4 Replies Latest reply on Mar 12, 2012 5:59 AM by revmk

    Maximum Dataset

    revmk

      Hello all,

       

      I have been asked to beegin construction of a replacement database.

       

      The existing system has a number of linked tables where the number of records exceeds the 1 million mark.

       

      Client (20k) -> notes (1m)

      Client (20k) -> Widgets (73k)

      Widgets (73k) -> Transactions (500k)

       

      With this in mind do you think Filemaker will be sufficient to sustain this and grow with it over the next few years? They often add records in bulk as they take things on.


      Thanks

        • 1. Re: Maximum Dataset
          Malcolm

          I have been asked to beegin construction of a replacement database.

           

          The existing system has a number of linked tables where the number of records exceeds the 1 million mark.

           

          Client (20k) -> notes (1m)

          Client (20k) -> Widgets (73k)

           

          With this in mind do you think Filemaker will be sufficient to sustain this and grow with it over the next few years?  They often add records in bulk as they take things on.

           

          The number of records is not the sole factor. The number of fields of each record and the type of fields will play a large part in the performance of the database.

           

          Also, it is not the simple fact of having a million records. That is not a problem at all. Having a million records in the found set is the problem.

           

           

          Malcolm

          • 2. Re: Maximum Dataset
            revmk

            Hi Malcom,

             

            Thanks for the reply.

             

            I've read a few reports that filemaker can be 'slow' with large datasets.  Am I right in thinking then it is more down to design than anything else where FM is concerned?

             

            Thanks

            • 3. Re: Maximum Dataset
              Malcolm

              I've read a few reports that filemaker can be 'slow' with large datasets.  Am I right in thinking then it is more down to design than anything else where FM is concerned?

               

              You should be able to get good performance from Filemaker.

               

              Typical FMPro apps are not designed to be big. They do things that don't work when the record set is large, like put unstored calcs and summary fields onto list layouts. They use "Yes" or "No" instead of numbers for booleans, then they create calc fields to count the Yes'. They are poorly normalised, etc.

               

              If you know that you are going to deal with large record sets you can plan for it. Create fields that are indexed. Use auto-enter calcs instead of calc fields. Don't use unstored calcs. Normalise the data to reduce the amount of fields/records/data required.

               

              If an unstorable calc is used figure out why. It's probably for a report or for user information. If that's the case, it can be performed by script on the sub-set of records needed at the time.  With script triggers, calculations can be performed just-in-time for the record on display. This is fast because the sums/calcs are only operating on the current record, not working across a large set.

               

              You can ensure that you never allow the user to show the full set. Provide search interfaces to display only the exact records needed.  Set up relationships so that you are only displaying small data sets via portals. You might only display notes from the last year, which might be dozens per client, not hundreds. With these approaches you keep the current found set to a minimum and Filemaker will handle it easily.

               

              Malcolm

              • 4. Re: Maximum Dataset
                revmk

                Thanks for the response Malcolm, a well rounded one with a lot of tips.