2 Replies Latest reply on May 18, 2009 11:27 AM by deltatango

    How many millions can FileMaker handle?

    obeechi

      Title

      How many millions can FileMaker handle?

      Post

      Does anyone know what the upper limit for records would be with FileMaker? Like what about 10 million records? Can FileMaker handle that? I mean in the sense, they searches are reasonably quick. 

       

      Also, what would be the techniques for improving the speed of a search with 10 million records? I have several books on FileMaker, and not one of them addresses this issue.

       

      For example, I would assume that an integer search is quicker than a character or string search. So that would mean putting address numbers in a numeric a field separate from the street name. I know by reading on MySQL that integer searches are quicker than string searches (at least for MySQL).

       

      Then what about enumeration? Like when you allow only one value from a value list. This helps search speed with MySQL, but does it help search speed with FileMaker (I'm assuming using a value list in a field's Validation Options ("Member of value list: ") might be the same or similar as an enumerated type) 

       

      Or what about converting text to numbers, in a field set aside for such. For instance, if A=01, B=02, et (via a case function) then concatenate the numbers (as text) and then store the result as a number (in yet another field, *number). Then to convert a search (via calculation) from a string to a number and searching for the number within the *number field?

       

      Or does having multiple indexes help? For example if one is searching for Don Juan @ 1234 Elm Street, Chicago, IL if the search were to use "Do" in the nameFirst column, "Ju" in the nameLast column, "1234" in the number of address as (number type) column, "Ch" in the city field, "IL" in the state field, would that be faster than searching fewer fields? (and the text to number calculation could also be applied here, within a script). 

       

      Or does it help to intentionally run a find twice instead of just once? Like once for the State, and then to run a Modify Find to search within the Found Set?

       

      Or what about specifying a limit of characters in the validation tab of a field's options? Does that enhance speed of finds?

       

      I suppose compacting a database (to a copy that is used in replace of the original (to be safe about it!)) will also help speed of finds. 

       

      Oh yeah, what about the cache too? The default is 8MB. How do you determine how large to set the cache?   

       

      I mean what are the tricks (beyond have a fast cpu, fast disk, lots of ram, plenty of free disk space? And how big record-wise can FileMaker reasonably get? Lets assume 50 or fewer columns (which is probably more than I'll ever use, `I think`).

        • 1. Re: How many millions can FileMaker handle?
          philmodjunk
            

          What an interesting question! I hope we get some good responses to your post.

           

          "what are the tricks (beyond have a fast cpu, fast disk, lots of ram, plenty of free disk space?"

          I can outline one minor "trick" for you. It only applies to data that is static (not subject to change), but can dramatically improve performance if you can work within its limitations.

           

          In one of my DB's, I created a "summary" file that summarizes invoice data. Since the data will not be changed once an invoice is printed, (This business does not have to deal with returns), all data collected can be processed in a way that reduces several thousand records to less than a hundred. Here's how it works: a script systematically finds all invoice line items from the current day"s invoices for a specific item. It then computes a total and creates one record with, date, material ID, quantity purchased, cost, etc. for that one item. The script loops through all invoiced items in this fashion. The result is that data from several thousand line item records have been reduced to 20 - 30 summary records. This makes a dramatic difference in calculation speed, when this data is used to produce a 5 year "cross tab style" summary report with monthly totals and averages for each invoiced item.

           

          Obviously, this trick doesn't work for every type of data you might collect in a database, but when you can use it, you can get some pretty dramatic performance enhancements.

          • 2. Re: How many millions can FileMaker handle?
            deltatango
              

            The most records I've gotten to in an FM DB as of yet are about 800,000.

             

             It is a bit slow, but more so when the file is shared. We have about 15 people connected at a time. Doing finds can be a hassle (5-12 second pauses).

             

             What makes it really slow is when you have complicated relationships and many of them. My solution has about 200 table instances in it and about 100 tables.

             

            Some pretty complex calculations added on top of this and we're talking coffee cups up the wazoo.

             

            However, it's the only program I've been able to use that can MAKE calculation fields and help you sort and GET data EASILY.

             

            With other programs, I won't mention any names [quickbooks! cough! cough!] can take 10-15 minutes making a summary report of 10-30,000 records. 

             

            So I'm not complaining...much.