1 Reply Latest reply on Apr 11, 2011 10:24 AM by philmodjunk

    400,000 Records. Need help with performence

    dataman

      Title

      400,000 Records. Need help with performence

      Post

      I have a table that has over 400,000 Records. Finds are taking over 3 minutes to perform.

      The server is running windows xp/3Ghz/3G of ram.

      What should i upgrade ? Should i add more Ram/CPU ? Do the client computers need an upgrade too ?

      thank you!

        • 1. Re: 400,000 Records. Need help with performence
          philmodjunk

          Some of my tables now exceed a million records so I know what you mean about these delays...

          More likely, you need to review the details of your finds and the fields into which you are entering search criteria. Any finds that enter criteria into unstored or unindexed fields will take increasingly longer to perform as your record count increases. FileMaker has to create a temporary index for each such field before it can perform the find.

          There are several basic options you can use to speed up your finds. You'll need to evaluate the actual design of your system and the type of finds you are performing to decide on the best approach or combination of them to use for your system:

          1) Turn on indexing for the field, or modify the design of your system so that field can be indexed. This might require using script triggers to run a script that computes a value and stores it in an indexed data field each time a field the current calculation field references is edited. This can be tricky to do with 100% accuracy and denormalizes your data, but can be done if you are careful.

          2) Use a denormalized "summary" table for your searches/reports. If your data becomes "static" (no further changes permitted) soon after creation, you may find that you can create a summary table and use a script to condense many records down into a much smaller number of records.

          You are basically computing summary totals in advance so that your reports don't have to compute totals from square one. In our system here, we generate hundreds of Invoice Type PO's with 1000's of line item records every day. A summarizing script runs every evening that "condenses" all those line item records into about a dozen records, one for each type of material purchased, with dollar and weight totals computed and stored in indexed number fields. When we pull up a 5 year summary report, we use this table and reduce the number of calculations needed by over 10 fold as compared to generating the same report directly from the line item records.

          3) Make your search a two stage search. Put all criteria that will be entered into indexed fields into the first Find. Then return to find mode and specify the criteria for your unstored/unindexed fields in the second find and use Constrain Found Set to complete the find needed to pull up the records you want. Since the second find only searches the found set produced by the first find, this approach is often many times faster than using a single find to locate the same records. It can even be done seamlessly for the user if you set up a search form of global fields for them to use to enter find criteria and a script to performs the two stage find.