2 Replies Latest reply on Nov 4, 2009 7:56 PM by RickWhitelaw

    Speed issue

    RickWhitelaw

      Title

      Speed issue

      Post

      Hi,

       

      I've been working on a solution for a few years that has become essential to the running of my business. It works well and does what I want. However the solution is normalized to what some would say is an extreme degree. This, as I know, has advantages and disadvantages. All is well until I need to see a report that ends up depending on (likely) thousands of unstored calculations. During the course of a year I will contract many musical theatre productions. Each production has many engagements (usually measured in weeks). Each production has an orchestra which can change by engagement. Several productions (and thus engagements) occur simultaneously. At the micro level is a table called "Fee Engine" which calculates, in concert with many related tables, the fees of the individuals. Another, "Fee Engine Calc" tracks totals for the engagement etc. The next level up would be "Production" which calculates various totals of all the Engagements of a given Production. Any layout based on the top level ("Production")that sums figures originating at "Fee Engine" can be really slow. The approach I'm taking to alleviate this, as an example: a field in "Fee Engine Calc" is called Fee_Sum which calculates the sum of fees paid out which are recorded in Fee Engine. I create an extra field in "Fee Engine Calc" w/calc  GetField("Fee_Sum") which is a STORED calculation. I could do the same one level up when calculating grand totals for a Production. I'm thinking that this will speed things up considerably when displaying fee totals on a list view of several productions (records of "Production"). 

      So, call that method 1. The second method would be to create a simple number field in Fee Engine Calc and, triggered by an event that signifies that the given Engagement is "done with" i.e. printing cheques or the like, have a script set that field with  GetField("Fee_Sum") and of course, the field would be indexed.

       

      I've tried method one with one significant field and it seems to speed things up in a big way. If I apply the same thinking to pension, union dues and such I would expect things to further improve. The second method has the minus and plus of not auto-updating. 

      PhilModJunk posted on something like this a while ago.

      All opinions are of course appreciated! The solution is working and the data is safe and accurate. I just want some speed for this year end reporting!

       

      Thanks,

      RW 

        • 1. Re: Speed issue
          mrvodka
            

          When working with FileMaker, sometimes you have to either denormalize or add utility fields to make things work efficiently. With the addition of script trigger, it may make sense for you to store the aggregate figures into static number field. If you have FMS, you can even run a schedule to check to see if the figures are off periodically.

           

          Working with million record files, sometimes adding up all the line items and setting a field make sense, especially for things such as closed invoices. ;)

          • 2. Re: Speed issue
            RickWhitelaw
              

            Thanks Mr Vodka,

             

            I thought as much. I don't have FMS. I could expand my opening script of the relevant files to check the simple number field against the calculation but since the found set could always be different this won't work. However it WILL work on the middle level (Engagements per single engagement). As well it would work on a Production (macro) level once a production is closed. Once a payroll, engagement or production is finished the numbers don't change anyway, so a number field on each level will work and won't need to be checked. I've just talked myself in a circle . . . .

             

            Rick.