1 2 Previous Next 21 Replies Latest reply on Sep 20, 2012 12:39 PM by wsvp

    Reducing calculations ... recommended?




      As a relatively new Filemaker convert I have found that use of calculations without careful planning can lead to performance issues, especially in lists of job or quotes for example, where I'm referencing quite a few fields in the calculation.


      Is there a common approach that more experienced developers use such as 'always' scripting calculations or sometimes according to certain scenarios. Any advice would be appreciated as I would like to adopt best practice but the shortcut of advice from more experienced developers would save me a lot of trial and error.


      Thank you in advance.



        • 1. Re: Reducing calculations ... recommended?

          Hello, Simon.


          It's difficult to answer your question precisely without more information (e.g., a sample of a calculation that's giving you trouble). However, I can give you some general principles:


          1) FileMaker has two kinds of calculations, stored and unstored. Stored calculations are evaluated and "stored" (hence the name) in the data table where they reside. As a result, they can be indexed by the database engine and will render no slower than any other field. Unstored calculations, on the other hand, have to be evaluated at runtime. There are a variety of reasons for this, including referencing a related field, a global field, another unstored calculation, or a summary field. Because they have to be evaluated at runtime, unstored calculations are typically a performance hit, compared to stored calculations.


          2) Whenever you reference a field or function in a calculation, the calculation has to evaluate its value. This can lead to inefficiencies in a couple of ways. First, repeated references to the same field or function forces FileMaker to repeat itself - which is inherently inefficient. Second, if the object you're referencing is itself something that has to be referenced, it can slow the calc engine down because it has to figure out what A is before it can figure out what B is.


          Now, let's take a couple of examples to make that clearer. Let's say I have a need to calculate a variable tax rate on the sum of a balance in a related table. This is inherently an unstored calculation, so we're going to skip over item 1 and go straight to item 2. (I've borrowed this example from Ray Cologon's excellent FileMaker Pro 9 Bible, which I recommend.)


          Here's an original example of a calculation - from a real-world solution, BTW:



          Sum( Invoices::BalancePayable ) > 0 and Sum( Invoices::BalancePayable ) < 50 ;

          Sum( Invoices::BalancePayable ) - Sum( Invoices::BalancePayable ) * 0.05 ;

          Sum( Invoices::BalancePayable ) >= 50 and Sum( Invoices::BalancePayable ) < 100 ;

          Sum( Invoices::BalancePayable ) - Sum( Invoices::BalancePayable ) * 0.1 ;

          Sum( Invoices::BalancePayable ) >= 100 and Sum( Invoices::BalancePayable ) < 150 ;

          Sum( Invoices::BalancePayable ) - Sum( Invoices::BalancePayable ) * 0.15 ;

          Sum( Invoices::BalancePayable ) >= 150 and Sum( Invoices::BalancePayable ) < 200 ;

          Sum( Invoices::BalancePayable ) - Sum( Invoices::BalancePayable ) * 0.2 ;

          Sum( Invoices::BalancePayable ) > = 200 ; Sum( Invoices::BalancePayable ) - Sum( Invoices::BalancePayable ) * 0.25



          Whew. That's really ugly. But what else can I do? I have a bunch of boundaries and each boundary has a different value I have to calculate?


          One thing to do is get really, really friendly with the Let function. You can read about it in the FileMaker Help, but basically, Let allows you to specify variables up front. How does that help? One way it helps us is that it allows us to calculate that unstored bit - the Sum ( Invoices::BalancePayable ) - only once. The calculation engine won't have to work nearly as hard if it only grabs that bit once instead of, what, 19 times?


          What would that look like?


          Let( Bal = Sum( Invoices::BalancePayable ) ;


          Bal > 0 and Bal < 50 ; BalBal * 0.05 ;

          Bal >= 50 and Bal < 100 ; BalBal * 0.1 ;

          Bal >= 100 and Bal < 150 ; BalBal * 0.15 ;

          Bal >= 150 and Bal < 200 ; BalBal * 0.2 ;

          Bal >= 200 ; BalBal * 0.25




          Whew. A lot easier on the calculation engine.


          Another way people hurt themselves is by making multiple calculation fields in an attempt to "segment" their calculation code. You might see this with a calculation like this:


               TodaysDate = Get ( CurrentDate )  [unstored]


          And then you have another calculation like this:


               Status = Case ( TodaysDate > DueDate ; "Late" ; TodaysDate = DueDate ; "Due Today" ; "OK" )


          Again, you've asked the calc engine to recalculate your current date twice - but you've also created a "dependency". That means that field B is "dependent" on field A - and can't evaluate until field A is evaluated. This can be a performance hit on your solution as well. You might be better off with this:


               Let ( TodaysDate = Get ( CurrentDate ) ;


                    Status = Case ( TodaysDate > DueDate ; "Late" ; TodaysDate = DueDate ; "Due Today" ; "OK" )




          In this case, you've used Let to do two things: Evaluate the current date only once, AND eliminate the dependency tree. Both will improve performance.


          3) Recognize that there's a prioritization in certain situations. For example, the Case function evaluates conditions in the order in which they appear. When it hits a true condition, it quits evaluating. Therefore, to make your calculation faster, put the most likely conditions near the front of the calculation. This will reduce the number of iterations the engine has to run through and speed it up.


          Those are just a few ideas off the top of my head. I (and others, I'm sure) can provide more detailed advice if you give us a little more information on exactly what you're trying to do.





          • 2. Re: Reducing calculations ... recommended?

            Mike_Mitchell wrote:



            Bal > 0 and Bal < 50 ; BalBal * 0.05 ;

            Bal >= 50 and Bal < 100 ; BalBal * 0.1 ;

            Bal >= 100 and Bal < 150 ; BalBal * 0.15 ;

            Bal >= 150 and Bal < 200 ; BalBal * 0.2 ;

            Bal >= 200 ; BalBal * 0.25



            Actually, this could be simplified further to =


            Bal * ( 1 - Case (

            Bal < 50 ; 0.05 ;

            Bal < 100 ; 0.1 ;

            Bal < 150 ; 0.15 ;

            Bal < 200 ; 0.2 ;

            0.25 )


            • 3. Re: Reducing calculations ... recommended?

              Actually, it could be even further reduced (as Ray did in his book) to:



              [ Bal = Sum( Invoices::BalancePayable ) ;

              Rate = 0.95 – Min( 4 ; Int( Bal / 50 )) * 0.05 ] ;

              Bal * Rate



              But the point was to demonstrate the Let function, not necessarily reduce the calc as far as possible.



              • 4. Re: Reducing calculations ... recommended?

                LOL, actually I seem to remember Ray, as a purist, being opposed to single-use variables.

                • 5. Re: Reducing calculations ... recommended?

                  Well, then, I guess you can take up the argument of how pure his purism is with him.   



                  • 6. Re: Reducing calculations ... recommended?

                    Have you found using ExecuteSQL as a calc to be faster? I just wonder since the SQL function doesn't go through relationships?

                    Could an executeSQL function be stored?

                    • 7. Re: Reducing calculations ... recommended?

                      Don't argue with Ray! Dr Cologon is one smart cookie.



                      • 8. Re: Reducing calculations ... recommended?

                        Hi Simon


                        I think Mike is making some good points regarding calculation design, but I think there is another side to this question...


                        I have heard some "rumors" of development styles which avoid ALL calculating fields at the database design level, and only uses Auto-Enter or Scripts to set calculated values.  This would result in ALL calculations being stored.  This in theory would improve the speed of Finds and Sorts (particularly with Relational Calculations.)




                        This "No Calculating Field" approach has a lot of holes.  First the more stored data the bigger the file or files become (this may not be a big deal.)...  Also from a design standpoint if an error in the Script or auto-enter calculation is found, it now becomes necessary to deal with the user data as well... But an even bigger concern, is that stored fields are NOT "Self Healing"...  Any interruption during a script or the auto-enter process could result in "inaccurate" data being entered, and it will REMAIN so going forward.  With calculating fields an "Account Balance" for example will always be accurate, as it will self heal when normal operation is restored.


                        I am a big fan of calculating fields when used correctly.  Keep in mind that if a calculation is not being displayed, it "generally" will not need to be evaluated.  So it is wise NOT to display excessive calculating fields during entry.  In low volume environments it may not be a big deal... But if you had an account with Millions of related transactions, it would be un-wise to try to display the Account Balance on the entry screen, as every change will trigger an enormous re-evaluation.


                        Hope this helps.

                        • 9. Re: Reducing calculations ... recommended?

                          Hello, Jeremy.


                          The question of whether an ExecuteSQL calculation is faster is not a "yes" or "no", but "it depends".


                          By itself, just pulling out information from the database, I've found ExecuteSQL typically faster at pulling information from a single table than would a comparable calculation through the calc engine. Significantly faster, in fact.


                          However, things are far less clear (in my experience) when you start talking about related records. This is because you start having to define your joins in your SQL statement. Sometimes, you can do this without using a JOIN qualifier, but sometimes you can't (depending on what you need). When a JOIN clause is included, I have seen some pretty significant slowdowns, especially over the network. I believe this may have something to do with the database needing to pull the records from all the involved tables across the wire. So doing it through the relationships graph seems (just based on a perception) not to be a whole lot different.


                          Now, there's a significant caveat here that CANNOT be ignored. It has to do with the relationships graph and a caching behavior FileMaker uses to deal with joins on the graph. When a file first opens, the FileMaker client goes through the graph and caches all the joins, their implied found sets, and everything else it needs to know in order to render the layouts that are in the solution. The more TOs you have on the graph, the longer that caching process will take. Eventually, if your graph gets large enough, you'll hit what Ray Cologon called a "performance elbow": Everything just starts grinding to a halt. The solution feels like you're dragging through hip-deep molasses, and it's largely because the cache for the graph has simply become too large.


                          By using ExecuteSQL to manage calculations for things like dashboards, or reporting joins that don't need to be there on a permanent basis, you can affect the overall performance of the solution. So the more complete answer to your question is, ExecuteSQL may or may not make a single calculation faster. But integrating it into your solution, where appropriate, will definitely improve its overall performance.


                          Oh, and yes, you can store the results of an ExecuteSQL calc. You just have to do it as an auto-enter calculation, and then refresh it when needed.





                          • 10. Re: Reducing calculations ... recommended?

                            Indeed. Ray is what I call "scary smart". And a very nice, approachable guy to boot. Met him and got to talk with him a bit at DevCon this year.



                            • 11. Re: Reducing calculations ... recommended?

                              wsvp, I try to LIMIT the calculations, but don't have a "NO calculating field" rule. I work a lot in SQL databases and, obviously, it's ALL DATA, all the time. "calculations" are done in the queries for SQL. In FM now, I use auto-enter, scripts, filtered portals, and Execute SQL in ways that just remove many of the old calculations. Use it all correctly, as you say, and it may depend greatly on what you need to do in your solution.


                              • 12. Re: Reducing calculations ... recommended?


                                The JOIN idea makes sense to me, but for some reason don't use it a lot. I'm pulling related datae from a particular table (I.e. test data from testtable into the students table with a simple Let variable, assigning a variable.

                                So i might do this:


                                SetField (TestScore) Value::(Let ($ID = STudentID // for this particualr record;

                                     ExecuteSQL (

                                "SELECT TestScore

                                From TestData

                                WHere StudentID //In the test data table // = ?";




                                That seems to work fine. Its not worrying aobut the relationship, or even setting one up to grab the data.


                                I'd really love a book or a white paper that tells more about uses of ExecuteSQL. I don't come from a SQL background (I teach how to add and subtract). I'd love to know when to use joins, when to use everything, what I can get from the execute statement, and even what words I can use.  Anyone have that?

                                • 13. Re: Reducing calculations ... recommended?

                                  Hey, Jeremy.


                                  I don't have a book per se, but there's a great series of articles on using ExecuteSQL over on the FileMaker Hacks web site:




                                  They did a series on ExecuteSQL at DevCon that got rave reviews. Check it out.




                                  P.S. I see nothing wrong with the calc you have there. I do similar things on a frequent basis.   

                                  • 14. Re: Reducing calculations ... recommended?

                                    Hi wsvp,


                                    You've articulated well a struggle that I think many developers have. Because of the performance issues of unstored calculations, issues that grew larger with FMP 12, many of us DO avoid unstored calculations as much as possible. And while the end result is certainly superior performance-wise, you enumerate many negatives: design errors force data fixing, interrupted processes force data fixing, and increased file size. Let me add some others:


                                    1) It's a lot more work to script a calculation rather than use FileMaker's built-in calculation field; I feel like a lot of development "best practice" encourages us to avoid fundamental FMP tools like calculated fields and relationships in favor of clever scripted workarounds.


                                    2) If a field used to script a calculation gets added to another layout or a new script any time in the future, you have to remember to handle the calculation as well.


                                    3) To avoid the "self healing" (clever!) problem wsvp talked about, I have to run "midnight run" updates to be sure that all data from scripted calculations are up to date.


                                    4) Changing large databases with complex business logic that were built prior to FMP 10 (and script triggers) to use scripted calculations can be prohibitively expensive.


                                    FMP unstored calculation field performance is what it is. As a developer I have to make choices about where to take advantage of the ease of development of an unstored calculation and where I need to script a calculation. But my preference would be to see future improvements to the speed of FMP's calculation engine. The step backward in FMP 12's calculation engine has, I believe, been at the heart of many of the FMP 12 performance issues detailed in this forum, including the Script Debugger and Data Viewer.


                                    Gordon Shewach

                                    Desktop Services

                                    Ann Arbor, MI

                                    1 2 Previous Next