13 Replies Latest reply on Jan 14, 2013 5:17 AM by carlos_panda

    Calculation Limit?

      I feel that I should know the answer to this but just can't work it out...






      calculationA (= "numberB / number A", stored)



      To create a new calculation field (calculationB) that contains the cumulative product of calculationA in descending date order:


      date (most recent first), calculationA, calculationB (= result)

      date 1, cA1, cB1 = 1 (most recent record to always return 1)

      date 2, cA2, cB2 = 1 x cA1

      date 3, cA3, cB3 = 1 x cA1 x cA2 (i.e.: cB2 x cA2)

      date 4, cA4, cB4 = 1 x cA1 x cA2 x cA3 (i.e.: cB3 x cA3)

      date 5, cA5, cB5 = 1 x cA1 x cA2 x cA3 x cA4 (i.e.: cB4 x cA4)



      So, in essence, calculationB is always equal to the product of calculationA and itself (both from the previous record).


      I can make this work using a calculation based on a relationship:



      tableDATANXT (contains all records with a date greater than the current record, sorted by ascending date)


      tableDATA::calculationB = Case ( tableDATA::Date = tableDATA::FirstDate ; 1 ; ( tableDATANXT::calculationA * tableDATANXT::calculationB ) )


      BUT... only some calculate with the remainer displaying a "?". The actual number that calculate is variable - perhaps the first 177 out of 769 records but appears to be dependent on whether the results are displayed in a table or a portal.


      Clearly the calculation works, just not enough times.


      Does anyone know of a better way to calculate a running balance as the product (rather than the sum) of a given field in a specified (date) order?






        • 1. Re: Calculation Limit?

          I am not sure I follow your description fully. Perhaps this might help?


          • 2. Re: Calculation Limit?

            Usually when I get a "?" in a field, it's because there are more characters than the field width.  Try widening your fields that have ? in them.  Worth a shot.

            • 3. Re: Calculation Limit?

              Yes - the description was pretty awful!


              However your idea of using Ln / Exp looks really interesting and at first glance appears to solve the problem.


              To be certain I'll have to spend the weekend testing but, in any event, many thanks indeed for such rapid and useful feedback.



              • 4. Re: Calculation Limit?

                Unfortunately not that easy (I wish!). Thanks anyway.



                • 5. Re: Calculation Limit?



                  I suspect your subject line is correct - that you are hitting a calculation limit. If I read your post correctly the growth in the number is exponential. In FMP, when I take the number 3 to the 700th power (3 ^ 700), the result is a number 334 characters long. When I try for the number 4 it returns a question mark, apparently more digits than the calc engine can deal with. This jives with your getting values for the several hundred records.


                  Bob Gossom

                  • 6. Re: Calculation Limit?

                    Yes, it may be to do with some form of limit. However of the 769 records in my sample, 767 are "1" and the other 2 are "0.25" so one wouldn't see this as a particularly demanding calculation!


                    I'm still working on Michael's idea of using Ln / Exp (see above) which so far looks like the best idea.





                    • 7. Re: Calculation Limit?

                      carlos_panda wrote:


                      I'm still working on Michael's idea of using Ln / Exp (see above)


                      What's there to work on? It should be very simple, I think (see attached). At least as long as your values are all > 0.

                      • 8. Re: Calculation Limit?

                        Just trying to implement it in the fastest way possible (it has to cope with subsets of 100k + records).


                        I'll take a look at the file.


                        Thanks again



                        • 9. Re: Calculation Limit?

                          If you take the data in Michael's file and replicate it, you start getting a "?" mark around the 9050 record. FMP does have limits on the number of meaningful digits that a number field can handle. It also starts getting very slow. There is no way FMP can support this methodology in 100k records. Indeed, I think the math co-processors in desktop computers probably can't handle numbers this size.


                          It's hard to imagine the functional utility of numbers this large outside of astronomy or [the right term for the study of very small things].


                          You might want to revisit the reason for this calc. If this precision is indeed needed, then perhaps you can change the units, i.e., use atronomic units, or other notation?


                          What is the purpose of this calc? Is it scientific?


                          Bob Gossom

                          • 10. Re: Calculation Limit?

                            BobGossom wrote:


                            There is no way FMP can support this methodology in 100k records. Indeed, I think the math co-processors in desktop computers probably can't handle numbers this size.


                            There are two separate issues here. One is the size of the resulting number: Officially, Filemaker's number fields support values up to 10^400. Thus, for example,  Factorial ( 213 ) returns an error although the number of required multiplications is merely 212 - certainly within the range of any computer (otherwise you wouldn't get a result for 2^213 either).


                            The other question is how many records can a summary field summarize. After all, that is the basis of the "methodology" applied here: nothing more than a simple total of stored (and relatively small) values. I am not aware of any declared limits here. Surely, totalling 100k records, even if slow, is not beyond our reach?

                            • 11. Re: Calculation Limit?



                              Using your Ln idea works well. I've found that a Summary based approach works much faster for display / scrolling in portals whereas a Sum calculation (based on a related TO) is more efficient elsewhere.


                              By the way, I don't think the original problem was to do with the size of the resulting number of even the complexity of the calculation. The explanation in my original post was poor but the underlying query (now solved) was probably quite simple so perhaps I should restate it for the record:


                              A table contains records with the following fields: RecordNumberX, DateX, NumberX, CalcX (or Summary), sorted in descending date order (for the most recent date RecordNumberX = 1, for the next most recent date RecordNumberX = 2 etc).


                              RecordNumberX = Get ( RecordNumber)

                              CalcX = Case ( RecordNumberX = 1 ; 1 ; Number * GetNthRecord ( CalcX ; RecordNumberX - 1 ) )


                              ***N.B. The above calculation is only to explain the desired outcome. It is NOT the solution!



                              • 12. Re: Calculation Limit?

                                carlos_panda wrote:


                                By the way, I don't think the original problem was to do with the size of the resulting number


                                I don't think so either. It's very easy to show by changing the calculation formula to =


                                Let (

                                n = Get (RecordNumber)


                                Case ( n = 1 ; 1 ; GetNthRecord ( CalcX ; n - 1 ) )



                                There is practically nothing to "calculate" here, and the expected result is 1 for all records of the found set. Still, the calculation will break at record #175 or so, because the stack becomes too high: in order to calculate the result of record #175, one needs the result of the same calculation for record #174; since the calculation is not stored, the result of record #174 must be calculated again - but for this to happen, we need to get the result of record #173 first, and so on...

                                • 13. Re: Calculation Limit?

                                  I agree. There's not much to calculate but it's useful to know that the reason it couldn't work beyond a certain point is due to the stack issue.