This is done all the time. While it's OK to do it for 1 or 2 layers deep on one or two calculations it is not a 'best practice'. Each layer adds a dependency to the calculation and requires each level to be calculated for every record. The number of computations increases dramatically after even a couple of layers.
At approximately 13 levels I found it almost unusable. Even a simple calculation like 14 day fields each subtracting from the previous one will become very slow.
Interesting. Does this performance hit apply to calculations across relationships (like Sum or List)?
For clarify, are you talking about "unstored" calculation?
If your calculations are referencing the same table, it is a stored calculation because the value could only change if that same record was being accessed so it can easily be re-indexed on the spot. I think ch0c0halic was pretty close. I've probably referenced as many as 15-20 fairly simple stored calculations without noticing much slow down.
When referencing related tables, the calculation in the current table can't know what is going on in the other one when it isn't active. Any time you access the record, wether it be directly, in a portal, value list, or through another calculation, the calculation is performed again. Then, if you reference a calculated field from another table, that value from every single record in the other table has to be calculated before the calculation the final value can be returned. This can start to bog things down very fast.
It is so tempting to just relate and calculate like it's 2025, but I've fallen into that trap before and it gets to be unbearably slow. I still struggle with finding the best way to keep unstored calculations to a minimum and data current. In fact, I was actually thinking about posting on this same topic which is why I must apologize for the lengthy post.
I'm working on an inventory/invoicing database for a flooring company and having trouble keeping the remaining quantity calculation from taking forever to load and stay accurate in live time. This is custom imported flooring that has to be counted in piece by piece by pallet by container. This gives us the received quantity. Then it is sold and the invoice provides that quantity. Then, the sold order is pulled from inventory and counted out piece by piece by pallet by container, and sometimes they substitute some of the ordered item for some other identical stuff. The amount deducted from the received amount is the amount sold until the item has been shipped and then it is the quantity that was counted out. Then there are damaged pieces, shrinkage, and other manual adjustments. When the remaining quantity reaches 0, there is a status that needs to change and that status needs to be searchable. So imagine if I were to just try to search or scroll a list view with that status calculation spanning over 20 relationships, unstored calculations, some tables with well over 100,000 records and the smallest one being about 5-10k. It's a nightmare.
Thanks for replying and sharing some real world mayhem. Do you script creation of any totals and do you archive data on occasion?
There is a word to say about custom calculations here (obviously, you'll need FMP Advanced for this). I think HammerOz rightfully points out that referring back to a logical field (for Age) has advantages. For example when it comes to maintenance or revision of the logic of the calculation.
Using custom functions does not necessarily prevent chain dependencies. I'm not entirely sure but I think it can in principle, which can save a lot of processing time. (It should be pointed out that for a calculation at the end of the chain, you'll still need all the references that fields inside the chain make to fields outside the chain. So in a layout that only shows that final calculation things may still be slow simply because the calculation is complex and needs many inputs.).
I am not entirely sure because while it works in a simple example, in more complex situations, using custom functions could also mean that the dependencies merely move from the calculations specified in the fields to the calculations specified in the custom functions. In that case the solution would be to define a custom function for each field in the chain, but then you'll loose the advantage single-point definition (if you use the same chains of dependencies in multiple places it would however still be a gain).
Also, functions call other functions faster than fields call other fields, so there is a performance improvement possible there. On the other hand, replacing a calculation in a field by a custom function means that in addition to doing the calculation the field has to call the function and hand over parameters. I am guessing that that still beats long chains of fields referencing fields.
Yes, and it's way worse.
Calcs based on calcs isn't "wrong"...it's just slow.
I have one "unavoidable" situation that goes 6 layers deep. (nothing is really unavoidable...)
It is for a report run once a year and it isn't worth reworking...but for only 11K records, it takes over two minutes to assemble.
Everything in life is a cost-benefit decision.
three minutes to build a report that takes three minutes to render?
For once I year, I might let it go.
For every day, I'd spend the extra time to speed it up.