Not sure what calculations you are using, or how your database is designed, but you may be able to skip the Calculation Field Types altogether and instead do the Calculations in ScriptManager when needed. Then your database would only have fields with data, which should make navigating quick no matter how many records you have. When needed you run a script which will then do the calculations.
There are several options. One is to take steps via scripts to keep your found sets always small. Don't allow a Show all records or such.
And with some data you can set up a "summary table" where the data is periodically updated to compute summary totals and store the totals in number fields so that your summary reports use the stored number fields from this table instead of summary fields that have to total large numbers of records.
Here's an example of something I've had working since FileMaker 4 or 5...
We have an invoicing type system. Since we buy scrap metal and redeem used beverage container deposits, they are really Purchase Orders, but the table structure is the same as you see with typical invoices:
Since we serve from 500 to 1000 customers a day with POs that have a minimum of 4 lineItem records each, Summary and cross tab reports of LineItems data spanning up to 5 years would need to crunch numbers from literally millions of records and would bring the system to a crawl while waiting for the progress bars to fill and the summary totals (of which we have for both average cost, total cost and total weight) update.
So I set up a script to run once a night that takes all the current day's line items and creates one record for each type of material purchased/redeemed with totals/averages computed and stored in number fields. That takes a set of data from up to 4000 records and "condenses" it down to about a dozen different records for the day.
And thus, a Five year comparison cross tab report with monthly totals and averages can be produced from this table without any noticeable delays.
But also note that this is pretty straight forward for data that once recorded, is never (or almost never) changed. Even so, it's taken an extra effort scripting wise to make sure that this summary table is always in agreement with the original line item data.
That should do a trick, because once data is entered, it is never changed (once the day goes by, data is set to remain unchanged)
Scripting solution should do a trick, i just need to know when to do it? (maybe on close? or script trigger onsave or something?)
Do you have example file?
The sample file would be a working copy of our DB and I'm not allowed to hand out copies of it. If this task were the only part of the script, I'd use a server schedule to run it. But since it also does some data imports from one FileMaker File to another to archive data (including the individual line items.) I've set up a robot file opened with Windows Scheduled Tasks that performs this script once a night just before the back up schedule backs up the file.
OnLastWindowClose might work, But since you might close a file more than once in a day and this could tie up that client for several minutes, it might not be the best option unless you added some additional code such as code that checks the time of day and only runs the script if it's after close of business or is set up to ask if you want to do this so that you can cancel out of it...
On the other hand, if you host from FileMaker Pro, you could set this up so that it only runs with OnlastWindowClose on the host computer by having the code check to see if the current computer is the host computer and exit if it is not.
I created another layout and named it "hard enter" for self-explanatory purposes.
And since I only need run it after data import, I would do the import via that layout and then just go through newly added records (find records with certain empty field, loop script, to set field from calculations).
Thanks for your inputs.