General advice required on large dataset calculations
My main dataset will always be growing in record sizes and potentially keep growing in field numbers, currently around 70,000 records.
I've not been using FM so long that I could have thought of this when I started the project, so I'm asking now.
By calculating directly on the data field and adding further summary fields etc here, I've noticed slowdowns.
Is there any justification for using this merely as a source and for each layout/report, creating a separate related table( not a table instance) which is populated by a script?
My second question directly off the back of that would be, assuming any sort of Yes to the above, and in an ever changing data environment, how often should the script run? You'd think after every change...
I'm just looking for experience and opinion in this area as scripting could do a lot of the heavy lifting, it's just a case of whether it's quicker in run time and easier to follow moving forward.
Quite woolly I know, I just feel I need a conversation about this!!