Get rid of the unstored calc and set the number in a scripted fashion based on the workflow. That will make it fast.
Inside a loop? tried this, it was marginally quicker but still took a long time to loop through the 7k product records
Sorting large sets by unstored calculations will never be fast on a hosted solution. There are workarounds though:
I think there's a problem with your database design - I'd argue that the sum of sales doesn't belong into the products table. Also client_id in the products table? Doesn't seem right to me.
Instead you should collect those values as aggregated fields in the same table where the actual values are found. For a typical total sales count that would be something like a LineItems table. There you could have an "quantity" field and an accompanying aggregate field that sums these amounts, like "LineItem::quantity_sum". Make sure that LineItem::quantity is an indexed field though - this is important.
Now you have all kinds of options for reports and different ways to count your sales: Want to see the total sales of a single product? Just put LineItems::amount_sum on your products layout - no need for the separate calculation field. Now here comes the interesting part: The sum you will see depends on the relation between Products table and LineItems table. The most basic case would be a relation by product_id.
But think further: I always have constant "true" and "false" fields in all of my tables, whose values are 1 and 0 respectively. I can use those fields to create relations that filter certain conditions and flags: Want to see the total sum of all products that haven't been delivered yet? Make another relation between Products and LineItems - this time not just based on product_id, with the condition Product::true = LineItems::delivered? You just sumed up a large collection of data in a very fast way and filtered by whatever conditions you can think of.
Back to your initial problem: Keep the sums and totals in the place where they are happening/stored - I guess that would be the Jobs table for you. Now just make a list view report based on that table, create a sub summary for product_id and sort by product_id based on an aggregated field like quantity_sum or wahtever you call it. The option to sort by an aggregated field is just below the default ascending or descending options.
Now you have the report you were looking for, and it is fast as long as your quantity field is indexed.
Wim - would be keen to see how this can be achieved.
First - record level is fine
We have scripted to populate a field based on workflow - so it works well on a record by record basis.
However if we want to view the top ten values for a customer then we need to loop using the script and that still takes too long.
Any advice much appreciated.
Not sure what loop you are referring to... the idea is that the sort can stay, but that the field that is now a calc gets set during the regular workflow.
Whenever a job is entered, a script updates the job total. There are a few caveats her to account for record locks and so on, so it is more work overall than just creating a calc. But it is going to be infinitely faster.
To get the top 10: you can sort descending (which will now be fast) and then use a short loop that uses GetNthRecord so that FM does not have to move between records. That's fast. Or do an ExecuteSQL to produce a sorted list in a variable and then LeftValues on that result...
How up-to-date must your report be? It seems like it's a job that could be run once a day, perhaps at the same time that you are archiving jobs. If you generate the data overnight the totals can be stored in the product record. Finding and sorting will then be very quick.
Presumably your "true"/"false" fields are globals?
Well, you'd think, but globals have the disadvantage of having to be set after every restart, for every user. Thsi would be fine for only 2 fields in the whole database. However I use these fields in every table in order to have the option to use them in various relations, so it's more like 2 * n - where n is the number of tables. That's why it is a lot simpler to just create 2 calculations in every table that evaluate to 1 and 0 respectively. Or you use 2 simple number fields with auto enter data (again, 1 and 0) - just make sure they are stored/indexed either way and never empty as your relationship will fail or be slow.