You'd need to explain what you mean by "a summary of a current position"? What kind of data do you display at that point? What needs to be summarized? Into what kind of totals?
How many users might be expected to access this table at the same time?
600,000 is starting to become a large number of records for a FileMaker table--enough to start taking precautions to avoid unnecessary delays. But there's not enough detail here to be able to say much. One key factor is not the total number of records in your table, but rather what will be the typical number of those records that you will need in a found set at any given time.
Sorry for being so vague!
Basically, it is a very simple stock management system. A table would house all stock movements from x location to y location by user by time (the third table I mention above). I then want a table/portal that looks at all of those transactions for each individual location, and takes the last record for each location, producing a report that summarises the stock holding by location for the user.
There could be instances where multiple people will be looking at the data.
Hope this makes more sense?
But how many "multiple people"? 2? 5? 100? 500?
Sounds like you have a set up similar to what is discussed here: Managing Inventory using a Transactions Ledger
Answering from that context, you have two ways to get such a report:
Do a summary report where each row on the report is a sub summary part with a summary field summing up the transaction changes to give you a current total. Each subtotal can be a single inventory item or a single inventory item at a specific location. This type of report is done on a layout where the body layout part has been removed and at least one sub summary layout part has been added.
Or you make the report from a layout where you have one record for each inventory item (Or one record for each inventory-item combination.) A relationship totals up the transactions matching to each record in this table.
As the total number of records increases, pulling up reports of these types from it take longer and longer to evaluate and it's not a linear increase. Doubling the number of records more than doubles the increase in time needed.
There are two basic approaches that can speed it up. The first is simpler to implement than the second:
The main purpose for such a transactions log as far as I can see it, is to be able to track how subtotals change over time. Totals that steadily increase to larger and larger amounts, those that keep dropping to zero or near zero typically represent cases where you may need to make an adjustment. To check current totals does not require a table of all the transactions records so there are two ways around it that I know of:
1) Since this table is set up like a bookkeeping ledger, you can periodically export the data to another to keep it for any of the above analysis if you need it for that and replace the transactions with a "balance forward" entry. So at regular intervals or when you start to observe unacceptable delays, you'd run a script that exports this data adds in "balance forward" records for each inventory item or each inventory-item combination and then deletes the exported records.
2) Using that report table mentioned earlier where you have one record for each line of your report, put a simple number field into this table and use scripts that update the value in this field with the addition/change/removal of a record in the transaction table. Script triggers and delete buttons would perform scripts that calculated the new totals/averages etc and update number fields in the related record in this table. This last method produces very fast reports but requires meticulous design work to make sure that there is no way to change the data without the correct script making the correct update in the report table.
Sorry, I would not think more than 7/8 people at any one time.
Many thanks for that Phil. I will start to have a look at the summary reports you talk about.
In terms of keeping the transactional ledger tidy, I had considered exporting "used" data like you mentioned. In terms of optimal database performance; do you have a ballpark figure of how many transactions a find/search would have to go through before it starts to harm performance?
Many thanks for your help
There are too many variables that will affect your results. Best way is to have the script ready to use when needed and to put the file into regular use. When users or you start seeing delays that are not acceptable, run your script after Close of Business and after doing this a few times, you'll have a means of determining whether you want to do such "condensing" on a regular schedule.