I have an Archive file where the tables store millions of records. One trick for working with report layouts based on these tables is to use global fields for collecting user entered criteria for the desired report. When they click a button to get their report, a window to a layout not based on these tables opens and displays the global fields in a small dialog box type window. They enter their criteria and click a button to see their report. The script closes the window, enters find mode and only then goes to the needed layout where it performs a find similar to those described here: Scripted Find Examples
This avoids accessing the layout in Browse mode--when your found set could easily be all records or a very large sub set of them. And the combination of global fields and a scripted find limit search criteria to fields that are indexed and this produces pretty fast results. But any searches that produce large sets of data--such as all records for a single year or more, will produce delays while waiting for summary fields to update.
If I do need to search on an unindexed field, I use the trick described here: 400,000 Records. Need help with performence
And another method that I use to get faster results for reports is to set up a system that creates a new report table where multiple records are "condensed" into a much smaller group of records with sub totals calculated from the original tables stored into simple number fields. In my situation, this can condense several thousand "line item" records created in a single day down to about a dozen or less. This method won't work for every type of data that you might collect in a database nor for every possible report that you might need, but when you can use this method, the results can be many times faster than generating a report based on the original table.