It's not so much the core number of records, but the overall design of the solution (wide tables, busy layouts, calculations,...) plus the bandwidth / latency if you have WAN users), and potentially the specs of your server. And the number of concurrent users and what they do in the solution.
There is no simple answer here, there are many factors that play a role.
Rough estimation: you are talking about 2,500,000 records per month if I start with 5,000 loans each month. That is a lot, so start with the design of your solution: make sure that things only calculate once and store the results instead of using calculated fields. Be very careful with layouts that show a lot of related data,...
To be clear,
- The term "store the results" means to script the calculations and setField insert the results one time (manually or programmatically calling for a recalc), rather than have them calculated on the fly via calculated fields, or...?
- What do you mean by "very careful" with layouts that show a lot of data? I'd THOUGHT I'd show the related data (cost itemizations and amortization loop results) in portals. Is there a better way?
Wim's advice is sound. Also, if you're dealing with a networked solution, be careful about the width of your tables (how many fields each table has). The more fields you have (generally speaking), the slower each record will load (because all fields load, with some exceptions, regardless of whether they're displayed). In a situation where you have a portal, the entire related set will try to load, so this could hit your performance.
Also: Wim mentioned avoiding calculated fields. This is especially true in the case of aggregated calculations (Sum, Count) that work across a found or related set. These force FileMaker to load the entire set of records affected. If you're loading 2.5M records ... ouch.
hi, I'm curious about the outcome of your database solution, does FM handle this amount of data well?