How could you possibly reach that limit on a DB of government features?
What you do with it in FileMaker will determine your pain point of performance. I have seen a database with only 10,000 records and some incredibly slow unstored calculations bring FileMaker to its feet. I've also used a FileMaker database with 83 million records that had no unstored fields and searching through it was reasonably fast. At least your file is narrow with only 6 fields. I assume your database will grow by importing data and expect FileMaker imports to be real slow because FileMaker updates each index for each record it imports. It does not do the entire import and then re-index like some big database engines do.
Note that FileMaker Server 15, through Actual Technologies, supports Extended SQL Sources (ESS) for PostGres which is new to 15. MySQL has been supported with ESS for years. But you can have the table in FileMaker and if you go to PostGres or MySQL, set up ESS and the table occurrence can be changed to point to those databases for information. Very easy to do.
FileMaker can be easy as a front end UI for ESS databases. But it is not always the fastest and may need some tweaking to get it to work well. However, it sure is easy to do.
Obviously you will want to have a server with lots of RAM and very fast storage access (SSD / RAID).
Why does it need to be one single file? When file A gets full start file B and relate them. unless there limit on how many records FM can see.
1 of 1 people found this helpful
Unless you’re constantly deleting and importing large sets of records, I’d expect you’ll run into the 8 TB per file limit long before you’ll hit the 64 quadrillion record limit.
If you expect 100 million+ records soon and continued rapid growth, I’d strongly consider segmenting the database as greatgrey suggested (although that comes with UI challenges) or move to a different database engine sooner rather than later.