I have a file with 4 index fields and 2 data fields to document government features.
The data fields cover the name of the feature and data describing the actual feature (text).
The four indexes cover, what government has this feature, a feature UID, type of feature, and source.
I anticipate hitting the upper limits of filemaker (64 quadrillion records) and needing to move this table to postgresql or mysql within a few years. What I am not sure of is what is meant by lifetime records and how much that is likely to shave off the theoretical limit. I'm assuming that if you add a record, delete it, and then add it again, you have used up 2 of your limit.
Would saving an empty copy and importing a large table restore record limits so they equal records currently in table?
At what point would would it be wise for such a table as described above to shift out of Filemaker and into a heftier database engine, moving Filemaker to the role of a front end to access that data?