Since there's no easy way to tell what was changed and whether that change truly repaired the file back to "factory specs", the safest option is to make numerous, frequent back up copies of your file. Then, when you get this kind of message, start recovering successively older copies of your file until one comes up "no problems found". Then import your data into a clone of this unrecovered backup copy.
Note: I've seen recovered files behave differently from the unrecovered copy even after the recover process reported "no problems found".
"Then import your data into a clone of this unrecovered backup copy."
I've seen that statement at least 100 times in researching FileMaker database corruption issues. I know it's the "standard, all-encompassing recommendation". And it usually comes with the caution to be sure and export to text files first, because importing directly from a FileMaker database could propagate the corruption. Oh, and add the restriction that container fields can't be exported.
Do you think the key people on the FileMaker development team understand the great difficulties those simple instructions entail for databases with 1000 fields spread over 100 table definitions? The probability of losing information somewhere during the process is not insignificant. And then there's the time it requires.
It is my knowing exactly what is involved in one of these "clone recoveries" that leads me to seek a more detailed explanation of the 3 structure adjustments the Recovery reported to me. More detail could do a lot of good -- for me and for others.
So, do you know of any possible sources of such information? Maybe a whitepaper somewhere?
"And it usually comes with the caution to be sure and export to text files first, because importing directly from a FileMaker database could propagate the corruption. Oh, and add the restriction that container fields can't be exported."
That's not exactly what I suggested. You'll note that I made no suggestion of exporting to intermediate text files. Importing your recovered data from a filemaker file into the clone should be "safe" in most cases. This avoids the issue of being unable to export container fields to text files and is a process that can be scripted to reduce the inconvenience a bit.
It's a matter of balancing risk vs. benefit. Many developers believe that exporting the data into a merge or other text file then importing into the clone is a bit safer. I've yet to encounter a situation where that actually eliminated an issue. Maybe I've been lucky. Given the difficulties inherent in this approach I choose to import directly from the recovered file and then monitor the file carefully for awhile. (My systems are backed up daily and I keep about a month's worth of daily backups on file, with one copy per month archived forever.)
Honestly, there have been times where I've simply used the recover file after the recover process reported "OK to use". Sometimes it's just too costly in terms of time and effort to restore the data back into a clone. I don't recommend this to others, because I know that I am assuming a risk that is only partially mitigated by the frequent automated back ups and careful monitoring that I do.
Thank you for your posts.
There are numerous parts to a FileMaker database file and whenever information is constantly being written to a file, the greater the possibility of incorrect information being written. The Recover command will do its best to try and fix a database file, but as you hear from others, it is not a cure-all. That is, some problems may remain with the file. However, the majority of the time, the recovered file is usable. Still, there is a possibility that there may be an underlying issue, so having a backup of the file is the only way to ensure certainty.
The Recover.log file will provide more information, but as you discovered, it does not provide enough information. Sometimes, during Recover, FileMaker cannot recognize what type of object is being referenced (field, graphic, type of graphic, etc.), so it is either removed or renamed (like a field name).
Yes, 1000 fields spanning 100 tables is definitely significant, and having additional information would be helpful.
I highly encourage you to enter this into our Feature Suggestion web form at:
I could copy your posts and paste them into the web form, but there are additional questions asked that only you can answer.