We have a FileMaker 12 file that is basically an interface file. The other day I noticed the file size was 8.5 GB and wondered what the heck!
Of the three tables in the file, one was used to store remote containers in the open mode (only about 100 records)
So, here's what I did to try and find the cause and remove the bloat:
1) Saved the file as a clone - file size dropped to 19,056 KB - much more in line with expectations for an interface file
2) Removed all of the records manually and saved the file as a compressed copy - file size changed to 5.9 GB
3) Did a recover of the original file with advanced features set to scan and rebuild scripts, layouts, fields, rebuild indexes, delete cached settings - file size changed to 5.9 GB
4) Manually deleted all records from the recovered file, then saved as a compressed copy - file size remained 5.9 GB
So my questions are: why is the clone file so much smaller than the recovered, no records, rebuilt indexes, compressed copy? Where would the bloat come from that was removed when I created a clone versus cleaning house? I need to begin hunting that down for another file that is used for data and don't want to have to clone and import millions of records in hundreds of tables.