Attempting to upload an 80 GB database file to FMC 17. Runs for many hours, then fails. Anyone else had this issue? Workarounds?
AFAIK comes aws (what's behind FMC) with 40GB disk (by default, one can buy more space). Could this be the reason?
Technical Specifications | FileMaker Cloud
Markus, thanks for the suggestion. I saw that coming, though. Already upgraded to 100 GB.
i thimk I remember seeing a notice that you need to have twice as much space as the file in order to upload. So I think you need 160 GB.
I now think I kind of remember hearing that too. I'm giving it a shot (although I can't spot that requirement in any of the documentation I usually google to remind myself how to set up FMC). I'll report back...
I would seriously consider breaking that file up into smaller chunks. If for nothing else then for backup and recovery efficiency.
But that's a longer-term effort.
For this kind of file size I would hook directly into the FMC file system and transfer the file over through SCP instead of relying on the FM upload mechanism. Downside is that you would have to manually set the permissions on the FMC Linux side but overall it should be faster.
I'm pretty intimidated by the prospect of fiddling with Linux and SCP since I'm not familiar with either. Maybe a little additional info here could help with a work around, though.
I'm converting this solution up from FileMaker 11. The database is really only about 800 MB, but it has a large set of related files (pdfs, jpgs, docs, etc.) that were being managed with the Productive Computing's File Manipulator plug-in. I'm converting the solution to use FileMaker's native externally stored container data for the move to FMC.
My initial plan was to insert the external files into internally stored container fields, then upload the single payload to FMC, then reconfigure the container fields for external storage. -and it would have worked if it hadn't been for this pesky transfer problem!
Can you (or anyone) suggest a different process?
Retrieving data ...