That shouldn't be a problem. You can adjust the size of the cache as way to balance how frequently the host saves to disk vs. the slight slowdown that might occur at the instant it saves to disc, but you shouldn't have to worry about the system "overloading" and then "crashing".
Your best defense against getting a damaged file is to follow some basic guidlines:
- Don't put a shared database file in a shared directory where multiple users can directly open the file. Always have it opened first by a host machine and then clients connect via Open Remote...
- Make lots of backups, keep many sequential copies and store some of them completely off site from your work location.
- Don't allow third party back up software to back up your file while it is open.
- Do not open a hosted database and attempt to make structural changes to it while others are using the file. Do this off the server or while there are no other users.
- Make sure the host machine has a healthy hard drive.
- If a file does show signs of damage, do not recover the file and then put the recovered copy into use. Instead, replace the file with an undamaged back up. You can open and save a clone of the backup file, then import all data from the recovered copy to load it with the most up to date data possible. With such imports, make sure you also check and update any next serial value settings for any serial number fields.
Ok cool thanks. I appreciate all that - great advice!