Timing might be a little tricky, but you could run an OS batch file to copy the file from backup over to the necessary directory. That could be coupled with a server-side schedule to create the backup when you needed it.
I have a script which is executed on the server. In this script, I would like to insert a copy of my filemaker database into a container.
Because this now makes your whole solution, just "data" that is subject to anyone that can get to that data... I would think really hard about the necessity for this.
I can think of a possible use case for this: say you have a database that gets synced to a hosted copy, so you want to deploy updates to desktop and go clients with a clone of your hosted database. This might be a job for a robot type setup where you open the hosted file and insert a copy from the server backups.
Just make sure you aren't inserting a copy of the database that has a container field with another copy of the database inside it, or this starts to become like one of those russian nested dolls where each doll contains a smaller doll