I wouldn't call it "easy" in all cases. It can be done, but "easy" it may not be depending on the size and complexity of your databases.
One option for simplifying your updates is to split the database file into two parts: Data and Interface. This is called the data separation model and I'll include a link to a thread where I spell out how to split an existing file if you choose to do this. With a separate interface file, you can just swap out interface files.
Changes that incorporate "new fields, tales, relationships", however, still require uploading a new copy of the data file and if data has been modified since you acquired the copy you've modified to produce the new version, you have to import the data from all the tables in your old version into the matching tables of the new. You also have to update any next serial value settings to prevent the production of duplicate serial number values in the updated data file. Fortunately, all this can be scripted, so you can download back up copies to your development machine, make changes and then the only "nights and weekends" part of the process is uploading the new file and kicking off your import script.
Any time I make such changes, I keep the original data file handy so that I can quickly swap it in if I discover a critical issue with the new version that escaped my testing...
Your sandbox file idea is easy, just create a table with an access account to the mother file. Create your files in the relationship graph by locating the table in the mother file. The TO will be in italic to indicate a foreign file. It will work just like one that you create in the file.
A second idea is to clone your Mother FIle and then use that clone to open the tables on the mother file. Double click on the TO and select the matching table on the remote Mother File and any related tables.
See below for working live...
It is possible, with suitable precautions, to make live edits on a file being served using Filemaker Server and this is way easier than developing a second file and then updating those changes in the server file...wait, isn't that making active changes in the file so we are back to the first statement...
You will have to make any changes to fields or add new tables to the files on the server and modifying fields require that no one else accesses that table. Having everyone avoid that table for a few minutes or making these changes after hours is rather easy. I have done hot edits during working hours and even remotely making instant updates to a client's database. Never had to import the data, don't want to.
Live updates require a good backup and I always forced the backup before beginning.
If you use the two file or more paradigm, it is easy to create a second shadow copy to be modified and tested (try to not enter new records) and then when perfected, switch it out on the server after hours.
Also, files can be created for individuals or specific purposes that work with the Mother File. I've made a file for the Owner, Bookkeeping, Sales, etc. that only had their passwords in them. If a salesman lost his computer, the file only opened the sales accounts so we didn't have to worry about possible access to all accounts. The single purpose file also prevents someone from harvesting passwords and accounts and using any or all of them.
Live updates require a good backup and I always forced the backup before beginning
That's absolutely essential. The risk here is that a network glitch occurring just when you save a structural change back to the server may corrupt your working file with a resulting major interruption to the accessibility of your database while you swap the damaged copy for the back up. Even those few minutes of down time may represent an unacceptable delay for some systems such as the "point of sale" type system we run here with Filemaker.