While such a synch process is possible, implementing it could become very complex and difficult to maintain. Image trying to manage a situation where all three sites modify a record in different ways and then your system has to manage the merge so that none of the updates are "lost".
There are ways to improve performance for your WAN users. It's basically a case of reducing the amount of data that has to make the trip from the server to the client machine each time the window refreshes on their system. If you can simplify layouts to reduce the number of unstored calculations, conditional formats, summary fields etc. this can, in turn, improve system responsiveness. Likewise reducing the number of records in your found set can improve performance in list and table views--as well as layouts with summary fields.
Reducing the number of container fields or the resolution of images stored in them may improve performance.
For reporting purposes, it can even be useful to set up a system that "pre computes" as much of the data as possible. In our system where we have millions of line item records and a report that can compute totals and averages for up to 5 years of data at a time, we use a special summary table where the individual line item records from our invoices are "condensed" down into a table where each type of item listed on the report for the day's transactions is one record with the daily total stored as a simple number. Our long term summary reports then reference this table and pop up for us much more quickly due to the greatly reduced calculation load. The more "static' your data is, the more practical this approach is to implement. Invoice data doesn't change once it's printed and the money changes hands except to void one entirely, so it works well for us.
A final option you can test to see if it helps is to use the data separation model and try putting the interface file on the local client machines instead of on the server. In theory, this would reduce the amount of data that needs to be transmitted to "paint" the layout elements as they would come from the local file, but don't know how significant a performance improvement this would produce.
For more on the data separation model: Convert to Seperation Model
Both your tips about reporting and the using a separate front end on the clients' machines are very interesting... I think the separation model may work particularly well for this solution as the front-end is fairly intensive, with fairly high amounts of conditional formatting and some container fields. Thanks for the advice.
I doubt that a local front end will be any faster when it contains "fairly high amounts of conditional formatting and some container fields". You'll hit the same bottle neck you have now. Where the local front end may render more quickly are the non data graphic elements that make up the layout's design such as the size, appearance and location of buttons, graphics pasted onto the layout background etc.
Thank you for your post.
Whenever you access a file across a WAN, the transfer of information is going to take longer than it would on a LAN. One option is to limit the number of objects/fields on a layout so not as much information is transferred. That is, only display the essential information for each record, but have an option to display more information if needed. This would help performance. Another option is to use custom web publishing where the information is transferred and displayed via HTML.
FileMaker does not have a way to update multiple servers at one time. If you don't need the information real-time, you could download the data to each of the computers at the beginning of the day, and then have a script at the end of the day that would update the host machine and then copy back to the satellite servers.