I have an existing transactional sync routine where the local files on FileMaker Go are a working file and a local connector file. The local connector file has external data sources to a host data file and a host connector file. When a sync is run it checks for any local records that are new or modified since LastSync and PUSHes them up. Checks for Deleted Records on Host and deletes any matching local records. Then checks for any new or mod Host records to PULL down. I have recently added a call using PSoS to PUSH the uploaded records from the Host Connector file into the data file.
I am working on the PSoS version of the PULL records down from the Host Data file scripts and can package the Host records to PULL them down through the Local Connector file.
I would like some input from others who have done this before.
a. I can call the PSoS PULL script one time where it can package each record to PULL down, then create one big package of all those records into a variable that is handed back to the Local Connector file to unpack and post into the local mobile file.
b. Call the PSoS PULL script multiple times, package a single record at a time and send to Local Connector file, unpack, post to local mobile file then keep calling the PSoS PULL script until all mod or new records are pulled.
Under normal use the sync will only be several records, however if a pricing update was run on the host then the next mobile sync will have thousands of records to pull that can contain images as well.
Tell me your thoughts and experiences with this.
I am thinking it is safer to pull one at a time as it does in the local file now, however calling the PSoS one time to package up the individual records in one go is attractive. Perhaps I could figure my loops in a way to release one record at a time from a single call to the PSoS script? I use a dict method to package the parent record with its children. So I would have dict list of the records to send or release one at a time.
Thanks for your help.