Recurring Imports - Best Method?
I have a database that works with data from a read only PostgreSQL database hosted on a different server. The FMP database is hosted on our network using FM Server 11 and is accessed by multiple users throughout the day. The needed data is imported into the FMP database from the PostgreSQL DB via ODBC using a server side script which runs the "import records" script step.
The data in the remote database changes often, and I have been trying to figure out the best method for importing the data so the FMP database matches the remote database exactly. I finally decided to have the import script first delete all records in the local FMP database, then import all records from the remote PostgreSQL database using the "Add New Records" import action. My reasoning is that the "Update existing records.." and "Update matching records.." options won't create an exact duplicate. "Update existing.." doesn't account for situations where there were more records on the last import (for example 10 records were deleted in the PostgreSQL DB since the last import), which leaves extra records in the local FMP database that shouldn't be there. "Update matching.." doesn't get rid of the records that no longer exist (records in the PostgreSQL DB are often deleted).
Finally the question! I have the import scripts running many times throughout the day.
1> Can it negatively effect the file to have massive amounts of data constantly being deleted and re-imported constantly? Currently the import scripts delete and replace roughly 200,000 records with each import.
2> There is also the issue of slowness. The import process takes a couple minutes to complete. I would love to figure out a faster method of doing the import process if possible.
Many thanks to anyone who can offer advice!