Hopefully that question is complete; I'm not exactly sure how to ask it simply.
I imagine this is a fairly fundamental portion of Filemaker, and I'm just now cutting my teeth on writing scripts. Here's what I'm trying to accomplish.
I have an ODBC-driven table (readonly), and a secondary table that reads data from it and updates itself accordingly. I've tested using the ODBC table directly for my main table, but the inability to create fields within it restricts what I can do. Also, since it's constantly talking to the reference DB, many things I've tried with it are very slow; I feel like I need to use a system whereby I keep a local table updated (as external records are updated) rather than using the ODBC-updated table directly. If there's a lean way to use that table correctly, and to run (for example) dynamic search scripts on it (found one online that requires defining fields in the table it's running on), I'd love to hear it.
Anyway, I'm trying to make sure the secondary table stays relatively up to date, as far as standard data and new records. We're always adding and deleting data from the database the ODBC table's reading from, and I want the data in the secondary table to remain mostly synchronized with the data from the external DB. That means, as records are changed or deleted from the external DB, I would like for these changes to be reflected in the secondary table.
The secondary table currently lacks over 10,000 records from the ODBC table. I first want to populate the secondary table with all of the (relevant; I have several fields of the over 160 from the external DB I use) data currently in the ODBC table. Second, I want to set up a recurring script that moves all changes detected (on a reasonable schedule; multiple times daily would be great, but if this process is too slow, I could just do overnights) in the external DB to the secondary table (this includes changes to existing records and the addition of new records if the keys are not found in the secondary table [I am using two keys, UPC and something called a price level; every item has two price levels]), all with minimal processing time. I've already figured out some methods (though they're probably clunky, given my poor scripting skills) for having a given record check the external DB and grab the latest data, but I want to make sure, when someone's browsing the secondary table in list form, they see the most up-to-date data I can possibly provide without sacrificing performance.
I hope that's a relatively complete picture. Please let me know what other information you need to understand the challenge. I could show you what I've tried to make of my script so far, but I think it's bad enough that it's not worth even trying to parse.
Thanks so much for any help you can provide!