I have a solution that pulls transactions from a variety of Pervasive SQL databases.
Refreshing my tables takes up to an hour for less than 100 000 records.
Why is IMPORT and DELETE ALL RECORDS so slow ?
You're doing mass delete / import on 60 tables with 5000 fields?
Ouch. I think you're asking for trouble. Doing that much turnover on a database will likely lead to corruption; I've seen it many times. Greg's suggestion to do an update-append (where you import based on a matching record, add any missing records, and then delete any that no longer exist) will be a much safer approach.
But yes, you're right; indexing, by default is off and will turn on when needed. That means if someone has (a) performed a Find, or (b) used a field as a key in a relationship, it will be indexed.
Is there a reason you can't do this using ESS instead of mass import?
Check the indexing on the fields in your tables. Turn indexing off on any fields that don't need it.
When FileMaker does a mass delete / import, it has to update the index on every field. This takes time. If you don't need the indexes, turning them off will dramatically improve performance.
Make sure you use a Blank Form layout with no fields. If you happen to display a Summary, it will be painfully slow
If the number of records is about the same, you could try a Find All, then Import and Update, into the existing records ( then delete the leftovers )
we really need a proper TRUNCATE TABLE command. There's no reason for FileMaker to work so hard
> Why is IMPORT and DELETE ALL RECORDS so slow ?
As I recall, Filemaker will automatically index something if needed.... so I could try turning ALL indexing off, and generally use that as a rule ?
It's gonna be a helluva job as there are about 60 tables with 5000 fields. That should be fun flipping through !
Quick find is also automatically switched on for everything, so if that is switched off as a rule ....
Wish there was a pref to say "always leave indexing off" unless I switch it on.
Do you agree ?
Thanks for your opinion. I appreciate !
These are records from multiple companies, all under one accounting package, each company has it's own SQL.
The reason for the mass import (from scratch) every time, is in order to synchronise to a specific snapshot in time on all the companies and projects at once.
The freshness is imperative ! A lot of processing happens on the imported data and for me to keep track of what got processed and what did not, is a nightmare.
I basically "suck up" the entire accounting system and then do my thing with it. Works like a charm (for three years now), only problem I have is the slowness
and I will gladly explore avenues that might speed stuff up.
I have NEVER experienced corruption and even if it were corrupted, I simply "refresh" my snapshot. It is used to produce one set of reports and is then refreshed again next time.
If I had to write this solution from scratch again, I may well do it differently. Is version 13 not ALWAYS better.
On that merry note .... I wish I could access an ODBC source by a variable name and not have to pick it off the list.
Yes, what's ESS ?
ESS = External SQL Sources. Check the Help file. Basically, it means connecting to the SQL data sources directly and using them as though they were FileMaker tables.
However, if you specifically need a "snapshot" at a particular point in time, it may not suit your needs.
Thanks again for all your input.
Many handy tips in here.
As I said - the snapshot thing is crucial.
I'll look at the indexing and other issues as discussed.
I will however evaluate the ESS as well
Retrieving data ...