Break the solution out into data and interface files using the data separation model?
if not that then import/export with .mer files if there are no containers.
Script an import?
Setup the external datasource, for the 'old' file, and import table by table...
This would at least give you the ability to use it for the next file update as well?
~ may not be what you had in mind, but the simplest approach that I personally could help you through.
I would definitely do this with local databases on a very fast machine. We've done similar for clients with documents in the past and it'll go as fast as the machine will let you. Test it with just a few records until you are comfortable with the process before tackling the entire db, and even then, I would do it a few times to get a good idea of how long the migration will take you when you run against the production data set.
We use and I can highly recommend Goya's FM Refresh for the migration. It's got some very nice features beyond building your own import scripts and is a great arrow to have in your quiver.
We have an "updater" tool with 2 sets of file references - to the old files and to the new files.
Each table occurrence has an equivalent layout.
We then import directly from OldTable into NewTable using matching field names.
So long as you haven't renamed fields this works fine + preserves text formatting and container contents.
It is completely scripted, so you have the whole weekend to import + test data, before users start again on the monday.
Christian, would you say this is faster than a normal import procedure? If so, would you estimate how much faster percentage-wise? Thanks (and thanks to all!).
Please run a test.
And be aware that our insert method uses native data types, so you don't loose details by converting to/from text.
It should be faster as it avoids writing text files and read them.