What are you really trying to achieve? A backup of your data? What are you going to do with your " 'clean' copy of the data alone"? It does seem like a lot of work if you are just trying to achieve a backup, and there are better ways to do that, like using FMServer to do it for you.
Thank you Chris,
I have had a nightmare with corrupt tables (4 of them) in a database I've been working on since FMP 4 or 5, over 12 years.
I'm sure it happened over time and I just never paid attention or even ran recover to see if the structure was solid.
I was able to 'replace' the tables after much ado.
My thought was to always have a 'current' version of data and structure on hand for any and all reasons.
I guess you're right though, that the data does me no good alone. I was probably over-reaction, as I'm known to do =)
I do plan on running 'recover' on the database on a weekly basis to see if there has been any corruption in the tables and if there is, repair it like I've recently done.
Thank you for making me think about it some more.
Any more suggestions about periodically checking the integrity of a database are very welcome.
I would make exports of each table (the layouts need not have every field, as you can specify current table or current layout fields). export as FMP, then you get most of what you need (on calculation or summary fields, except as raw data) without scripts and layout objects.
This may be overkill, but you could...
- First, you'll need to know about this: http://www.databuzz.com.au/using-executesql-to-query-the-virtual-schemasystem-tables/
- Have your script loop through each layout
- Identify the layout table name
- Query the virtual schema to find the base table name
- If the base table has already been processed, skip the layout. Otherwise, we'll add the base table to a list in a variable of base tables that have been processed (at the end of the loop, most likely)
- Get a list of fields in the base table (by querying again)
- Loop through this and keep only those field names that are not calcs, summaries or globals
- Set the list into a global field or variable
- Have a field in each table that uses a custom function to convert the field list in your variable to a delimited list of values from those fields in this record
- Show all records
- Export just the field with the delimited values
- May want to export the field name list, as well, ideally as the first row
- Mark the base table as done, and end loop
Not very detailed, I know, and there would be details to work out, but I'm confident it could be done. The advantage, such as it is, is that if you added tables, fields, layouts, etc. you wouldn't need to update this script, so long as every table included that field to make a delimited string.
Just thinking out loud. HTH.
Beverly, thank you for your reply.
That is a nice shot, the specifying of the current table to include all fields.
The problem is that only 'all fields' as of now are included.
To include any new fields I would have to manually add them each time I run the script.
They are not automatically included in the script if I designate it to run without dialog and if I 'pre' specify the export order.
Any thoughts on how to automatically include new fields in the export so that I could run the script completely automated.?
Out of curiosity: Do DBs run on FMS?
true! any new fields added after the script is created will mean 're-setting' the script.
I've tried with Matching Names hoping that new fields would automatically match, but that doesn't always happen either.
Yes they run on FMS - thanx, whatchya thinking =)
Ok are my thoughts:
1] Have FM server create clones of your files to maintain your structure
2] Create 77 layouts with no UI other than the default
3] Script going to each layout and saving the data as Merge text files to maintain the field headers
1] This seems like a LOT of work for no real benefit. Just backup your files regularly. I have been using FM for over 20yrs and have rarely seen that much corruption unless a system is ignored for years and mismanged.
2] It is not recommend to recover you files and keep using them nor is it recommend to recover them as a routine maintenance. It should really only be done when it truly necessary.
Chris, thank you so much for your response.
You are really right, it is a LOT of work for no benefit, I'm 'knee-jerking' reacting after having to replace 4 bad tables, relationships, field def's and all. And again you are right that the db was ignored and not maintained correctly for years, all on me completely.
I do have great backups, weekly in the cloud to an east and west coast google data center, crash plan continual bu, 2 CCC drives local and a Time Machine, so obviously I'm a bit OCD.
My current plan is to once a week run recover on my master DB and see if there are any structural errors. (not to use that recovered file, but to just see if there are any problems)
Also to make a clone, no records on a weekly basis.
Any other thoughts, I appreciate your time, and am going to abandon the 'export all tables, all fields' approach, it really is meaningless.
whew, thanx, Troy
I'm sure you know can have server run an integrity check on backup if there really is a problem it would find it. It's unlikely that a recover process would discover things this wouldn't. Remember the most common reasons for corruption are when a process gets interrupted usually on a location drive which is why I never develop on local files. The other is you have bad sectors on the hard drive this will hose you for sure. I understand your need to be cautious since you've been burned and it's good to keep an eye on things for sure. We all need to find the right balance between setting up the proper environment and checking that the database is in good shape. Always use the merge format to "clean" the exported data if you really get concerned since it outputs it as text only which is the most basic format of data.
Sorry if this is mentioning stuff that you already know and are following, but from your description of your "a bit OCD" process of backup I thought it was worth raising it just in case. You do know that you should never have crash plan, CCC, TimeMachine etc running backups on databases that are live, don't you? You should have FMServer create the backups and then you take copies of the backup files, not the live files. Having other backup utilities running over live files is a surefire way to corrupt files.
Lemmtech, thank you for your time, you know, I did 'know' that there is that option on the server admin 'side' ask me why it wasn't checked? dunno - ARG!, so thank you for bringing that back to my conscious, I have since enabled that option on all my backups. thanx
Chris, thanx man, yes I did know that, a buddy years ago told me, I had no idea... at your kind suggestion though, I did go back thru all the backup 'schemes' I have going and it's all good, no backups of 'live' databases..... thanx so much