I would suggest you try the debugging steps:
- isolate one record and export it. Does it open in Excel correctly?
- create a new, simple, FM file, populate it with similar data (numbers, text, date) and export it. Does it open correctly?
- export to xls format. Does that open correctly?
- export the data as a merge file and import that into Excel. Does that display as expected?
I'm just trying to see if the problem is your particular FM file, the data in the file, or the Excel install, or the exchange of the data.
I apologize, I should have added more information. Sorry for wasting your time.
I'm pretty sure its one database. I have about 70 databases. I don't have an issue with any other database. I can't export using xls. The database is about 450,000 records with about 400 fields. (I'm only exporting about 15 fields). I did notice one particular field creates the issue. I have tried to do a recover, but that hasn't been successful. I still get errors. Merge doesn't work.
I did notice one particular field creates the issue.
You might try this on a copy of your fil:
Define a brand new field.
Use Replace field contents to copy the data over from the old field to the new field.
Delete the old field.
See if this makes any difference. If it does not, it may not be your file, but the data in that particular field that is creating the issue.
- "I'm pretty sure it's one database. I have about 70 databases. I don't have an issue with any other database." - That suggests it's the file or the data in the file.
- "I can't export using xls.". Just curious - why not? Technical issue, or preference? It's a de-bugging procedure, so why not?
- "The database is about 450,000 records with about 400 fields. (I'm only exporting about 15 fields). I did notice one particular field creates the issue." See Phil's suggestion. What happens if you exclude the troublesome field from the export? Does all work then?
- "Merge doesn't work." Why not? Do you mean you export the data to a merge file, and you can't import it? It imports with errors? Not presented as you would expect?
xls doesn't work as well. It crashes when I open the file. I also understand that xls has a limit on characters per field, which I have truncated the field down to its limit.
If I exclude the field, it does work.
Merge doesn't work. When I try to open it with excel, the spreadsheet is just empty. I do not receive any error messages at all.
I'll try to follow Phil's suggest to see if I can make any progress.
Don't understand why merge format wouldn't work - that suggests a remaining problem. Weird that it shows nothing.
You seem to have determined that it is the data in one field that is causing the problem. I would suggest you isolate it down to a record, perhaps by the 'Binary Method' of splitting the records into ever-decreasing halves. If you export one record does it work ok?
Sorbsbuster, it does work with exporting on record. I have cut down the database to about 8 separate files before. 7 would open up file, 1 will crash. CSV works on export. Merge gives me the "Not Enough Memory" on excel. Again, this happens on a few computers when I try it.
Yes, but what Sorbsbuster is suggesting is that you repeatedly divide down the set of records that you are exporting until you can determine precisely which record or group of records in your file causes the problem. The idea here is that this record may be corrupted and your best bet may be to delete the record and re-enter it.
Other things to try:
Recover the file and see if you can export successfully from the recovered file.
Save a clone of your file. Export the data to a merge file, import it into the clone and then try exporting from this new copy into an excel file format.
I went ahead and tried to do a recover. I also saved a clone, then exported the data into a merge file with the import as you suggested. I still haven't had an success.
Cutting down the records might be just too much. I'm working with around half a million records. Could there be an issue with having special characters in the data? CSV and Tab files export without any issues.
As you are removing some causes you need to move on to investigating other possibilities. The strength of a 'binary test' of the data is that it narrows the records down to a small suspect set (if any) very quickly. It doesn't matter that you have 500,000 records. Viz:
- the first split tests down to 250k
- the second down to 125k
- the next down to 63k
- the next down to 32k, 16k, 8k, 4k, 2k, 1k, 500, 250, 125, 60, 30, 15, 8.
So after 16 tests you have exposed any faulty data, or proved that the problem does not lie there. If you had 2,000,000 records another 2 tests would achieve the same result.