There is a file size limit, but it's about 4GB from what I can see in the Known Bugs List.
But FileMaker builds field indexes during the import and FileMaker may be bogging down building a truly massive index of values from your imported text.
Try turning indexing off on all text fields in your target table and see if that makes a difference.
Thanks Phil, the text field in the target table is global, hence no option for turning off indexing. So decided to wait it out, it took 46 minutes for the text content of the file to be imported. Note: the target table is linked (connected) to over 100 other table occurrences; so I tested on a global field on a target table that is not linked (connected) to any other tables, that seems to take a few seconds. I say it seems, but still have to insert a pause script step, using 15 sec. to make sure that what ever FM is doing internally is completed; otherwise the next 5 script steps, performing actions on the very large content of the target field gets seriously missed up (truncated). In other words, script steps following the import records are executed before the internal actions are completed. Not nice. This does not happen with relatively small imported files.
The intent is to parse the text content of the target field and generate records, in this case about 23,000 records. We know that parsing in FM is inherently slow, but having to perform a substitute step 23,000 times in a loop on the large text content of the target field, even before the parsing starts, seems to really bog down performance. Currently It looks like it would take 8 or so hrs to process this very large file, a bit intolerable.
This FM database application I created is relatively very complex, trying to make the necessary changes to improve handling very large text content files is a bit daunting.
I'm not sure that I follow all of that.
Why do you import into a global text field? Are you importing into a global field and then using an auto-enter calculation to parse data from the global field into other fields?
Are you using a script to parse the data?
And if there is no parsing at all taking place, does that make a difference?
I'm thinking that if it's the parsing that's so slow, it's the index building on the other fields--the ones receiving the parsed data that are slowing the process down. That may be a necessary cost of getting the job done as such indexing is usually needed.
Darn, all that I typed disappeared, timed out.
Sorry, but will not try to reproduce it, ... all parsing is being done by scripts.
Importing to field in a table that is not connected to other tables, occurs fast, a few seconds, rather than to a table with over 100 connections of table occurrences (46 min). It is not dependent on the field being global.
Probably would have been smart to perform the parsing using auto-enter calculations directly to the fields in the various tables. I will play with this ans see if it results in faster processing.