4 Replies Latest reply on Nov 10, 2014 5:39 PM by rouelf_1

    Import Record From A Very Large Text File In A Folder Seems To Make FM Unresponsive.

    rouelf_1

      Title

      Import Record From A Very Large Text File In A Folder Seems To Make FM Unresponsive.

      Post

      A very large text file: 7,541,209 characters, 1,406,507 words, 390,408 lines.

      When this very large text file is imported as a record contained in a folder, such as in the "TemporaryPath". Filemaker seems to become unresponsive. A multicolored spinning wheel shows up, and seems to persist indefinitely. 

      Using the script steps below:

      Set Variable [ $newFile; Value:"GEDCOM. txt" ]

      Set Variable [ $Path; Value:Get ( TemporaryPath ) & $newFile ]

      Export Field Contents [ GED_Import:: GED_File; “filewin:$Path” OR “filemac:$Path” ]

      Set Variable [ $Path; Value:Get ( TemporaryPath ) ]

      Import Records [ Folder Name: $Path; File Type: Text files; Target: “GED_Import”; Method: Add; Character Set: “UTF-8”; Field Mapping: Source field 1 import to GED_Import:: GED_Text ]

      This works great for relatively small files; However, this very large file seems to make FM13 Adv unresponsive (After the Import step). A multicolored spinning wheel shows up, and seems to persist indefinitely. 

      Does such a large file run into a text field content size limit ?

        • 1. Re: Import Record From A Very Large Text File In A Folder Seems To Make FM Unresponsive.
          philmodjunk

          There is a file size limit, but it's about 4GB from what I can see in the Known Bugs List.

          But FileMaker builds field indexes during the import and FileMaker may be bogging down building a truly massive index of values from your imported text.

          Try turning indexing off on all text fields in your target table and see if that makes a difference.

          • 2. Re: Import Record From A Very Large Text File In A Folder Seems To Make FM Unresponsive.
            rouelf_1

            Thanks Phil,  the text field in the target table is global, hence no option for turning off indexing. So decided to wait it out, it took 46 minutes for the text content of the file to be imported. Note: the target table is linked (connected) to over 100 other table occurrences; so I tested on a global field on a target table that is not linked (connected) to any other tables, that seems to take a few seconds. I say it seems, but still have to insert a pause script step, using 15 sec. to make sure that what ever FM is doing internally is completed; otherwise the next 5 script steps, performing actions on the very large content of the target field gets seriously missed up (truncated). In other words, script steps following the import records are executed before the internal actions are completed. Not nice. This does not happen with relatively small imported files.

            The intent is to parse the text content of the target field and generate records, in this case about 23,000 records. We know that parsing in FM is inherently slow, but having to perform a substitute step 23,000 times in a loop on the large text content of the target field, even before the parsing starts, seems to really bog down performance. Currently It looks like it would take 8 or so hrs to process this very large file, a bit intolerable.

            This FM database application I created is relatively very complex, trying to make the necessary changes to improve handling very large text content files is a bit daunting.

             

            • 3. Re: Import Record From A Very Large Text File In A Folder Seems To Make FM Unresponsive.
              philmodjunk

              I'm not sure that I follow all of that.

              Why do you import into a global text field? Are you importing into a global field and then using an auto-enter calculation to parse data from the global field into other fields?

              Are you using a script to parse the data?

              And if there is no parsing at all taking place, does that make a difference?

              I'm thinking that if it's the parsing that's so slow, it's the index building on the other fields--the ones receiving the parsed data that are slowing the process down. That may be a necessary cost of getting the job done as such indexing is usually needed.

              • 4. Re: Import Record From A Very Large Text File In A Folder Seems To Make FM Unresponsive.
                rouelf_1

                Darn, all that I typed disappeared, timed out.

                Sorry, but will not try to reproduce it, ... all parsing is being done by scripts.

                Importing to field in a table that is not connected to other tables, occurs fast, a few seconds, rather than to a table with over 100 connections of table occurrences (46 min). It is not dependent on the field being global.

                Probably would have been smart to perform the parsing using auto-enter calculations directly to the fields in the various tables. I will play with this ans see if it results in faster processing.

                Thanks.