1 2 Previous Next 17 Replies Latest reply on Nov 29, 2016 2:52 AM by Johan Hedman

    slow JSON processing speed (BE_JSONPath & JsonGet)

    DanielSmith

      Im developing a solution where a JSON payload is submitted via phpAPI into a record in a FileMaker database. Periodically a server side script will check for new records, validate them and then process the data creating/updating various records based on business logic. Everything works well in initial testing but when scaling up the size of the JSON processing speed and ram usage spikes exponentially.

       

      The JSON contains a single array with ~140 key/value pairs per array value. During testing I had ~10 array values but have since started scaling up to ~500 arrary values, though we could expect over 1000 array values in the real world, which represents ~2MB of plain text being POSTed.

       

      The first test is taking a single payload with 100 array values containing 140 key/value pairs each.

      The script loops thru each of the 100, creates a record (via selector connector) in a  table and enters a value in a 'UUID' field. At this point I am ignoring the remaining 139 fields of data.

       

      All tests are running locally on a fully specced model MBPr with FMPA 15. During all tests FMPA has a single core pegged at 100%

       

      TEST 1

      a control test not using the JSON data to test record creation speed. Setting the UUID field with 'Get(UUID)'

      100 records are created in 263ms

      500 records are created in 1195ms (this processing time is scaling properly)

       

      TEST 2

      using geistinteractive Json functions (which leverage Base Elements anyway) to setting the UUID from the JSON using 'jsonGet ( PAYLOAD::payload ; "encounter[" & $loop & "]." &  "UUID" )'

      100 records are created in 11606ms

      500 records are created in 285002ms

       

      TEST 3

      using BaseElements (latest stable version) Json functions to setting the UUID from the JSON using 'BE_JSONPath ( PAYLOAD::payload ; "$.encounter[" & $loop & "].UUID" )'

      100 records are created in 6055ms

      500 records are created in 149280ms

       

      Looking at these tests the processing time for a simple record creation only scales linearly however just extracting a single key/pair per array value does not. For the longest test this is 4mins 45secs to create 500 records and just set a single field in each record based on JSON data stored in a related text field.  Expanding this to the 140 key/pairs per array value would give an estimated 11 hours of processing time!! The other side effect when run as a PSOS on a server is the FMSE service consumes all available RAM and if left long enough will eventually crash out.

       

      Has anyone else experienced this type of speed degradation when parsing JSON in FileMaker? Or has anyone dealt with JSON of this size in FileMaker (70k+ lines) I am aware this could all be done using PHP however I was hoping for the ease and flexibility of using FileMaker scripts to process and deal with these 'JSON payloads'?

       

      (Screenshot attached shows a cut down version of the current script.)

       

      Thanks,

        • 1. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
          taylorsharpe

          I would save the payload to a variable instead of reading out of a stored field each time.  That would be one thing that would increase performance a lot. 

           

          PSoS can be done correctly by passing only the data needed to create a record (not the whole payload) and just PSoS the creation of each record. 

          • 2. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
            DanielSmith

            Hi Taylor,

             

            Nice suggestion. Setting PAYLOAD::Payload as $payload in the above script has cut the 285002ms down to 191559ms. So a ~30% increase in performance. Still un workable as scaling to 140 key/values blows this out to hours still.

             

            The PSOS example is all run correctly and nothing is passed from client to server, I was more showing concern that it was possible to kill the FMSE process with an 18 line script that processes a JSON array

            • 3. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
              user19752

              It would be waste of time that passing large parameter (json text) to function in script loopt then parse whole text every time, the loop should be in the function so that parse the text only once.

              But I don't know the functions can work as so.

              Now the functions may be searching keyword 1+2+3+...+100=5050 times for 100 arrays.

              • 4. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                DanielSmith

                What loop functions are available inside a function? Im not 100% sure what you are proposing.

                 

                If I  understand... you are saying that for 500 array values and 140 key/values inside that it is 70,000 times the JSON data is being parsed. I could  process the entire JSON once, save each array as variables $array0, $array1, $array2, etc. Then loop thru these and extract the 140 values I need from each.. Im not convinced it will result in a useable process time.

                • 5. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                  user19752

                  For example, calculation in loop

                  jsonGet ( PAYLOAD::payload ; "encounter[" & $loop & "]." &  "UUID" )

                  this search 1st "encounter" in 1st loop, this search the keyword once,

                  search 2nd "encounter" in 2nd loop, this search the keyword twice,

                  etc.

                  so if the max $loop is 100 then searching 5050 times, $loop is 500 then 125250 times. This (looping 500 take about 25 times of 100) looks fitting your result.

                   

                  Sorry I don't have resolution now, only saying cause of your issue.

                  • 6. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                    user19752

                    If your data have same fields (key) on each record (array), JSON is not fit to the purpose, it is too redundant.

                    • 7. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                      beverly

                      I am aware this could all be done using PHP however I was hoping for the ease and flexibility of using FileMaker scripts to process and deal with these 'JSON payloads'?

                       

                      I'm reading the entire thread and going back to your first post here. I think you answered your own question. You are asking an external process (plug-in) to do something that could be done in PHP (another external process). Your 'ease and flexibility' go straight out the door if your process is too slow to be practical. Sometimes it's a trade-off, but I'll go with speed every time, even if it is not as easy from the start.

                       

                      beverly

                      • 8. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                        Vincent_L

                        I would create all the records first with your firts loop.

                        Then I would get the UUIDs, and then I would send all those UUIDs in one javascript call, and then get the result that I would put in a text file with BE_Write, and then finally import

                         

                        I think it wil be the fastest way.

                         

                        If you don't care so much about the UUIDs, being exact Filemaker UUIDs, I would skip the record creation, and generate some kind of UUIDs in javascript.

                         

                        Or if you don't need the UUIDs in the jascaript part.  I would generate every record in javascript, expor to text, and then set an import with auto-enter field for UUIDs.

                         

                        It will be much faster.

                        Because, imports are the faster way to create records, and calling function is expensive. So by limiting to one call to jaavscript, and using import you've the fastest way to go

                        • 9. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                          DanielSmith

                          beverly wrote:

                           

                          I'm reading the entire thread and going back to your first post here. I think you answered your own question.

                           

                          Not really. Using PHP to parse the JSON and create records directly in a database is possible but will not be as flexible as the initially intended solution to use FileMaker. The sample script I attached above was a very small part of the overall system architecture which is not relevant to thread and having the raw payload land in FIleMaker and then processed on demand presents a number of major benefits to the clients workflow. Also, the client does not have PHP skills in house so catering for minor JSON schema changes will require contractors rather than in house skill.

                          • 10. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                            DanielSmith

                            Hi Vincent, thanks for that.  Unfortunately the UUID comes from a record created at one of 300+ remote sites and must remain in tack. Essentially the 300+ remote sites are POSTing new records created as JSON, this is landing in an intermediate database and then once verified a number of FileMaker databases are adding the data to their existing dataset on demand. All 140 fields of data for each record must be maintained.

                            • 11. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                              beverly

                              well, perhaps I was thinking that instead of JSON result, you get CSV or XML or something that FMP can digest directly, rather than parse from the PHP?

                               

                              yes, the problem being lack of PHP skills, I get that. Still not easy, also not fast...

                               

                              beverly

                              • 12. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                                taylorsharpe

                                Regarding killing FMSE, I haven't had problems with it dying since version FMS 13.  What version of FMS are you using?  How much RAM do you have (just curious since you say you run out of memory)?  Tell us about the server you are using.  I was just wondering if you already were at FMS 15 or if not, trying it on 15 could solve this PSoS FMSE problem. 

                                • 13. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                                  DanielSmith

                                  Hi Taylor,

                                   

                                  Dev server is win2012R2 running FMS 15.03  (4 cores and 8GB RAM)

                                  Laptop is macOS sierra latest v15 FMPA & FMS. 16GB RAM.

                                  • 14. Re: slow JSON processing speed (BE_JSONPath & JsonGet)
                                    taylorsharpe

                                    Well, that is the latest server OS and FMS.  Assuming this is done at night when there aren't a lot of other processes going, then 8 gigs should be more than enough.  Granted I like my servers to have 16.  Hmmmm, so hardware doesn't look to be the issue. 

                                     

                                    Is it possible to parse the JSON into smaller groups.  In doing so you could break it down into say 4ths and spin off 4 processes running at the same time to see if that improves things.  This works as long as you don't end up with more processes than cores.  It's a simplistic version of parallel processing.  The limit often will be the speed of your disk array or SSD instead of your cpu cores.  I've done it before and sped up some big jobs this way.  You just don't want to have say 4 big processes going while other people are also trying to use FMS. 

                                    1 2 Previous Next