DanielSmith

slow JSON processing speed (BE_JSONPath & JsonGet)

Discussion created by DanielSmith on Nov 27, 2016
Latest reply on Nov 29, 2016 by Johan Hedman

Im developing a solution where a JSON payload is submitted via phpAPI into a record in a FileMaker database. Periodically a server side script will check for new records, validate them and then process the data creating/updating various records based on business logic. Everything works well in initial testing but when scaling up the size of the JSON processing speed and ram usage spikes exponentially.

 

The JSON contains a single array with ~140 key/value pairs per array value. During testing I had ~10 array values but have since started scaling up to ~500 arrary values, though we could expect over 1000 array values in the real world, which represents ~2MB of plain text being POSTed.

 

The first test is taking a single payload with 100 array values containing 140 key/value pairs each.

The script loops thru each of the 100, creates a record (via selector connector) in a  table and enters a value in a 'UUID' field. At this point I am ignoring the remaining 139 fields of data.

 

All tests are running locally on a fully specced model MBPr with FMPA 15. During all tests FMPA has a single core pegged at 100%

 

TEST 1

a control test not using the JSON data to test record creation speed. Setting the UUID field with 'Get(UUID)'

100 records are created in 263ms

500 records are created in 1195ms (this processing time is scaling properly)

 

TEST 2

using geistinteractive Json functions (which leverage Base Elements anyway) to setting the UUID from the JSON using 'jsonGet ( PAYLOAD::payload ; "encounter[" & $loop & "]." &  "UUID" )'

100 records are created in 11606ms

500 records are created in 285002ms

 

TEST 3

using BaseElements (latest stable version) Json functions to setting the UUID from the JSON using 'BE_JSONPath ( PAYLOAD::payload ; "$.encounter[" & $loop & "].UUID" )'

100 records are created in 6055ms

500 records are created in 149280ms

 

Looking at these tests the processing time for a simple record creation only scales linearly however just extracting a single key/pair per array value does not. For the longest test this is 4mins 45secs to create 500 records and just set a single field in each record based on JSON data stored in a related text field.  Expanding this to the 140 key/pairs per array value would give an estimated 11 hours of processing time!! The other side effect when run as a PSOS on a server is the FMSE service consumes all available RAM and if left long enough will eventually crash out.

 

Has anyone else experienced this type of speed degradation when parsing JSON in FileMaker? Or has anyone dealt with JSON of this size in FileMaker (70k+ lines) I am aware this could all be done using PHP however I was hoping for the ease and flexibility of using FileMaker scripts to process and deal with these 'JSON payloads'?

 

(Screenshot attached shows a cut down version of the current script.)

 

Thanks,

Outcomes