Add JSON as a valid data source for the 'import records' script step and menu function.
Sometimes ODBC ist not possible and often we get data via JSON.
I like this idea. Really I do! I’ve been using FileMaker to 'import' text in many formats for years. JSON, like XML, is just another patterned TEXT format. If any pattern is "flat", the direct import is easy.
However, the beauty of XML, JSON, EDI, etc. is the nested structures with which they can be sent and received. This requires an interim where the nested data is pushed to related child records.
Just keep those caveats in mind for this type of suggestion.
Sent from miPhone
This requires an interim where the nested data is pushed to related child records.
Looking forward to an overhaul of the import/export mechanism that will allow importing/exporting to multiple related records.
XML is a valid import source and XML is capable to have structured data.
So why not also allow it for JSON ?JSON import will have restrictions, nested strictures with a push to relations ... too much for FileMaker.It should br possible to live with this restriction. ARRAYS could be interpreted as repeating fields.
Maybe: If objects in objects will be found, this could be the "sub-object JSON string" in a simple textfield.
I will happy take care of it after the import.
Not clear on what the implementation details might look like, but I see a need for something like this.
Thanks for posting.
XML is a valid import source and XML is capable to have structured data. So why not also allow it for JSON ?
So why not also allow it for JSON ?
You may ONLY import into one table at a time. If you choose to import "related" (child structures) into parent fields, then you must post-process to push into related records. This is regardless of the IMPORT TYPE. JSON would be no different, if it is objects in arrays in objects in arrays in objects...
While trying to create a local version of an app, importing and exporting data in order to keep home-base up to date makes the process slow, and might fail if in the middle of the import your internet fails. Nevertheless, api calls for the entire data is really fast, sometimes felling almost instantly ... but then parsing the JSON result into a table, even in a table with no validation or auto-enter fields ... takes 10 times longer than simply importing. Therefore I really do see a need for this, with all subsequent limitation that it might come with.
Retrieving data ...