8 Replies Latest reply on Jan 19, 2017 9:53 AM by jbante

    Best methods for importing/storing/reporting JSON time series data


      Hi there, Everyone!


      Looking for some help/suggestions regarding some best practices and recommendations for dealing with time series sensor data.


      I'm looking to take JSON data from an API, store it, in some form, in Filemaker, and then visualize it.


      The data itself is essentially just large collections of name/value pairs (a timestamp and a sensor value). The problem is dealing with the volume of data. I'm pulling values every 30 seconds from 4 tags from 3 devices 24/7 which equates to just about 35,000 data points a day.


      I don't think storing each name/value pair in its own record is a good option; that's just a crazy amount of data to be doing finds on to bring into charts for visualization.


      Looking for alternatives to just storing every data point as its own record and maybe even alternatives to visualizing/charting the data outside of Filemaker's options (for example, I've used Google Charts previously to do some visualization in a web viewer). Specifically looking for alternatives that make the data quickly findable and retrievable for display.


      I attached a copy of some of the type of JSON data that I'm working with that's returned to me from the API calls.


      Be really interested to hear any ideas/successful experiences/experiments that you all have done.


      Thanks tons in advance!

        • 1. Re: Best methods for importing/storing/reporting JSON time series data
          Johan Hedman

          There is a number of plugins that can parse JSON into FileMaker


          Base Elements


          and more


          But besides that you also have a few FM native modules that you can use like



          Here you can read a very good article about parsing JSON to FM

          How to Parse JSON (or any other language) in FileMaker - Soliant Consulting 

          1 of 1 people found this helpful
          • 2. Re: Best methods for importing/storing/reporting JSON time series data


            I'm not sure this is the best(maybe not).

            But if you think that each/value pair to FileMaker Record is not a good idea,

            how about this?


            /* In JS, convert date/value pair to */


            var dateArray = [date1,date2,....];

            var valueArray = [value1,value2,....];


            var obj={



              /* add additional info if you want */


            var dataToSave=JSON.stringify(obj);

            Screen Shot 2017-01-19 at 17.32.15.png

            /* Save this to FileMaker, using FMPURL or some other technique*/

            2 of 2 people found this helpful
            • 3. Re: Best methods for importing/storing/reporting JSON time series data

              I think storing every datapoint in a separate record is precisely the way it should be done. Datapoints are the basic elements (i.e. records in a database) in this problem, and using them as record items is the logical approach ...


              I use a similar approach to processing a directory listing from a webserver, to get me a list of files, dates and sizes, and (crucially) the full path name. The directory listing is a little over 355.000 lines (about 20MB of data), and I repeat this process for every update of the website (from 1 to maybe 6 times a day) ... On a MacBook Pro (2.2GHz quad core i7) this takes a little over 30 minutes ... using the database even with 355.000 records is no problem whatsoever ... (I even have FileMaker databases with more than 5.000.000 records ...)


              The most difficult part is extending the listing from filenames grouped by folder to full pathnames for every file. You have a similar problem to extend the "header information" to be included in every record (tags, devices, ...) with the datapoints. If you structure the database this way records are in fact very simple and short, and the solution can be extended to include more tags and devices without the need to restructure the database or the processing ...

              You can just select whatever devices/tags are of interest for a specific reporting ...


              Contrary to the directory listing, which is essentially a return delimited list of records, your file is one long string. This has to be preprocessed before importing into FIleMaker or any other database.

              Basically adding returns, to split the string into a list of lines for every "datapoint string", and then removing non-data text elements like "date", "value", brackets, etc ...

              Depending on location/number format standards, you also have to decide to maybe change values from decimal point to decimal comma and importing them as numbers, or leaving it as is, and proces this (string) after importing.

              The timestamp data has to be "cleaned up" as well, again, either in the preprocessing, or after import, to match the internal representation FileMaker uses ...


              Hope this gives you an idea to go forward.


              I would be very interested in hearing how you go forward and the resulting solution 



              1 of 1 people found this helpful
              • 4. Re: Best methods for importing/storing/reporting JSON time series data

                Hey Johan,


                Thanks for the replies. I think the biggest challenge for me is figuring out the best method to store, find, and retrieve the data. I'm not too concerned with the conversion of the JSON. I'm using some python to bring in the JSON, parse it, and make it available to Filemaker Server. I've actually tried out the ModularFilemaker plugin and read that Soliant Consulting blog (and others they've posted about handling JSON; super helpful btw!). Really good suggestions and helpful.


                At the moment, the biggest challenge I have is deciding whether to just go for putting each value pair into its own record or grouping them by time (i.e. taking all the values in a 10-minute period, return delimiting them, and putting them into a single record). I'll definitely be doing the find !s requested by the user via Filemaker Server especially since this solution is going to be accessed on mobile devices. But even that I'm not really sure is the best (i.e. fastest) method for retrieving and displaying the data into a trend chart/multiple trend charts.

                • 5. Re: Best methods for importing/storing/reporting JSON time series data

                  Thanks for the suggestion, Sam! I hadn't considered using Javascript but I'm definitely toying with this solution (storing array objects in FM fields in a record).


                  The questions I have in that case are:


                  1. What's the best method for the user to find the data. In other words, what could they search on? I could timestamp when the record was created and put the time or date range of the data in another FM field, I suppose. But I really want the user to be able to select to see data in no smaller than 10-minite chunks.


                  2. How does the visualization come into play? I need some kind of trend visualization. Would I be able to use Filemaker's charts in the case that you're suggesting? Would I have to pass the contents to a third party tool through a web viewer?



                  • 6. Re: Best methods for importing/storing/reporting JSON time series data

                    Hi, thefedorcompany

                    I have a question, where do you get JSON data from?

                    Via webviewer using Ajax?

                    If so, you can easily visualize the data using javascript.

                    After the visualization, in saving phase, transform the JSON data as you can easily find afterwards.

                    e.g. divide the data by time( duration ), and labelling it.

                    Labelling means creating new key(e.g."startTime") with time data.( Insert this into your modified JSON )

                    Then, save this to FileMaker.

                    In FileMaker, using this label, you can find the data easily.

                    In reporting, pass the found data to webviewer and rebuild data object and visualize it.


                    I hope this will help.

                    • 7. Re: Best methods for importing/storing/reporting JSON time series data



                      Really interesting suggestion. I'm definitely willing to try it. However, I ran the numbers yesterday and it looks like would arrive at the 5M record point within ~140 days assuming 35k data points (aka records) a day. I know that the maximum size for number of records in a table is WAY higher than 5M but at some point server performance with finds becomes a serious issue (and I don't actually know what that performance threshold is).


                      Like you pointed out, the good thing is that the values are short (one timestamp field and one small, float value). And those values aren't changing at all, just being stored, so they can be indexed to drastically improve performance. And you're also right about the flexibility that allows additions or edits to be made easier in the future.


                      Good points about needing to parse the data. Not too worried about that at all. As other people have pointed out on this thread, there are a ton of good resources out there that will parse through JSON to make sure that I can get what I need to do an import into FM and create the records.


                      I'll be interested to know how I go forward too! =P

                      • 8. Re: Best methods for importing/storing/reporting JSON time series data

                        In any kind of analysis of this kind of high-density stream data, the conventional practice is to periodically calculate summaries over some time window and archive or discard the raw data. Which summary over which time window to calculate will depend on what kind of analysis you want to do.

                        2 of 2 people found this helpful