So falling in love with API's and pulling data via JSON but am struggling to find a good way to take a large data set ( > 8,000 records or arrays ) and narrowing it to a more manageable amount.
What I'm currently doing is using the Insert from URL option to pull my API Data down and putting it in a variable.
From there I've been able to add records based on the data I've pulled (using a script to grab each set of data using a Loop to cycle through each array till it returns a blank value...i.e. out of array fields).
My problem is once it grew over 4,000 record the process became un-manageable.
Would love to (A) filter out the fields I need (in this case via a field in the JSON for example where the field "last_action_date" = 2018-01-22...i.e. today) and only add those records; or
(B) continue to do like I've been doing but find a more efficient way to do it....currently I start at array  (or $x) which is always the API's (or JSON - still getting used to the terminology) first field/record and run a loop (adding to my [$x] each time) and pulling out each record till there are none.
Again, works fine with a few records but when I got to the thousands it pulled about 250 records for every 10 minutes running. This means I'm well over 5hrs of script running to get the fields updated by this process.
Help, I know there has to be a better way!!!!