Hi, I'm trying to figure out the best way to do large imports via an http requests.
I have an "INVENTORY" table with item IDs (over 8000 items) and I need to import updated buy and sell orders for each product into a related table. These buy and sell orders need to be updated frequently so I thought the best way would be to create two related tables ("BUY ORDERS" and "SELL ORDERS) joined by the item IDs and purge them (deleting all records) before running a new update. Especially since the web requests have to be done separately for the buy and sell orders.
The problem is that the updated information is gathered via http requests with xml return. I can request multiple IDs at the same time but since http requests are limited in size (a bit over 2000 characters) I thought of creating a variable list to request a block of the item IDs and loop it until all items are imported.
I'm wondering if anyone can think of a better way of doing this.