Using Offset on Time ?

I am using integromat and used their time now function to pull sales for today's date -5 minutes so it pulls only sales within the last 5 minutes on every pull.
The time came through at 19:55:24 instead of 13:55:24 which would be around 1:55 pm Central time so guessing the light speed server is 7 hours ahead of me?
Was reading on this page to apply an offset. I am not sure how to add this to this URl
Any help by passing an example URL with US Central Time with the link I have below back would be very much appreciated.
https://developers.lightspeedhq.com/retail/introduction/parameters/
https://api.lightspeedapp.com/API/Account/{{Account_ID}}/Sale.json?load_relations=["Customer","Customer.Contact"]&timeStamp=><,2018-12-21T00:00:00+,2018-12-21T23:59:00
The time came through at 19:55:24 instead of 13:55:24 which would be around 1:55 pm Central time so guessing the light speed server is 7 hours ahead of me?
Was reading on this page to apply an offset. I am not sure how to add this to this URl
Any help by passing an example URL with US Central Time with the link I have below back would be very much appreciated.
https://developers.lightspeedhq.com/retail/introduction/parameters/
https://api.lightspeedapp.com/API/Account/{{Account_ID}}/Sale.json?load_relations=["Customer","Customer.Contact"]&timeStamp=><,2018-12-21T00:00:00+,2018-12-21T23:59:00
14 comments
https://api.lightspeedapp.com/API/Account/{{Account_ID}}/Sale.json?load_relations=["Customer","Customer.Contact"]&timeStamp=><,2018-12-27T20:33:08.946Z+,2018-12-27T20:38:08.946Z&offset=UTC−06:00
https://api.lightspeedapp.com/API/Account/{{Account_ID}}/Sale.json?load_relations=["Customer","Customer.Contact"]&timeStamp=><,2018-12-27T22:04:17.863Z-6:00+,2018-12-27T22:09:17.865Z-6:00
Time is 22 and should say 16 I believe for 4 pm Central
I guess when there is no sale you get a data output like so? and you mentioned something about only allowing some 100 limit. Does this mean it will only pull 100 sales from each pull? Just want to make sure I don't run into any hiccups later or hit that bucket limit or anything.
Thx for your input.
It's a shame that pushing new records into LS Retail via the API won't allow batch recordsets like that. Those pushes are typically one record at a time. Which makes importing data into LS Retail a very expensive operation. Worst case is one record push every 10 seconds I believe due to API throttling.
This is a 23 location supplement store so every 5 minutes should be sufficient. Can't imagine there would be 100 sales every 5 minutes but want to send customers other data fairly quickly after leaving.
Thanks for the incite on that.
Can you refresh me on how that bucket limit works though. If you are trying to stay clear of the max 100 record limit that might get you in trouble with your bucket limit correct? Surely this is only the case for 100's of stores with a boatload of sales hopefully lol.
You need to revisit the rate limits and the parameters in the API documentation.
These will show you how you can determine how long to wait before your next API call, as well as how you can specify which page you are requesting. With your example of 500 sales per day, that's only 5 API calls. Which should take only a minute or so to pull pages and parse, even with loaded relations!
Well I guess since I have it pulling every 5 minutes and I am dynamically changing the time to only pull sales within the last 5 minutes then I should not have to worry about getting any second pages unless they have over 100 sales within any 5 min period.
If that were the case I guess I would have to set an if state of If over 100 records then start a new request with a new https request with the offset added in the URL?
Think that's what you are saying.
GET https://api.merchantos.com/API/Account/{AccountId}/Sale.json?archived=false&load_relations=[%22SaleLines%22,%22SaleLines.Discount%22,%22SaleLines.Note%22,%22SaleLines.Item%22,%22SalePayments%22,%22SalePayments.PaymentType%22,%22SalePayments.SaleAccounts%22,%22Customer%22,%22Customer.Contact%22,%22Discount%22]&offset=0&limit=100&timeStamp=%3E%3C,2018-12-27T11:00:00-5:00,2018-12-27T12:00:00-5:00
And here would be the second page pull:
GET https://api.merchantos.com/API/Account/{AccountId}/Sale.json?archived=false&load_relations=[%22SaleLines%22,%22SaleLines.Discount%22,%22SaleLines.Note%22,%22SaleLines.Item%22,%22SalePayments%22,%22SalePayments.PaymentType%22,%22SalePayments.SaleAccounts%22,%22Customer%22,%22Customer.Contact%22,%22Discount%22]&offset=100&limit=100&timeStamp=%3E%3C,2018-12-27T11:00:00-5:00,2018-12-27T12:00:00-5:00
The first (default) pull is offset=0 to skip no records. Once I see that there are 105 sales in that first response, I would then instruct my second pull to have offset=100 to skip over the first 100 records. So my second pull would grab records 101 through 105 and complete the routine.
Any code you have should evaluate the @attributes, set a variable to the total record count, and then you can perform simple ceiling math to determine the iteration block of how many offsets you will need to specify in separate API calls...
Guess could just set up a router with a filter if offset = 1 then the https request for that would be offset=100
if offset response = 2 https request offset=200 and so on... This would poll 201 through 300 correct.
Then once you know your average daily sales... Throw a couple extra request on there just in case which would cover another 200 over your average...
https://api.lightspeedapp.com/API/Account/{{Account_ID}}/Sale.json?load_relations=["Customer","Customer.Contact"]&timeStamp=><,2018-12-27T22:04:17.863Z-6:00+,2018-12-27T22:09:17.865Z-6:00