Rate Limit

jtellierjtellier Member Posts: 44

I have a process that makes a service call every 5 seconds while it processes a queue of required changes. I get:

"httpCode":"429","httpMessage":"Too Many Requests","message":"Rate Limit Exceeded: Please decrease your request volume. To eliminate this message, keep your request rate below 1 requests per second."

...even though it is one call every 5 seconds. How can we handle this better or set something? It is imperative that these operations occur.

9 comments

  • gregaricangregarican Member Posts: 593 

    If it's a GET request then usually the standard is 1 call per second. If it's a PUT or POST request then the standard is around 1 call every 10 seconds if I'm not mistaken. You can always inspect the response headers, which will tell you how your calls are affecting the leaky bucket stats.

  • jtellierjtellier Member Posts: 44

    I changed it from 1 every 5 seconds to 1 PUT every 10 seconds and it seems to work, so I assume you have some timing bugs on your end counting calls since one a second should work..... Since we have 30,000 calls queued up... but the headers do not show any limiting data.

  • VintageWineGuyVintageWineGuy Member Posts: 112 ✭

    I assume you have reviewed this?

    It sounds like you are filling the bucket too fast.

  • jtellierjtellier Member Posts: 44

    Yes, I see it has 1 PUT per 10 seconds.... Just doesn't seem like a viable queue for any productive integrations but we will keep our systems 1 way wherever possible.

  • VintageWineGuyVintageWineGuy Member Posts: 112 ✭

    Yeah, bulk adds are not fast. I run a lot of mine at night when the bucket size and drip rate increases. And I don't design anything bulk that requires real time.

  • jtellierjtellier Member Posts: 44

    I changed it from 1 every 5 seconds to 1 PUT every 10 seconds and it seems to work, so I assume you have some timing bugs on your end counting calls since one a second should work..... Since we have 30,000 calls queued up, the operation that should take about 30 seconds is taking 4 days.... This can't be right.

  • gregaricangregarican Member Posts: 593 

    @jtellier you have definitely stumbled upon a limitation in terms of integrating with the LS Retail platform. Any smaller client with a couple hundred customers, a couple hundred SKU's, and a couple dozen daily sales will usually encounter no issues with API integrations. And that frankly comprises most of the client base.

    Larger clients are the outliers. To give you an idea about scalability, every night I pull out that day's sales transactions and the current on-hand product listing. With product tags and other related data sets. This routine currently takes about 3 hours for almost 9,000 items. And this involves GET requests that I hard-code for 1 request/second.

    When we initially cutover to LS Retail I had to import in all products. Hard-coding the routine to 1 request every 10 seconds per the rate limits. As the math says, this initial routine took about 30 hours to complete. Since the company was regularly closed for Sunday and Monday I was able to do this behind the scenes with no service interruption.

    If you are looking to sync up records between different systems, I'd suggest performing an initial dump of any/all LS Retail records to a middleware platform. Then you can query LS Retail for just updated records based on their last edited timestamp. Those delta data sets won't be nearly as large, unless we are talking about thousands of sales transactions or something like that. Which would be a good problem to have 😀

    The scalability of this is specifically the reason why we didn't cutover our larger company to the platform. We use the API to plug quite a few gaps and we certainly couldn't do so for a company that has hundreds of thousands of customers, SKU's, etc.

  • jtellierjtellier Member Posts: 44

    Yes, that is what I am doing now almost exactly. Just on initial import we were trying to supplement the product data since we have a much more extensive data set for product information, e.g. if a sku is missing, or description, we could have fed it back into LightSpeed.... With our first integration that left 30,000 calls for sku updates alone. So I ended up just creating a queue and lightspeed can get the data we provide whenever they allow it... days, weeks, etc.

  • gregaricangregarican Member Posts: 593 

    @jtellier If I'm not mistaken I think Lightspeed Retail has an on-boarding team for initial data conversions related to cutovers for new accounts. When we were considering migrating our larger company over to LS Retail our account rep offered this service. A one-time deal where we'd ship them our data and they could push it into LS Retail. At a faster pace than me doing it via the API.

    The only kicker to that is typically we would do this over a weekend timeframe to minimize potential downtime given our operating hours. Well, this LS Retail team didn't work on weekends. Which makes me (as a 25+ year IT drone) a bit jealous...lol.

Sign In or Register to comment.