question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Wait for Rate Limiter Resources to Become Available

See original GitHub issue

I’m trying to migrate a small script from limiter and I’m running into a few issues.

My script calls an API that limits requests to 3 per second, so my script needs to stay under that. In order to do so, I’m trying to write a simple batching function that can be called and will automatically limit requests. Ideally, this function will also process its callbacks in order.

import { RateLimiterMemory, RateLimiterQueue } from "rate-limiter-flexible";

function mapWithRateLimiter(array, callback) {
  const memoryLimiter = new RateLimiterMemory({ points: 3, duration: 1 });
  const queueLimiter = new RateLimiterQueue(memoryLimiter, {});

  return Promise.all(array.map(async (item, index) => {
    await queueLimiter.removeTokens(1);
    return callback(item, index);
  }));
}

Once this function gets called with the fourth item, the removeTokens function rejects. While this makes sense for an API, in my use case I really want it to wait and resolve when a token becomes available. I was able to achieve this with limiter with the following function:

export async function mapWithRateLimiter(array, callback) {
  const limiter = new RateLimiter({
    tokensPerInterval: 3,
    interval: "second"
  });

  return Promise.all(array.map(async (item, index) => {
    await limiter.removeTokens(1);
    return callback(item, index);
  }));
}

Is this possible with rate-limiter-flexible?

Thanks!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
LandonSchroppcommented, Nov 9, 2021

Ah, that makes sense! So as a workaround I can see the maxQueueSize. Thanks!

0reactions
animircommented, Nov 9, 2021

@LandonSchropp I see, thanks. You initialise new RateLimiterQueue(memoryLimiter, {}) with empty options object, as a result maxQueueSize option is set to undefined. I think, it should be set to default value in this case, but it isn’t.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Rate-limiting strategies and techniques - Google Cloud
The most common reason for rate limiting is to improve the availability of API-based services by avoiding resource starvation.
Read more >
Announcing Rate Limiting for .NET - Microsoft Developer Blogs
RateLimiter contains Acquire and WaitAsync as the core methods for trying to gain permits for a resource that is being protected.
Read more >
Implementing Rate Limiting with Resilience4j - Reflectoring
A deep dive into the Resilience4j ratelimiter module. ... If no permission is available at the end of the wait time, the RateLimiter...
Read more >
RateLimiter (Guava: Google Core Libraries for Java 19.0 API)
Each acquire() blocks if necessary until a permit is available, and then takes it. Once acquired, permits need not be released. Rate limiters...
Read more >
Troubleshooting Cloudflare Rate Limiting
Rate Limiting is designed to limit surges in traffic that exceed a user-defined rate. The system is not designed to allow a precise...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found