question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Load balancing / throttling

See original GitHub issue

Promise approach has a great potential to solving the load balancing / throttling problem, which is yet missing. This is a proposal to add such feature to the library.

As an example, use a question + my answer from StackOverflow for massive amounts of queries.

In essence, we have a number of promises that need to be resolved sequentially. Typically, this can be achieved by using promise.all(). However, in cases where the number of promise objects is so large that a single call to promise.all() is impossible, we need to implement a complex workaround, like the one link to which I provided earlier (perhaps not a very good one at that).

What we need is something called promise.page or promise.throttle that would adhere to the idea of splitting one huge chunk of promises into smaller chunks and page through them all till they are all resolved. In addition, we need a flexibility in the resolution strategy:

  • specify that each chunk/page of promises is to be resolved one after another. This is very-very important, because in many cases an attempt to execute a chunk of promises before the previous one is finished will lead to running out of memory.
  • by default, the whole request should reject, if one chunk rejects, but we need an option where it is possible to either just stop or carry on with other chunks of promises.
  • such a method would need to request each next chunk of data through a promise, because typically the data would never reside in memory in its entirety.

It should help greatly if you think if this all as an almost-infinite queue that got terminated and now needs to be processed entirely, fast and in huge bulks.

And you don’t need to think of it as something that deals only with copious amounts of promises. Totally not. Here’s an example:

We have 100 requests to be sent to a particular service in one go. However, the service accepts only up to 10 at a time, so we need to send it in 10 blocks. Promise support for throttling would shine in this case.

P.S. I’ve been tempted to implement it in my own library - pg-promise, but it’s just truly a generic promise task, and not just database-related.

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Comments:18 (10 by maintainers)

github_iconTop GitHub Comments

3reactions
dsernstcommented, Jul 31, 2015

Is there a definitive reason this was closed? It continues to be implemented by hand, even though a more general function would be helpful.

In particular see the approach listed at the end of this answer: http://stackoverflow.com/a/28223454/2348750

Often when people ask about this problem what they actually care about is making a function return a result at most every number of milliseconds or make a function act as a monitor for how often calls are made. This is to throttle the number of calls that are made to a web service that rate-limits. This should be limited at the level of the function that returns the promise. For example:

var queue = Promise.resolve();
function throttle(fn, ms){
    var res = queue.then(function(){ // wait for queue
        return fn(); // call the function
    });
    queue = Promise.delay(ms).return(queue); // make the queue wait
    return res; // return the result
}

This would let you do:

function myApiCall(){
    // returns a promise
}
var api = throttle(myApiCall, 300); // make call at most every 300 ms;

api(); // calls will be sequenced and queued
api(); // calls will be made at most every 300 ms
api(); // just be sure to call this directly, return this to consumers
1reaction
spioncommented, Jul 31, 2015

Its a somewhat specialized feature best implemented as a stand-alone module (independent of promise implementation).

There are many more strategies on how to enqueue things, including limiting to N requests per M seconds (instead of just controlling how often a single request is made), limiting to N concurrent requests and so on. Each of those has a different implementation.

Note: the implementation (from stackoverflow) may not do what the linked stackoverflow question asks unless the calls to the throttled functions are done in series. This one will work regardless:

function throttle(fn, ms) {
  var queue = Promise.resolve()
  return function() {
    var res = queue.then(fn)
    // Wait for either the result, or the previous request to complete + additional ms
    // (whichever one is slower). Don't keep the value.
    queue = Promise.join(res, queue.delay(ms)).return()
    return res
  }
}

Read more comments on GitHub >

github_iconTop Results From Across the Web

Traffic Throttling Mechanisms - Section.io
The layer of infrastructure used to implement throttling would typically be a load balancer or reverse-proxy positioned between the user-agents and the origin ......
Read more >
Service quotas and API throttling limits - AWS Documentation
Elastic Load Balancing API throttling. When you configure an Amazon ECS service to use a load balancer, the target group health checks must...
Read more >
AWS ALB/ELB/NLB Level throttling
A customer is planning for lift and shift from on-prem to AWS. They are currently using F5 load balancer which has the feature....
Read more >
Application Load Balancer and throttling requests ... - Reddit
Application Load Balancer and throttling requests from spammers IP. Hi,. I currently have a set of EC2s that are behind an Application Load ......
Read more >
Rate Shaping and Throttling Options - Avi Networks
For instance, if the virtual service is scaled-out to 3 SE, each SE will throttle after 1Gbps, so effectively there will be 3...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found