question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Distribute cache concurrency issue

See original GitHub issue

Hi @stefanprodan , When deploying a lots of api services(use distributed cache to store counter) behind load balance, every instance need to get conuter and use process’ lock to increse count, as the code below

 using (await AsyncLock.WriterLockAsync(counterId).ConfigureAwait(false))
            {
                var entry = await _counterStore.GetAsync(counterId, cancellationToken);

                if (entry.HasValue)
                {
                    // entry has not expired
                    if (entry.Value.Timestamp + rule.PeriodTimespan.Value >= DateTime.UtcNow)
                    {
                        // increment request count
                        var totalCount = entry.Value.Count + _config.RateIncrementer?.Invoke() ?? 1;

                        // deep copy
                        counter = new RateLimitCounter
                        {
                            Timestamp = entry.Value.Timestamp,
                            Count = totalCount
                        };
                    }
                }

                // stores: id (string) - timestamp (datetime) - total_requests (long)
                await _counterStore.SetAsync(counterId, counter, rule.PeriodTimespan.Value, cancellationToken);
            }

I think it has concurrency issues, counter will not work correctly when a lots of requests come in. Every instance’s counter count by themself then save to the cache store, the last one will cover the previous one.

Why not use Token Bucket algorithm to achieve this feature.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:34 (4 by maintainers)

github_iconTop GitHub Comments

7reactions
tomaustin700commented, Jun 13, 2020

Hey @stefanprodan, what do you think to the changes proposed by @simonhaines? I’m wanting to use this package but holding off until the concurrency issues are sorted.

5reactions
simonhainescommented, Mar 5, 2020

The IDistributedCache service used to implement the distributed rate-limiting counter does not provide enough concurrency guarantees to resolve this race condition, and it likely never will. An atomic increment operation is needed, such as Redis’ INCR command.

We resolved this issue by refactoring the IRateLimitCounterStore and backing it with a Redis cache, see the repo here. This also reduces per-request latency by eliminating the read/update/write operations that are the core of this issue (see here).

For each rate limit rule, time is divided into intervals that are the length of the rule’s period. Requests are resolved to an interval, and the interval’s counter is incremented. This is a slight change in semantics to the original implementation, but works in our use-case.

This approach requires a dependency on StackExchange.Redis to access the INCR command and key expiry, and the IConnectionMultiplexer service needs to be injected at startup.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Dealing with concurrency issues when caching for high- ...
To prevent this, first you have to set soft and hard expiration date. Lets say the hard expiration date is 1 day, and...
Read more >
Concurrency control in distributed caching
Concurrency control deals with issues involved with allowing multiple end users simultaneous access to shared entities, such as objects or data records.
Read more >
Design | How high concurrency is handled in cache?
Recently in one of the interviews, I was asked how a cache(we are discussing redis) handles thousands of requests (both READ and WRITE)...
Read more >
Distributed Caching — The Only Guide You'll Ever Need
This write-up is an in-depth guide on Distributed Cache. It does cover all the frequently asked questions about it such as What is...
Read more >
How to handle concurrent updates for the same record in a ...
Generate some data based on the message payload; Cache the data on Redis; Send the data to another service. My issue is when...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found