Least Recently Used (LRU) Eviction Strategy
See original GitHub issueIssue created from https://github.com/TurnerSoftware/CacheTower/issues/53#issuecomment-628631354
Currently the only eviction strategy is one of absolute time. If there was a method to define a “size” of cache, it would be worth exploring a “Least Recently Used” system to ensure that the hottest items stay in the cache.
Thoughts:
- Implement an extension (eg.
AutoEvict
/FixedCapacity
) that manages the more advanced eviction strategy - New extension tracks number of items in the cache, cache keys, expiry dates and last access dates
- Could use a LinkedList to track most recently used or a timestamp directly
- Probably easier with a
ConcurrentDictionary
with cache keys as the key and a custom type as the value
- Once a criteria has been hit (eg. X number of items in the cache), use what it knows to evict locally
Challenges:
- Keeping the extension up-to-date with cache data
- Can use the various extension hooks but still is a bit fiddly
- Only evicting locally
- Can do what I already do for
RedisRemoteEvictionExtension
(pass the cache layers in directly) but it is pretty cumbersome
- Can do what I already do for
Notes: To handle a distributed LRU would be excessively complex and likely not useful. Would mean that every access to a cache item (including local) would have to then broadcast that access back through the cache system and whatever distributed backends are used.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:5 (3 by maintainers)
Top Results From Across the Web
LRU Cache Data Structure
A Least Recently Used (LRU) Cache is a cache data structure that's often implemented by pairing a doubly linked list with a hash...
Read more >Implement Least Recently Used (LRU) Cache
The Least Recently Used (LRU) cache is a popular caching strategy that discards the least recently used items first to make room for...
Read more >Cache replacement policies
In computing, cache replacement policies are optimizing instructions, or algorithms, ... Caching improves performance by keeping recent or often-used data items ...
Read more >Cache Eviction Policies
Cache eviction policies are specific algorithms for how to manage data in a cache. · Least recently used (LRU) eviction policy removes the...
Read more >LRU Cache with Implementations
The Least Recently Used (LRU) is one of those algorithms. As the name suggests when the cache memory is full, LRU picks the...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
That would be fantastic. LRU and bounded size are two requirements for my usage scenario.
(For my own reference)
An interesting LFU cache implementation: https://github.com/papers-we-love/papers-we-love/blob/master/caching/a-constant-algorithm-for-implementing-the-lfu-cache-eviction-scheme.pdf
Not sure how applicable it is to this given its a LFU not an LRU plus it is a storage mechanism, not an eviction strategy. There still might be some useful pieces of information to extract from it.