Use lru_cache
See original GitHub issueI think lru_cache
with suitable default memory limit could be good idea.
Let’s assume, user is working with bare fit/trace objects and does multiple different /same plots, removing them and redoing them again.
Some functions are in really heavy use, e.g. transforming object to Inference object, calculation of summary values.
Also probably some plots could benefit from lru_cache
, so “interactive” creation and destroying would be faster.
We would also need a setting to turn it off and change the memory limits.
Issue Analytics
- State:
- Created 5 years ago
- Reactions:1
- Comments:8 (8 by maintainers)
Top Results From Across the Web
Caching in Python Using the LRU Cache Strategy - Real Python
A cache implemented using the LRU strategy organizes its items in order of use. Every time you access an entry, the LRU algorithm...
Read more >LRU Cache Implementation - GeeksforGeeks
We use two data structures to implement an LRU Cache. Queue is implemented using a doubly-linked list. The maximum size of the queue...
Read more >LruCache - Android Developers
Google uses cookies to deliver its services, to personalize ads, and to analyze traffic. You can adjust your privacy controls anytime in your...
Read more >How to Implement LRU Cache in Java - Baeldung
A guide to implementing an LRU cache in Java. ... Used (LRU) cache is a cache eviction algorithm that organizes elements in order...
Read more >Implement LRU Cache - Educative.io
Implement a Least Recently Used (LRU) Cache. ... To implement an LRU cache we use two data structures: a hashmap and a doubly...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I have started to work on this. I will soon open a PR to facilitate talking about design decisions for this feature.
I think lru_cache is not something we really need. Usually fjnction is called once per data and mcmc data changes from sample to sample