question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

I think lru_cache with suitable default memory limit could be good idea.

Let’s assume, user is working with bare fit/trace objects and does multiple different /same plots, removing them and redoing them again.

Some functions are in really heavy use, e.g. transforming object to Inference object, calculation of summary values.

Also probably some plots could benefit from lru_cache, so “interactive” creation and destroying would be faster.

We would also need a setting to turn it off and change the memory limits.

https://docs.python.org/3/library/functools.html

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:8 (8 by maintainers)

github_iconTop GitHub Comments

2reactions
MFreidankcommented, Oct 1, 2018

I have started to work on this. I will soon open a PR to facilitate talking about design decisions for this feature.

0reactions
ahartikainencommented, Nov 19, 2022

I think lru_cache is not something we really need. Usually fjnction is called once per data and mcmc data changes from sample to sample

Read more comments on GitHub >

github_iconTop Results From Across the Web

Caching in Python Using the LRU Cache Strategy - Real Python
A cache implemented using the LRU strategy organizes its items in order of use. Every time you access an entry, the LRU algorithm...
Read more >
LRU Cache Implementation - GeeksforGeeks
We use two data structures to implement an LRU Cache. Queue is implemented using a doubly-linked list. The maximum size of the queue...
Read more >
LruCache - Android Developers
Google uses cookies to deliver its services, to personalize ads, and to analyze traffic. You can adjust your privacy controls anytime in your...
Read more >
How to Implement LRU Cache in Java - Baeldung
A guide to implementing an LRU cache in Java. ... Used (LRU) cache is a cache eviction algorithm that organizes elements in order...
Read more >
Implement LRU Cache - Educative.io
Implement a Least Recently Used (LRU) Cache. ... To implement an LRU cache we use two data structures: a hashmap and a doubly...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found