question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cached values may get changed for Memory backend

See original GitHub issue

Migrated issue, originally created by Yi (qinsoon)

For example,

from dogpile.cache import make_region

region = make_region().configure('dogpile.cache.memory')

@region.cache_on_arguments()
def foo():
	return [1, 2, 3]

def bar():
	a = foo()
	a += [4]
	return a

print bar()
print bar()

The first call to bar() returns [1, 2, 3, 4] as expected. However, the [4] gets appended to the cached value (of type list) in the dictionary for the cache. And the next call to bar() returns [1, 2, 3, 4, 4], which seems unexpected.

This would happen if the cached value is an object, so that the assignment in bar() is assignment by reference.

Though this can be fixed by the user to clone the value, I think the memory backend may return a clone of the cached value instead of directly returning the value, which gives a chance that the cached value may get modified by the user. Also, from the perspective of abstraction, the user should not see a cached value being changed.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:11 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
sqlalchemy-botcommented, Nov 24, 2018

Michael Bayer (zzzeek) wrote:

however, adding a “clone” argument to the backend where the user can just provide a cloning function for the kinds of objects they want to store, that would allow for a memory backend that’s a lot faster than the pickle version. can make it emit a warning if no clone function is given. folks who don’t want to clone can specify it as a no-op so at least they know what they’re getting into.

0reactions
zzzeekcommented, Nov 3, 2020

this option can be handled more effectively than the “pickle” version using the new custom serializer argument coming in #191

Read more comments on GitHub >

github_iconTop Results From Across the Web

Modern Caching 101: What Is In-Memory Cache, When and ...
Some caches are based on a key-value store. This essentially means you can provide the data store a unique key and it will...
Read more >
What is Caching and How it Works - Amazon AWS
The data in a cache is generally stored in fast access hardware such as RAM (Random-access memory) and may also be used in...
Read more >
Setting in memory, only changed values in database - RFCs
My proposal is to keep only name, category and value in database and rest would live in code and would be loaded in...
Read more >
Cache in-memory in ASP.NET Core - Microsoft Learn
Caching works best with data that changes infrequently and is expensive to generate. Caching makes a copy of data that can be returned...
Read more >
On demand recomputing: the Memory class - Joblib
The cache for this particular value can be cleared using the clear method. Its invocation causes the stored value to be erased from...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found