Is it safe to use `functools.lru_cache` in FastAPI app?
See original GitHub issueIn an example in the documentation, a functools.lru_cache
is used for caching (in memory) an object that is shared across many requests, in order to avoid reading from disk once per user request.
The question is: in a more general setting where the cached function called has several arguments and the corresponding endpoint receives many concurrent requests, is it recommended to use functools.lru_cache
as proposed within a FastAPI app even though it is not thread-safe?
from fastapi import FastAPI
import functools
app = FastAPI()
@functools.lru_cache
def make_slow_query(date: str) -> str:
data = ...
return data
@app.get("/data")
def get_data(date: str):
return {
"data": make_slow_query(date),
...
}
Edit
Is it the usage of Depends
what makes it “valid”? or could it be implemented without Depends
(as above)?
from fastapi import FastAPI, Depends
import functools
app = FastAPI()
@functools.lru_cache
def make_slow_query(date: str) -> str:
data = ...
return data
@app.get("/data")
def get_data(queried_data: str = Depends(make_slow_query)):
return {
"data": queried_data,
...
}
Thank you in advance.
Issue Analytics
- State:
- Created 2 years ago
- Reactions:8
- Comments:15 (3 by maintainers)
Top Results From Across the Web
Python functools LRU cache shared between multiple processes
Let's say factorial returns a Pydantic object or deeply nested dict. I do not want to use Redis for caching because nested dict/list ......
Read more >Settings and Environment Variables - FastAPI
@lru_cache() modifies the function it decorates to return the same value that was returned the first time, instead of computing it again, executing...
Read more >Easy Python speed wins with functools.lru_cache
Using requests to get three match days without caching takes on average 171ms running locally on my computer. This isn't bad, but we...
Read more >Caching in Python Using the LRU Cache Strategy - Real Python
Python's functools module comes with the @lru_cache decorator, which gives you the ability to cache the result of your functions using the Least...
Read more >Amazing Functools Features in Python | by Vivek K. Singh
Take a look at how using lru cache made our code run faster. Python saved the function's cache and retrieved the cached value,...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
This is not right, if your function is not a coroutine, FastAPI will run that request inside a threadpool.
asyncio is not safe for direct usage of
lru_cache
, you may want to use something likeaiocache
for it.Yes it is possible, if your function is not a coroutine (
async def
), there is multi-threading happening, i explained quite deeply how it works in here -> https://github.com/tiangolo/fastapi/issues/2776#issuecomment-776659392If it works locally(testing with simple benchmarks), it SHOULD work in heavy workload as well.