Does it make sense to use ThreadPoolExecutor in fastAPI
See original GitHub issueHi All,
I have a super simple app that has only one endpoint. This endpoint loads data from a database and this is parallelised using a ThreadPoolExecutor. For example:
@app.get('/load_from_db')
def load_from_db():
....
with concurrent.futures.ThreadPoolExecutor() as executor:
for dataset in datasets:
executor.submit(dataset.load_from_database)
....
Now I changed from Flask to FastAPI. I have declared my function as def load_from_db
such that it is executed in a different thread-pool and does not block the main thread.
OUT-DATED As a result my service now is 10x slower than using Flasks? I tried to set max_workers=5 but did no really help. What is the reason?
EDIT: I created again some thorough test cases and it turns out fastAPI is not slower for me. Let’s change the question to:
Is it safe & does it make sense to use ThreadPoolExecutor in fastAPI?
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (1 by maintainers)
Top Results From Across the Web
Is having a concurrent.futures.ThreadPoolExecutor call ...
I need to use the concurrent.futures.ThreadPoolExecutor part of the code in a FastAPI endpoint. My concern is the impact of the number of...
Read more >tiangolo/fastapi - Gitter
By default, it uses a ThreadPoolExecutor. By default, it has "number of processors on the machine, multiplied by 5". So, it's actually a...
Read more >Concurrency and async / await - FastAPI
You can only use await inside of functions created with async def . ... That's why it makes a lot of sense to...
Read more >Concurrent.futures in FastAPI : r/learnpython - Reddit
I have a FastAPI application where one of the API needs to perform the tasks in parallel. With the help of threadexecutorpool we...
Read more >How to use ThreadPoolExecutor in Python
First, create an instance of ThreadPoolExecutor . Next, we have to declare the number of worker threads. The default value of max_workers is...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@hellocoldworld is correct. I will add that in some cases you do actually need a thread pool, if you want to mix async with blocking operations. In that case you can use Starlette’s
run_in_threadpool
:This is actually what FastAPI uses internally if you define your handler as
def
(and notasync def
).Yes, it’s slightly slower. If you use
ThreadPoolExecuter
in adef
function in FastAPI, what happens is:def
function might block, it calls it withrun_in_threadpool
, which runs it in a thread 2.ThreadPoolExecuter
, this creates thread 3.ThreadPoolExecuter
.This means for the handler to complete you need 4 thread switches (1->2->3->2->1).
If you use an
async def
there are 0 thread switches, and if you useasync def
withrun_in_threadpool
there are 2. Since each thread switch adds overhead, usingThreadPoolExecuter
inside adef
function will probably be slower.