Run async operations on separate threads
See original GitHub issueFirst Check
- I added a very descriptive title to this issue.
- I used the GitHub search to find a similar issue and didn’t find it.
- I searched the FastAPI documentation, with the integrated search.
- I already searched in Google “How to X in FastAPI” and didn’t find any information.
- I already read and followed all the tutorial in the docs and didn’t find an answer.
- I already checked if it is not related to FastAPI but to Pydantic.
- I already checked if it is not related to FastAPI but to Swagger UI.
- I already checked if it is not related to FastAPI but to ReDoc.
Commit to Help
- I commit to help with one of those options 👆
Example Code
# some_router.py
@router.get("/")
async def get_some_things():
return await some_long_taking_method()
Description
Hey, I wanted to ask a question that has been buzzing me all over these days. It’s about FastAPI and it behavior with threads and async/not async functions. As you can see in the sample code, I have a router with an async endpoint that runs an operation that takes for example 15 seconds. After doing some research I found this: “Thus, def (sync) routes run in a separate thread from a threadpool, or, in other words, the server processes the requests concurrently, whereas async def routes run on the main (single) thread, i.e., the server processes the requests sequentially”. So, to summarize, what I achieve currently is an endpoint that that takes some time to process and if 2 users hit it at the same time, one has to wait until the other finishes up… and that only speaking with only 2 users. Is there any way to wrap these endpoints to run on another thread? I tried using loads of things that some people suggested but none of them worked, so I anyone can give me a hand with it I would really appreciate it! Thanks in advance.
Operating System
Windows
Operating System Details
No response
FastAPI Version
0.79
Python Version
3.9.13
Additional Context
No response
Issue Analytics
- State:
- Created a year ago
- Comments:18 (7 by maintainers)
Top GitHub Comments
To start off, this issue makes me proud being part of this community! Great help from all over the place, that is really cool to see!
I had a brief conversation on LinkedIn with @gonzacastro , and I now understand what he is trying to achieve. I created an example that demonstrates that the FastAPI can still be requested, while other requests are still waiting for the 3rd party API to respond.
I added some verbosity for clarity purposes. If I call the
get_response
endpoint 3 times rapidly, the logs look like this:Note how request 1 took longer than request 2, and although request 2 came in later than request 1, it returned it response earlier.
You just have to change the
asyncio.sleep()
into a asynchronous api call (usehttpx
) and your good to go. Hope this answers your question!@gonzacastro My apologies, I hadn’t had my morning coffee yet. I meant that when you have synchronous blocking code in an async call stack, it will result in blocking the event loop (not thread, like i said earlier). That is, in my opinion, a design flaw of the software. You can either:
If your code (and I mean the 15-second-runtime-code) is not IO bound but CPU bound (e.g. you are performing some crazy calculation), then again I would recommend to reconsider your design and make the entire request-response loop more asynchronisous. With that, I mean going from
request -> calculation -> return response
torequest -> put work in queue and return response
so your client can check if the result is available later on. However, if it is IO bound (the 15 second work), you should either make the IO bound work async, or use sync all the way. Mixing them both in a IO heavy situation (at least in FastAPI) is a bad idea and should be avoided.