question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[QUESTION] cannot increase throughput via async def

See original GitHub issue

Description

How can I increase the throughput of fastapi?

I have started a fastapi server. And I have a api that will call google’s api and return the token from google.

I found that if I use “async def”, the throughput is slower than normal “def” The log will output like this, more like sequential to me:

DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None

If I use normal “def” function then the throughput will better.

DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    Starting new HTTPS connection (1): oauth2.googleapis.com:443
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None
DEBUG:    https://oauth2.googleapis.com:443 "POST /token HTTP/1.1" 200 None

And I want to increase the throughput that but don’t know how, I tried to configure worker numbers, but the throughput seems remain the same.

Thank you very much.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

6reactions
sm-Fifteencommented, Nov 26, 2019

The FastAPI documentation mentions that an async def route or dependency will be called on the server’s event loop directly, with the implicit requirement that it must not block and should spend very little time running if possible. A def route/dependency, on the other hand, will be run in a separate thread, which is intended to let you call sync libraries like requests or SQLAlchemy without having to do any extra work.

This means that whether a route is marked as async def or def has a number of subtle and sometimes counterintuitive effects:

def common_parameters(q: str = None, skip: int = 0, limit: int = 100):
    return {"q": q, "skip": skip, "limit": limit}

@app.get("/items/")
def read_items(commons: dict = Depends(common_parameters)):
    return commons

Neither the route or the dependency contain await, so shouldn’t they be def? No, because FastAPI uses def to determine which functions are supposed to be spawned inside a thread to avoid blocking the server’s main thread. It will still work, but it will spawn lots and lots of threads to perform trivial operations, and that could potentially hurt performance.

Likewise (and this is the issue you’re probably running into):

@app.get("/")
async def root():
    requests.get('http://slowsite.example/')
    return {'result': 'done'}

Even though requests.get is a network operation, requests only does blocking I/O, so it will pause until it has received a response. That’s a problem because async def tells FastAPI that this function doesn’t have to run in its own thread and can instead be run on the server’s event loop directly. If the request takes 10 seconds to respond, that’s 10 seconds during which your server won’t be responsive because its main thread/event loop is still busy waiting for that operation (which was supposed to be as short as possible and non-blocking) to complete.

Assuming I guessed correctly, you should either keep the route that calls google’s API as a def function (so your request calls happen in separate threads) or swap your HTTP library for an async alternative, like httpx.

3reactions
steinitzucommented, Nov 25, 2019

How are you calling the google API? Are you using a blocking http library (e.g. “requests”)?

Read more comments on GitHub >

github_iconTop Results From Across the Web

async method for performance / throughput of web services
Ideally, you should be using async / await . Here is the proper way to adjust the code. public virtual async Task<ConstituentResponse> PostConstituenAsync( ......
Read more >
Concurrency and async / await - FastAPI
You can only use await inside of functions created with async def . If you are using a third party library that communicates...
Read more >
Async/Await - Best Practices in Asynchronous Programming
The best solution to this problem is to allow async code to grow naturally through the codebase. If you follow this solution, you'll...
Read more >
Async IO in Python: A Complete Walkthrough
It has been said in other words that async IO gives a feeling of concurrency despite using a single thread in a single...
Read more >
async test patterns for Pytest - Anthony Shaw
The problem. Asyncio is brilliant for improving performance of applications that benefit ... import pytest @pytest.mark.asyncio async def ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found