question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[FEATURE] Provide timing data for things FastAPI does outside of user code

See original GitHub issue

Is your feature request related to a problem? Please describe. I currently find myself using FastAPI as an API frontend for a database running queries of varying levels of complexity that usually return fairly large quantities of data. The problem I have is that, when attempting to figure out why an API request is taking time to respond, there’s a lot going on outside of what fencing the code inside your route with the usual time_start = time.time(), time_delta = time.time() - time_start won’t tell you.

In your average FastAPI application, there’s 4 main causes for slowdowns:

  • Endpoint code running (app)
  • Waiting for the database (io)
  • Data validation (pydantic)
  • Data serialization (json/ujson)

Of those 4, only the first two are actually part of the user code, the other two are handled behind the scenes by FastAPI. If I’m testing a route that’s returning 50MBs of JSON (for whatever reason), it’s actually rather tricky to determine whether switching from json to ujson or orjson will have any performance benefit or if the slowdown is coming from Pydantic choking on the sheer amount of data being fed to it or the database being slow.

Describe the solution you’d like Given FastAPI is performing operations outside of what the user can observe or measure, it should probably expose timing data for those things in some way (especially validation and serialization, since I doubt routing is going to factor a lot in the processing time). I don’t know how the implementation should go, though, given any middleware is only going to receive a Starlette Response object and FastAPI probably shouldn’t be doing this sort of thing automatically anymore than it does with CORS. The Response object could probably be extended in some way to contain some sort of timing dict, though that’s sure to cause all sorts of compatibility issues, so I don’t know whether that can be done without some upstream work with Starlette.

Describe alternatives you’ve considered I was initially trying to write a middleware for this, but python’s own cProfile is limited to text file export, so processing the data in Python becomes an extra hurdle, if the format is even stable enough for that. Without using a profiler, ASGI middlewares simply don’t have access to timing information for the app besides the total time spent awaiting the callable.

Additional context The recently introduced Server-Timing HTTP header seems like a perfect way to push coarse profiling information to make debugging those cases somewhat easier, especially given how it’s supported in the Chrome devtools and that Firefox support for it is just around the corner. That’s probably beyond the scope of that issue, though.

Issue Analytics

  • State:open
  • Created 4 years ago
  • Reactions:18
  • Comments:54 (35 by maintainers)

github_iconTop GitHub Comments

13reactions
sumerccommented, Nov 26, 2019

Latest update:

Implementation is mostly finished, I am currently extensively writing more and more tests to validate corner cases, but initial tests seems pretty promising and somewhat stable.

A new API is added to yappi: set_tag_callback. You can basically any function, code together and filter by its tag (multithreaded or not). I am hoping that you can use this new API to make your own coroutine/context/request…etc aware profiling.

coroutine aware wall-time profiling support is also implemented. You can see the overall wall-time spent per-coroutine with correct call count(normally YIELDs are seen another callcount which is the default behaviour of cProfile, too.)

I would be more than happy to hear your feedback on this latest work. The branch name is coroutine-profiling. Here is the link: https://github.com/sumerc/yappi/tree/coroutine-profiling

8reactions
sm-Fifteencommented, Dec 12, 2019

This is fantastic! Way to set Yappi appart from any other python profiler, even the official ones. I also need to thank you for all the time and effort you’ve put into supporting this idea I was just throwing out there as a possible feature to integrate into the framework. I didn’t even think it was possible to profile coroutines in Python, and as it turns out, I was right…

…but now, only a month later, thanks to you, it’s now possible. So thank you, @sumerc! And here’s to yappi becoming the go-to profiler for all async programs!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Deployments Concepts - FastAPI
FastAPI framework, high performance, easy to learn, fast to code, ready for production.
Read more >
Settings and Environment Variables - FastAPI
As environment variables can be set outside of the code, but can be read by the code, and don't have to be stored...
Read more >
FastAPI
FastAPI framework, high performance, easy to learn, fast to code, ready for production.
Read more >
Concurrency and async / await - FastAPI
Note: You can mix def and async def in your path operation functions as much as you need and define each one using...
Read more >
Bigger Applications - Multiple Files - FastAPI
Let's say the file dedicated to handling just users is the submodule at ... This codes lives in the module app.routers.items , the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found