Server would occupy a lot of memory when method is not async
See original GitHub issueI just read the source code of starlette, and I think I found reason why it’s occupying so much memory
The problem is in starlette.routing.py
methodrequest_response()
```python
def request_response(func: typing.Callable) -> ASGIApp:
"""
Takes a function or coroutine `func(request) -> response`,
and returns an ASGI application.
"""
is_coroutine = asyncio.iscoroutinefunction(func)
async def app(scope: Scope, receive: Receive, send: Send) -> None:
request = Request(scope, receive=receive, send=send)
if is_coroutine:
response = await func(request)
else:
response = await run_in_threadpool(func, request)
await response(scope, receive, send)
return app
async def run_in_threadpool(
func: typing.Callable[..., T], *args: typing.Any, **kwargs: typing.Any
) -> T:
loop = asyncio.get_event_loop()
if contextvars is not None: # pragma: no cover
# Ensure we run in the same context
child = functools.partial(func, *args, **kwargs)
context = contextvars.copy_context()
func = context.run
args = (child,)
elif kwargs: # pragma: no cover
# loop.run_in_executor doesn't accept 'kwargs', so bind them in here
func = functools.partial(func, **kwargs)
return await loop.run_in_executor(None, func, *args)
My rest interface is not async, it will run in loop.run_in_executor
, but starlette do not specify the executor here, so the default thread pool size should be os.cpu_count() * 5, my test machine has 40 cpus so I should have 200 threads in the pool. And after each request it will not release the object in these threads, unless the thread be reused by next request, which will occupy a a lot of memory. Especially when I wrap a large deep learning model in the server.
My question is could we make the thread pool size configurable?
Issue Analytics
- State:
- Created 3 years ago
- Reactions:5
- Comments:10 (7 by maintainers)
Top Results From Across the Web
async or not async method - Stack Overflow
I am wondering if i should not declare all my methods with "async Task". ... If the method does do operations that use...
Read more >Why async version of simple program use to 50x more memory
Typically, when a thread is spawned, a pretty sizeable amount of virtual memory is mapped for it, to allow for its stack to...
Read more >The performance characteristics of async methods in C# ...
If the async method completes synchronously the following memory overhead will occur: for async Task methods there is no overhead, for async ......
Read more >When to Use (and Not to Use) Asynchronous Programming
Asynchronous programming is a form of parallel programming that allows a unit of work to run separately from the primary application thread.
Read more >await - JavaScript - MDN Web Docs
Top level await. You can use the await keyword on its own (outside of an async function) at the top level of a...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
It’s worth mentioning that you can modify the default capacity limiter on anyio. 👍
Yup you’re right, it can be modified, I thought you meant replace 😅