Possible deadlock when client disconnect during http.response.body
See original GitHub issueHi, It happens with both h11/httptools protocols and with both asyncio/uvloop loops. I tried with uvicorn 0.6.1 on python 3.7.2, but I can also reproduce with python 3.6.8.
This issue doesn’t occur every time. It seems to happen much less with log-level=debug.
app:
import asyncio
import threading
_active_requests = 0
_lock = threading.Lock()
async def app(scope, receive, send):
global _active_requests
assert scope["type"] == "http"
with _lock:
_active_requests += 1
print(f"Request started. Currently {_active_requests} active requests.")
chunk_size = 1024*512
chunks = [(str(i) * chunk_size).encode("utf-8") for i in range(10)]
total_size = sum(len(c) for c in chunks)
await send({
"type": "http.response.start",
"status": 200,
"headers": [
[b"content-type", b"text/plain"],
[b"Content-Length", f"{total_size}".encode("utf-8")],
]
})
await asyncio.sleep(1)
for i, chunk in enumerate(chunks):
await send({
"type": "http.response.body",
"body": chunk,
"more_body": i < len(chunks) - 1,
})
with _lock:
_active_requests -= 1
print(f"Request ended. Currently {_active_requests} active requests.")
Client:
$> curl -v http://127.0.0.1:8000 -s > /dev/null
* Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to 127.0.0.1 (127.0.0.1) port 8000 (#0)
> GET / HTTP/1.1
> Host: 127.0.0.1:8000
> User-Agent: curl/7.64.0
> Accept: */*
>
< HTTP/1.1 200 OK
< date: Wed, 27 Mar 2019 23:43:29 GMT
< server: uvicorn
< content-type: text/plain
< content-length: 5242880
<
{ [65536 bytes data]
* Connection #0 to host 127.0.0.1 left intact
$> curl http://127.0.0.1:8000 -s > /dev/null
$> curl http://127.0.0.1:8000 -s > /dev/null
Uvicorn output, all is good:
(venv-3.7.2) $ uvicorn async_app:app --http h11 --loop asyncio
INFO: Started server process [21186]
INFO: Waiting for application startup.
INFO: ASGI 'lifespan' protocol appears unsupported.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Request started. Currently 1 active requests.
INFO: ('127.0.0.1', 37864) - "GET / HTTP/1.1" 200
Request ended. Currently 0 active requests.
Request started. Currently 1 active requests.
INFO: ('127.0.0.1', 37866) - "GET / HTTP/1.1" 200
Request ended. Currently 0 active requests.
Request started. Currently 1 active requests.
INFO: ('127.0.0.1', 37870) - "GET / HTTP/1.1" 200
Request ended. Currently 0 active requests.
Same thing but this time closing the client before the end:
$> curl -s http://127.0.0.1:8000 | head -c 1
$> curl -s http://127.0.0.1:8000 | head -c 1
$> curl -s http://127.0.0.1:8000 | head -c 1
(venv-3.7.2) $> uvicorn async_app:app --http h11 --loop asyncio
INFO: Started server process [21511]
INFO: Waiting for application startup.
INFO: ASGI 'lifespan' protocol appears unsupported.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Request started. Currently 1 active requests.
INFO: ('127.0.0.1', 37938) - "GET / HTTP/1.1" 200
Request started. Currently 2 active requests.
INFO: ('127.0.0.1', 37940) - "GET / HTTP/1.1" 200
WARNING: socket.send() raised exception.
WARNING: socket.send() raised exception.
WARNING: socket.send() raised exception.
WARNING: socket.send() raised exception.
Request ended. Currently 1 active requests.
Request started. Currently 2 active requests.
INFO: ('127.0.0.1', 37942) - "GET / HTTP/1.1" 200
^CINFO: Shutting down
INFO: Waiting for background tasks to complete. (CTRL+C to force quit)
^CINFO: Finished server process [21511]
With more debug:
$ uvicorn async_app:app --http h11 --loop asyncio --log-level debug
DEBUG: None - ASGI [1] Initialized {'type': 'lifespan'}
INFO: Started server process [21837]
INFO: Waiting for application startup.
DEBUG: None - ASGI [2] Initialized {'type': 'lifespan'}
DEBUG: None - ASGI [2] Started task
DEBUG: None - ASGI [2] Raised exception
INFO: ASGI 'lifespan' protocol appears unsupported.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
DEBUG: ('127.0.0.1', 37958) - Connected
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Initialized {'type': 'http', 'http_version': '1.1', 'server': ('127.0.0.1', 8000), 'client': ('127.0.0.1', 37958), 'scheme': 'http', 'method': 'GET', 'root_path': '', 'path': '/', 'query_string': b'', 'headers': '<...>'}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Started task
Request started. Currently 1 active requests.
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.start', 'status': 200, 'headers': '<...>'}
INFO: ('127.0.0.1', 37958) - "GET / HTTP/1.1" 200
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - ASGI [3] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37958) - Disconnected
DEBUG: ('127.0.0.1', 37964) - Connected
DEBUG: ('127.0.0.1', 37964) - ASGI [4] Initialized {'type': 'http', 'http_version': '1.1', 'server': ('127.0.0.1', 8000), 'client': ('127.0.0.1', 37964), 'scheme': 'http', 'method': 'GET', 'root_path': '', 'path': '/', 'query_string': b'', 'headers': '<...>'}
DEBUG: ('127.0.0.1', 37964) - ASGI [4] Started task
Request started. Currently 2 active requests.
DEBUG: ('127.0.0.1', 37964) - ASGI [4] Received {'type': 'http.response.start', 'status': 200, 'headers': '<...>'}
INFO: ('127.0.0.1', 37964) - "GET / HTTP/1.1" 200
DEBUG: ('127.0.0.1', 37964) - ASGI [4] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37964) - ASGI [4] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37964) - ASGI [4] Received {'type': 'http.response.body', 'body': '<524288 bytes>', 'more_body': True}
DEBUG: ('127.0.0.1', 37964) - Disconnected
^CINFO: Shutting down
INFO: Waiting for background tasks to complete. (CTRL+C to force quit)
^CINFO: Finished server process [21837]
strace:
strace -tt -e trace=network,read,write,epoll_wait,open,close -o strace-output -s 300 -f uvicorn ...
21912 23:54:22.101575 epoll_wait(3, [], 3, 0) = 0
21912 23:54:22.102320 write(2, "DEBUG: ('127.0.0.1', 37992) - ASGI [15] Received {'type': 'http.response.body', 'body
': '<524288 bytes>', 'more_body': True}\n", 125) = 125
21912 23:54:22.102798 sendto(7, "000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000"..., 524288, 0, N
ULL, 0) = 524288
21912 23:54:22.104034 write(2, "DEBUG: ('127.0.0.1', 37992) - ASGI [15] Received {'type': 'http.response.body', 'body
': '<524288 bytes>', 'more_body': True}\n", 125) = 125
21912 23:54:22.104506 sendto(7, "111111111111111111111111111111111111111111111111111111111111111111111111111111111111
111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111
111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111"..., 524288, 0, N
ULL, 0) = 491520
21912 23:54:22.105874 write(2, "DEBUG: ('127.0.0.1', 37992) - ASGI [15] Received {'type': 'http.response.body', 'body
': '<524288 bytes>', 'more_body': True}\n", 125) = 125
21912 23:54:22.108103 write(2, "DEBUG: ('127.0.0.1', 37992) - ASGI [15] Received {'type': 'http.response.body', 'body
': '<524288 bytes>', 'more_body': True}\n", 125) = 125
21912 23:54:22.108482 epoll_wait(3, [{EPOLLIN|EPOLLOUT|EPOLLERR|EPOLLHUP, {u32=7, u64=94802813124615}}], 3, 4) = 1
21912 23:54:22.108668 recvfrom(7, 0x56395ce668f0, 262144, 0, NULL, NULL) = -1 ECONNRESET (Connection reset by peer)
21912 23:54:22.109192 epoll_wait(3, [], 2, 0) = 0
21912 23:54:22.109539 write(2, "DEBUG: ('127.0.0.1', 37992) - Disconnected\n", 43) = 43
21912 23:54:22.109819 close(7) = 0
21912 23:54:22.109952 epoll_wait(3, [], 2, 2) = 0
21912 23:54:22.112151 epoll_wait(3, [], 2, 0) = 0
21912 23:54:22.112325 epoll_wait(3, [], 2, 100) = 0
...
From what I’ve seen it seems it happens when:
- Something call pause_writing
- The ASGI send function is called and it’s locked on
await self.flow.drain()
- The connection_lost function is called.
- Nothing releases the lock and the ASGI send function never returns.
I’ve noticed this issue when investigating a memory leak with django/asgi/gunicorn. Clients are requesting some files (10MB - 200MB each) and some clients close the connection. The whole django response stays in memory forever since the asgi send function never return. This ends up taking all memory until uvicorn is restarted.
Issue Analytics
- State:
- Created 4 years ago
- Comments:8 (6 by maintainers)
Top GitHub Comments
Thanks for helping work through this!
Okay, released as 0.7.0.b2. You’ll want to install with
pip install --pre
since it’s labelled as a beta pre-release.