question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Server Sent Events (SSE) and send buffer flushing

See original GitHub issue

I have a simple ASGI app, seen below. The server appears to be sending the data at the expected intervals, and the headers look right, but the browser does not receive the data until the connection is closed. Is this a buffering issue? How might one flush the buffer so it goes straight to the browser?

Here is the application code:

import asyncio
import datetime


async def app(scope, receive, send):
    headers = [(b"content-type", b"text/html")]
    if scope["path"] == "/":
        body = (
            "<html>"
            "<body>"
            "</body>"
            "<script>"
            "  let eventSource = new EventSource('/sse');"
            "  eventSource.addEventListener('message', (e) => {"
            "    document.body.innerHTML += e.data + '<br>';"
            "  });"
            "</script>"
            "</html>"
        ).encode()

        await send({"type": "http.response.start", "status": 200, "headers": headers})
        await send({"type": "http.response.body", "body": body})
    elif scope["path"] == "/sse":
        headers = [
            (b"content-type", b"text/event-stream"),
            (b"cache-control", b"no-cache"),
            (b"connection", b"keep-alive"),
        ]

        async def body():
            ongoing = True
            while ongoing:
                try:
                    payload = datetime.datetime.now()
                    yield f"data: {payload}\n\n".encode()
                    await asyncio.sleep(10)
                except asyncio.CancelledError:
                    ongoing = False

        await send({"type": "http.response.start", "status": 200, "headers": headers})
        async for chunk in body():
            await send({"type": "http.response.body", "body": chunk, "more_body": True})
        await send({"type": "http.response.body", "body": b""})
    else:
        await send({"type": "http.response.start", "status": 404, "headers": headers})
        await send({"type": "http.response.body", "body": b""})

This can be run by naming the file above to asgi_sse.py, then using something like

uvicorn asgi_sse:app

The headers:

$ curl -I http://localhost:8000/sse
HTTP/1.1 200 OK
date: Mon, 01 Jun 2020 09:51:41 GMT
server: uvicorn
content-type: text/event-stream
cache-control: no-cache
connection: keep-alive

And the response:

$ curl http://localhost:8000/sse
data: 2020-06-01 05:52:40.735403

data: 2020-06-01 05:52:50.736378

data: 2020-06-01 05:53:00.736812

Weird that curl gets the data immediately…

Is there documentation needed on how to get SSE to work with uvicorn? Or (more likely) am I doing something wrong?

I put the code above on repl.it To try it out, please fork and run it. Strangely, it works as intended on repl.it.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
synodrivercommented, Oct 3, 2021
async def func(scope, receive, send):
    if scope["type"] == "http":
        await send({"type": "http.response.start", "status": 200,
                    "headers": [(b"Content-Type", b"text/event-stream"), (b"Cache-Control", b"no-cache"),
                                (b"X-Accel-Buffering", b"no")]})
        for i in range(10):
            await send({"type": "http.response.body", "body": f"data: {i}\r\n".encode(), "more_body": True})
            await asyncio.sleep(1)
        await send({"type": "http.response.body", "body": b""})

Maybe you need anothor header field X-Accel-Buffering.

0reactions
bokunodevcommented, Oct 3, 2021

hi, i just tried uvicorn and i’m facing the same issue. it seems like there is noway to manually flush response buffer, is there a solution for this? or this is by design and an intended behavior. i’m on linux arch 5.10.70-1-lts

Read more comments on GitHub >

github_iconTop Results From Across the Web

Server Sent Events (SSE) - Fabio
fabio detects SSE connections if the Accept header is set to text/event-stream and enables automatic flushing of the response buffer to forward ...
Read more >
Using Server Sent Events to Simplify Real-time Streaming at ...
We walk through how we implemented an SSE server that's scalable and load-balanced to simplify and improve a real-time data visualization ...
Read more >
Why are my server sent events arriving as a batch?
I'm guessing that closing the connection will flush whatever buffer is causing the problem. Share.
Read more >
[Guided Reading] HPBN - SSE (Server-Sent Events)
Unlike a raw XHR connection, which buffers the full received response until the connection is dropped, an SSE connection can discard processed ...
Read more >
Using server-sent events - Web APIs | MDN
Warning: When not used over HTTP/2, SSE suffers from a limitation to the maximum number of open connections, which can be especially painful ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found