question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Response streaming does not work

See original GitHub issue

The following code inspired by the documentation does not work. The result is buffered and sent to the client at once.

import cherrypy
import time

class Root:
    @cherrypy.expose
    def thing(self):
        cherrypy.response.headers['Content-Type'] = 'text/plain'
        def content():
            yield "Hello, "
            time.sleep(3)
            yield "world"
        return content()
    thing._cp_config = {'response.stream': True}

print(cherrypy.__version__)
cherrypy.quickstart(Root())

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
stencicommented, Sep 19, 2016

In Windows all the clients seem to buffer, so it seems to be a lower level problem in Windows.

Both Firefox and curl on Linux stream the output as expected, regardless of whether the CherryPy server runs on Windows or Linux.

Both Firefox and curl on Windows buffer the output. The output of curl on Windows shows the correct timestamps, but the whole content is displayed at once at the end.

The documentation already says:

In general, it is safer and easier to not stream output

Perhaps it is worth adding:

… and it doesn’t work well in Windows clients

I will run my scripts from Firefox running on Ubuntu on VirtualBox, so I can keep track of the progress without rewriting them. It’s an horrible workaround, but it’s the easiest way for me.

Thanks for your help

1reaction
webknjazcommented, Sep 19, 2016

I’ve already confirmed that it is working. You may check this with command I posted above.

Here’s handler yielding additional timestamps:

#! /usr/bin/env python

from datetime import datetime
import time
import cherrypy

class Root:
    @cherrypy.expose
    @cherrypy.config(**{'response.stream': True})
    def thing(self):
        cherrypy.response.headers['Content-Type'] = 'text/plain'
        yield 'Hi, ' * 100
        yield str(datetime.now())
        yield '================='
        time.sleep(3 * 2)
        yield str(datetime.now())
        yield 'there!'


if __name__ == '__main__':
    cherrypy.quickstart(Root())
$ curl --trace-ascii - localhost:8080/thing
== Info:   Trying ::1...
== Info: connect to ::1 port 8080 failed: Connection refused
== Info:   Trying 127.0.0.1...
== Info: Connected to localhost (127.0.0.1) port 8080 (#0)
=> Send header, 83 bytes (0x53)
0000: GET /thing HTTP/1.1
0015: Host: localhost:8080
002b: User-Agent: curl/7.46.0
0044: Accept: */*
0051: 
<= Recv header, 17 bytes (0x11)
0000: HTTP/1.1 200 OK
<= Recv header, 24 bytes (0x18)
0000: Server: CherryPy/8.1.0
<= Recv header, 39 bytes (0x27)
0000: Content-Type: text/html;charset=utf-8
<= Recv header, 37 bytes (0x25)
0000: Date: Mon, 19 Sep 2016 18:22:41 GMT
<= Recv header, 28 bytes (0x1c)
0000: Transfer-Encoding: chunked
<= Recv header, 2 bytes (0x2)
0000: 
<= Recv data, 439 bytes (0x1b7)
0000: 190
0005: Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 
0045: Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 
0085: Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 
00c5: Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 
0105: Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 
0145: Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 
0185: Hi, Hi, Hi, Hi, 
0197: 1a
019b: 2016-09-19 21:22:41.744041
Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, Hi, 2016-09-19 21:22:41.744041<= Recv data, 23 bytes (0x17)
0000: 11
0004: =================
=================<= Recv data, 32 bytes (0x20)
0000: 1a
0004: 2016-09-19 21:22:47.750253
2016-09-19 21:22:47.750253<= Recv data, 11 bytes (0xb)
0000: 6
0003: there!
there!<= Recv data, 5 bytes (0x5)
0000: 0
0003: 
== Info: Connection #0 to host localhost left intact

Do you see timestamps? Their difference is 6 seconds. The output to console has stopped for 6 seconds as in handler too. This is my proof.

So look for bottlenecks in your scripts.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Issue in Streaming response from backend server to Client - IBM
We have enbaled streaming in this case so that Datapower does not process the response and streams the response to Client.On the MPG...
Read more >
Streaming response is not displayed #10506 - GitHub
Describe the Issue​​ I have created a filtered stream on Twitter. I can stream the responses using curl https://api.twitter.com/2/tweets/search/ ...
Read more >
"Streaming Server Bad Response" | Switcher Studio Help Center
In rare instances, the "Streaming Server Bad Response" error message could mean there is a problem with the streaming destination. Many platforms provide...
Read more >
What to do with errors when streaming the body of an Http ...
One of the following should do it: Close the connection (reset or normal close); Write a malformed chunk (and close the connection) which ......
Read more >
Streaming requests and responses | Apigee Edge
By default, HTTP streaming is disabled and HTTP request and response payloads are written to a buffer in memory before they are processed...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found