Header "Transfer-Encoding: chunked" set even if Content-Length is provided which causes body to not actually get chunked
See original GitHub issueTest script
import requests
import time
def f():
yield b"lol"
time.sleep(2)
yield b"man"
requests.post('http://127.0.0.1:8801/', data=f(), headers={"Content-Length": 6})
Actual result
Received on the server:
$ nc -p 8801 -l
POST / HTTP/1.1
Host: 127.0.0.1:8801
User-Agent: python-requests/2.0.0 CPython/3.3.1 Linux/3.11.0-031100rc4-generic
Accept: */*
Transfer-Encoding: chunked
Content-Length: 6
Accept-Encoding: gzip, deflate, compress
lolman
Expected result
Did not expect “Transfer-Encoding: chunked” since I provided the Content-Length. If requests insists on doing chunked transfer encoding, it should disregard the content length and actually chunk the content (as it does if there is not Content-Length header given).
Issue Analytics
- State:
- Created 10 years ago
- Comments:52 (37 by maintainers)
Top Results From Across the Web
Chunked encoding and content-length header - Stack Overflow
No : "Messages MUST NOT include both a Content-Length header field and a non-identity transfer-coding. If the message does include a ...
Read more >Transfer-Encoding - HTTP - MDN Web Docs
Chunked encoding. Chunked encoding is useful when larger amounts of data are sent to the client and the total size of the response...
Read more >Chunked transfer encoding - Wikipedia
The chunks are sent out and received independently of one another. No knowledge of the data stream outside the currently-being-processed chunk is necessary...
Read more >Migration to 2.x — aiohttp 3.8.3 documentation
aiohttp does not enable chunked encoding automatically even if a transfer-encoding header is supplied: chunked has to be set explicitly. If chunked is...
Read more >Responses - Everything curl
When receiving a chunked response, there is no Content-Length: for the response to indicate its size. Instead, there is a Transfer-Encoding: chunked header...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@timuralp I remain opposed to adding a flag for this. It’s really just unacceptable to have a HTTP/1.1 implementation that cannot handle chunked transfer encoding in 2016. It’s been a specification requirement for so long that the first specification that required it is nearly old enough to vote in the United States of America: I don’t think we can keep cutting entities slack for not doing it.
From my perspective the bug here remains that we may incorrectly emit both Content-Length and Transfer-Encoding. Of course, my perspective is non-binding. 😉
Currently, providing a generator and specifying a content-length is an error (it generates an invalid http request), so this use case should not be used by anybody. That’s why I was thinking it would not break your user’s programs. Why generators ? Data is not always provided by a file like object. For example, I’d like to watch upload progress by yielding data chunk by chunk, etc. (otherwise I’d have to override read() methods)