requests.Session doesn't properly handle closed keep-alive sessions
See original GitHub issueHello, this was reported for requests 2.18.4 on Debian BTS[¹] by Jonathan Lynch, since it’s not Debian specific I’m forwarding here:
When a server reaps a keep-alive session it sends a FIN packet to the client. Normally, requests handles this fine and rebuilds the session on the next request. However, there is an edge case involving network latency that is not properly handled:
If python sends a request at roughly the same time as the server closes the session, then the server will send a RST (as the session is closed). Python receives this RST on what it thought was a valid session and throws an error:
requests.exceptions.ConnectionError: ('Connection aborted.',
RemoteDisconnected('Remote end closed connection without response',))
The reason I consider this a bug is because python received the FIN packet before it received the RST. As a result, it shouldn’t be surprised when the connection is subsequently aborted. It is an edge case, but the client has enough information available to it that it could have handled it correctly.
The workaround is to set max_retries on the Session via an HTTPAdaptor, but I believe the correct behavior when the FIN is received is to rebuild the session and re-send any requests that were in-flight (rather than throwing an error). Requests correctly handles the FIN packet if there are no in-flight requests, but if there are in-flight requests it ignores it and instead throws an error.
I will ask Jonathan to continue the discussion here. Thanks!
[¹] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=899406
Issue Analytics
- State:
- Created 5 years ago
- Reactions:11
- Comments:15 (4 by maintainers)
Top GitHub Comments
Hi,
I think i found a solution and created pull requests for urllib3 -> urllib3/urllib3#1911 After it will be merged ( I hope ) , requests should raise MaxRetries error as it should , because they are setting max_retries = 0 to retry object of urllib3 by default - > this should be discussed if better not to leave default urllib3 value which is 3 , or pass 1.
More info -> https://bugs.python.org/issue3566 , https://hg.python.org/cpython/rev/eba80326ba53 , https://bugs.python.org/issue41345
I agree when you said that the Session is not handling as should the connections, and I think that the reasson of the adapters…the RFC for “Persistent” Connections is very weird…it defines that the client should start sending tcp keep alive packets after 2 hours of no data received… for a Service Oriented Server…it is not costable having idle sessions. I.E. A video Streming server…chaturbate for say something… or netflix, users navigate throu the options and then picjk a video or room, but too many times , we a, as userrs let the video playing and fall asleep…so…we are consuming resourses that at the end we are causining our provider to spend more so they will increaase thier prices and so on…
So in the case, you software raise an exception instead of handling as should the connection…it is because the server side, maybe no the server software it, the load balancers, sanboxes or any other box in tyhe middle.
Thats Why I SUGGESTED to enable a lower layer connection guard that wull ensure that the session…well the last request, keep open until the server says…WTF and clloses thw connection properly instead of just dropping it.
so…it is work of no just Requests team…it really depens on 7 teams…one per OSI layer, actually 4, heheh.
SoI am just tryiong to let you know that we can be creatives and find solutinos, and try to understnad that is the Module is called Request , they handle requests…and suport itself on other specific modules…
XD