question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Issuing requests in close concurrency can unneccessarily result in multiple HTTP/2 connections.

See original GitHub issue

When I do several requests with the AsyncClient it seems that it opens multiple connections to the server. My code looks like this:

async def _send_request(self, client, image, url):
    headers = {
        'Content-Type': 'image/jpeg',
        'Content-Length': str(len(image))
    }
    return await client.post(
        url,
        headers=headers,
        data=image
    )

async with httpx.AsyncClient(timeout=timeout, base_url=self.server_url) as client:
    requests = []
    for image in images:
        requests.append(self._send_request(client, image, url))
    responses = await asyncio.gather(*requests)

As far as I understand it should use a connection pooling, store the connection and reuse it. In my case I get the following debug message for every image- post request I send with a loop:

DEBUG:httpx.dispatch.connection:start_connect host='api.garaza.io' port=443 timeout=TimeoutConfig(connect_timeout=None, read_timeout=60, write_timeout=5)

Am I doing anything wrong or is it a bug?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:10 (9 by maintainers)

github_iconTop GitHub Comments

2reactions
yeraydiazdiazcommented, Nov 10, 2019

I noticed this happening on a similar situation as well and it also confused me. I fully expected connections to only be created once per origin, but it seems the current behaviour is by design according to this test?

https://github.com/encode/httpx/blob/master/tests/dispatch/test_connection_pools.py#L69-L80

@PrimozGodec, max_connections is a BoundedSemaphore which HTTPX uses to limit the number of connections of the pool. The default is 100.

1reaction
ojiicommented, Feb 18, 2020

But, we don’t know ahead of time if a connection will be HTTP/1.1 or HTTP/2 until it’s been established.

is there a way around this issue if we know beforehand that it’s http2 (as in, when talking to a server that is http2 only, such as APNS)?

Read more comments on GitHub >

github_iconTop Results From Across the Web

RFC 9113 - HTTP/2
Making multiple concurrent requests can reduce latency and improve application performance. HTTP/1.0 allowed only one request to be outstanding at a time on ......
Read more >
How many concurrent requests should we multiplex in ...
The number of streams that client and server can initiate isn't unlimited, it's mandated by the SETTINGS_MAX_CONCURRENT_STREAMS parameter of ...
Read more >
HTTP/1.1 vs HTTP/2: What's the Difference?
With persistent connections, HTTP/1.1 assumes that a TCP connection should be kept open unless directly told to close. This allows the client to ......
Read more >
Improve throughput and concurrency with HTTP/2 - Vespa Blog
The Vespa HTTP container now accepts HTTP/2 with TLS enabled. Learn how this improves HTTP throughput and efficiency, and how to get started ......
Read more >
HTTP/1.1: Connections
HTTP requests and responses can be pipelined on a connection. Pipelining allows a client to make multiple requests without waiting for each response, ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found