question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

connection is given to multiple pool consumers simultaneously

See original GitHub issue

Initially observed on Arabba-SR6, still present on Arabba-SR7.

  • target is PostgreSQL 12 (AWS RDS).
  • openjdk version “11.0.7.0.101” 2020-07-14 LTS (Zulu)

A given PostgresqlConnection is being provided to multiple pool users simultaneously.

With Arabba-SR7, I started seeing the following stack trace (there was no exception from the pool thrown with SR6

java.lang.IllegalArgumentException: Too many permits returned: returned=1, would bring to 11/10
  at reactor.pool.AllocationStrategies$SizeBasedAllocationStrategy.returnPermits(AllocationStrategies.java:141)
  at reactor.pool.AbstractPool.destroyPoolable(AbstractPool.java:147)
  at reactor.pool.SimpleDequePool.drainLoop(SimpleDequePool.java:310)
  at reactor.pool.SimpleDequePool.drain(SimpleDequePool.java:204)
  at reactor.pool.SimpleDequePool.doAcquire(SimpleDequePool.java:199)
  at reactor.pool.AbstractPool$Borrower.request(AbstractPool.java:378)
  at reactor.core.publisher.FluxPeek$PeekSubscriber.request(FluxPeek.java:130)

I have also added logging around the acquisition and close of connections, both on over and under the pool. My logging shows this:

2020-10-01 22:23:03.844 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : created CloseLoggingConnection{name='query/reader (over pool) 41', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]}
2020-10-01 22:23:03.851 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : closed CloseLoggingConnection{name='query/reader (over pool) 41', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]} (onComplete)
2020-10-01 22:23:17.116 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : created CloseLoggingConnection{name='query/reader (over pool) 51', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]}
2020-10-01 22:23:17.166 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : closed CloseLoggingConnection{name='query/reader (over pool) 51', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]} (onComplete)
2020-10-01 22:23:17.179 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : created CloseLoggingConnection{name='query/reader (over pool) 62', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]}
2020-10-01 22:23:17.225 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : created CloseLoggingConnection{name='query/reader (over pool) 70', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]}
2020-10-01 22:23:19.297 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : closed CloseLoggingConnection{name='query/reader (over pool) 62', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]} (onComplete)
2020-10-01 22:23:19.302 DEBUG 852 --- [tor-tcp-epoll-1] c.n.b.e.p.LoggingConnectionFactory       : closed CloseLoggingConnection{name='query/reader (over pool) 70', delegate=PooledConnection[CloseLoggingConnection{name='query/reader (under pool) 4', delegate=PostgresqlConnection{client=io.r2dbc.postgresql.client.ReactorNettyClient@261617c5, codecs=io.r2dbc.postgresql.codec.DefaultCodecs@4cb7990f}}]} (onComplete)

under pool 4 is the database connection over pool 41 and over pool 51 both acquire and release it as expected however, then over pool 62 and over pool 70 are provided the same underlying database connection simultaneously.

I am unable to reproduce this in my test suite; this does consistently happen when running my application in a deployed environment.

I initially thought that the pool was closing in-use connections (#88) as that is what I was observing. However, I now believe that this was due to the connection being both in the pool and in use, and thus being closed while in use.

~I have also observed long connection close times; we are using r2dbc-proxy to wrap connections and queries with spans. I have seen a connection acquired and query executed quickly, but then a long delay before the connection is closed (as observed by io.r2dbc.proxy.listener.LifeCycleListener#afterCloseOnConnection). Parent spans also observe the same delay.~ (edit: struck this out, we were capturing the query traces incorrectly, only timing Statement.execute)

There doesn’t seem to be any additional logging I can enable in either r2dbc-pool or reactor-pool to help debug. I’ve temporarily disabled r2dbc-pool in my application, and several observed “weird problems” have now gone away.

I’m open on how to debug this further.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
mp911decommented, Dec 10, 2020

Thanks a lot for your feedback.

1reaction
mp911decommented, Oct 5, 2020

Too many permits returned looks like an issue in reactor-pool. Can you file a bug report in https://github.com/reactor/reactor-pool/issues since the issue seems to be rooted in the actual pool component?

Read more comments on GitHub >

github_iconTop Results From Across the Web

SQL Server Connection Pooling - ADO.NET - Microsoft Learn
Only connections with the same configuration can be pooled. ADO.NET keeps several pools at the same time, one for each configuration.
Read more >
Multiple DB connections using a connection pool vs Single ...
Depends on your use case. Suppose you are building a web application that would be used by multiple users simultaneously.
Read more >
The Pooling of Connections in Redis | Geek Culture - Medium
Connection pooling means that connections are reused rather than created each time when the connection is requested. To facilitate connection reuse, a memory ......
Read more >
Connection pooling - IBM
Using connection pools helps to both alleviate connection management ... It needs to share connections among multiple users within the same transaction.
Read more >
11 Session Pooling and Connection Pooling in OCI
A session pool can be either homogeneous or heterogeneous. Homogeneous session pooling means that sessions in the pool are alike for authentication (they...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found