question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Native direct memory leak for H2 cancellations

See original GitHub issue

We’ve had several production issues due to linkerd ceasing to forward traffic due to OutOfDirectMemoryException errors. At our traffic volume, it usually takes 1-2 weeks for a particular instance to fail. Raising the direct memory limit using the -XX:MaxDirectMemory JVM flag appears to extend the time it takes for an instance to fail.

In digging around for possible clues, we do notice that the open_streams counter appears to increase without bound for certain services: image

In trying to figure out what makes certain services different from others we discovered that, due to where they are in our request flow, some calls are more likely to get canceled. We further guessed that the cancellations are somehow causing a resource leak.

For reasons I don’t fully understand, linkerd will sometimes emit a warning log message when a request is canceled (it seems like this could be a relatively common and benign event, so I’m wondering why it’s emitted at the WARN verbosity):

W 0308 12:05:12.528 PST THREAD22 TraceId:ce91690fb56eea4c: Exception propagated to the default monitor (upstream address: /127.0.0.1:61833, downstream address: /127.0.0.1:9999, label: $/inet/127.1/9999).
Reset.Cancel

We setup a test harness to trigger this code path in linkerd. It consisted of a gRPC service that has a method that takes a fixed duration (e.g., 10ms), a client that drives a large amount of request volume with a deadline that’s less than that (e.g., 5ms), and a linkerd configured with -XX:MaxDirectMemorty=128M. When we run the test, after about 10 minutes of traffic, linkerd stops forwarding traffic with OutOfDirectMemoryException errors:

W 0307 21:01:38.951 PST THREAD19: Unhandled exception in connection with /127.0.0.1:50892, shutting down connection
io.netty.util.internal.OutOfDirectMemoryError: failed to allocate 1048576 byte(s) of direct memory (used: 133169438, max: 134217728)
    at io.netty.util.internal.PlatformDependent.incrementMemoryCounter(PlatformDependent.java:640)
    at io.netty.util.internal.PlatformDependent.allocateDirectNoCleaner(PlatformDependent.java:594)

During the test, we also observe the stream count increase. However, after the client is terminated the stream count seems to go back down. The bad state of the linkerd process does not. If we try to forward more traffic after waiting about 10 minutes, it almost immediately fails with the same out of direct memory error.

Running with the Netty leak detector set to paranoid doesn’t yield any warning messages as far as I can tell. This indicates that there’s still a live reference hanging onto the direct memory that’s being used.

We wondered if maybe this was just due to the traffic volume we were throwing at linkerd and that it was just being overwhelmed instead of leaking. This doesn’t be to case. We compared heap dumps from shortly after the failure started occurring and then around 10 minutes after. Not only did it continue to forward traffic, but the internal DirectArena metrics in Netty (at least, I think that’s what these are?) seemed to indicate that the allocations and deallocations did not match:

image

We’re continuing to investigate what’s going on, but wanted to get this issue out in case anyone else has some ideas.

Diagnostic info:

Metrics:

{
  "jvm/start_time": 1552070340000,
  "jvm/application_time_millis": 1146241,
  "jvm/classes/total_loaded": 9044,
  "jvm/classes/current_loaded": 9034,
  "jvm/classes/total_unloaded": 10,
  "jvm/postGC/Par_Survivor_Space/used": 262752,
  "jvm/postGC/CMS_Old_Gen/used": 14603600,
  "jvm/postGC/Par_Eden_Space/used": 0,
  "jvm/postGC/used": 14866352,
  "jvm/nonheap/committed": 73457664,
  "jvm/nonheap/max": -1,
  "jvm/nonheap/used": 67724776,
  "jvm/tenuring_threshold": 6,
  "jvm/thread/daemon_count": 59,
  "jvm/thread/count": 60,
  "jvm/thread/peak_count": 238,
  "jvm/mem/postGC/Par_Survivor_Space/used": 262752,
  "jvm/mem/postGC/CMS_Old_Gen/used": 14603600,
  "jvm/mem/postGC/Par_Eden_Space/used": 0,
  "jvm/mem/postGC/used": 14866352,
  "jvm/mem/metaspace/max_capacity": 1109393410,
  "jvm/mem/buffer/direct/max": 0,
  "jvm/mem/buffer/direct/count": 1,
  "jvm/mem/buffer/direct/used": 1,
  "jvm/mem/buffer/mapped/max": 0,
  "jvm/mem/buffer/mapped/count": 0,
  "jvm/mem/buffer/mapped/used": 0,
  "jvm/mem/allocations/eden/bytes": 27257749500,
  "jvm/mem/current/used": 84282424,
  "jvm/mem/current/CMS_Old_Gen/max": 724828160,
  "jvm/mem/current/CMS_Old_Gen/used": 14603600,
  "jvm/mem/current/Metaspace/max": -1,
  "jvm/mem/current/Metaspace/used": 52679384,
  "jvm/mem/current/Par_Eden_Space/max": 279183360,
  "jvm/mem/current/Par_Eden_Space/used": 1688904,
  "jvm/mem/current/Par_Survivor_Space/max": 34865152,
  "jvm/mem/current/Par_Survivor_Space/used": 262752,
  "jvm/mem/current/Compressed_Class_Space/max": 1073741820,
  "jvm/mem/current/Compressed_Class_Space/used": 8562152,
  "jvm/mem/current/Code_Cache/max": 50331648,
  "jvm/mem/current/Code_Cache/used": 6485632,
  "jvm/num_cpus": 4,
  "jvm/gc/msec": 17836,
  "jvm/gc/eden/pause_msec.count": 9,
  "jvm/gc/eden/pause_msec.max": 6,
  "jvm/gc/eden/pause_msec.min": 2,
  "jvm/gc/eden/pause_msec.p50": 2,
  "jvm/gc/eden/pause_msec.p90": 3,
  "jvm/gc/eden/pause_msec.p95": 6,
  "jvm/gc/eden/pause_msec.p99": 6,
  "jvm/gc/eden/pause_msec.p9990": 6,
  "jvm/gc/eden/pause_msec.p9999": 6,
  "jvm/gc/eden/pause_msec.sum": 23,
  "jvm/gc/eden/pause_msec.avg": 2.5555555555555554,
  "jvm/gc/ParNew/msec": 15212,
  "jvm/gc/ParNew/cycles": 3127,
  "jvm/gc/ConcurrentMarkSweep/msec": 2624,
  "jvm/gc/ConcurrentMarkSweep/cycles": 80,
  "jvm/gc/cycles": 3207,
  "jvm/fd_limit": 10240,
  "jvm/compilation/time_msec": 28841,
  "jvm/uptime": 1169726,
  "jvm/safepoint/sync_time_millis": 1145,
  "jvm/safepoint/total_time_millis": 21396,
  "jvm/safepoint/count": 4038,
  "jvm/heap/committed": 55472128,
  "jvm/heap/max": 1038876670,
  "jvm/heap/used": 16555256,
  "jvm/fd_count": 65,
  "adminhttp/sent_bytes": 137434,
  "adminhttp/connection_received_bytes.count": 0,
  "adminhttp/connection_duration.count": 0,
  "adminhttp/connects": 5,
  "adminhttp/success": 4,
  "adminhttp/request_latency_ms.count": 0,
  "adminhttp/admission_control/deadline/exceeded": 0,
  "adminhttp/admission_control/deadline/rejected": 0,
  "adminhttp/admission_control/deadline/expired_ms.count": 0,
  "adminhttp/received_bytes": 480,
  "adminhttp/read_timeout": 0,
  "adminhttp/write_timeout": 0,
  "adminhttp/connection_sent_bytes.count": 0,
  "adminhttp/connection_requests.count": 0,
  "adminhttp/nacks": 0,
  "adminhttp/thread_usage/requests/per_thread/admin-1": 3,
  "adminhttp/thread_usage/requests/per_thread/admin-2": 2,
  "adminhttp/thread_usage/requests/mean": 0,
  "adminhttp/thread_usage/requests/relative_stddev": 0,
  "adminhttp/thread_usage/requests/stddev": 0,
  "adminhttp/transit_latency_ms.count": 0,
  "adminhttp/tls/connections": 0,
  "adminhttp/socket_unwritable_ms": 0,
  "adminhttp/closes": 0,
  "adminhttp/request_payload_bytes.count": 0,
  "adminhttp/nonretryable_nacks": 0,
  "adminhttp/socket_writable_ms": 0,
  "adminhttp/response_payload_bytes.count": 0,
  "adminhttp/dtab/size.count": 0,
  "adminhttp/requests": 4,
  "adminhttp/pending": 1,
  "adminhttp/handletime_us.count": 0,
  "adminhttp/connections": 1,
  "rt/h2/server/127.0.0.1/4142/sent_bytes": 30617,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.count": 95,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.max": 3999,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.min": 0,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.p50": 79,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.p90": 1097,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.p95": 1097,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.p99": 1822,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.p9990": 3999,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.p9999": 3999,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.sum": 35236,
  "rt/h2/server/127.0.0.1/4142/connection_received_bytes.avg": 370.90526315789475,
  "rt/h2/server/127.0.0.1/4142/connection_duration.count": 95,
  "rt/h2/server/127.0.0.1/4142/connection_duration.max": 729,
  "rt/h2/server/127.0.0.1/4142/connection_duration.min": 1,
  "rt/h2/server/127.0.0.1/4142/connection_duration.p50": 5,
  "rt/h2/server/127.0.0.1/4142/connection_duration.p90": 47,
  "rt/h2/server/127.0.0.1/4142/connection_duration.p95": 65,
  "rt/h2/server/127.0.0.1/4142/connection_duration.p99": 371,
  "rt/h2/server/127.0.0.1/4142/connection_duration.p9990": 729,
  "rt/h2/server/127.0.0.1/4142/connection_duration.p9999": 729,
  "rt/h2/server/127.0.0.1/4142/connection_duration.sum": 2521,
  "rt/h2/server/127.0.0.1/4142/connection_duration.avg": 26.53684210526316,
  "rt/h2/server/127.0.0.1/4142/connects": 196,
  "rt/h2/server/127.0.0.1/4142/success": 2,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_failures": 1385,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.count": 92,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.max": 12,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.min": 0,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.p50": 0,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.p90": 0,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.p95": 0,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.p99": 12,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.p9990": 12,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.p9999": 12,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.sum": 24,
  "rt/h2/server/127.0.0.1/4142/request/stream/data_bytes.avg": 0.2608695652173913,
  "rt/h2/server/127.0.0.1/4142/request/stream/failures": 1385,
  "rt/h2/server/127.0.0.1/4142/request/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 1385,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_success": 15,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.count": 181,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.max": 25,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.min": 0,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.p50": 3,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.p90": 19,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.p95": 24,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.p99": 25,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.p9990": 25,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.p9999": 25,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.sum": 1364,
  "rt/h2/server/127.0.0.1/4142/request/stream/stream_duration_ms.avg": 7.535911602209945,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.count": 556,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.max": 47,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.min": 1,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.p50": 7,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.p90": 24,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.p95": 28,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.p99": 36,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.p9990": 38,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.p9999": 47,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.sum": 5898,
  "rt/h2/server/127.0.0.1/4142/request_latency_ms.avg": 10.607913669064748,
  "rt/h2/server/127.0.0.1/4142/admission_control/deadline/exceeded": 0,
  "rt/h2/server/127.0.0.1/4142/admission_control/deadline/rejected": 0,
  "rt/h2/server/127.0.0.1/4142/admission_control/deadline/expired_ms.count": 0,
  "rt/h2/server/127.0.0.1/4142/received_bytes": 30675072,
  "rt/h2/server/127.0.0.1/4142/read_timeout": 0,
  "rt/h2/server/127.0.0.1/4142/write_timeout": 0,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.count": 95,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.max": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.min": 0,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.p50": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.p90": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.p95": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.p99": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.p9990": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.p9999": 60,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.sum": 2940,
  "rt/h2/server/127.0.0.1/4142/connection_sent_bytes.avg": 30.94736842105263,
  "rt/h2/server/127.0.0.1/4142/connection_requests.count": 95,
  "rt/h2/server/127.0.0.1/4142/connection_requests.max": 62,
  "rt/h2/server/127.0.0.1/4142/connection_requests.min": 0,
  "rt/h2/server/127.0.0.1/4142/connection_requests.p50": 0,
  "rt/h2/server/127.0.0.1/4142/connection_requests.p90": 31,
  "rt/h2/server/127.0.0.1/4142/connection_requests.p95": 34,
  "rt/h2/server/127.0.0.1/4142/connection_requests.p99": 37,
  "rt/h2/server/127.0.0.1/4142/connection_requests.p9990": 62,
  "rt/h2/server/127.0.0.1/4142/connection_requests.p9999": 62,
  "rt/h2/server/127.0.0.1/4142/connection_requests.sum": 903,
  "rt/h2/server/127.0.0.1/4142/connection_requests.avg": 9.505263157894737,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/per_thread/finagle/netty4-2-6": 4029,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/per_thread/finagle/netty4-2-8": 161,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/per_thread/finagle/netty4-2-7": 4121,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/per_thread/finagle/netty4-2-2": 381243,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/per_thread/finagle/netty4-2-1": 555,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/per_thread/finagle/netty4-2-3": 2596,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/mean": 111.2,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/relative_stddev": 0.62538224,
  "rt/h2/server/127.0.0.1/4142/thread_usage/requests/stddev": 69.5425,
  "rt/h2/server/127.0.0.1/4142/transit_latency_ms.count": 0,
  "rt/h2/server/127.0.0.1/4142/tls/connections": 0,
  "rt/h2/server/127.0.0.1/4142/socket_unwritable_ms": 0,
  "rt/h2/server/127.0.0.1/4142/response/stream/stream_failures": 14,
  "rt/h2/server/127.0.0.1/4142/response/stream/data_bytes.count": 0,
  "rt/h2/server/127.0.0.1/4142/response/stream/failures": 14,
  "rt/h2/server/127.0.0.1/4142/response/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 14,
  "rt/h2/server/127.0.0.1/4142/response/stream/stream_success": 2,
  "rt/h2/server/127.0.0.1/4142/response/stream/stream_duration_ms.count": 0,
  "rt/h2/server/127.0.0.1/4142/closes": 196,
  "rt/h2/server/127.0.0.1/4142/stream/stream_failures": 716,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.count": 92,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.max": 39,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.min": 1,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.p50": 5,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.p90": 36,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.p95": 37,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.p99": 38,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.p9990": 39,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.p9999": 39,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.sum": 1227,
  "rt/h2/server/127.0.0.1/4142/stream/total_latency_ms.avg": 13.33695652173913,
  "rt/h2/server/127.0.0.1/4142/stream/data_bytes.count": 0,
  "rt/h2/server/127.0.0.1/4142/stream/local/data/frames": 2,
  "rt/h2/server/127.0.0.1/4142/stream/local/data/bytes.count": 0,
  "rt/h2/server/127.0.0.1/4142/stream/local/reset": 393419,
  "rt/h2/server/127.0.0.1/4142/stream/local/trailers": 2,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/frames": 390433,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.count": 346,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.max": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.min": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.p50": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.p90": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.p95": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.p99": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.p9990": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.p9999": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.sum": 4152,
  "rt/h2/server/127.0.0.1/4142/stream/remote/data/bytes.avg": 12,
  "rt/h2/server/127.0.0.1/4142/stream/remote/reset": 783809,
  "rt/h2/server/127.0.0.1/4142/stream/remote/trailers": 0,
  "rt/h2/server/127.0.0.1/4142/stream/failures": 716,
  "rt/h2/server/127.0.0.1/4142/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 716,
  "rt/h2/server/127.0.0.1/4142/stream/stream_success": 2,
  "rt/h2/server/127.0.0.1/4142/stream/open_streams": 0,
  "rt/h2/server/127.0.0.1/4142/failures": 392691,
  "rt/h2/server/127.0.0.1/4142/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 392507,
  "rt/h2/server/127.0.0.1/4142/failures/interrupted": 101,
  "rt/h2/server/127.0.0.1/4142/failures/interrupted/com.twitter.finagle.Failure": 101,
  "rt/h2/server/127.0.0.1/4142/failures/interrupted/com.twitter.finagle.Failure/com.twitter.finagle.CancelledConnectionException": 101,
  "rt/h2/server/127.0.0.1/4142/failures/com.twitter.finagle.ChannelWriteException": 1,
  "rt/h2/server/127.0.0.1/4142/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException": 1,
  "rt/h2/server/127.0.0.1/4142/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException/java.nio.channels.ClosedChannelException": 1,
  "rt/h2/server/127.0.0.1/4142/failures/com.twitter.finagle.buoyant.h2.Reset$ProtocolError$": 1,
  "rt/h2/server/127.0.0.1/4142/failures/restartable": 182,
  "rt/h2/server/127.0.0.1/4142/failures/restartable/com.twitter.finagle.Failure": 182,
  "rt/h2/server/127.0.0.1/4142/failures/restartable/com.twitter.finagle.Failure/com.twitter.finagle.CancelledConnectionException": 101,
  "rt/h2/server/127.0.0.1/4142/failures/rejected": 81,
  "rt/h2/server/127.0.0.1/4142/failures/rejected/com.twitter.finagle.Failure": 81,
  "rt/h2/server/127.0.0.1/4142/sourcedfailures/127.0.0.1/4142": 1,
  "rt/h2/server/127.0.0.1/4142/sourcedfailures/127.0.0.1/4142/com.twitter.finagle.ChannelWriteException": 1,
  "rt/h2/server/127.0.0.1/4142/sourcedfailures/127.0.0.1/4142/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException": 1,
  "rt/h2/server/127.0.0.1/4142/sourcedfailures/127.0.0.1/4142/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException/java.nio.channels.ClosedChannelException": 1,
  "rt/h2/server/127.0.0.1/4142/exn/io.netty.util.internal.OutOfDirectMemoryError": 194,
  "rt/h2/server/127.0.0.1/4142/socket_writable_ms": 0,
  "rt/h2/server/127.0.0.1/4142/dtab/size.count": 0,
  "rt/h2/server/127.0.0.1/4142/requests": 392705,
  "rt/h2/server/127.0.0.1/4142/handletime_us.count": 556,
  "rt/h2/server/127.0.0.1/4142/handletime_us.max": 917,
  "rt/h2/server/127.0.0.1/4142/handletime_us.min": 5,
  "rt/h2/server/127.0.0.1/4142/handletime_us.p50": 11,
  "rt/h2/server/127.0.0.1/4142/handletime_us.p90": 26,
  "rt/h2/server/127.0.0.1/4142/handletime_us.p95": 32,
  "rt/h2/server/127.0.0.1/4142/handletime_us.p99": 69,
  "rt/h2/server/127.0.0.1/4142/handletime_us.p9990": 152,
  "rt/h2/server/127.0.0.1/4142/handletime_us.p9999": 917,
  "rt/h2/server/127.0.0.1/4142/handletime_us.sum": 9074,
  "rt/h2/server/127.0.0.1/4142/handletime_us.avg": 16.320143884892087,
  "rt/h2/server/127.0.0.1/4142/connections": 0,
  "rt/h2/service/svc/127.0.0.1:4142/success": 2,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_failures": 1385,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.count": 92,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.max": 12,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.min": 0,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.p50": 0,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.p90": 0,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.p95": 0,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.p99": 12,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.p9990": 12,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.p9999": 12,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.sum": 24,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/data_bytes.avg": 0.2608695652173913,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/failures": 1385,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 1385,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_success": 15,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.count": 181,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.max": 25,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.min": 0,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p50": 3,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p90": 19,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p95": 24,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p99": 24,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p9990": 25,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p9999": 25,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.sum": 1343,
  "rt/h2/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.avg": 7.419889502762431,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.count": 556,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.max": 44,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.min": 1,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.p50": 6,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.p90": 22,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.p95": 25,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.p99": 34,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.p9990": 36,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.p9999": 44,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.sum": 5068,
  "rt/h2/service/svc/127.0.0.1:4142/request_latency_ms.avg": 9.115107913669064,
  "rt/h2/service/svc/127.0.0.1:4142/retries/classification_timeout": 0,
  "rt/h2/service/svc/127.0.0.1:4142/retries/response_stream_too_long": 1,
  "rt/h2/service/svc/127.0.0.1:4142/retries/per_request.count": 0,
  "rt/h2/service/svc/127.0.0.1:4142/retries/backoffs_exhausted": 0,
  "rt/h2/service/svc/127.0.0.1:4142/retries/total": 0,
  "rt/h2/service/svc/127.0.0.1:4142/retries/budget_exhausted": 0,
  "rt/h2/service/svc/127.0.0.1:4142/retries/request_stream_too_long": 0,
  "rt/h2/service/svc/127.0.0.1:4142/retries/budget": 100,
  "rt/h2/service/svc/127.0.0.1:4142/response/stream/stream_failures": 14,
  "rt/h2/service/svc/127.0.0.1:4142/response/stream/data_bytes.count": 0,
  "rt/h2/service/svc/127.0.0.1:4142/response/stream/failures": 14,
  "rt/h2/service/svc/127.0.0.1:4142/response/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 14,
  "rt/h2/service/svc/127.0.0.1:4142/response/stream/stream_success": 2,
  "rt/h2/service/svc/127.0.0.1:4142/response/stream/stream_duration_ms.count": 0,
  "rt/h2/service/svc/127.0.0.1:4142/stream/stream_failures": 716,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.count": 92,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.max": 36,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.min": 1,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.p50": 5,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.p90": 34,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.p95": 34,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.p99": 36,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.p9990": 36,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.p9999": 36,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.sum": 1154,
  "rt/h2/service/svc/127.0.0.1:4142/stream/total_latency_ms.avg": 12.543478260869565,
  "rt/h2/service/svc/127.0.0.1:4142/stream/data_bytes.count": 0,
  "rt/h2/service/svc/127.0.0.1:4142/stream/failures": 716,
  "rt/h2/service/svc/127.0.0.1:4142/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 716,
  "rt/h2/service/svc/127.0.0.1:4142/stream/stream_success": 2,
  "rt/h2/service/svc/127.0.0.1:4142/failures": 392691,
  "rt/h2/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 392507,
  "rt/h2/service/svc/127.0.0.1:4142/failures/interrupted": 101,
  "rt/h2/service/svc/127.0.0.1:4142/failures/interrupted/com.twitter.finagle.Failure": 101,
  "rt/h2/service/svc/127.0.0.1:4142/failures/interrupted/com.twitter.finagle.Failure/com.twitter.finagle.CancelledConnectionException": 101,
  "rt/h2/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.ChannelWriteException": 1,
  "rt/h2/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException": 1,
  "rt/h2/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException/java.nio.channels.ClosedChannelException": 1,
  "rt/h2/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.buoyant.h2.Reset$ProtocolError$": 1,
  "rt/h2/service/svc/127.0.0.1:4142/failures/restartable": 182,
  "rt/h2/service/svc/127.0.0.1:4142/failures/restartable/com.twitter.finagle.Failure": 182,
  "rt/h2/service/svc/127.0.0.1:4142/failures/restartable/com.twitter.finagle.Failure/com.twitter.finagle.CancelledConnectionException": 101,
  "rt/h2/service/svc/127.0.0.1:4142/failures/rejected": 81,
  "rt/h2/service/svc/127.0.0.1:4142/failures/rejected/com.twitter.finagle.Failure": 81,
  "rt/h2/service/svc/127.0.0.1:4142/sourcedfailures/$/inet/127.1/9999": 1,
  "rt/h2/service/svc/127.0.0.1:4142/sourcedfailures/$/inet/127.1/9999/com.twitter.finagle.ChannelWriteException": 1,
  "rt/h2/service/svc/127.0.0.1:4142/sourcedfailures/$/inet/127.1/9999/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException": 1,
  "rt/h2/service/svc/127.0.0.1:4142/sourcedfailures/$/inet/127.1/9999/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException/java.nio.channels.ClosedChannelException": 1,
  "rt/h2/service/svc/127.0.0.1:4142/requests": 392705,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.count": 6,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.max": 5,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.min": 2,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.p50": 2,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.p90": 4,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.p95": 5,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.p99": 5,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.p9990": 5,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.p9999": 5,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.sum": 17,
  "rt/h2/client/$/inet/127.1/9999/connect_latency_ms.avg": 2.8333333333333335,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.count": 10,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.max": 21,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.min": 3,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.p50": 9,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.p90": 14,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.p95": 21,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.p99": 21,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.p9990": 21,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.p9999": 21,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.sum": 92,
  "rt/h2/client/$/inet/127.1/9999/failed_connect_latency_ms.avg": 9.2,
  "rt/h2/client/$/inet/127.1/9999/sent_bytes": 7928925,
  "rt/h2/client/$/inet/127.1/9999/service_creation/failures": 182,
  "rt/h2/client/$/inet/127.1/9999/service_creation/failures/com.twitter.finagle.Failure": 182,
  "rt/h2/client/$/inet/127.1/9999/service_creation/failures/com.twitter.finagle.Failure/com.twitter.finagle.CancelledConnectionException": 101,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.count": 556,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.max": 35,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.min": 0,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.p50": 0,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.p90": 8,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.p95": 13,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.p99": 21,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.p9990": 23,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.p9999": 35,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.sum": 1422,
  "rt/h2/client/$/inet/127.1/9999/service_creation/service_acquisition_latency_ms.avg": 2.5575539568345325,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.count": 18,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.max": 1245384,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.min": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.p50": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.p90": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.p95": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.p99": 1245384,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.p9990": 1245384,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.p9999": 1245384,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.sum": 1240947,
  "rt/h2/client/$/inet/127.1/9999/connection_received_bytes.avg": 68941.5,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.count": 18,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.max": 626799,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.min": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.p50": 9,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.p90": 26,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.p95": 30,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.p99": 626799,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.p9990": 626799,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.p9999": 626799,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.sum": 626478,
  "rt/h2/client/$/inet/127.1/9999/connection_duration.avg": 34804.333333333336,
  "rt/h2/client/$/inet/127.1/9999/failure_accrual/removals": 1,
  "rt/h2/client/$/inet/127.1/9999/failure_accrual/probes": 15,
  "rt/h2/client/$/inet/127.1/9999/failure_accrual/removed_for_ms": 802518,
  "rt/h2/client/$/inet/127.1/9999/failure_accrual/revivals": 0,
  "rt/h2/client/$/inet/127.1/9999/connects": 18,
  "rt/h2/client/$/inet/127.1/9999/success": 14,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_failures": 1399,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.count": 92,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.max": 12,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.min": 0,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.p50": 0,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.p90": 0,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.p95": 0,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.p99": 0,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.p9990": 12,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.p9999": 12,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.sum": 12,
  "rt/h2/client/$/inet/127.1/9999/request/stream/data_bytes.avg": 0.13043478260869565,
  "rt/h2/client/$/inet/127.1/9999/request/stream/failures": 1399,
  "rt/h2/client/$/inet/127.1/9999/request/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 1399,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_success": 15,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.count": 182,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.max": 24,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.min": 0,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.p50": 4,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.p90": 18,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.p95": 24,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.p99": 24,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.p9990": 24,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.p9999": 24,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.sum": 1284,
  "rt/h2/client/$/inet/127.1/9999/request/stream/stream_duration_ms.avg": 7.054945054945055,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.count": 374,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.max": 28,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.min": 0,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.p50": 4,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.p90": 13,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.p95": 19,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.p99": 27,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.p9990": 28,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.p9999": 28,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.sum": 2065,
  "rt/h2/client/$/inet/127.1/9999/request_latency_ms.avg": 5.521390374331551,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.count": 374,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.max": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.min": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.p50": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.p90": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.p95": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.p99": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.p9990": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.p9999": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.sum": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues_per_request.avg": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/request_limit": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/budget_exhausted": 0,
  "rt/h2/client/$/inet/127.1/9999/retries/cannot_retry": 1,
  "rt/h2/client/$/inet/127.1/9999/retries/not_open": 81,
  "rt/h2/client/$/inet/127.1/9999/retries/budget": 100,
  "rt/h2/client/$/inet/127.1/9999/retries/requeues": 0,
  "rt/h2/client/$/inet/127.1/9999/received_bytes": 1240947,
  "rt/h2/client/$/inet/127.1/9999/read_timeout": 0,
  "rt/h2/client/$/inet/127.1/9999/write_timeout": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.count": 18,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.max": 7926449,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.min": 24,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.p50": 24,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.p90": 2336,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.p95": 2633,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.p99": 7926449,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.p9990": 7926449,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.p9999": 7926449,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.sum": 7928925,
  "rt/h2/client/$/inet/127.1/9999/connection_sent_bytes.avg": 440495.8333333333,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.count": 18,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.max": 1075,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.min": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.p50": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.p90": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.p95": 0,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.p99": 1075,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.p9990": 1075,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.p9999": 1075,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.sum": 1072,
  "rt/h2/client/$/inet/127.1/9999/connection_requests.avg": 59.55555555555556,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/success": 14,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_failures": 1399,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.count": 92,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.max": 12,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.min": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.p50": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.p90": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.p95": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.p99": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.p9990": 12,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.p9999": 12,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.sum": 12,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/data_bytes.avg": 0.13043478260869565,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/failures": 1399,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 1399,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_success": 15,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.count": 182,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.max": 24,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.min": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p50": 4,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p90": 18,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p95": 24,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p99": 24,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p9990": 24,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.p9999": 24,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.sum": 1288,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request/stream/stream_duration_ms.avg": 7.076923076923077,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.count": 374,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.max": 30,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.min": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.p50": 4,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.p90": 15,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.p95": 20,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.p99": 29,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.p9990": 30,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.p9999": 30,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.sum": 2326,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/request_latency_ms.avg": 6.219251336898396,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/response/stream/stream_failures": 15,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/response/stream/data_bytes.count": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/response/stream/failures": 15,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/response/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 15,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/response/stream/stream_success": 2,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/response/stream/stream_duration_ms.count": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/stream_failures": 716,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.count": 92,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.max": 31,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.min": 1,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.p50": 5,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.p90": 23,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.p95": 30,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.p99": 31,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.p9990": 31,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.p9999": 31,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.sum": 903,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/total_latency_ms.avg": 9.815217391304348,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/data_bytes.count": 0,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/failures": 716,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 716,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/stream/stream_success": 2,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/failures": 392510,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 392508,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.ChannelWriteException": 1,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException": 1,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException/java.nio.channels.ClosedChannelException": 1,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/failures/com.twitter.finagle.buoyant.h2.Reset$ProtocolError$": 1,
  "rt/h2/client/$/inet/127.1/9999/service/svc/127.0.0.1:4142/requests": 392523,
  "rt/h2/client/$/inet/127.1/9999/tls/connections": 0,
  "rt/h2/client/$/inet/127.1/9999/socket_unwritable_ms": 0,
  "rt/h2/client/$/inet/127.1/9999/response/stream/stream_failures": 15,
  "rt/h2/client/$/inet/127.1/9999/response/stream/data_bytes.count": 0,
  "rt/h2/client/$/inet/127.1/9999/response/stream/failures": 15,
  "rt/h2/client/$/inet/127.1/9999/response/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 15,
  "rt/h2/client/$/inet/127.1/9999/response/stream/stream_success": 2,
  "rt/h2/client/$/inet/127.1/9999/response/stream/stream_duration_ms.count": 0,
  "rt/h2/client/$/inet/127.1/9999/closes": 51,
  "rt/h2/client/$/inet/127.1/9999/stream/stream_failures": 716,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.count": 92,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.max": 29,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.min": 1,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.p50": 4,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.p90": 21,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.p95": 28,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.p99": 29,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.p9990": 29,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.p9999": 29,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.sum": 824,
  "rt/h2/client/$/inet/127.1/9999/stream/total_latency_ms.avg": 8.956521739130435,
  "rt/h2/client/$/inet/127.1/9999/stream/data_bytes.count": 0,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/frames": 56387,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.count": 174,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.max": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.min": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.p50": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.p90": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.p95": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.p99": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.p9990": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.p9999": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.sum": 2088,
  "rt/h2/client/$/inet/127.1/9999/stream/local/data/bytes.avg": 12,
  "rt/h2/client/$/inet/127.1/9999/stream/local/reset": 58701,
  "rt/h2/client/$/inet/127.1/9999/stream/local/trailers": 0,
  "rt/h2/client/$/inet/127.1/9999/stream/remote/data/frames": 14,
  "rt/h2/client/$/inet/127.1/9999/stream/remote/data/bytes.count": 0,
  "rt/h2/client/$/inet/127.1/9999/stream/remote/reset": 338,
  "rt/h2/client/$/inet/127.1/9999/stream/remote/trailers": 14,
  "rt/h2/client/$/inet/127.1/9999/stream/failures": 716,
  "rt/h2/client/$/inet/127.1/9999/stream/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 716,
  "rt/h2/client/$/inet/127.1/9999/stream/stream_success": 2,
  "rt/h2/client/$/inet/127.1/9999/failures": 392510,
  "rt/h2/client/$/inet/127.1/9999/failures/com.twitter.finagle.buoyant.h2.Reset$Cancel$": 392508,
  "rt/h2/client/$/inet/127.1/9999/failures/com.twitter.finagle.ChannelWriteException": 1,
  "rt/h2/client/$/inet/127.1/9999/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException": 1,
  "rt/h2/client/$/inet/127.1/9999/failures/com.twitter.finagle.ChannelWriteException/com.twitter.finagle.ChannelClosedException/java.nio.channels.ClosedChannelException": 1,
  "rt/h2/client/$/inet/127.1/9999/failures/com.twitter.finagle.buoyant.h2.Reset$ProtocolError$": 1,
  "rt/h2/client/$/inet/127.1/9999/exn/io.netty.util.internal.OutOfDirectMemoryError": 7,
  "rt/h2/client/$/inet/127.1/9999/available": 0,
  "rt/h2/client/$/inet/127.1/9999/singletonpool/connects/fail": 19,
  "rt/h2/client/$/inet/127.1/9999/singletonpool/connects/dead": 0,
  "rt/h2/client/$/inet/127.1/9999/socket_writable_ms": 0,
  "rt/h2/client/$/inet/127.1/9999/cancelled_connects": 10,
  "rt/h2/client/$/inet/127.1/9999/dtab/size.count": 0,
  "rt/h2/client/$/inet/127.1/9999/requests": 392523,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/num_weight_classes": 1,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/size": 1,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/rebuilds": 392807,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/closed": 0,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/load": 0,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/meanweight": 1,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/adds": 1,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/updates": 139,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/algorithm/p2c_least_loaded": 1,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/available": 0,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/max_effort_exhausted": 392678,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/busy": 1,
  "rt/h2/client/$/inet/127.1/9999/loadbalancer/removes": 0,
  "rt/h2/client/$/inet/127.1/9999/connections": 0,
  "rt/h2/bindcache/path/expires": 0,
  "rt/h2/bindcache/path/evicts": 0,
  "rt/h2/bindcache/path/misses": 1,
  "rt/h2/bindcache/path/oneshots": 0,
  "rt/h2/bindcache/bound/expires": 0,
  "rt/h2/bindcache/bound/evicts": 0,
  "rt/h2/bindcache/bound/misses": 1,
  "rt/h2/bindcache/bound/oneshots": 0,
  "rt/h2/bindcache/tree/expires": 0,
  "rt/h2/bindcache/tree/evicts": 0,
  "rt/h2/bindcache/tree/misses": 1,
  "rt/h2/bindcache/tree/oneshots": 0,
  "rt/h2/bindcache/client/expires": 0,
  "rt/h2/bindcache/client/evicts": 0,
  "rt/h2/bindcache/client/misses": 1,
  "rt/h2/bindcache/client/oneshots": 0
}

Thread dump:

2019-03-08 10:58:11
Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.191-b12 mixed mode):

"Attach Listener" #513 daemon prio=9 os_prio=31 tid=0x00007fa208808000 nid=0xf21b waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"ForkJoinPool.commonPool-worker-1" #510 daemon prio=5 os_prio=31 tid=0x00007fa2084b2800 nid=0x1370f waiting on condition [0x000070000ea57000]
   java.lang.Thread.State: WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079546f6b0> (a java.util.concurrent.ForkJoinPool)
	at java.util.concurrent.ForkJoinPool.awaitWork(ForkJoinPool.java:1824)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1693)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

"ForkJoinPool.commonPool-worker-0" #509 daemon prio=5 os_prio=31 tid=0x00007fa2082b3000 nid=0x1e10f waiting on condition [0x000070000e851000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079546f6b0> (a java.util.concurrent.ForkJoinPool)
	at java.util.concurrent.ForkJoinPool.awaitWork(ForkJoinPool.java:1824)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1693)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

"QueueingHandlerPool-472" #507 daemon prio=5 os_prio=31 tid=0x00007fa20821e000 nid=0x1d70f waiting on condition [0x000070000df36000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-471" #506 daemon prio=5 os_prio=31 tid=0x00007fa208aa1000 nid=0x7d0b waiting on condition [0x000070000dc2d000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-8" #505 daemon prio=5 os_prio=31 tid=0x00007fa207322800 nid=0x1720b runnable [0x00007000160b6000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x00000007953d9780> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x00000007953d97f8> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007955495a8> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-7" #504 daemon prio=5 os_prio=31 tid=0x00007fa2069ad000 nid=0x1eb0b runnable [0x0000700015fb3000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d2ac00> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x00000007953d96c0> (a java.util.Collections$UnmodifiableSet)
	- locked <0x0000000795549a58> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-6" #503 daemon prio=5 os_prio=31 tid=0x00007fa20844b800 nid=0x1ab0b runnable [0x0000700015eb0000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d2aac8> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000794d2ab40> (a java.util.Collections$UnmodifiableSet)
	- locked <0x0000000795549a08> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-5" #502 daemon prio=5 os_prio=31 tid=0x00007fa2084b7000 nid=0x1ec0b runnable [0x0000700015dad000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d2a990> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000794d2aa08> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007955499b8> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"ForkJoinPool.commonPool-worker-3" #501 daemon prio=5 os_prio=31 tid=0x00007fa206b3b800 nid=0x1ac0b waiting on condition [0x0000700015caa000]
   java.lang.Thread.State: WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079546f6b0> (a java.util.concurrent.ForkJoinPool)
	at java.util.concurrent.ForkJoinPool.awaitWork(ForkJoinPool.java:1824)
	at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1693)
	at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

"finagle/netty4-2-4" #500 daemon prio=5 os_prio=31 tid=0x00007fa207370000 nid=0x14a0b runnable [0x0000700015ba7000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d2af18> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000794d2af90> (a java.util.Collections$UnmodifiableSet)
	- locked <0x0000000795549968> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-441" #470 daemon prio=5 os_prio=31 tid=0x00007fa2084b3800 nid=0x1d80b waiting on condition [0x0000700013941000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-439" #468 daemon prio=5 os_prio=31 tid=0x00007fa2084b2000 nid=0x1060b waiting on condition [0x000070001373b000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-438" #467 daemon prio=5 os_prio=31 tid=0x00007fa2084b1000 nid=0xed0b waiting on condition [0x0000700013638000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-410" #439 daemon prio=5 os_prio=31 tid=0x00007fa2074e1000 nid=0xb50b waiting on condition [0x00007000119e4000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-406" #435 daemon prio=5 os_prio=31 tid=0x00007fa208a5b800 nid=0x12d0b waiting on condition [0x00007000112cf000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-405" #434 daemon prio=5 os_prio=31 tid=0x00007fa208a5a800 nid=0x9b0b waiting on condition [0x00007000111cc000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-404" #433 daemon prio=5 os_prio=31 tid=0x00007fa208a5a000 nid=0xa30b waiting on condition [0x00007000110c9000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-403" #432 daemon prio=5 os_prio=31 tid=0x00007fa208a59000 nid=0x1230b waiting on condition [0x0000700010fc6000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-392" #421 daemon prio=5 os_prio=31 tid=0x00007fa208458000 nid=0xca0b waiting on condition [0x00007000104a5000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-391" #420 daemon prio=5 os_prio=31 tid=0x00007fa208457000 nid=0x10407 waiting on condition [0x000070001029f000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-390" #419 daemon prio=5 os_prio=31 tid=0x00007fa208456800 nid=0x9207 waiting on condition [0x0000700010099000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-389" #418 daemon prio=5 os_prio=31 tid=0x00007fa2084ef800 nid=0x9307 waiting on condition [0x000070000ff96000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-388" #417 daemon prio=5 os_prio=31 tid=0x00007fa2084ee800 nid=0x1500b waiting on condition [0x000070000fe93000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-387" #416 daemon prio=5 os_prio=31 tid=0x00007fa2084ee000 nid=0x1510b waiting on condition [0x000070000fd90000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-386" #415 daemon prio=5 os_prio=31 tid=0x00007fa2084ed000 nid=0x5a07 waiting on condition [0x000070000fa87000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-385" #414 daemon prio=5 os_prio=31 tid=0x00007fa2082b2000 nid=0x5907 waiting on condition [0x000070000f984000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-384" #413 daemon prio=5 os_prio=31 tid=0x00007fa2082b1800 nid=0x7c0b waiting on condition [0x000070000f67b000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-383" #412 daemon prio=5 os_prio=31 tid=0x00007fa2082b0800 nid=0x1c70b waiting on condition [0x000070000f372000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-382" #411 daemon prio=5 os_prio=31 tid=0x00007fa2082b0000 nid=0xeb0b waiting on condition [0x000070000f26f000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-381" #410 daemon prio=5 os_prio=31 tid=0x00007fa20828c000 nid=0x1100b waiting on condition [0x000070000f16c000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-378" #407 daemon prio=5 os_prio=31 tid=0x00007fa208455000 nid=0x1210b waiting on condition [0x000070000ee63000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-377" #406 daemon prio=5 os_prio=31 tid=0x00007fa208311000 nid=0xdd0b waiting on condition [0x000070000ed60000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-375" #404 daemon prio=5 os_prio=31 tid=0x00007fa206aac800 nid=0x1ff07 waiting on condition [0x000070000eb5a000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-374" #403 daemon prio=5 os_prio=31 tid=0x00007fa206b55800 nid=0xe60b waiting on condition [0x000070000e74e000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-372" #401 daemon prio=5 os_prio=31 tid=0x00007fa208404800 nid=0x1130b waiting on condition [0x000070000e445000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-371" #400 daemon prio=5 os_prio=31 tid=0x00007fa20706a800 nid=0xf10b waiting on condition [0x000070000de33000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"admin-2" #398 daemon prio=5 os_prio=31 tid=0x00007fa2069ec000 nid=0x16207 runnable [0x000070001831c000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000795368aa8> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000795368b20> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007953e8850> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-271" #298 daemon prio=5 os_prio=31 tid=0x00007fa2074e5800 nid=0x17907 waiting on condition [0x00007000115d8000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-246" #273 daemon prio=5 os_prio=31 tid=0x00007fa207461000 nid=0x7207 waiting on condition [0x000070000f881000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-242" #269 daemon prio=5 os_prio=31 tid=0x00007fa206d1e800 nid=0x19607 waiting on condition [0x000070000f475000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-232" #259 daemon prio=5 os_prio=31 tid=0x00007fa20699a000 nid=0x18307 waiting on condition [0x000070000e954000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-229" #256 daemon prio=5 os_prio=31 tid=0x00007fa2084eb000 nid=0x15607 waiting on condition [0x000070000e548000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"QueueingHandlerPool-35" #62 daemon prio=5 os_prio=31 tid=0x00007fa206db0800 nid=0x7b03 waiting on condition [0x000070001019c000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x000000079548c548> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"admin-1" #34 daemon prio=5 os_prio=31 tid=0x00007fa208ad4800 nid=0x9903 runnable [0x000070000e64b000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000795368bf8> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000795368c70> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007953e88a0> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"UnboundedFuturePool-2" #30 daemon prio=5 os_prio=31 tid=0x00007fa206d15800 nid=0x9d03 waiting on condition [0x000070000e342000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x0000000795246cb8> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-3" #28 daemon prio=5 os_prio=31 tid=0x00007fa2074de800 nid=0xa103 runnable [0x000070000e13c000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d2ade0> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000794d2ae58> (a java.util.Collections$UnmodifiableSet)
	- locked <0x0000000795549918> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"UnboundedFuturePool-1" #27 daemon prio=5 os_prio=31 tid=0x00007fa206b22000 nid=0xa203 waiting on condition [0x000070000e039000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x0000000795246cb8> (a java.util.concurrent.SynchronousQueue$TransferStack)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
	at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
	at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
	at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1073)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-2" #23 daemon prio=5 os_prio=31 tid=0x00007fa208400000 nid=0xa807 runnable [0x000070000db2a000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d8a008> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000794d2ac48> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007955498c8> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4-2-1" #22 daemon prio=5 os_prio=31 tid=0x00007fa206c9a800 nid=0x310f runnable [0x000070000d821000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000794d10870> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000794d108a0> (a java.util.Collections$UnmodifiableSet)
	- locked <0x000000079560c340> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at com.twitter.finagle.util.BlockingTimeTrackingThreadFactory$$anon$1.run(BlockingTimeTrackingThreadFactory.scala:23)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4/boss-1" #21 daemon prio=5 os_prio=31 tid=0x00007fa2071b4800 nid=0xa603 runnable [0x000070000dd30000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000795275b60> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000795275b78> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007951c96b0> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.lang.Thread.run(Thread.java:748)

"finagle/netty4/boss-1" #17 daemon prio=5 os_prio=31 tid=0x00007fa206c18000 nid=0x3f07 runnable [0x000070000d71e000]
   java.lang.Thread.State: RUNNABLE
	at sun.nio.ch.KQueueArrayWrapper.kevent0(Native Method)
	at sun.nio.ch.KQueueArrayWrapper.poll(KQueueArrayWrapper.java:198)
	at sun.nio.ch.KQueueSelectorImpl.doSelect(KQueueSelectorImpl.java:117)
	at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
	- locked <0x0000000795275bc0> (a io.netty.channel.nio.SelectedSelectionKeySet)
	- locked <0x0000000795275bd8> (a java.util.Collections$UnmodifiableSet)
	- locked <0x00000007951c9700> (a sun.nio.ch.KQueueSelectorImpl)
	at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
	at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62)
	at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:755)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:410)
	at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884)
	at java.lang.Thread.run(Thread.java:748)

"CollectClosables" #15 daemon prio=5 os_prio=31 tid=0x00007fa20898c800 nid=0xa903 in Object.wait() [0x000070000da27000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait(Native Method)
	- waiting on <0x00000007958174e8> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:144)
	- locked <0x00000007958174e8> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:165)
	at com.twitter.util.Closable$$anon$1.run(Closable.scala:165)

"HighResTimer" #14 daemon prio=5 os_prio=31 tid=0x00007fa208373000 nid=0x4003 in Object.wait() [0x000070000d924000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait(Native Method)
	- waiting on <0x0000000794e11e00> (a java.util.TaskQueue)
	at java.lang.Object.wait(Object.java:502)
	at java.util.TimerThread.mainLoop(Timer.java:526)
	- locked <0x0000000794e11e00> (a java.util.TaskQueue)
	at java.util.TimerThread.run(Timer.java:505)

"Netty 4 Timer-1" #11 daemon prio=5 os_prio=31 tid=0x00007fa208931800 nid=0x3d03 waiting on condition [0x000070000d61b000]
   java.lang.Thread.State: TIMED_WAITING (sleeping)
	at java.lang.Thread.sleep(Native Method)
	at io.netty.util.HashedWheelTimer$Worker.waitForNextTick(HashedWheelTimer.java:567)
	at io.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:466)
	at java.lang.Thread.run(Thread.java:748)

"AsyncAppender-Dispatcher-Thread-1" #10 daemon prio=5 os_prio=31 tid=0x00007fa206b87000 nid=0x4303 in Object.wait() [0x000070000d518000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait(Native Method)
	- waiting on <0x000000079514cbe8> (a java.util.ArrayList)
	at java.lang.Object.wait(Object.java:502)
	at org.apache.log4j.AsyncAppender$Dispatcher.run(AsyncAppender.java:548)
	- locked <0x000000079514cbe8> (a java.util.ArrayList)
	at java.lang.Thread.run(Thread.java:748)

"Service Thread" #8 daemon prio=9 os_prio=31 tid=0x00007fa20709d800 nid=0x3903 runnable [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread1" #7 daemon prio=9 os_prio=31 tid=0x00007fa206013800 nid=0x4603 waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" #6 daemon prio=9 os_prio=31 tid=0x00007fa207072800 nid=0x4703 waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Signal Dispatcher" #5 daemon prio=9 os_prio=31 tid=0x00007fa207009800 nid=0x3603 runnable [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Surrogate Locker Thread (Concurrent GC)" #4 daemon prio=9 os_prio=31 tid=0x00007fa207012800 nid=0x3403 waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Finalizer" #3 daemon prio=8 os_prio=31 tid=0x00007fa206007800 nid=0x5403 in Object.wait() [0x000070000ce03000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait(Native Method)
	- waiting on <0x0000000794cc4688> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:144)
	- locked <0x0000000794cc4688> (a java.lang.ref.ReferenceQueue$Lock)
	at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:165)
	at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:216)

"Reference Handler" #2 daemon prio=10 os_prio=31 tid=0x00007fa206000000 nid=0x1e03 in Object.wait() [0x000070000cd00000]
   java.lang.Thread.State: WAITING (on object monitor)
	at java.lang.Object.wait(Native Method)
	- waiting on <0x0000000794cc46b8> (a java.lang.ref.Reference$Lock)
	at java.lang.Object.wait(Object.java:502)
	at java.lang.ref.Reference.tryHandlePending(Reference.java:191)
	- locked <0x0000000794cc46b8> (a java.lang.ref.Reference$Lock)
	at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)

"main" #1 prio=5 os_prio=31 tid=0x00007fa207016800 nid=0x1803 waiting on condition [0x000070000c5ea000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x0000000795274808> (a java.util.concurrent.CountDownLatch$Sync)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedNanos(AbstractQueuedSynchronizer.java:1037)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1328)
	at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:277)
	at com.twitter.util.Promise.ready(Promise.scala:600)
	at com.twitter.util.CloseAwaitably0.ready(Awaitable.scala:213)
	at com.twitter.util.CloseAwaitably0.ready$(Awaitable.scala:212)
	at io.buoyant.linkerd.Main$$anon$1.ready(Main.scala:105)
	at io.buoyant.linkerd.Main$$anon$1.ready(Main.scala:105)
	at com.twitter.util.Await$.$anonfun$all$1(Awaitable.scala:176)
	at com.twitter.util.Await$$$Lambda$845/1802415698.apply(Unknown Source)
	at scala.collection.immutable.List.foreach(List.scala:378)
	at com.twitter.util.Await$.all(Awaitable.scala:176)
	at com.twitter.util.Await$.all(Awaitable.scala:165)
	at io.buoyant.linkerd.Main$.main(Main.scala:59)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.twitter.app.App.$anonfun$nonExitingMain$4(App.scala:364)
	at com.twitter.app.App$$Lambda$186/1034094674.apply(Unknown Source)
	at scala.Option.foreach(Option.scala:257)
	at com.twitter.app.App.nonExitingMain(App.scala:363)
	at com.twitter.app.App.nonExitingMain$(App.scala:344)
	at io.buoyant.linkerd.Main$.nonExitingMain(Main.scala:21)
	at com.twitter.app.App.main(App.scala:333)
	at com.twitter.app.App.main$(App.scala:331)
	at io.buoyant.linkerd.Main$.main(Main.scala:21)
	at io.buoyant.linkerd.Main.main(Main.scala)

"VM Thread" os_prio=31 tid=0x00007fa207020800 nid=0x2003 runnable 

"Gang worker#0 (Parallel GC Threads)" os_prio=31 tid=0x00007fa207800000 nid=0x2603 runnable 

"Gang worker#1 (Parallel GC Threads)" os_prio=31 tid=0x00007fa20701f800 nid=0x2403 runnable 

"Gang worker#2 (Parallel GC Threads)" os_prio=31 tid=0x00007fa207020000 nid=0x2203 runnable 

"Gang worker#3 (Parallel GC Threads)" os_prio=31 tid=0x00007fa207801000 nid=0x1b03 runnable 

"Concurrent Mark-Sweep GC Thread" os_prio=31 tid=0x00007fa206811800 nid=0x2103 runnable 

"VM Periodic Task Thread" os_prio=31 tid=0x00007fa207043000 nid=0x3b03 waiting on condition 

JNI global references: 2523

What happened:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

Environment:

  • linkerd/namerd version, config files: 1.5.2, example config
  • Platform, version, and config files (Kubernetes, DC/OS, etc): local
  • Cloud provider or hardware configuration: Mac OS X/Nomad

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:20 (20 by maintainers)

github_iconTop GitHub Comments

2reactions
zackangelocommented, Apr 2, 2019

Okay, took a bit of a “shotgun” approach and sprayed some stream cancels over some code paths that are likely triggered in a reset scenario. I was able to get the streams to drain in every case by adding a cancel on the service->service failure handler in the h2 router (forgive the sloppy combinator usage: https://github.com/linkerd/linkerd/compare/master...zackangelo:cancelfix).

Currently testing this fix at around 1MM requests (with both unsafe enabled and disabled) and it looks good so far.

Alex kindly offered up a more targeted patch that accomplishes the same, so I expect one of us will open a PR to fix this today.

2reactions
zackangelocommented, Apr 1, 2019

It looks like the problem might be that, under certain conditions, we’re not fully draining com.finagle.twitter.buoyant.h2.Stream instances on Reset.

When digging around in the heap and looking at only unreachable objects (i.e., those waiting to be GC’d), I found a bunch of ArrayDeque instances that still contained 1 element:

image

As far as I can tell, this should never happen. Stream instances should not be permitted to be GC’d without being drained first. In the screenshot above, it looks like the ArrayDeque instances were also hanging onto a reference to h2 Data frames (which are, in turn, hanging onto a pooled buffer reference – it says Unpooled in the screenshot, but that’s just the slice, the backing buffer is pooled).

As a quick test, I wanted to see if the leak would go away if I forced a drain upon finalization. I added this method to AsyncQueueReader:

    override def finalize(): Unit = {
      if (frameQ.size > 0) {
        println("**************************************")
        println("queue not empty but being gc'd!!!")
        println("**************************************")

        frameQ.poll().transform {
          case Return(f) => f.release()
          case _ => Future.Done
        }; ()
      }
    }

When adding this method, the leak disappears and the heap arenas appear to stay the same size. 🎉

In my current linkerd branch, I’ve removed a few things that were complicating heap analysis like buffered writes and the buffering retry classifier. As a next step, I’m going to add them back in and see if the leak stays gone with the finalizer.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Fixing Java's ByteBuffer native memory "leak" (evanjones.ca)
TL;DR: The Java NIO API caches a maximum-sized direct ByteBuffer for each thread, which looks like a native memory leak if you read...
Read more >
[Known Issue] Out of direct memory error may occur during ...
If you read or write very large blocks from a large number of threads, the result is a memory error that looks like...
Read more >
MID Server runs out of memory due to memory leak.
Cause. Heapdump analysis shows there are memory leak due to H2 Database connection leak. H2 is an in-memory Database used by Discover Patterns...
Read more >
Troubleshoot Memory Issues in Your Java Apps
There are two ways to detect a memory leak. First, you can wait until your app crashes and you see an OutOfMemoryError exception...
Read more >
Memory leak messages when shutting down Apache Tomcat ...
This is very likely to create a memory leak. Stack trace of thread: sun.misc.Unsafe.park(Native Method) java.util.concurrent.locks.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found