@grpc/grpc-js: Closing Stream After Unary Call Triggers Double Free
See original GitHub issueProblem description
Attempting to close the client’s connections to a server after making a single unary call triggers a panic in Node (this may actually be a bug in Node, it’s not clear to me).
Reproduction steps
I have a client class that manages a couple gRPC streams (_events
and _tunnelStream
) that are open for the lifetime of the client class. The client also makes unary calls. After making any number of unary calls (1 or more), and attempting to disconnect with the function below triggers a double free what appears to be the NodeJS HTTP/2 library:
async disconnect(): Promise<void> {
if (this._events !== null) {
this._events.on('error', () => {})
this._events.cancel()
}
if (this._tunnelStream !== null) {
this._tunnelStream.on('error', () => {})
this._tunnelStream.cancel()
}
this.rpc.close()
}
Environment
- MacOS / Ubuntu 20.04
- Node version 14
- Package name and version
@grpc/grpc-js ^1.0.3
Screenshots
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:9 (5 by maintainers)
Top Results From Across the Web
How to keep grpc-js client connection open (alive) during ...
I have a grpc server streaming RPC that communicates with the client. The client throws an error when it does not receive communication...
Read more >grpc - Bountysource
Attempting to close the client's connections to a server after making a single unary call triggers a panic in Node (this may actually...
Read more >Documentation Class: Server - gRPC on GitHub
The callback will be called when all pending calls have completed and the server is fully shut down. This method is idempotent with...
Read more >Troubleshooting Gitaly and Gitaly Cluster - GitLab Docs
This is a gRPC call error response code. If this error occurs, even though the Gitaly auth tokens are set up correctly, it's...
Read more >NVIDIA Deep Learning TensorRT Documentation
After populating the input buffer, you can call TensorRT's execute_async_v3 method to start inference asynchronously using a CUDA stream.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Wrapping the
.cancel()
call withsetTimeout
/setImmediate
seems to mitigate the error.That is definitely a bug in Node itself. You have more information about how you triggered the bug and your environment, so you should file an issue in that repository (https://github.com/nodejs/node), and link to it from here. I suggest that when you do, you share the error information in the form of text so that it’s easier to interact with.