Request/Response Performance
See original GitHub issueHi, I think there is much room for improvements. Could we use this ticket to talk about performance improvemets in general?
E:\Repositorys\node-nats>node benchmark\request_perf.js
Request Performance Test
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
12583 request-responses/sec
Avg roundtrip latency: 39 microseconds
Request-responses/sec: 12348 - 12583 Payload: 60 Bytes
Testcase - https://gist.github.com/StarpTech/a8032cb07ff17da691788bfda3df75af
Issue Analytics
- State:
- Created 7 years ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
Analysis of HTTP Performance Problems
This paper is the first in a series on performance issues in the World Wide Web. ... each request/response pair would incur a...
Read more >LambdaAction - Amazon Simple Email Service
An invocation type of RequestResponse means that the execution of the function immediately results in a response, and a value of Event means...
Read more >How to objectively monitor REST APIs performance based on ...
Using the example of Azure Log Analytics, the request/response sizes are logged alongside the performance: enter image description here.
Read more >Request-Response Olympics - The Search For The Perfect ...
Performance — we will test response times of each protocol for a duration of 60 seconds at a rate of 100 requests per...
Read more >c# - gRPC - request response performance - Stack Overflow
I have just tested out the performance of gRPC in C# and am puzzled about its performance between computers. Small messages take a...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I am sure we could do more if the responder was not same process.
But we are still at 2.5x.
Point was even if you took Node.js and took out NATS and send a bunch of bytes over a socket like this, would you significantly beat 39 microseconds?
Requestor->Server->Responder->Server->Requestor