question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ctx.Response.Send multiple times per connection?

See original GitHub issue

Hello Joel,

Me again (; Quick question, is there a way to use the Send command multiple times in a session? I am trying to stream MJPEG over HTTP 1.1 (Chunk method works perfect), but I don’t want to end the session after Send.

I.e.

byte[] jpeg = null;
while (true)
{
ctx.Response.Send(Encoding.ASCII.GetBytes(MJPEGHeader));
ctx.Response.Send(Encoding.ASCII.GetBytes("Content-length: " + jpeg.Length.ToString()));
jpeg = Get_Next_JPEG();
ctx.Response.Send(jpeg);
ctx.Response.Send(Encoding.ASCII.GetBytes(MJPEGFooter));
}

Thanks and sorry to bother so much!

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:22 (16 by maintainers)

github_iconTop GitHub Comments

1reaction
jchristncommented, Jun 30, 2020
0reactions
jchristncommented, Jun 30, 2020

Just some performance data for reference. WatsonWebserver smokes HttpServerLite in performance, but that’s to be expected - http.sys (used by Watson) is in the kernel, whereas HttpServerLite is 100% in user-space.

This is a simple test of 10 clients, 1000 requests, to a loopback endpoint (no content delivered). HttpServerLite is the top set, Watson is the bottom set. Net-net, Watson is 5x faster (average latency) and 3x throughput.

C:\Users\joelc\Downloads>bombardier-windows-amd64.exe -c 10 -n 1000 http://localhost:9000
Bombarding http://localhost:9000 with 1000 request(s) using 10 connection(s)
 1000 / 1000 [======================================================================================] 100.00% 2481/s 0s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      2683.78     384.07    3058.27
  Latency        3.71ms   577.70us    12.97ms
  HTTP codes:
    1xx - 0, 2xx - 1000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:     1.01MB/s

C:\Users\joelc\Downloads>bombardier-windows-amd64.exe -c 10 -n 1000 http://localhost:9000
Bombarding http://localhost:9000 with 1000 request(s) using 10 connection(s)
 1000 / 1000 [======================================================================================] 100.00% 4954/s 0s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec     17345.60   15501.15   37627.71
  Latency      598.27us     1.42ms    32.91ms
  HTTP codes:
    1xx - 0, 2xx - 1000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:     3.19MB/s

The second test was with some content being sent (JPG image). 1000 requests over 10 clients. Watson (bottom test) was again ~3x faster (latency) and >3x throughput.

C:\Users\joelc\Downloads>bombardier-windows-amd64.exe -c 10 -n 1000 http://localhost:9000/img/watson.jpg
Bombarding http://localhost:9000/img/watson.jpg with 1000 request(s) using 10 connection(s)
 1000 / 1000 [======================================================================================] 100.00% 1243/s 0s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      1441.68     211.20    1854.85
  Latency        6.91ms     1.00ms    16.95ms
  HTTP codes:
    1xx - 0, 2xx - 1000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:   203.24MB/s

C:\Users\joelc\Downloads>bombardier-windows-amd64.exe -c 10 -n 1000 http://localhost:9000/img/watson.jpg
Bombarding http://localhost:9000/img/watson.jpg with 1000 request(s) using 10 connection(s)
 1000 / 1000 [======================================================================================] 100.00% 2470/s 0s
Done!
Statistics        Avg      Stdev        Max
  Reqs/sec      4774.15    2052.52    7068.31
  Latency        2.12ms     2.11ms    49.48ms
  HTTP codes:
    1xx - 0, 2xx - 1000, 3xx - 0, 4xx - 0, 5xx - 0
    others - 0
  Throughput:   663.02MB/s

I guess living in the kernel has its advantages. Still should be pretty fast (given the fact that it’s user space). I only have a few more things left to do to it and will report back here when it’s ready. I’m unlikely to make this the replacement for the innards of WatsonWebserver though because of the tremendous performance difference.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to use koa ctx body for multiple responses?
You can set ctx.body multiple times but only the last one will be used ... You can either concatenate your values in order...
Read more >
Express.js - Sending Multiple HTTP Responses - Blog
response.send() sends the response and closes the connection, whereas with response.write() you can send multiple responses. In this article, I will explain ...
Read more >
Response
Sending the response. The simplest way to send a response is to return a value from the route handler.
Read more >
Response resolver - Basics
Response resolver. Response resolver is a function that accepts a captured request and may return a mocked response. Parameters.
Read more >
How do you mock different responses in real life? #1117
Hello hello! I'm kinda new to this mocking thing, and I'm struggling implementing a proper developer experience to test different cases.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found