question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unable to respond with a Stream

See original GitHub issue

I’m unable to work out how to reply with a streamed response using Botwin. Reading up on ASP.NET Core it looks liked I’d be able to do the following:

[HttpGet]
public FileStreamResult GetTest()
{
  var stream = new MemoryStream(Encoding.ASCII.GetBytes("Hello World"));
  return new FileStreamResult(stream, new MediaTypeHeaderValue("text/plain"))
  {
    FileDownloadName = "test.txt"
  };
}

(https://stackoverflow.com/questions/42771409/how-to-stream-with-asp-net-core#42772150)

However, I can’t work out how I can do this using the HttpResponse I have from the BotwinModule handler?

Here’s a snippet of my module, but it doesn’t return any data:

var models = this.journalReader.Read(start, end, type, ct);

var fileStream =
    new FileStreamResult(CreateExportStream(models, requestLog, ct), "application/csv")
        {
            FileDownloadName
                = $"journal-{type}.csv"
        };

res.Body = fileStream.FileStream;

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:23 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
bazwilliamscommented, Mar 23, 2018

This works perfectly for streams which can be processed in memory, but unfortunately in my case, this doesn’t fix the issue. I’m using a producer/consumer pattern to minimise the memory requirements of my service - the data coming from the database could be many hundreds of MB and I’d rather stream it directly to the client.

This means, I’m not likely to have the entire contents of the stream in memory at once and is the cause of the block - StreamCopyOperation.CopyToAsync won’t start copying until the source stream has completed.

The solution (with and without hacky delay) doesn’t work for streams larger than my producer/consumer buffer size. 😦

1reaction
jchannoncommented, Mar 22, 2018

Might not be able to, just a brain dump for now 😄

Read more comments on GitHub >

github_iconTop Results From Across the Web

Having trouble streaming response to client using expressjs
I am having a really hard time wrapping my head around how to stream data back to my client when using Nodejs/Expressjs. I...
Read more >
Cannot reply with stream in object mode #3733 - hapijs/hapi
I'm trying to use a stream as a reply. const Transform = require('readable-stream').
Read more >
Live streaming error messages - YouTube Help
The Live Dashboard and Live Control Room checks for errors in the stream you're sending to YouTube. The messages are displayed next to...
Read more >
Configuring a Lambda function to stream responses
Configuring a handler function to stream responses​​ To indicate to the runtime that Lambda should stream your function's responses, you must wrap your...
Read more >
Fix live stream issues - Google Meet Help
Try the following: Turn up the volume on your computer or phone. Ask the meeting organizer to confirm that they are not muted...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found