question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

parseStream on s3 stream not working - no errors, but execution halts

See original GitHub issue

Bug description For some reason, mm.parseStream fails to work. My code doesn’t throw any exceptions, but the execution halts at that point.

Expected behavior I expected parseStream to work properly.

const S3 = require('aws-sdk/clients/s3');
const s3 = new S3();
const mm = require('music-metadata');

(async function() {
  const stream = s3
    .getObject({
      Bucket: 'my-bucket',
      Key: 'test.wav',
    })
    .createReadStream();

  const data = await mm.parseStream(stream);
  console.log('we never get here :(', data);
})();

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:30 (18 by maintainers)

github_iconTop GitHub Comments

1reaction
Borewitcommented, Oct 18, 2019

but I’m not sure how I would know what range to grab

But music-metadata knows. Via the ITokenizer the SomeS3orHttpRangedReader would be able to calculate this. I am considering to implement something like that, because I have already done it for the browser.

But again, it’s not a huge deal. Using Node.js’s streams should be sufficient!

Understood. Don’t worry, this is an area which has my interest. In addition to that, I see a growing number of users reading directly from cloud storage.

1reaction
Borewitcommented, Oct 17, 2019

Should be solved, sorry I cannot easily prevent reading the entire WAV via a stream is not something I can prevent here; see #283.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Troubleshoot data delivery failure between Kinesis and S3
I'm trying to send data from Amazon Kinesis Data Firehose to my Amazon Simple Storage Service (Amazon S3) bucket, but it fails. How...
Read more >
Parse csv file from S3 using Lambda and Node Stream
The issue with your code is that it's not correctly dealing with the asynchronous nature of JavaScript. Specifically, your code is exiting ...
Read more >
8 Must-Know Tricks to Use S3 More Effectively in Python
Starting from line 9, we first upload a CSV file without explicitly specifying the content type. When we then check how this object's...
Read more >
Hi, I'm having issues using the "copy into" command. I'm trying ...
I'm trying to copy a file from AWS S3 bucket into a table in snowflake ... actually load data, but will attempt to...
Read more >
Resolve issues with uploading large files in Amazon S3
Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. · Important: If you receive errors...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found