question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failing to read large files

See original GitHub issue

I’m trying to use Jimp to process images that are about 70-100 MB in size. However, I get the following error

// image is 87.7 MB
Jimp.read('big-image', (err, img) => {
  console.log(err)
})
> Error: maxMemoryUsageInMB limit exceeded by at least 226MB

Is there any way around this?

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:16
  • Comments:25

github_iconTop GitHub Comments

21reactions
kane-masoncommented, Nov 12, 2021

i found the solutions above would only partially work and further transforms like crop would error

This is what worked for me

const cachedJpegDecoder = Jimp.decoders['image/jpeg']
Jimp.decoders['image/jpeg'] = (data) => {
  const userOpts = { maxMemoryUsageInMB: 1024 }
  return cachedJpegDecoder(data, userOpts)
}
15reactions
kde12327commented, Sep 9, 2020

I have same problem. So I tried fix something. [(https://developer.aliyun.com/mirror/npm/package/jpeg-js)] On this site, there are maxMemoryUsageInMB option on decode function.

And I change \node_modules@jimp\core\dist\utils\image-bitmap.js:196 line.

this.bitmap = this.constructor.decoders[_mime](data); to this.bitmap = this.constructor.decoders[_mime](data, {maxMemoryUsageInMB: 2000});

Then I can read the large image. Try this one.

Read more comments on GitHub >

github_iconTop Results From Across the Web

FileReader() fails to read large files - Stack Overflow
It is failing for big files as you are reading the whole file in memory before processing it. A File object is an...
Read more >
What's The Best Way to Handle Extremely Large (1GB+) Text ...
Solution 2: Split the Large File into Smaller Chunks. If your text editor can't open a 2GB text file, you can split it...
Read more >
Troubleshooting Uploading Large Files - SugarCRM Support
Attempting to upload a large file to a Sugar module fails; the page may simply continuously load, or fail with "file too large"...
Read more >
Why does FOPEN fail when trying to open very large files from ...
You can workaround this C limitation by breaking up your file into smaller sections, preferably on boundaries which are logical divisions of your...
Read more >
Problem reading large files - Talend Community
The input file is 530MB and the records are 354 bytes long. The job processes correctly for small files but throws a "java.lang.OutOfMemoryError"...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found