sharp(new Buffer(svg)) throws "corrupt headers" error whenever svg is larger than 10,000,000 bytes
See original GitHub issueFirstly, thank you for providing this robust and useful library!
We are encountering some difficulty with this piece of code:
exports.svgToPng = function(svg) {
return new Promise(async function(resolve, reject) {
let buffer = Buffer.from(svg);
console.log(buffer);
await sharp(buffer)
.png()
.toBuffer()
.then(data => {
console.log('svgToPng() succeeded');
return resolve(data);
}).catch(err => {
console.log('svgToPng() returned error: ' + err)
return reject(err.stack);
});
});
};
We have a set of svg’s that we have tested this code against, and it works perfectly on our smaller svgs (less than 5.5MB), but it fails consistently with our larger svgs (larger than 12.5MB), with the error: Error: Input buffer has corrupt header
.
We have double and triple checked our svg headers, and made sure that there weren’t any problems (“<svg” tag was in first 300 characters, nothing crucial missing), and we have found that the headers of the succeeding svgs are identical to those of the failing svgs - the only difference is size.
We have tried manipulating a number of settings, such as changing the sharp cache, altering the density property, adding .resize()
with various smaller sizes before the .png()
command. None of these changed the behavior of the code. We even replaced all our sharp commands with:
...
await sharp(buffer)
.metadata()
.then(data => {
console.log(data);
});
...
Still, the behavior was completely unchanged. Large files failed with the corrupt header warning, and smaller files worked just fine.
Our use case is that we have users uploading large dxf or svg files to AWS S3, and we wish to quickly convert them to pngs and upload those pngs back to AWS S3.
Issue Analytics
- State:
- Created 5 years ago
- Comments:13 (7 by maintainers)
Top GitHub Comments
librsvg has recently been reducing its dependence on libxml2, instead (pre)processing XML using Rust, and in newer releases it looks like this problem has pretty much gone away.
For example the following 40MB input file can now happily be processed by the current/latest version of sharp/libvips/librsvg.
https://commons.wikimedia.org/wiki/File:Location_map_San_Francisco_Bay_Area.svg
I’m going to close this as it’s unlikely we’ll need to expose this librsvg setting and I suspect it will be removed very soon.
@waghanza This issue relates to SVG support in sharp. Your issue looks like an unrelated AWS encoding problem. Please open a new issue with an example image and standalone (i.e. no AWS dependency) code sample that produce this behaviour if you’re still having problems.