OutOfMemoryError on corrupted png file
See original GitHub issueHi, I’m getting OOM error while loading a png file. It is a corrupted file, but this kind of error should be prevented from firing anyway.
File
File location: https://better-essay-service.com/favicon-96x96.png Info: Supposedly 96x96px png file, actual size 8556 bytes, content can be checked via GIMP. Thumbnail:
Stacktrace
java.lang.OutOfMemoryError: Java heap space
at com.drew.lang.StreamReader.getBytes(StreamReader.java:71)
at com.drew.imaging.png.PngChunkReader.extract(PngChunkReader.java:96)
at com.drew.imaging.png.PngMetadataReader.readMetadata(PngMetadataReader.java:102)
Debug info
Debugging the issue I found out that extracting chunk data works until reader position 8070 in the file. Then calling SequentialReader.getInt32() method results in following:
// Motorola - MSB first (big endian)
return (101 << 24 & 0xFF000000) |
(58 << 16 & 0xFF0000) |
(99 << 8 & 0xFF00) |
(114 & 0xFF);
Valid int value 1698325362
is returned. Then in StreamReader.getBytes(int count) allocation of new byte[1698325362]
fails hard.
I guess some sanity check on those values returned should be in place? E.g. when reader tries to allocate more bytes than the actual length of the file is?
Possibly related to https://github.com/drewnoakes/metadata-extractor/issues/273 ?
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:6 (6 by maintainers)
Top GitHub Comments
#535 has been merged (thanks again).
This is a good strategy. It’s not always straightforward to know how large the file is, however. For example, we may be streaming data with unknown length over a network.