Problem with the .gz file size ?
See original GitHub issue- Etcher version: 1.0.0-beta16
- Operating system and architecture: OSX 10.11.6
I make backups of SD cards with this script in the Terminal :
diskutil unmountDisk /dev/disk1
sudo dd bs=1m if=/dev/rdisk1 | gzip -c > ~/Desktop/pi.img.gz
When I try to flash these .gz
images, Etcher seems to get stuck at 99% of the process, ETA remans at 2 or 3 seconds, the transfer rate very slowly decreases, until I get the error :
“Not enough space on the drive. Please insert larger one and try again”.
The same error appears with the uncompressed image file, but without the final slow-down.
However, on the command line, with :
gzip -dc ~/Desktop/pi.img.gz | sudo dd bs=1m of=/dev/rdisk1
there’s no error and the resulting SD card works well. Am I doing something wrong or is there a problem with Etcher ?
Issue Analytics
- State:
- Created 7 years ago
- Comments:10 (7 by maintainers)
Top Results From Across the Web
gz doesn't return correct file size for content above 2^32 B = 4 ...
If using a card that CAPACITY > SIZE , then the only effect is that the "speed" bar is wrong, but everything else...
Read more >gzip --list gives wrong uncompressed file size for large files
This problem is due to an inherent property of the gzip file format. The gzip manual says: The `gzip' format represents the the...
Read more >gzip -l returning incorrect values for uncompressed file size
The problem is that the data I'm receiving from gzip -l is invalid. Mostly it seems the uncompressed size is too small, in...
Read more >Uncompressed file estimation wrong? - Unix Stack Exchange
This is caused by the size of the field used to store the uncompressed size in gzipped files: it's only 32 bits, so...
Read more >How can I get the uncompressed size of gzip file without ...
Unfortunately, the only way to know is to extract it and count the bytes. gzip files do not properly report ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Slightly off-topic:
There is actually a use-case for that - when you’re doing something like
dd if=/dev/zero of=/dev/sdc
to completely wipe out the contents of a drive.dd
is a very low-level tool, which means it’s easy to abuse or get things wrong if you’re not very careful 😉Back on-topic: Specifically, the bug that was fixed in v1.0.0-beta.17 that @jviotti 's referring to is that previously Etcher would only write images with a size equal to an integer number of megabytes. 15931539456 equates to 15193.5 MB, and Etcher v1.0.0-beta.17 now copes with the extra-half-megabyte on the end https://github.com/resin-io-modules/etcher-image-write/pull/58 .
Nah, I think the snippet you’ve quoted basically says that the output file may not be an exact multiple of the block-size. So the output file will always be the exact size of the input file (the raw SD card in your case), regardless of the block size.
I think
My shitty USB/SD card adaptor died
may be the cause of the errors in the log you posted 😉 So you were actually seeing two problems at once - errors in your logs because of a shitty SD adaptor, and the “Not enough space on the drive” error caused by Etcher beta16 not coping with half-megabyte sized disk images.Hopefully both problems will be fixed by Etcher beta17 and a new SD adaptor 😃