[storage][test] Array buffer allocation failed in Windows Node 10
See original GitHub issueFailed in in Windows Node 10 but succeeded in Node 8 & Node 12 with linux and Mac. The allocation is for a (2^31 -1) Byte buffer. And the test need to allocate about 4GB memory in all. Since it still fail when I tried to allocate a smaller buffer, I think this is due to the limited memory size of the test machine.
1) Highlevel
uploadStream should work when blockSize = BLOCK_BLOB_MAX_STAGE_BLOCK_BYTES:
Uncaught RangeError: Array buffer allocation failed
at new ArrayBuffer (<anonymous>)
at createUnsafeArrayBuffer (buffer.js:118:12)
at createUnsafeBuffer (buffer.js:112:25)
at allocate (buffer.js:330:12)
at Function.allocUnsafe (buffer.js:292:10)
at new PooledBuffer (D:\a\1\s\sdk\storage\storage-blob\dist-esm\storage-blob\src\utils\PooledBuffer.js:12:38)
at BufferScheduler.require.BufferScheduler.shiftBufferFromUnresolvedDataArray (D:\a\1\s\sdk\storage\storage-blob\dist-esm\storage-blob\src\utils\BufferScheduler.js:221:22)
at BufferScheduler.require.BufferScheduler.resolveData (D:\a\1\s\sdk\storage\storage-blob\dist-esm\storage-blob\src\utils\BufferScheduler.js:249:35)
at ReadStream.<anonymous> (D:\a\1\s\sdk\storage\storage-blob\dist-esm\storage-blob\src\utils\BufferScheduler.js:160:40)
at ReadStream.emit (events.js:198:13)
at ReadStream.EventEmitter.emit (domain.js:448:20)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at ReadStream.Readable.push (_stream_readable.js:224:10)
at lazyFs.read (internal/fs/streams.js:181:12)
at FSReqWrap.wrapper [as oncomplete] (fs.js:467:17)
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
How to handle Array buffer allocation failed in nodejs?
I have successfully upload image in nodejs and I am using multer for that. But sometime " Array buffer allocation failed " error...
Read more >Array buffer allocation failed
The report has some images and charts. This is working fine with less no. of images. We are getting below error while generating...
Read more >RangeError: invalid array length - JavaScript - MDN Web Docs
The JavaScript exception "Invalid array length" occurs when specifying an array length that is either negative, a floating number or exceeds ...
Read more >IBM Spectrum Protect: Optimizing Performance
problem is a billable service that is offered to IBM Spectrum Protect customers. For ... 10. If you use node replication, review the...
Read more >4.9.10 - Release Status
4.9.10. Download the installer for your operating system or run ... Implement Windows NodePublish/Unpublish #823; Update example policy, use it in tests, ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
This is the scenario we discussed as a group and agreed it qualifies as a perf scenario which should be run through the new perf automation framework @mikeharder is standing up.
CC: @mitchdenny, @weshaggard
You could be running into the memory ceiling on the VM. We are seeing the same thing on some on some of the Java test runs for storage. I do question the need for this test though.
What are you trying to prove/disprove by allocating an array that is so large?
Surely if you were transferring that much data down the wire as a single chunk you’d be reading it from a stream filling a much more modestly sized buffer then pushing it down the wire. Sure a customer might decide that they want to feed in a large buffer for transmission instead of streaming, but I don’t see much difference between allocating say a 100MB buffer vs. a 4GB buffer for the purposes of a unit test/live test.