Problems with large blobs (10Gb)
See original GitHub issueHi,
We are developing a web application that manage large files (PSB up to 10Gb) and when we are uploading these files we always recieves the same error:
PUT https://xxxxxxxxcpn226pa5vnoa.blob.core.windows.net/scenes/a3e7881d-b886-431e-9644-5b554eaab7aa/90mm_Atlantis_Afternoon_sphere.psb?comp=block&blockid=MDg5NDg0NzgtMDAwMzQx&sv=2017-04-17&sr=c&sig=7iDoryKY8nyUVWp4VRGFIP1A8ydNtkuUlQEkHkzArLw%3D&st=2017-12-13T15:20:18Z&se=2017-12-14T15:25:18Z&sp=cw&api-version=2017-04-17 net::ERR_FILE_NOT_FOUND
startup.bundle.js?=c0gh5z2x95lj697cutvpqkyzv9:1 Uncaught TypeError: Cannot read property 'getAutoIncrementFunction' of null
at ChunkStreamWithStream.<anonymous> (azure-storage.blob.js?=k3t6a3p5o2bj7mqh2bglod5u8r:7111)
at ChunkStreamWithStream.EventEmitter.emit (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:37703)
at ChunkStreamWithStream.ChunkStream._emitBufferData (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:1251)
at ChunkStreamWithStream.ChunkStream._buildChunk (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:1217)
at BrowserFileReadStream.EventEmitter.emit (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:37700)
at readableAddChunk (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:35932)
at BrowserFileReadStream.Readable.push (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:35891)
at FileReader.BrowserFileReadStream._fileReader.onloadend (azure-storage.common.js?=yijzqfmvy8am964fztaetjtvyv:363)
at ZoneDelegate.invoke (polyfills.bundle.js?=thw4y640ea233fojkygkj6anox:1)
at Object.onInvoke (startup.bundle.js?=c0gh5z2x95lj697cutvpqkyzv9:1)
at ZoneDelegate.invoke (polyfills.bundle.js?=thw4y640ea233fojkygkj6anox:1)
at Zone.runGuarded (polyfills.bundle.js?=thw4y640ea233fojkygkj6anox:1)
at FileReader.<anonymous> (polyfills.bundle.js?=thw4y640ea233fojkygkj6anox:1)
Any idea about this error?
Regards!
Issue Analytics
- State:
- Created 6 years ago
- Comments:13 (6 by maintainers)
Top Results From Across the Web
Should I Store Files in the Database? - wiseDATAman.com
If you don't expect to store very large volumes of BLOB data some of these issues might not apply. Evaluate the information in...
Read more >Is it possible to dynamically produce large files (10Gb) and ...
I've implemented this by creating a Blob of data and using an objectUrl to download them. (example from: Download large files in Dartlang):...
Read more >big files above 10 GB does not get uploaded properly (git lfs)
Using git lfs, I am trying to upload a lot of big data files. Most went up with no problems, but 1 files...
Read more >How to Upload Large Files in Azure Blob in chunks from using ...
Hi Team, I have tried to upload large files from the LWC componet in chunks. But not found any call-back URL for uploading...
Read more >Avoid Git LFS if possible - Hacker News
git supports large files, it just can't track changes in binary files efficiently and if they're large you check in a new blob...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@cdoneshot Glad to see that! Following versions of SDK will have this fix.
@cdoneshot Old version Chromium (QQ browser currently uses a very old Chromium core) has a bug with local file reading, which will failed to GC unused memory, may lead to net::ERR_FILE_NOT_FOUND error. Refer to https://github.com/jhiesey/stream-http/issues/57
Please try with this version which has a workaround for this issue: https://1drv.ms/u/s!AtKGeL9cTFgWgtcre3izmixWzq1Buw