directoryOpen gives up on large directories
See original GitHub issueThis seems absurd but I have a use case for the Filesystem Access API where I pass it a game folder and it reads all the whole folder recursively for all the files needed to load a map into a webGL instance.
The problem is the folder is ~2.5GB and has ~17,000 files. 2/10 times I open the folder directoryOpen will return an array with all ~17,000 files, the rest of the time the function seems to parse all the folders but stops and doesn’t return anything. This works 100% of the time on Firefox and Safari but in Chrome, latest stable and beta it rarely works.
No errors or warnings in the console, it just stops and doesn’t return at all. Sometimes it will read all the folders successfully but not return at all, sometimes it will stop a few folders short of the total amount of folders. This also happens from time to time with ~2000 files but not as often.
Is this something to do with too many promises being made at once or is this a deeper issue or some maximum limit in the API spec?
Another observation is that introducing an artificial delay with something like
const delay = (ms: number) => new Promise(resolve => setTimeout(resolve, ms)) delay(100)
Increases the chance of the function returning anything by a lot, obviously this takes far longer to load the files. Any ideas? Thanks!
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (3 by maintainers)
Top GitHub Comments
Thanks for the bug report. I have routed it to the correct team and close this bug in favor of the Chrome bug.
Here is the relevant block https://github.com/x8BitRain/webhl/blob/master/src/components/FileLoader.tsx#L46 and Chromium bug https://bugs.chromium.org/p/chromium/issues/detail?id=1176294