Make large files great again
See original GitHub issueI was in the process of stress testing this app before making the switch but I seem to have run into some problems with large files.
I don’t expect to upload this large of a file but I tried to upload a 23GB mkv file through my admin sharex config. This was directly to my publicip:9999, no nginx inbetween. Monitoring the ram and cpu, cpu seemed to spike a few times throughout the upload but ram was pretty strong at 230~MB total usage and it wouldn’t go up which is great. I watched the file gain in size in /uploads and started counting how long it would take from the upload being finished to it giving me the url. It was stuck on the process that was done uploading but still waiting to give a url for about a minute and 30 seconds. After that the file started over with reuploading, although in my uploads folder, the old file remained and a new file was being written for this new transfer.
I cancelled it then since I think it would just loop over and over probably. I tested beforehand with a gigabyte file and it worked all right (there was some delay between the upload being finished to url generated). In my config I have maxSize: '150000MB',
and noJsMaxSize: '100GB',
The other main options are pretty much default.
Is there a way that I could give you helpful debug logs for these transfers? I’m worried I might be hitting a bottleneck with the architecture of node.
Issue Analytics
- State:
- Created 3 years ago
- Comments:10 (5 by maintainers)
Top GitHub Comments
Frankly hadn’t given that any thought before this issue. Also thought of something like that while this was going on, but I was sure Multer (the current lib we use to parse multipart data) didn’t have stream-based API to hook into. I just gave it a look again, and indeed stream-based API is only on RC versions versions atm. Can probably give that a try as-is, but eh, dunno. Though there are some solutions that involve writing own Multer storage engine, such as this one. Probably a better choice for now.
Aight, that sounds good to me as well.
This is incredible. Thank you for adding this. It was almost instant after the 23GB file was done uploading until it gave me the link. So much faster now