Very large memory being used for a 50mb excel file
See original GitHub issueWhen I make the call to ExcelReaderFactory.CreateReader(Stream stream)
, which stream contains the contents to a 50mb excel file - a total of 800 mb gets eaten up by that call (halting execution while doing so).
Is this using DOM? I thought only a single element/line was ever loaded at a time in the memory.
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:27
Top Results From Across the Web
Excel issue running large files
The first reason is loading time. If your excel file is large (2 mb or larger) you will notice longer loading time, reduced...
Read more >Reduce the file size of your Excel spreadsheets
If the file size of your spreadsheet is too large, try the following tips to make it more manageable.
Read more >Clean up an Excel workbook so that it uses less memory
Identifies areas in Excel workbooks that use lots of memory and describes how you can make your workbook files work more efficiently.
Read more >Excel File Too Large - Unable to Open
The biggest problem when you say open and delete data is the fact the file just wont open. It gets to 5%, RAM...
Read more >Microsoft Excel 365 HUGE Memory Leak (50 GB RAM)
If Excel works normally, it consumes 150-500MB of RAM. When it starts to lag, it consumes 40+GB. Usually the image freezes for a...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The Server GC trades memory in favor of throughput. Then there’s the Retain VM option that also affects how .NET frees memory back to the OS.
I have an idea how we can change ExcelDataReader to allocate less but will have to test if it has any real world impact. Could be that the objects we can do something about are so shortlived that it doesn’t matter.
@kikaragyozov I am also facing memory issue when upload huge files. In my case files are more than 200MB in size.
So I have decided to a build a library to read huge Xlsx file. library is mostly ready. If you try it today it should work fine. please fell free to try and share your feedbacks and bugs. In coming days I will push few more changes to iron out any issue and add documentation.
Link to repository https://github.com/ArjunVachhani/XlsxHelper