Only first sheet is returning when working with Huge data sets
See original GitHub issueWhile working or bigger data sets( tested with 2000 cols X 1000 rows) with multiple sheets. In that case, after certain MB, (in my case 40MB), sheet_to_row_object_array
is only returning first sheet.
Is it a limitation? Are there any alternatives to get all the sheets regardless of files size?
Issue Analytics
- State:
- Created 7 years ago
- Comments:15 (7 by maintainers)
Top Results From Across the Web
What to do if a data set is too large for the Excel grid
How to open a data set that exceeds Excel's grid limits · Open a blank workbook in Excel. · Go to the Data...
Read more >Working with large sheets of data in Excel - YouTube
Working with large sheets of data in Excel. Selecting and moving around large sheets. Some time-saving tricks.Check out my online courses ...
Read more >Excel Magic Trick 1242: Transform Large Data Set ... - YouTube
Download Start Excel File: https://excelisfun.net/ files /EMT1441-1442Start.xlsxDownload Finished Excel File: ...
Read more >Connected Sheets: Analyze Big Data In Google Sheets
Running a defined set of functions on big datasets* ... the larger BigQuery dataset (remember the data shown in the Sheet is just...
Read more >7 ways to merge multiple Google sheets into one without ...
Another way to import data from multiple Google Sheets is to export each sheet first, and then import them all to a necessary...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@mtharrison scratch that.
The nodejs string limit is 16 bytes shy of 256MB: https://github.com/nodejs/node/issues/3175 . I thought the previous write test case would cover it, but the underlying xml files are 60M.
I’m going to resolve this in the next version by trying to catch that particular error and give more informative output.
Can you do one last thing: try saving that file as XLSB in Excel and seeing if you hit the same problem? The XLSB parser keeps data as buffers so it theoretically should not trigger a 256MB string conversion
Thanks @SheetJSDev for the super-quick response. I tried with the debug option and the error I get is:
Looks like I’m probably hitting V8’s max string size. I logged the buffer size out before
toString()
is called on it and it’s 356mb.