The table gets slow or may even fail on large datasets
See original GitHub issueHi,
The table gets progressively slower when displaying large data and at certain data size it fails with an exception. The reason is the deep copy solution that you use in src/store/index.js at line 80:
JSON.parse(JSON.stringify(data))
Please consider changing this to a more optimal solution, like [...data] for example.
Cheers, Petar
Issue Analytics
- State:
- Created 5 years ago
- Reactions:5
- Comments:7 (3 by maintainers)
Top Results From Across the Web
Why MySQL Could Be Slow With Large Tables? - Percona
The three main issues you should be concerned if you're dealing with very large data sets are Buffers, Indexes, and Joins. Buffers. First...
Read more >What to Do When Your Data Is Too Big for Your Memory?
Another way to handle large datasets is by chunking them. That is cutting a large dataset into smaller chunks and then processing those...
Read more >data.table vs data.frame | Handling Large Datasets in R
R users (mostly beginners) struggle helplessly while dealing with large data sets. They get haunted by repetitive warnings, error messages ...
Read more >how to work with large data? table with millions of records
Another thing to consider is using import, but with a SQL query instead of connecting directly to a table. You can reduce the...
Read more >Large datasets in Power BI Premium - Microsoft Learn
The large dataset storage format allows datasets in Power BI ... the Power BI Desktop model upload size, which is still limited to...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

I have a similar scenario like @F-Adam-B , 30+ columns with 5000- 10000 rows. (It includes ExpandedArea too), It’s taking around 30+ sec to render. Any help?
I’m as well using a large data set and running into sluggish rendering issues. @AllenFang will the upgrade to react-bootstrap-table-next@1.3.0 and
react-bootstrap-table2-editor@1.2.0be able to handle a table with 27 columns with 13000 to 15000 values in each column?