question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

The table gets slow or may even fail on large datasets

See original GitHub issue

Hi, The table gets progressively slower when displaying large data and at certain data size it fails with an exception. The reason is the deep copy solution that you use in src/store/index.js at line 80: JSON.parse(JSON.stringify(data)) Please consider changing this to a more optimal solution, like [...data] for example.

Cheers, Petar

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:5
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
imtmhcommented, Aug 21, 2019

I have a similar scenario like @F-Adam-B , 30+ columns with 5000- 10000 rows. (It includes ExpandedArea too), It’s taking around 30+ sec to render. Any help?

2reactions
F-Adam-Bcommented, Feb 12, 2019

I’m as well using a large data set and running into sluggish rendering issues. @AllenFang will the upgrade to react-bootstrap-table-next@1.3.0 and react-bootstrap-table2-editor@1.2.0 be able to handle a table with 27 columns with 13000 to 15000 values in each column?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why MySQL Could Be Slow With Large Tables? - Percona
The three main issues you should be concerned if you're dealing with very large data sets are Buffers, Indexes, and Joins. Buffers. First...
Read more >
What to Do When Your Data Is Too Big for Your Memory?
Another way to handle large datasets is by chunking them. That is cutting a large dataset into smaller chunks and then processing those...
Read more >
data.table vs data.frame | Handling Large Datasets in R
R users (mostly beginners) struggle helplessly while dealing with large data sets. They get haunted by repetitive warnings, error messages ...
Read more >
how to work with large data? table with millions of records
Another thing to consider is using import, but with a SQL query instead of connecting directly to a table. You can reduce the...
Read more >
Large datasets in Power BI Premium - Microsoft Learn
The large dataset storage format allows datasets in Power BI ... the Power BI Desktop model upload size, which is still limited to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found