question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Performance impact with huge data insertions

See original GitHub issue

We have an API paginated over 5000 records with 500 per page. Each page’s response is of 550KB in size. As soon as a page’s response is obtained, we write the data to the realm DB using realm.create() method. What we have observed is, when the Realm table has 2000 records inserted, the insertions from 2001-5000 are taking 2seconds per insertion. This is again causing an issue to the UI in our react native app.

Also, we have another API with 50000 records paginated with 500 records per page. Each page’s response is of 50KB size. The insertions of 50000 records took nearly 2 minutes 50 seconds. But for 5000 records, it took 24 minutes, surprisingly.

Our app’s UI was unresponsive for almost 24 minutes due to such slow insertions. Is there a way, we can optimise the insertions of such a huge data?

Version of Realm and Tooling

  • Realm JS SDK Version: 2.21.1
  • Node or React Native: node-8.12.0, RN-0.57
  • Client OS & Version: Android Oreo (8.0)
  • Which debugger for React Native: Samsung Galaxy Tab S3 device, running Android Oreo.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:15 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
kmvkrishcommented, Apr 1, 2019

@bmunkholm We have updated out business logic to handle the sorting and finding uniques rows after the insertions are finished. This gave us a huge performance improvement. Thanks.

1reaction
bmunkholmcommented, Mar 28, 2019

You should ensure you have a primary key for the properties you sort on.

But also find a way where you don’t have to sort 10000 records 5000 times. Can’t you do that after you have inserted all records?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Database crisis: How we improved performance for large ...
Database crisis: How we improved performance for large database inserts ... So around 2,50,000 queries would hit the database together.
Read more >
SQL Bulk Insert Concurrency and Performance Considerations
One of the challenges we face when using SQL bulk insert from files flat can be concurrency and performance challenges, especially if the ......
Read more >
Does the size of a table impact the performance of insert query ...
Simplistically, reducing the size of the table will not have much impact on performance. There are some cases where it could make a...
Read more >
Database performance side effects of huge inserts
My question is whether so many INSERTs on one table would affect in any way the performance of the whole database ?
Read more >
Performance: INSERTing records into "large" tables.
If you're concerned about INSERT performance only, adding indexes will cause more writes to occur which has the potential to slow down INSERTs...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found