question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Fetching large json data results in OutOfMemoryError

See original GitHub issue

Description

Using fetch to get several Megabytes (currently happens for me at around 80Mb) of JSON data causes Android to panic and throw an OutOfMemoryError. This is due to the fact that the whole response is being read as bytes, quickly filling up the heap.

Stacktrace:

09-01 21:32:21.035  5480  5555 E AndroidRuntime: java.lang.OutOfMemoryError: Failed to allocate a 98836368 byte allocation with 25165824 free bytes and 88MB until OOM, target footprint 133385952, growth limit 201326592
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at okio.Buffer.readByteArray(Buffer.kt:1429)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at okio.Buffer.readByteArray(Buffer.kt:1424)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at okio.RealBufferedSource.readByteArray(RealBufferedSource.kt:238)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at okhttp3.ResponseBody.bytes(ResponseBody.kt:124)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at com.facebook.react.modules.blob.BlobModule$4.toResponseData(BlobModule.java:134)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at com.facebook.react.modules.network.NetworkingModule$2.onResponse(NetworkingModule.java:512)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
09-01 21:32:21.035  5480  5555 E AndroidRuntime: 	at java.lang.Thread.run(Thread.java:923)

React Native version:

System:
    OS: Linux 5.13 Solus 4.3
    CPU: (8) x64 Intel(R) Core(TM) i7-7700HQ CPU @ 2.80GHz
    Memory: 1.39 GB / 15.52 GB
    Shell: 5.1.8 - /bin/bash
  Binaries:
    Node: 14.17.5 - /usr/bin/node
    Yarn: 1.22.10 - /usr/bin/yarn
    npm: 6.14.14 - /usr/bin/npm
    Watchman: Not Found
  SDKs:
    Android SDK: Not Found
  IDEs:
    Android Studio: Not Found
  Languages:
    Java: 1.8.0_302-solus - /usr/lib64/openjdk-8/bin/javac
  npmPackages:
    @react-native-community/cli: Not Found
    react: 17.0.2 => 17.0.2 
    react-native: 0.65.1 => 0.65.1 
  npmGlobalPackages:
    *react-native*: Not Found

Steps To Reproduce

I created a test repository that shows the aforementioned behaviour. It also includes a 100Mb JSON file for testing, that you can serve locally or access via GitHub directly. Steps are as follows

  1. Tap on the load data button
  2. Wait for the app to crash

Expected Results

Getting an out of memory error shouldn’t really happen with this size of data in my opinion. Sure 100Mb sounds a lot at first, but in enterprise-grade apps this is probably a common scenario. In any case I think there should be a possibility to dynamically switch to streaming the response since okhttp offers bytestream and charstream as well. I’m not that well versed in Java but it would probably help reduce all of the 150Mb landing on the heap at once?

Snack, code example, screenshot, or link to a repository:

https://github.com/curtisy1/ReactNativeFetchRepro

Issue Analytics

  • State:open
  • Created 2 years ago
  • Reactions:1
  • Comments:11

github_iconTop GitHub Comments

2reactions
stale[bot]commented, Jan 9, 2022

Hey there, it looks like there has been no activity on this issue recently. Has the issue been fixed, or does it still require the community’s attention? This issue may be closed if no further activity occurs. You may also label this issue as a “Discussion” or add it to the “Backlog” and I will leave it open. Thank you for your contributions.

1reaction
curtisy1commented, May 16, 2022

@TommysG We didn’t find any solution to this. What we tried doing was reduce the size of our responses and optimize them in a way that this issue is more unlikely to happen for our use case. That is more of a bandaid solution, however and I’m fairly certain we will easily reach that cap in production where larger data is quite common

Read more comments on GitHub >

github_iconTop Results From Across the Web

Processing large JSON files in Python without running out of ...
Loading complete JSON files into Python can use too much memory, leading to slowness or crashes. The solution: process JSON data one chunk ......
Read more >
Getting Out of memory due to large JSON data - Stack Overflow
Sometimes this error comes when available heap size is full, you have to clear cache programmatically. Also use jackson for parsing json data....
Read more >
processing large JSON file error: OutOfMemory
Hi, I am new to Talend Open Studio for Data Integration. ... when the tInputFileJSON reads a big file (80M), it will run...
Read more >
How to filter 1GB JSON on the frontend and not crash ... - Uptech
Simple filtering of a large amount of data leads to “heap out of memory”. As of April 2018, I didn't find any virtual...
Read more >
Parse Large Json Files in Java using Gson Streaming - amitph
The JSON string is first loaded into memory and converted into an object. Thus, large JSON objects can lead to OutOfMemoryError. We can...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found