Local fetch not working in android tfjs-react-native API
See original GitHub issueTensorFlow.js version
@tensorflow/tfjs" : 1.7.4 @tensorflow/tfjs-react-native : 0.2.3
Browser version
expo : 37.0.3
Describe the problem or feature request
Local fetch does not work on android but works fine on ios
Code to reproduce the bug / link to feature request
import React, { useEffect, useState } from "react";
import { StyleSheet, Text, View, Button, Image } from "react-native";
import { ImageBrowser } from "expo-multiple-media-imagepicker";
import * as tf from "@tensorflow/tfjs";
import * as tf_rn from "@tensorflow/tfjs-react-native";
import * as Permissions from "expo-permissions";
import * as jpeg from "jpeg-js";
const askPerm = async () => {
return await Permissions.askAsync(Permissions.CAMERA_ROLL);
};
export default function App() {
const [imagePickerOpen, setImagePickerOpen] = useState(false);
const [TFReady, setTFReady] = useState(false);
useEffect(() => {
askPerm();
tf.ready().then(() => setTFReady(true));
}, []);
const imageBrowserCallback = async (s) => {
let files = await s;
console.log(files);
const r = await tf_rn.fetch(files[0].uri, {}, { isBinary: true });
const rawImageData = await r.arrayBuffer();
const imageTensor = imageToTensor(rawImageData);
setImagePickerOpen(false);
};
return (
<View style={styles.container}>
<Text>TF Loaded {TFReady.toString()}</Text>
{imagePickerOpen ? (
<ImageBrowser
max={20} // Maximum number of pickable image. default is None
headerButtonColor={"#E31676"} // Button color on header.
badgeColor={"#E31676"} // Badge color when picking.
callback={imageBrowserCallback}
/>
) : (
<Button
title="open"
onPress={() => setImagePickerOpen(!imagePickerOpen)}
></Button>
)}
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: "#fff",
alignItems: "center",
justifyContent: "center",
},
});
Example of a files[] for android
Array [
Object {
"albumId": "-1034915852",
"creationTime": 1491896243000,
"duration": 0,
"exif": Object {
"DateTime": "2017:04:11 13:07:23",
"DateTimeOriginal": "2017:04:11 13:07:23",
"ImageLength": 637,
"ImageWidth": 622,
"LightSource": 0,
"Orientation": 0,
"UserComment": "{\"sha1\":\"4e7b7ebc00464c46dfc8b03c0b8ad968770bfc6b\",\"ext\":\"jpg\"}",
},
"filename": "Share-your-encounters-with-disguised-devils.jpg",
"height": 637,
"id": "2638",
"localUri": "file:///storage/emulated/0/pictures/9gag/Share-your-encounters-with-disguised-devils.jpg",
"location": null,
"mediaType": "photo",
"modificationTime": 1491896243000,
"uri": "file:///storage/emulated/0/pictures/9gag/Share-your-encounters-with-disguised-devils.jpg",
"width": 622,
}
]
ios:
Array [
Object {
"creationTime": 1588249831000,
"duration": 0,
"exif": Object {
"ColorModel": "RGB",
"Depth": 16,
"HasAlpha": true,
"Orientation": 1,
"PixelHeight": 2688,
"PixelWidth": 1242,
"ProfileName": "Display P3",
"{Exif}": Object {
"DateTimeOriginal": "2020:04:30 18:00:31",
"PixelXDimension": 1242,
"PixelYDimension": 2688,
"UserComment": "Screenshot",
},
"{PNG}": Object {
"InterlaceType": 0,
},
"{TIFF}": Object {
"Orientation": 1,
},
},
"filename": "IMG_0546.PNG",
"height": 2688,
"id": "30493338-1075-414A-A9D0-8C2FC25F52C4/L0/001",
"isFavorite": false,
"isHidden": false,
"localUri": "file:///var/mobile/Media/DCIM/100APPLE/IMG_0546.PNG",
"location": null,
"mediaSubtypes": Array [
"screenshot",
],
"mediaType": "photo",
"modificationTime": 1588249832257,
"orientation": 1,
"uri": "assets-library://asset/asset.PNG?id=30493338-1075-414A-A9D0-8C2FC25F52C4&ext=PNG",
"width": 1242,
},
]
Error faced in android:
[Unhandled promise rejection: TypeError: Network request failed]
- node_modules/@tensorflow/tfjs-react-native/dist/platform_react_native.js:97:22 in xhr.onerror
- node_modules/event-target-shim/dist/event-target-shim.js:818:39 in EventTarget.prototype.dispatchEvent
- node_modules/react-native/Libraries/Network/XMLHttpRequest.js:574:29 in setReadyState
- node_modules/react-native/Libraries/Network/XMLHttpRequest.js:388:25 in __didCompleteResponse
- node_modules/react-native/Libraries/vendor/emitter/EventEmitter.js:190:12 in emit
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:436:47 in __callFunction
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:111:26 in __guard$argument_0
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:384:10 in __guard
- node_modules/react-native/Libraries/BatchedBridge/MessageQueue.js:110:17 in __guard$argument_0
* [native code]:null in callFunctionReturnFlushedQueue
Issue Analytics
- State:
- Created 3 years ago
- Reactions:6
- Comments:26 (6 by maintainers)
Top Results From Across the Web
React Native Android Fetch failing on connection to local API
You are not able to access your local development server because that port hasn't been forwarded by ADB yet. When you run react-native ......
Read more >TensorFlow.js for React Native is here!
The framework provides a way to author applications and is responsible for communication between the JavaScript thread and native APIs. tfjs- ...
Read more >Image Classification on React Native ... - DEV Community
I think the issue is @tensorflow/tfjs-react-native": "0.1.0-alpha.2" has a problem on Android platforms to fetch images from file:\\ and a HTTP ...
Read more >react-native-background-fetch - npm
iOS & Android BackgroundFetch API implementation for React Native. Latest version: 4.1.7, last published: 10 days ago.
Read more >Image Classification on React Native with ... - Aman Mittal
4import { fetch } from '@tensorflow/tfjs-react-native'; ... you can load the model being used in this app (mobilenet) is integrating or not.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Digging into this it looks like fetch is not a good fit for loading these types of files. The api our custom
fetch
is trying to implement leaves behavior for file:// scheme urls undefined. I probably wouldn’t rely on on it for this even where it seems to work. Instead I would use an existing file reading mechanism in react native to get the data. You want to be able to read the file data into one of two formats:Here is a skeleton of some code using expo-filesystem that you can adapt to your code. We do something similar in our bundleResourceIO model loader (though we use a different library for reading the file).
Give that a try and let me know if that works.
I’d also recommend that any one on 0.1.0-alpha.2 upgrade to 0.2.3.
@ryanvolum you should definitely downscale your images before running inference, typically the models are trained on relatively small images compared to typical camera resolutions (for example our mobilenet wrapper will internally resize inputs to 224x224). So finding a smaller size that will give you adequate performance will help performance. On the memory usage front, if you can resize the image before you even get it into react native, you will save having to allocate those large base64 encoded strings.
Another perforamance tip. If you are processing many images, it will be faster if they are all the same size (or use a small number of fixed dimensions), every time you pass a different sized image through your pipeline a bunch of texture initialization (and memory usage) will happen. If you use a small set of fixed sizes you only pay the overhead for each different size once.