[TypeError: img.toFloat is not a function] When passing an image from decodeJpeg (tfjs-react-native) into model.classify (tfjs-auto-ml)
See original GitHub issuePlease make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information
- Have I written custom code (as opposed to using a stock example script provided in TensorFlow.js): Yes
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: All mobile devices
- TensorFlow.js installed from (npm or script link): npm
- TensorFlow.js version (use command below): 2.7.0
- tfjs-react-native 0.5.0
- tfjs-automl: 1.0.0
Describe the current behavior
I can get the image from the react native file system (using expo-file-system), convert it from base64 to array buffer, decode it (with tfjs-react-native), but once I pass it to my ImageClassificationModel
from tfjs-automl, it gives me the error. It seems like the object returned from decodeJpeg (tfjs-rn) doesn’t have a method toFloat
that ImageClassificationModel.classify
expects it to have for preprocessing.
Standalone code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem. If possible, please share a link to Colab/CodePen/any notebook.
const base64 = await ExpoFileSystem.readAsStringAsync(imageUri, {
encoding: 'base64'
})
const raw = Uint8Array.from(toByteArray(base64)) // toByteArray is from npm package 'base64-js'
const imageTensor = decodeJpeg(raw) // decodeJpeg is from tfjs-rn
const predictions = await model.classify(imageTensor) // ImageClassificationModel from tfjs-automl
Here’s how I create my ImageClassificationModel
, if it’s relevant. I’m using expo to grab the asset txt as a string, then I pass it along with my asset ids for my model.json and my model_weights.bin. I’m using metro bundler properly to make sure that the extensions .txt. and .bin are bundled.
async function loadModel(dictId: number, modelJsonId: any, weightsId: number) {
const [dictAsset] = await Asset.loadAsync(dictId)
const dictRaw = await ExpoFileSystem.readAsStringAsync(dictAsset.localUri)
const dict = dictRaw.trim().split('\n')
const model = await loadGraphModel(bundleResourceIO(modelJsonId, weightsId))
return new ImageClassificationModel(model, dict)
}
Here’s what I get when I log the imageTensor
object. This is supposed to be of type Tensor
.
{"dataId": {}, "dtype": "int32", "id": 324, "isDisposedInternal": false, "kept": false, "rankType": "3", "shape": [224, 126, 3], "size": 84672, "strides": [378, 3]}
I can definitely see why this is crashing. ImageClassificationModel
from automl wants to call toFloat()
on this Tensor
as part of preprocessing before classification.
Issue Analytics
- State:
- Created 3 years ago
- Comments:9 (4 by maintainers)
Top GitHub Comments
Switching from
import { loadGraphModel } from '@tensorflow/tfjs'
toimport * as tf from '@tensorflow/tfjs'
/tf.loadGraphModel
allowed me to remove the all chained ops script.Are you satisfied with the resolution of your issue? Yes No