Deeplab/Segmentation: Error: The dtype of dict['ImageTensor'] provided in model.execute(dict) must be int32, but was float32
See original GitHub issueTensorFlow.js version
“@tensorflow/tfjs-node”: “^2.0.1” “@tensorflow/tfjs”: “^2.0.1”
Browser version
Nodejs and expressjs, no browser
Describe the problem or feature request
In using a LOCALLY downloaded and served model… in the NPM/node module the index.js file requires me to cast a tensor to an int32 or it errors out about float32 no matter what I pass it.
Code to reproduce the bug / link to feature request
model loading
const loadModelDeepLab = async () => {
const modelName = 'pascal'; // set to your preferred model, either `pascal`, `cityscapes` or `ade20k`
const quantizationBytes = 2; // either 1, 2 or 4
const url = 'https://tfhub.dev/tensorflow/tfjs-model/deeplab/pascal/1/default/1/model.json?tfjs-format=file';
return await deeplab.load({modelUrl: url,base: modelName, quantizationBytes});
};
info on the tensor
Tensor {
kept: false,
isDisposedInternal: false,
shape: [ 1000, 1000, 3 ],
dtype: 'int32',
size: 3000000,
strides: [ 3000, 3 ],
dataId: {},
id: 2,
rankType: '3',
scopeId: 0 }
the error out
"(node:35252) UnhandledPromiseRejectionWarning: Error: The dtype of dict['ImageTensor'] provided in model.execute(dict) must be int32, but was float32
"
I manually changed offending code to… and everything runs
SemanticSegmentation.prototype.predict = function (input) {
var _this = this;
return tf.tidy(function () {
var data = utils_1.toInputTensor(input);
return tf.squeeze(_this.model.execute(tf.cast(data,"int32")));
});
};
Issue Analytics
- State:
- Created 3 years ago
- Reactions:3
- Comments:6
Top Results From Across the Web
TensorflowJS TFJS error: The dtype of dict - Stack Overflow
The default dtype for tf.zeros is 'float' in the recent release of TFJS. ... provided in model.execute(dict) must be int32, but was float32....
Read more >The dtype of dict['ImageTensor'] provided in model ... - Reddit
The dtype of dict['ImageTensor'] provided in model.execute(dict) must be int32, but was float32. I'm trying to use deeplab pretrained model ...
Read more >Error: The shape of dict['input_tensor'] provided in model ...
Hi, I converted my saved model to model json using tensorflowjs_converter ... provided in model.execute(dict) must be int32, but was float32.
Read more >Unhandled Rejection (Error): The dtype of dict['ImageTensor ...
[Solved]-Unhandled Rejection (Error): The dtype of dict['ImageTensor'] provided in model.execute(dict) must be int32, but was float32-Reactjs.
Read more >The Shape Of Dict['Tofloat'] Provided In Model.Execute(Dict ...
The dtype of dict['ImageTensor'] provided in model.executedict must be int32 but was float32. I'm trying to use deeplab pretrained model from. Uncaught Error: ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Same problem for me when trying to run the TensorFlow SavedModel Import Demo with a custom ssd mobilenet v2 model.
@parthlathiya42 it worked for me with the following cast:
I got the same error.
My Scenerio: I trained a pre-trained model from tensorflow model zoo using transfer learning using tensorflow api as saved model (model.pb file) and converted it into tfjs format (model.json and shared .bin files).
When I tried running this model.json on the javascript(web), it gives the below error:
When I tried someone else’s working converted model (model.json and shared .bin files) on my javascript(web), it worked.
Conclusion: There is something wrong with my converted model. I converted it using tensorflowjs_converter. My original model in (model.pb) works accurately in python too.
I’m still trying out to convert my model.pb file with different tensorflowjs_converters as it seems to be the converters versioning issue.