tfjs-node support for saved models does not recognize valid dtypes
See original GitHub issueSimply calling tfnode.node.getMetaGraphsFromSavedModel(path);
on a model using uint8
results in error:
(node:2420) UnhandledPromiseRejectionWarning: Error: Unsupported tensor DataType: DT_UINT8, try to modify the model in python to convert the datatype
at mapTFDtypeToJSDtype (/home/vlado/dev/test-tfjs/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:465:19)
However, support for unint8
was added to tfjs
via https://github.com/tensorflow/tfjs/pull/2981 back in March.
Those newly supported data types should be added throughout tfjs
codebase.
Environment: Ubuntu 20.04 running NodeJS 14.9.0 with TFJS 2.3.0
Issue Analytics
- State:
- Created 3 years ago
- Comments:18 (8 by maintainers)
Top Results From Across the Web
tfjs-node support for saved models does not recognize valid dtypes ...
Attempting to load saved_model that uses int64 in tfjs-node results in failure due to unrecognized data type: const tf = require('@tensorflow/tfjs-node'); ...
Read more >Error: The shape of dict['input_tensor'] provided in model ...
Hi, I converted my saved model to model json using ... When I changed to @tensorflow/tfjs-node@next, it would throw "Cannot read property ...
Read more >@tensorflow/tfjs-converter - npm package | Snyk
Another limitation of this conversion route is that it does not support some layer types (e.g., recurrent layers such as LSTM) yet. keras_saved_model ......
Read more >How to transform the input in node.js to tf.estimator saved ...
After this the saved model has only one input though I have several features. ... But how do I create a model string...
Read more >An introduction to AI in Node.js - IBM Developer
For Node.js specifically, a model can be written in Python to use the ... TensorFlow.js provides support for several model types:.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@loretoparisi The model expects an encoded word vector as input, while the Universal Sentence Encoder (USE) model returns embeddings.
Basically, you’ll want to use the
loadTokenizer()
function from the previous USE version, but that one requires TFJS 1.x… I have a working version locally, but it’d be better to fix theexamples instead - see Issuemodel repoUnfortunately @pyu10055’s commit b02310745ceac6b8e4a475719c343da53e3cade2 on the USE-repo broke both the Toxicity
examplemodel and your use-case entirely…The real problem is that the examples are outdated and some changes broke TFJS 2.x compatibility (in the case of USE I fail to see the reasoning behind the change - might have been a mistake?).
Meanwhile, I’ll create a gist for you that contains all you need to get this working as a single-file solution. I’ll get back to you in a bit.
EDIT: I got confused here, since a similar issue was raised w.r.t. outdated tfjs-examples. The same applies to tfjs-models, though - basically some models are incompatible with TFJS 2.x due to package changes (not for technical reasons).
@loretoparisi btw, one advantage of working with
saved_model
andgetMetaGraphsFromSavedModel()
is that it shows actual signature names instead just an incrementing array (when model has multiple inputs and/or outputs) that you get from agraph_model
.See https://github.com/tensorflow/tfjs/issues/3942 for details.