errors executing async models loaded via tf.loadGraphModel.
See original GitHub issueTensorFlow.js version
"@tensorflow-models/coco-ssd": "^1.0.0",
"@tensorflow-models/mobilenet": "^1.0.0",
"@tensorflow-models/universal-sentence-encoder": "1.0.1",
"@tensorflow/tfjs": "1.0.0",
"@tensorflow/tfjs-node": "1.0.1",
Error also occurs on tfjs 1.0.2
Node version
Node 11.11
Describe the problem or feature request
The following error occurs when making a prediction with either coco-ssd or universal-sentence-encoder (which are both models with control flow ops). But doesn’t happen with mobilenet.
node:59415) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'id' of undefined
at /Users/yassogba/projects/intent-classifier/node_modules/@tensorflow/tfjs-converter/dist/src/executor/graph_executor.js:295:99
at Array.map (<anonymous>)
at GraphExecutor.<anonymous> (/Users/yassogba/projects/intent-classifier/node_modules/@tensorflow/tfjs-converter/dist/src/executor/graph_executor.js:295:58)
at step (/Users/yassogba/projects/intent-classifier/node_modules/@tensorflow/tfjs-converter/dist/src/executor/graph_executor.js:56:23)
at Object.next (/Users/yassogba/projects/intent-classifier/node_modules/@tensorflow/tfjs-converter/dist/src/executor/graph_executor.js:37:53)
at fulfilled (/Users/yassogba/projects/intent-classifier/node_modules/@tensorflow/tfjs-converter/dist/src/executor/graph_executor.js:28:58)
at processTicksAndRejections (internal/process/next_tick.js:81:5)
From doing some console logging it looks like the tensors in the graph are undefined during executeAsync.
Note that I only get this error when using the node backend (tfjs-node), if i use the vanilla cpu backend (tfjs) it works.
Code to reproduce the bug / link to feature request
// const tf = require('@tensorflow/tfjs'); // using this import works
const tf = require('@tensorflow/tfjs-node');
const cocossd = require('@tensorflow-models/coco-ssd');
const mob = require('@tensorflow-models/mobilenet');
global.fetch = require('node-fetch');
const useLoader = require('@tensorflow-models/universal-sentence-encoder');
// This works
const model = await mob.load();
const res = await model.classify(tf.randomNormal([224, 224, 3]));
console.log(res);
// This fails.
const cocomodel = await cocossd.load();
const cocoa = await cocomodel.detect(tf.randomNormal([224, 224, 3]));
console.log(cocoa)
// this fails
console.time('Loading Universal Sentence Encoder');
const use = await useLoader.load();
console.timeEnd('Loading Universal Sentence Encoder');
Issue Analytics
- State:
- Created 4 years ago
- Comments:10 (5 by maintainers)
Top Results From Across the Web
Failing to load model using tf.loadGraphModel in tfjs
I am trying to get a webgl app from this github repo: https://github.com/terryky/tfjs_webgl_app working, but I am running into some issues.
Read more >Loading models into tensorflow.js via react.js | by Manfye Goh
Step 1: Convert Tensorflow's model to TF.js model (Python environment). Importing a TensorFlow model into TensorFlow.js is a two-step process.
Read more >How to run any Tensorflow model on a browser ... - YouTube
tensorflow #tensorflowjs #javascriptIn this video, I will show you how you can convert an existing Tensorflow model to Tensorflow.js so that ...
Read more >await - JavaScript - MDN Web Docs
Using await pauses the execution of its surrounding async function until the promise is settled (that is, fulfilled or rejected).
Read more >An introduction to AI in Node.js - IBM Developer
Leverage AI in your Node.js applications using TensorFlow.js. ... to load the model. For tf.GraphModel , use loadGraphModel .
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@pyu10055 This repo code demonstrates it https://github.com/tafsiri/use-text-classifier/blob/master/training/train.js, update the package.json to remove the other models. However on your point about that I thought all the models specify tfjs as a peerDependency, so they won’t install their own copy.
Also see https://github.com/tensorflow/tfjs/issues/1454
No longer an issue on latest tfjs.