question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error loading the model

See original GitHub issue

Trying to reproduce the README example:

    const loadImage = (src) => new Promise((resolve, reject) => {
      const image = new Image();
      image.src = src;
      image.crossOrigin = 'Anonymous';
      image.onload = () => resolve(image);
      image.onerror = (err) => reject(err);
    });

    const pretrainedModelURL = 'https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json';

    tf.loadModel(pretrainedModelURL).then(model => {
      const layer = model.getLayer('conv_pw_13_relu');
      return tf.model({
        inputs: [model.inputs[0]],
        outputs: layer.output,
      });
    }).then(pretrainedModel => {
      return tf.loadModel('/model.json').then(model => {
        return loadImage('/trees/tree1.png').then(loadedImage => {
          const image = tf.reshape(tf.fromPixels(loadedImage), [1,224,224,3]);
          const pretrainedModelPrediction = pretrainedModel.predict(image);
          const modelPrediction = model.predict(pretrainedModelPrediction);
          const prediction = modelPrediction.as1D().argMax().dataSync()[0];
          console.log(prediction);
        });
      });
    }).catch(err => {
      console.error('Error', err);
    });

I have an error from the line:

const pretrainedModelPrediction = pretrainedModel.predict(image);

The error is the next one:

tfjs.js:67 Uncaught (in promise) Error: The dtype of the feed (int32) is incompatible with that of the key 'input_1' (float32).
    at new t (tfjs.js:67)
    at assertFeedCompatibility (tfjs.js:67)
    at e.add (tfjs.js:67)
    at new e (tfjs.js:67)
    at tfjs.js:67
    at tfjs.js:49
    at e.scopedRun (tfjs.js:49)
    at e.tidy (tfjs.js:49)
    at e.tidy (tfjs.js:49)
    at s (tfjs.js:67)

Any idea about this error? My tfjs version is 0.12.2.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:11 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
kevinpelagocommented, Jul 7, 2020

This is awesome! Great article, and thanks for the shout out!

1reaction
aralrocacommented, Jul 7, 2020
Read more comments on GitHub >

github_iconTop Results From Across the Web

Error in loading the model - jit - PyTorch Forums
Since the error message says “error loading the model”, it is likely the program failed to load the model file in torch::jit::load(path) ....
Read more >
Error loading the model because it is being used
My team is trying to use the collaborative feature of Bizagi. We have been able to successfully share a process diagram with two...
Read more >
HL2 Model viewer "Error Loading Model" when I try to compile ...
I'm trying to port a model I rigged from MAX to SFM and I compiled the .qc with the textures and model and...
Read more >
Error loading saved model · Issue #51746 - GitHub
After saving a Quant Aware model to saved_model using model.save(), I get this error while loading it with tf.keras.models.load_model() ...
Read more >
Solved: Unable to Load Model error message
Solved: I have a group of users who receive and "Unable to Load Model" error message. Another group of users have no issues...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found