question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Loaded Frozen model running forever

See original GitHub issue

To get help from the community, we encourage using Stack Overflow and the tensorflow.js tag.

TensorFlow.js version

tfjs@0.15.1

Browser version

Chrome 72.0.3626.121

Describe the problem or feature request

The loaded frozen model take very long time (over 30 minutes. actually I never saw it complete predicting) even if it’s only about medium size (40MB). And it take me only about 3 seconds to forward pass the model using Python on MAC.

image

Since there is no message available when I called model.predict(), I’m asking for help to try to identify where should be wrong.

const model = await tf.loadFrozenModel(MODEL_URL, WEIGHTS_URL);
console.log("Model loaded.");
console.log(model);
const cat = document.getElementById('cat');
t = tf.fromPixels(cat);
t = t.toFloat();
t = tf.expandDims(t, 0);

console.log("Start transformation by model...")
model.predict(t).dispose(); # get stuck here forever

Code to reproduce the bug / link to feature request

I had uploaded the code and the model artifacts for reproducing the issue here: Github repo: tfjs-debug-2. You can clone the repo and start a Python HttP server by python httpserver.py and visit localhost:1234.

Any suggestion or advice are appreciated!

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:7 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
dsmilkovcommented, Mar 19, 2019

Great! tf.fromPixels became tf.browser.fromPixels in 1.0. See the “Breaking changes” section in our release notes: https://github.com/tensorflow/tfjs/releases/tag/v1.0.0

1reaction
dsmilkovcommented, Mar 18, 2019

Hi,

If you are short on time, you can produce a custom tfjs bundle with that PR and test to see that it fixes your problem.

NOTE This will give you tfjs 1.x, which means it will only run the new format (model.json) and you have to use the new tf.loadGraphModel API instead of tf.loadFrozenModel. If you need 0.15.x, you’ll have to checkout the 0.15.x branch from tfjs-core, manually apply the fix in tensorflow/tfjs-core#1622 and follow the steps above.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to speed up the Keras model.predict? - Stack Overflow
Load frozen graph using TensorFlow 1.x functions with tf.io.gfile.GFile("./frozen_models/frozen_graph.pb", "rb") as f: graph_def ...
Read more >
Loading project frozen on "Load Scripting Assemblies"
It runs forever, eventually running up the memory to 100% and blackscreening my computer. On the latest 2020 LTS, URP, recently started ...
Read more >
Get&Transform / Power Query stuck on load to Data Model
I have been working with Power Query / Get&Transform for several years now, and have now run into a problem I have never...
Read more >
Session run gets stuck running inference on Jetson Nano
Issue 1: It takes forever to run the get_frozen_graph() function. My graph file is only 23MB. But eventually it executes. Issue 2: In...
Read more >
Running forever with the Raspberry Pi Hardware Watchdog
Running forever with the Raspberry Pi Hardware Watchdog ... you might find indications of such as freeze in the /var/log/kernel.log file.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found