Loaded Frozen model running forever
See original GitHub issueTo get help from the community, we encourage using Stack Overflow and the tensorflow.js
tag.
TensorFlow.js version
tfjs@0.15.1
Browser version
Chrome 72.0.3626.121
Describe the problem or feature request
The loaded frozen model take very long time (over 30 minutes. actually I never saw it complete predicting) even if it’s only about medium size (40MB). And it take me only about 3 seconds to forward pass the model using Python on MAC.
Since there is no message available when I called model.predict()
, I’m asking for help to try to identify where should be wrong.
const model = await tf.loadFrozenModel(MODEL_URL, WEIGHTS_URL);
console.log("Model loaded.");
console.log(model);
const cat = document.getElementById('cat');
t = tf.fromPixels(cat);
t = t.toFloat();
t = tf.expandDims(t, 0);
console.log("Start transformation by model...")
model.predict(t).dispose(); # get stuck here forever
Code to reproduce the bug / link to feature request
I had uploaded the code and the model artifacts for reproducing the issue here: Github repo: tfjs-debug-2. You can clone the repo and start a Python HttP server by python httpserver.py
and visit localhost:1234
.
Any suggestion or advice are appreciated!
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
How to speed up the Keras model.predict? - Stack Overflow
Load frozen graph using TensorFlow 1.x functions with tf.io.gfile.GFile("./frozen_models/frozen_graph.pb", "rb") as f: graph_def ...
Read more >Loading project frozen on "Load Scripting Assemblies"
It runs forever, eventually running up the memory to 100% and blackscreening my computer. On the latest 2020 LTS, URP, recently started ...
Read more >Get&Transform / Power Query stuck on load to Data Model
I have been working with Power Query / Get&Transform for several years now, and have now run into a problem I have never...
Read more >Session run gets stuck running inference on Jetson Nano
Issue 1: It takes forever to run the get_frozen_graph() function. My graph file is only 23MB. But eventually it executes. Issue 2: In...
Read more >Running forever with the Raspberry Pi Hardware Watchdog
Running forever with the Raspberry Pi Hardware Watchdog ... you might find indications of such as freeze in the /var/log/kernel.log file.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Great!
tf.fromPixels
becametf.browser.fromPixels
in 1.0. See the “Breaking changes” section in our release notes: https://github.com/tensorflow/tfjs/releases/tag/v1.0.0Hi,
If you are short on time, you can produce a custom tfjs bundle with that PR and test to see that it fixes your problem.
NOTE This will give you tfjs 1.x, which means it will only run the new format (model.json) and you have to use the new
tf.loadGraphModel
API instead oftf.loadFrozenModel
. If you need0.15.x
, you’ll have to checkout the 0.15.x branch from tfjs-core, manually apply the fix in tensorflow/tfjs-core#1622 and follow the steps above.