loadModel from url doesn't work in Node
See original GitHub issue(Reported by another user (which is why I don’t have the stack trace))
loadModel with a url path doesn’t work in Node. This is most likely related to fetch
missing in node. We should detect the environment and use node’s built-in HTTP, or conditionally import node-fetch when we are not in the browser.
cc @nsthorat, @tafsiri for ideas on node <–> browser interop.
Issue Analytics
- State:
- Created 5 years ago
- Comments:23 (5 by maintainers)
Top Results From Across the Web
TFJS-Node: How to load model from url? - Stack Overflow
I want to load a model from an url in node. This works in the broser: mobileNet = await tf.loadModel('https://storage.googleapis.
Read more >lstm example with node.js, load, and http requests
Now I'm trying to take that code and run it via node on a remote server. ... loadModel (/work/lstm_model_loader/node_modules/@tensorflow/tfjs/node_modules/@ ...
Read more >Save and load models | TensorFlow.js
Saving a model in node.js does not prevent it from being loaded in the browser. Below we will examine the different schemes available....
Read more >An introduction to AI in Node.js - IBM Developer
In this tutorial, you'll get an overview of using AI in your Node.js applications by using TensorFlow.js. To work through this learning path ......
Read more >Loading models into tensorflow.js via react.js | by Manfye Goh
If you get the console log of “Load model success” means your model is ... while the layered model doesn't have the node...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
Top Related Hashnode Post
No results found
Top GitHub Comments
I get a fetch error even when using a local file running tfjs 13.1 and node 8.11.
Models was saved from Keras with the Python package
Update – I also get an error when trying to run the example file loader code from @caisq: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle
FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.
Code sample:
package.json looks like:
Node.js code looks like: