question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

loadModel from url doesn't work in Node

See original GitHub issue

(Reported by another user (which is why I don’t have the stack trace))

loadModel with a url path doesn’t work in Node. This is most likely related to fetch missing in node. We should detect the environment and use node’s built-in HTTP, or conditionally import node-fetch when we are not in the browser.

cc @nsthorat, @tafsiri for ideas on node <–> browser interop.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:23 (5 by maintainers)

github_iconTop GitHub Comments

4reactions
limscodercommented, Sep 27, 2018

I get a fetch error even when using a local file running tfjs 13.1 and node 8.11.

Models was saved from Keras with the Python package

 tfjs.converters.save_keras_model(model, path)
model = await tf.loadModel('file:///absolute/path/to/model.json');
(node:71934) UnhandledPromiseRejectionWarning: Error: browserHTTPRequest is not supported outside the web browser without a fetch polyfill.
    at new BrowserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:46:19)
    at Object.browserHTTPRequest (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-core/dist/io/browser_http.js:247:12)
    at Object.<anonymous> (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:98:50)
    at step (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:42:23)
    at Object.next (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:23:53)
    at /Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:17:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:13:12)
    at Object.loadModelInternal (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/models.js:92:12)
    at Object.loadModel (/Users/thoda22/projects/predictatron/metrics/lstm/node_modules/@tensorflow/tfjs-layers/dist/exports.js:17:21)

Update – I also get an error when trying to run the example file loader code from @caisq: https://github.com/caisq/tfjs-dump/tree/master/tfjs-node-doodle

3reactions
caisqcommented, Mar 26, 2019

FYI, if you use tfjs-node or tfjs-node-gpu, loading a tf.LayersModel (i.e., a model converted from Keras or constructed from TF.js itself) should be working with the latest release.

Code sample:

package.json looks like:

{
    "devDependencies": {
        "@tensorflow/tfjs-node": "^1.0.2"
    }
}

Node.js code looks like:

const tf = require('@tensorflow/tfjs-node');

(async function() {
    const modelURL = `https://storage.googleapis.com/tfjs-models/tfjs/mobilenet_v1_0.25_224/model.json`;
    const model = await tf.loadLayersModel(modelURL);
    model.summary();
})();
Read more comments on GitHub >

github_iconTop Results From Across the Web

TFJS-Node: How to load model from url? - Stack Overflow
I want to load a model from an url in node. This works in the broser: mobileNet = await tf.loadModel('https://storage.googleapis.
Read more >
lstm example with node.js, load, and http requests
Now I'm trying to take that code and run it via node on a remote server. ... loadModel (/work/lstm_model_loader/node_modules/@tensorflow/tfjs/node_modules/@ ...
Read more >
Save and load models | TensorFlow.js
Saving a model in node.js does not prevent it from being loaded in the browser. Below we will examine the different schemes available....
Read more >
An introduction to AI in Node.js - IBM Developer
In this tutorial, you'll get an overview of using AI in your Node.js applications by using TensorFlow.js. To work through this learning path ......
Read more >
Loading models into tensorflow.js via react.js | by Manfye Goh
If you get the console log of “Load model success” means your model is ... while the layered model doesn't have the node...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Hashnode Post

No results found