question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Getting error when loading wasm backend

See original GitHub issue

System information

  • OS : Windows 7 - 32 bit
  • Hardware : PC : Pentium Dual Core CPU E5400 @ 2.70 GHz 2.69 GHz , RAM : 1 GB (also seen in a machine with RAM of 4GB)
  • TensorFlow.js installed from (npm or script link): Script
  • TensorFlow.js version : 2.3
  • Browser version: Chrome Version 85

The functionality(loading of backend ‘wasm’) works fine for most of the machines I am using, but there are few machines where I have noticed that the wasm backend fails to load. The error says : Error: Backend ‘wasm’ has not yet been initialized. Make sure to await tf.ready() or await tf.setBackend() before calling other methods image

Although I am pretty sure that I am not executing any other method before promise is resolved as shown in below code.

try {
        BlazeFaceModel = await blazeface.load();
        tf.setBackend('wasm').then(() => callNext());

    } catch (error) {
        console.error(error);
    }

I have seen this issue mostly in machines with low end hardware and when webgl is not available.

Also it will be helpful if you can clarify some of the below:

  1. If we request tf to set a certain backend and if it is not available, will there be fallback to other backends?
  2. When I have put the command, tf.setBackend(‘wasm’). Why is webgl loading? Is webgl loaded first and then the change in backend happens?
  3. I have used a promise based approach (similar to await as mentioned in the console error) to execute next command only after the backend is set. Why does the error ask me to await again? Should I use both await tf.ready() with await tf.setBackend() ?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7

github_iconTop GitHub Comments

1reaction
rthadurcommented, Oct 15, 2020

you are right it is supported ,let wait for @annxingyuan response. Thank you

0reactions
google-ml-butler[bot]commented, Nov 2, 2020

Closing as stale. Please @mention us if this needs more attention.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Error when loading WebAssembly file from NestJs backend
I'm trying to load a WebAssembly file into the browser. I'm trying to load this library and when I try to do what...
Read more >
@tensorflow/tfjs-backend-wasm - npm
This package adds a WebAssembly backend to TensorFlow.js. ... If you are loading the WASM backend from jsdelivr through the script tag, ...
Read more >
Building to WebAssembly - Emscripten
The wasm backend is strict about linking files with different features sets - for example, if one file was built with atomics but...
Read more >
Common 503 errors on Fastly | Fastly Help Guides
Similar to backend read errors, connection timeouts can be caused by transient network issues, long trips to origin, and origin latency.
Read more >
Rust and WebAssembly
It comes pre-configured, and you shouldn't have to tweak this at all to get webpack and its local development server working. wasm-game-of-life/www/index.html.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found