[tfjs-backend-wasm] - chrome tab becomes unresponsive when trying to apply the bokeh effect using wasm backend
See original GitHub issueTensorFlow.js version
1.5.1
Browser version
Chrome - 79.0.3945.130
Describe the problem or feature request
I am using the bodyPix model for doing person segmentation and then using the segmentation data for drawing the bokeh effect on a HTML canvas. The chrome tab becomes unresponsive when I try to set the backend to ‘wasm’. There are no issues when I use the ‘webgl’ backend.
Code to reproduce the bug / link to feature request
import * as bodyPix from '@tensorflow-models/body-pix';
import * as tf from '@tensorflow/tfjs';
import '@tensorflow/tfjs-backend-wasm';
await tf.setBackend('wasm');
const bpModel = await bodyPix.load({
architecture: 'MobileNetV1',
outputStride: 16,
multiplier: 0.75,
quantBytes: 2
});
const segmentationData = await bpModel.segmentPerson(this._inputVideoElement, {
internalResolution: 'low', // resized to 0.25 times of the original resolution before inference
maxDetections: 1, // max. number of person poses to detect per image
segmentationThreshold: 0.7 // represents probability that a pixel belongs to a person
});
bodyPix.drawBokehEffect(
this._outputCanvasElement,
this._inputVideoElement,
segmentationData,
15, // Constant for background blur, integer values between 0-20
7 // Constant for edge blur, integer values between 0-20
);
Issue Analytics
- State:
- Created 4 years ago
- Comments:10 (2 by maintainers)
Top Results From Across the Web
Web application become unresponsive on chrome
We are facing some bizarre issue with our web application when we run it on chrome. All the operations on the application executes...
Read more >Content-Security-Policy to make Tensorflow.js's WASM ...
My question got answered in the Tensorflow.js repo. I'll repost the solution here in case anyone has the same issue. CSP headers to...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
In my experience TF.js blocks the thread it is running in during the inference, even with WebGL backend. So I moved TF.js usage into a WebWorker to mitigate that, and it fixed the UI blocking (it was very apparent during ~5 secs warmup especially, but for each inference it blocks UI thread too when I intentionally made inference more expensive), but then I can ~only support Chrome now because WebGL backend is only available through OffscreenCanvas inside WebWorker. It is common in community:
https://erdem.pl/2020/02/making-tensorflow-js-work-faster-with-web-workers https://medium.com/@wl1508/webworker-in-tensorflowjs-49a306ed60aa https://towardsdatascience.com/tensorflow-js-using-javascript-web-worker-to-run-ml-predict-function-c280e966bcab
@dsmilkov @nsthorat @tylerzhu-github @rthadur I’m wondering do you think there could be another way to avoid blocking UI/main thread while still supporting more browsers and using WebGL backend? Thanks and sorry for mentioning all.
Are you satisfied with the resolution of your issue? Yes No