tfjs-tflite gives the error RuntimeError: Aborted(). Build with -s ASSERTIONS=1 for more info.
See original GitHub issueI’m trying to use the tfjs-tflite
package to run a segmentation model but when I load the model, I get the following warning from tflite_web_api_cc_simd.js:9
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
And when I try to run the model, I get the following error from the same line
RuntimeError: Aborted(). Build with -s ASSERTIONS=1 for more info.
at abort (tflite_web_api_cc_simd.js:9:9277)
at _abort (tflite_web_api_cc_simd.js:9:59963)
at tflite_web_api_cc_simd.wasm:0x1e418
at tflite_web_api_cc_simd.wasm:0x2c4ac6
at tflite_web_api_cc_simd.wasm:0x39623
at tflite_web_api_cc_simd.wasm:0x49174
at tflite_web_api_cc_simd.wasm:0x3084f7
at tflite_web_api_cc_simd.wasm:0x17b2a
at TFLiteWebModelRunner$Infer [as Infer] (eval at new_ (tflite_web_api_cc_simd.js:9:37941), <anonymous>:8:10)
at module$exports$google3$third_party$tensorflow_lite_support$web$tflite_web_api_client.TFLiteWebModelRunner.infer (tflite_web_api_client.js?965d:1745:134)
I don’t understand why it’s being aborted and what I should build with -s ASSERTIONS=1
to get more info…
Can somebody explain to me what’s the procedure?
Issue Analytics
- State:
- Created 2 years ago
- Comments:11 (1 by maintainers)
Top Results From Across the Web
abort([object Object]). Build with -s ASSERTIONS=1 for more ...
I wrote the deploy() in try/catch block: const deploy = async () => { try { const accounts = await web3.eth.
Read more >s ASSERTIONS=1 for more info and Uncaught abort(101 ...
I get following error and i have no idea why. This error occurs about every 10th time and is gone after a page...
Read more >error handling - @tensorflow/tfjs-tflite won't load - Stack Overflow
I tried to load up into a local webpage @tensorflow/tfjs-tflite in the ... After further testing, I'm able to load the modules via...
Read more >TFJS-TFLITE给出错误runtimeerror:中止()。使用-s断言 ...
tfjs-tflite gives the error RuntimeError: Aborted(). Build with-s ASSERTIONS=1 for more info.I'm trying to use the tfjs-tflite package to run a segmentation ...
Read more >@tensorflow/tfjs-tflite - npm
@tensorflow/tfjs-tflite. TypeScript icon, indicating that this package has built-in type declarations. 0.0.1-alpha.9 • Public • Published 3 ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thanks @jinjingforever. I was able to add some logs to that function and find out that our model was converted without specifying the input size as a parameter, so it flagged its input layer as dynamically sized. Specifying the input size when using the converted actually did the trick. I can now see that several operations are delegated to XNNPACK, but still half of them are ran by tflite.
I don’t know if this has a big impact on the inference speed or not, but from my tests, even when using the simd_threaded version of XNNPACK, I do not get a performance improvement compared when running with tfjs in the browser (I compare with quantized models for both engines)… Would you have any tips on what I could try to improve further the performance? Knowing that around half of the inference time is taken by the
predict()
method while the other half is taken by thedata()
method to retrieve the values from the tensors.Wow, that sounds really promising! I can’t wait to try it out! Thanks a lot for your help, it was extremely appreciated 😃