Export to ONNX Error
See original GitHub issueHi, thanks for the great work. Following my question here , I tried to convert to ONNX using this repo. But I got several errors.
By inputting this command just like your example, I got segmentation fault error :
sudo python3 onnx_export.py --model mobilenetv3_100 ./mobilenetv3_100.onnx
==> Creating PyTorch mobilenetv3_100 model
==> Exporting model to ONNX format at './mobilenetv3_100.onnx'
==> Loading and checking exported model from './mobilenetv3_100.onnx'
Segmentation fault
When I tried with efficientnet_b0 using checkpoint and not using checkpoint
sudo python3 onnx_export.py --model efficientnet_b0 ./efficientnet.onnx
or
sudo python3 onnx_export.py --model efficientnet_b0 --checkpoint ../train/model_best.pth.tar --num-classes 30 ./efficientnet.onnx
I got Couldn't export Python operator SwishAutoFn
error.
==> Creating PyTorch efficientnet_b0 model
=> Loading checkpoint '../train/20191015-Deepeye36k-efficientnet_b0-224/model_best.pth.tar'
=> Loaded checkpoint '../train/20191015-Deepeye36k-efficientnet_b0-224/model_best.pth.tar'
==> Exporting model to ONNX format at './efficientnet.onnx'
Traceback (most recent call last):
File "onnx_export.py", line 75, in <module>
main()
File "onnx_export.py", line 59, in main
input_names=input_names, output_names=output_names)
File "/home/ivan/.local/lib/python3.6/site-packages/torch/onnx/__init__.py", line 26, in _export
result = utils._export(*args, **kwargs)
File "/home/ivan/.local/lib/python3.6/site-packages/torch/onnx/utils.py", line 394, in _export
operator_export_type, strip_doc_string, val_keep_init_as_ip)
RuntimeError: ONNX export failed: Couldn't export Python operator SwishAutoFn
Any help would be appreciated, thanks
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
[ONNX] Error when exporting adaptive_max_pool2d to ONNX
Describe the bug When trying to export this model to ONNX, I get the following: pytorch 1.10.1: no errors/warnings, Segmentation fault ...
Read more >(optional) Exporting a Model from PyTorch to ONNX and ...
This tutorial will use as an example a model exported by tracing. To export a model, we call the torch.onnx.export() function. This will...
Read more >PyTorch to ONNX export, ATen operators not supported ...
Update: I retrained the model on the basis of BERT cased embeddings, but the problem persists. The same ATen operators are not converted...
Read more >error Loading onnx model exported from pytorch to matlab
error Loading onnx model exported from pytorch... Learn more about onnx, neural network, deep learning Deep Learning Toolbox, MATLAB.
Read more >detectron2 torch.onnx.export gives device error for all oprtions ...
With the export_model script provided in detectron2, i am unable to export onnx model because of device error.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@ivder Good to hear. You should be able to use the models with any ONNX runtime or conversion tool (https://github.com/microsoft/onnxruntime, https://github.com/onnx/onnx-tensorrt, etc) that supports the same file format and operator versions as you export in. Caffe2 is just the default available runtime if you have PyTorch installed.
@ivder It looks like the opposite of what I expected, PyTorch/ONNX being too new. I get the same segfault with PyTorch 1.3 + ONNX 1.6 installed. I get a different crash with PyTorch 1.3 + ONNX 1.5 installed. It works with PyTorch 1.2 + ONNX 1.5
Some related issues: https://github.com/onnx/onnx/issues/2417 https://github.com/onnx/onnx/issues/2394