question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ONNX can't export SwishImplementation

See original GitHub issue

Hi, Thanks for the PyTorch implementation. It is really good. However, I’ve ran into some issues. While exporting using torch.onnx, I am getting an error. The code is as follows:

model = EfficientNet.from_name(model_name='efficientnet-b0')
torch.onnx.export(model, torch.rand(10,3,240,240), "EfficientNet-B0.onnx")

The error I am getting is:

Traceback (most recent call last):
  File "/home/bishshoy/pycharm/EfficientNet-PyTorch/main.py", line 17, in <module>
    main()
  File "/home/bishshoy/pycharm/EfficientNet-PyTorch/main.py", line 7, in main
    torch.onnx.export(model, torch.rand(10,3,240,240), "EfficientNet-B7.onnx")
  File "/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/__init__.py", line 27, in export
    return utils.export(*args, **kwargs)
  File "/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/utils.py", line 104, in export
    operator_export_type=operator_export_type)
  File "/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/utils.py", line 287, in _export
    proto, export_map = graph._export_onnx(params, _onnx_opset_version, defer_weight_export, operator_export_type)
RuntimeError: ONNX export failed: Couldn't export Python operator SwishImplementation

Defined at:
/home/bishshoy/pycharm/EfficientNet-PyTorch/efficientnet_pytorch/utils.py(52): forward
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/nn/modules/module.py(477): _slow_forward
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/nn/modules/module.py(487): __call__
/home/bishshoy/pycharm/EfficientNet-PyTorch/efficientnet_pytorch/model.py(175): extract_features
/home/bishshoy/pycharm/EfficientNet-PyTorch/efficientnet_pytorch/model.py(193): forward
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/nn/modules/module.py(477): _slow_forward
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/nn/modules/module.py(487): __call__
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/jit/__init__.py(252): forward
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/nn/modules/module.py(489): __call__
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/jit/__init__.py(197): get_trace_graph
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/utils.py(192): _trace_and_get_graph_from_model
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/utils.py(224): _model_to_graph
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/utils.py(281): _export
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/utils.py(104): export
/home/bishshoy/miniconda3/lib/python3.7/site-packages/torch/onnx/__init__.py(27): export
/home/bishshoy/pycharm/EfficientNet-PyTorch/main.py(7): main
/home/bishshoy/pycharm/EfficientNet-PyTorch/main.py(17): <module>


Graph we tried to export:
graph(%input.1 : Float(10, 3, 240, 240)
      %1 : Float(32, 3, 3, 3)
      %2 : Float(32)
      %3 : Float(32)
      %4 : Float(32)
      %5 : Float(32)
      %6 : Long()
...
...
...

Issue Analytics

  • State:open
  • Created 4 years ago
  • Reactions:2
  • Comments:8 (2 by maintainers)

github_iconTop GitHub Comments

64reactions
lukemelascommented, Oct 17, 2019

Hi, thanks for the issue. The latest update to the repo (pip version 0.5.1) includes two versions of the Swish function, one for training and another for exporting. To switch to the export-friendly version, use .set_swish(memory_efficient=False) . For example:

model = EfficientNet.from_name(model_name='efficientnet-b0')
model.set_swish(memory_efficient=False)
torch.onnx.export(model, torch.rand(10,3,240,240), "EfficientNet-B0.onnx")

Let me know if this does or does not work for you.

2reactions
joakimlindbladcommented, Mar 12, 2020

So, just a reminder to update the ONNX export example in the Readme https://github.com/zhanghang1989/EfficientNet-PyTorch/blob/master/README.md

Read more comments on GitHub >

github_iconTop Results From Across the Web

EfficientNet SwishImplementation Error - PyTorch Lightning
Im new to PyTorch Lighting and I am trying to export my model (EfficientNet based) to .onnx. According to this forum ( ONNX...
Read more >
RuntimeError: ONNX export failed: Couldn't export Python ...
RuntimeError : ONNX export failed: Couldn't export Python operator SwishImplementation. 解决办法: weights_path = '模型权重的路径'
Read more >
How to Convert your ML Model for Cloud Inferencing (Part 3a)
Step 1: Export to ONNX. Let's start with test converting the based model, EfficientNet in our example, to ONNX. We can then add...
Read more >
Export from PyTorch | Docs - Snap Inc.
onnx in PyTorch. The code itself is simple. First we import torch and build a test model. import torch
Read more >
Couldn't Export Pytorch Model To Onnx - ADocLib
To export a model we call the torch.onnx.export function. ... error: RuntimeError: ONNX export failed: Couldn't export Python operator SwishImplementation.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found