question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Can`t export mobilenetv3 model to onnx

See original GitHub issue

model = models.mobilenet_v3_small(pretrained=True) input_np = np.random.uniform(0, 1, (1, 3, 224, 224)) input_var = torch.FloatTensor(input_np) torch.onnx.export(model, args=(input_var), f="cnn.onnx", verbose=False, input_names=["input"], output_names=["output"])

Error: RuntimeError: Exporting the operator hardsigmoid to ONNX opset version 9 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub.

cc @neginraoof

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:8 (7 by maintainers)

github_iconTop GitHub Comments

4reactions
zhiqwangcommented, Mar 11, 2021

Hi @jkparuchuri

Sorry, I haven’t fully tested my proposal and you are right. I only tested exporting nn.Hardswish before. I’ve missed the Hardsigmoid in

https://github.com/pytorch/vision/blob/c991db82abba12e664eeac14c9b643d0f1f1a7df/torchvision/models/mobilenetv3.py#L35

As you mentioned, torch.export doesn’t support this operator,

I replaced this operator with F.hardtanh to solve this problem (I’ve tested the whole model now). You can do something as following to address this error.

diff --git a/torchvision/models/mobilenetv3.py b/torchvision/models/mobilenetv3.py
index 1e2606d..fdf1f6d 100644
--- a/torchvision/models/mobilenetv3.py
+++ b/torchvision/models/mobilenetv3.py
@@ -32,7 +32,7 @@ class SqueezeExcitation(nn.Module):
         scale = self.fc1(scale)
         scale = self.relu(scale)
         scale = self.fc2(scale)
-        return F.hardsigmoid(scale, inplace=inplace)
+        return F.hardtanh(scale + 3, 0., 6., inplace=inplace) / 6.
 
     def forward(self, input: Tensor) -> Tensor:
         scale = self._scale(input, True)

Now there are two ways to solve this bug,

  1. Native support exporting F.hardsigmoid to onnx.
  2. Replace F.hardsigmoid with F.hardtanh that is friendly for exporting and equal numerically as I did above.

And the export of mobilenetv3 to onnx is missing in the unit-test, maybe we could add a test like test_shufflenet_v2_dynamic_axes.

cc @datumbox @fmassa What is your suggestion?

3reactions
zhiqwangcommented, Feb 26, 2021

Hi

It seems that the nn.Hardswish caused this problem, actually the nightly version of PyTorch has addressed this problem, so the unit-test is passed.

If you are using PyTorch 1.7.x, you can replace it to an export friendly version of Hardswish as below and set ONNX opset version to 11.

class Hardswish(nn.Module):
    """
    Export-friendly version of nn.Hardswish()
    """
    def __init__(self):
        super().__init__()

    def forward(self, x):
        return x * F.hardtanh(x + 3, 0., 6.) / 6.
Read more comments on GitHub >

github_iconTop Results From Across the Web

Getting Started with PyTorch Image Models (timm)
To enable exporting a timm model in ONNX format, we can use the exportable argument when creating the model, to ensure that the...
Read more >
onnx/Lobby - Gitter
Im trying to convert Mobilenetv3 onnx model to tensorflow. ... Hi, i am interested in onnx runtime quantized inference and i can't find...
Read more >
Converting Pytorch model .pth into onnx model - Stack Overflow
my_model.pth') model.load_state_dict(state_dict) torch.onnx.export(model, dummy_input, "moment-in-time.onnx").
Read more >
geffnet - PyPI
All models are implemented by GenEfficientNet or MobileNetV3 classes, ... with the 'SAME' conv padding activated cannot be exported to ONNX ...
Read more >
Reverse-Engineered Apple NeuralHash, in ONNX and Python
After some digging and reverse engineering on the hidden APIs I managed to export its model (which is MobileNetV3) to ONNX and rebuild...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found