question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to transform custom ops to onnx

See original GitHub issue

I try to transform the custom ops to onnx as the nms set. I define the static symbolic method. when the output is single, I got the custom onnx node. however, when the number of output is more than one, the custom node would be optimized. Here is my simple code.

def func1(a, b):
    c = torch.add(a, b)
    d = torch.sub(c, a)
    return [c,d]

class NMSop(torch.autograd.Function):

    @staticmethod
    def forward(ctx, a, thr):
        return func1(a, thr)

    @staticmethod
    def symbolic(g, a, thr):
        x =  g.op('Aten::NMS', a, thr)
        return  x

class CustomNet(nn.Module):
    def __init__(self):
        super(CustomNet, self).__init__()
        self.net = NMSop()

    def forward(self, a, b):
        #import pdb;pdb.set_trace()
        x = NMSop.apply(a, b)
        #x = a.repeat_interleave(torch.tensor([2]), dim=0)
        return x

net = CustomNet()
t = torch.randn(3,4)
s = torch.randn(3,4)
#result = net(t)
#print(result)
torch.onnx.export(net, (t,s), 'test.onnx', verbose=True)

if the func1 return c, I can get the NMS node, if return [c,d], can not. So there are some methods to process multiple output?

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
zgplvyoucommented, Sep 9, 2022

Sorry, I do not know how to add shape inference for the symbolic in PyTorch. Maybe you can add a custom schema with this.

ok, I trace the graph, the shape information is missing in the jit graph, I will continue to find the solution next week. Anyway, happy mid-autumn festival, today is going into garbage time.

0reactions
grimoirecommented, Sep 9, 2022

Sorry, I do not know how to add shape inference for the symbolic in PyTorch. Maybe you can add a custom schema with this.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Custom operators | onnxruntime
Using Custom Ops with TF2ONNX: This notebook covers converting a TF model using an existing custom op, defining new custom ops in Python...
Read more >
How to make custom operator in onnx and run it in ... - GitHub
I want to change one operator in my onnx model to a set of substitution operators which can run on my hardware backend....
Read more >
onnx custom op registration - python - Stack Overflow
First: You need to implement the operator that you try to use in python. Second: You need to register the operator you have...
Read more >
How to convert Tensorflow2 Model to ONNX using tf2onnx ...
Using the Custom Ops requires trial and error method of converting the model. First, try in the command line option. Command Line Option....
Read more >
Custom ONNX* Operators - OpenVINO™ Documentation
The ONNX* importer provides a mechanism to register custom ONNX operators based on predefined or custom nGraph operations. The function responsible for ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found