question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Proper way to export the model to onnx

See original GitHub issue
import torchreid

torchreid.models.show_avai_models()

model = torchreid.models.build_model(name='osnet_ain_x1_0', num_classes=1000)

torchreid.utils.load_pretrained_weights(model, "osnet_ain_x1_0_msmt17_256x128_amsgrad_ep50_lr0.0015_coslr_b64_fb10_softmax_labsmth_flip_jitter.pth") 

from torch.autograd import Variable
import torch
import onnx

input_name = ['input']
output_name = ['output']
input = Variable(torch.randn(1, 3, 256, 128))
torch.onnx.export(model, input, 'osnet_ain_x1_0.onnx', input_names=input_name,output_names=output_name, verbose=True, export_params=True)

The model after convert only 10633KBytes, while the pytorch model got 16888KBytes

onnx_model = onnx.load("osnet_ain_x1_0.onnx")
onnx.checker.check_model(onnx_model)

  1. The output messages seems all fine, but is this the correct way?
  2. What is the num_classes I should set?
  3. Am I using correct input_name and output_name

Model before convert is 16888KB, model after convert only got 10633KB

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:19 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
mikel-brostromcommented, Aug 6, 2022

I have a working multibackend (ONNX, OpenVINO and TFLite) class for for the ReID models that I manged to export (mobilenet, resnet50 and osnet models) with my export script. My export pipeline is as follows: PT --> ONNX --> OpenVINO --> TFLite. osnet models fails in the OpenVINO export; mobilenet and resnet50 models go all the way through. Feel free to experiment with it, it is in working condition as shown by my CI pipeline. Don’t forget to drop a PR if you have any improvements! 😄

1reaction
stereomatchingkisscommented, Feb 1, 2020

I would love to see a summary of the model before even considering any other things.

At last I got the output, need to print it out manually, since it is too long I put it at pastebin.

This time I test it with the output value.

import torchreid

torchreid.models.show_avai_models()

model = torchreid.models.build_model(name='osnet_ain_x1_0', num_classes=1041)
torchreid.utils.load_pretrained_weights(model, "osnet_ain_x1_0_msmt17_256x128_amsgrad_ep50_lr0.0015_coslr_b64_fb10_softmax_labsmth_flip_jitter.pth") 

model.eval()

from torch.autograd import Variable
import torch
import onnx

# An example input you would normally provide to your model's forward() method.
input = torch.ones(1, 3, 256, 128)
raw_output = model(input)

torch.onnx.export(model, input, 'osnet_ain_x1_0.onnx', verbose=False, export_params=True)

print("-------------------------check model---------------------------------------\n")

try:
    onnx_model = onnx.load("osnet_ain_x1_0.onnx")
    onnx.checker.check_model(onnx_model)
    graph_output = onnx.helper.printable_graph(onnx_model.graph)
    with open("graph_output.txt", mode="w") as fout:
        fout.write(graph_output)	
except:
    print("Something went wrong")
	
	
import onnxruntime
import numpy as np

ort_session = onnxruntime.InferenceSession("osnet_ain_x1_0.onnx")

def to_numpy(tensor):
    return tensor.detach().cpu().numpy() if tensor.requires_grad else tensor.cpu().numpy()

# compute ONNX Runtime output prediction
ort_inputs = {ort_session.get_inputs()[0].name: to_numpy(input)}
ort_outs = ort_session.run(None, ort_inputs)	

# compare ONNX Runtime and PyTorch results
np.testing.assert_allclose(to_numpy(raw_output), ort_outs[0], rtol=1e-03, atol=1e-05)

print("Exported model has been tested with ONNXRuntime, and the result looks good!")

But, assertion raise

Not equal to tolerance rtol=0.001, atol=1e-05

Mismatched elements: 13 / 512 (2.54%)
Max absolute difference: 0.00070047
Max relative difference: 0.02552379
 x: array([[9.402755e-05, 2.179507e+00, 3.834043e-01, 2.136301e-01,
        0.000000e+00, 0.000000e+00, 8.341767e-01, 2.012278e+00,
        3.984281e-01, 5.461890e-02, 2.032685e-01, 7.745304e-06,...
 y: array([[9.402186e-05, 2.179636e+00, 3.834159e-01, 2.135554e-01,
        0.000000e+00, 0.000000e+00, 8.342320e-01, 2.012077e+00,
        3.984202e-01, 5.461205e-02, 2.032657e-01, 7.745568e-06,...

ps : Export the model by torch.jit.trace(model, example) work, figure out how to get 512 features.

Read more comments on GitHub >

github_iconTop Results From Across the Web

(optional) Exporting a Model from PyTorch to ONNX and ...
To export a model, we call the torch.onnx.export() function. This will execute the model, recording a trace of what operators are used to...
Read more >
How to Convert a PyTorch Model to ONNX in 5 Minutes - Deci AI
Converting deep learning models from PyTorch to ONNX is quite straightforward. Start by loading a pre-trained ResNet-50 model from PyTorch's ...
Read more >
Tutorial 5: Exporting a model to ONNX
We provide a python script to export the pytorch model trained by MMPose to ONNX. python tools/deployment/pytorch2onnx.py ${CONFIG_FILE} ${CHECKPOINT_FILE} ...
Read more >
Best Practices for Neural Network Exports to ONNX
Our experience shows that is easier to export PyTorch models. If possible, choose a PyTorch source and convert it using the built-in torch.onnx...
Read more >
Export to ONNX - Transformers - Hugging Face
In this guide, we'll show you how to export Transformers models to ONNX ... all values close (atol: 1e-05) All good, model saved...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found