question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

ValueError: Outputs values doesn't match between reference model and ONNX exported model

See original GitHub issue

Environment info

Expected behavior

image

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5

github_iconTop GitHub Comments

3reactions
rumeshmadhusankacommented, Feb 2, 2022

I’m getting the same error for mBART. I’m using a colab notebook with/without GPU.

!pip install transformers[onnx] sentencepiece -q
!python -m transformers.onnx --model=facebook/mbart-large-50 --feature seq2seq-lm-with-past onnx/

Using framework PyTorch: 1.10.0+cu111 ValueError: Outputs values doesn’t match between reference model and ONNX exported model: Got max absolute difference of: 3.5762786865234375e-05

2reactions
github-actions[bot]commented, Aug 26, 2021

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Exporting transformers models - Hugging Face
Starting from transformers v2.10.0 we partnered with ONNX Runtime to provide an easy ... ONNX model outputs' name match reference model ({'pooler_output', ...
Read more >
outputs are different between ONNX and pytorch
Problem solve by adding model.eval() before running inference of pytorch model in test code. Solution is from the link model = models.
Read more >
Convert Transformers to ONNX with Hugging Face Optimum
Introduction guide about ONNX and Transformers. Learn how to convert transformers like BERT to ONNX and what you can do with it.
Read more >
Model Optimizer Frequently Asked Questions
A : Most likely, Model Optimizer does not know how to infer output shapes of some layers in the given topology. To lessen...
Read more >
Using Huggingface Transformers with ML.NET | Rubik's Code
Exporting Huggingface Transformers to ONNX Models ... ONNX model outputs' name match reference model ({'pooler_output', 'last_hidden_state'} ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found