ValueError: Outputs values doesn't match between reference model and ONNX exported model
See original GitHub issueEnvironment info
Expected behavior
Issue Analytics
- State:
- Created 2 years ago
- Comments:5
Top Results From Across the Web
Exporting transformers models - Hugging Face
Starting from transformers v2.10.0 we partnered with ONNX Runtime to provide an easy ... ONNX model outputs' name match reference model ({'pooler_output', ...
Read more >outputs are different between ONNX and pytorch
Problem solve by adding model.eval() before running inference of pytorch model in test code. Solution is from the link model = models.
Read more >Convert Transformers to ONNX with Hugging Face Optimum
Introduction guide about ONNX and Transformers. Learn how to convert transformers like BERT to ONNX and what you can do with it.
Read more >Model Optimizer Frequently Asked Questions
A : Most likely, Model Optimizer does not know how to infer output shapes of some layers in the given topology. To lessen...
Read more >Using Huggingface Transformers with ML.NET | Rubik's Code
Exporting Huggingface Transformers to ONNX Models ... ONNX model outputs' name match reference model ({'pooler_output', 'last_hidden_state'} ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m getting the same error for mBART. I’m using a colab notebook with/without GPU.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.