question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add MarianMT to models exportable with ONNX

See original GitHub issue

πŸš€ Feature request

Add the support to convert the MarianMT model with the transformers.onnx package documented here. The conversion now returns marian () is not supported yet. Only [...] are supported. If you want to support (marian) please propose a PR or open up an issue.

Motivation

MarianMT is one of the best translation models in the hub because of the extensive number of pretrained language pairs, but it can be slow for real-time use cases. A conversion to ONNX combined with quantization could significantly improve inference time.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:10 (9 by maintainers)

github_iconTop GitHub Comments

2reactions
lewtuncommented, Nov 24, 2021

Thanks for the ping @LysandreJik - happy to look into the issue πŸ˜ƒ

1reaction
Maxinho96commented, Nov 29, 2021

Hey @Maxinho96 do you mind if I continue working on your branch in #13854? This will allow your contribution to be accounted for once we eventually merge the support for MarianMT models πŸ˜ƒ

Sure, thank you πŸ™

Read more comments on GitHub >

github_iconTop Results From Across the Web

Export to ONNX - Transformers - Hugging Face
In this guide, we'll show you how to export Transformers models to ONNX (Open Neural Network eXchange). Once exported, a model can be...
Read more >
Exporting your model to ONNX format - Unity - Manual
It allows you to easily interchange models between various ML frameworks and tools. You can export a neural network from the following Deep ......
Read more >
Adds DonutSwin to models exportable with ONNX #19401
Then trying to convert to onnx I get: python -m transformers.onnx --model=./swin onnx/ Local PyTorch model found. Framework not requested. Using torch to...
Read more >
Tutorial 8: Pytorch to ONNX (Experimental)
List of supported models exportable to ONNX ... If the deployed backend platform is TensorRT, please add environment variables before running the file:....
Read more >
huggingface/transformers untagged-96ceb499938398068174 ...
To help contributors add new models more easily to Transformers, there is a new command that ... Add ONNX support for MarianMT models...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found