question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

When would pegasus be able to be exported in ONNX format?

See original GitHub issue

It seems like it’s not available now, I got this error: Error while converting the model: Unrecognized configuration class <class 'transformers.configuration_pegasus.PegasusConfig'> for this kind of AutoModel: AutoModel. Model type should be one of RetriBertConfig, T5Config, DistilBertConfig, AlbertConfig, CamembertConfig, XLMRobertaConfig, BartConfig, LongformerConfig, RobertaConfig, LayoutLMConfig, SqueezeBertConfig, BertConfig, OpenAIGPTConfig, GPT2Config, MobileBertConfig, TransfoXLConfig, XLNetConfig, FlaubertConfig, FSMTConfig, XLMConfig, CTRLConfig, ElectraConfig, ReformerConfig, FunnelConfig, LxmertConfig, BertGenerationConfig, DebertaConfig, DPRConfig, XLMProphetNetConfig, ProphetNetConfig.

Which is fair since pegasus is a new addition. Is it something the team plans to do soon?

Or can someone point me some resources on if there are other ways to export a pre-trained model from huggingface? I’m pretty new to the machine learning thing :p

Thanks all!

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7 (2 by maintainers)

github_iconTop GitHub Comments

2reactions
sshleifercommented, Nov 2, 2020

@patil-suraj has a partial solution that he just posted to the forums. he might be able to extend that to Pegasus/BART

1reaction
patil-surajcommented, Nov 3, 2020

I’m on it! Will ping here once I get it working.

@phosfuldev, you can refer to this post to see how T5 is exported to onnx https://discuss.huggingface.co/t/speeding-up-t5-inference/1841

Read more comments on GitHub >

github_iconTop Results From Across the Web

Export to ONNX - Transformers - Hugging Face
For example, a model trained in PyTorch can be exported to ONNX format and then imported in TensorFlow (and vice versa). Transformers provides...
Read more >
how to convert HuggingFace's Seq2seq models to onnx format
Pegasus is a seq2seq model, you can't directly convert a seq2seq model (encoder-decoder model) using this method.
Read more >
Best Practices for Neural Network Exports to ONNX
Our experience shows that is easier to export PyTorch models. If possible, choose a PyTorch source and convert it using the built-in torch.onnx...
Read more >
NVIDIA DRIVE OS 6.0.4 TensorRT 8.4.11 Documentation
A good first step after exporting a model to ONNX is to run constant folding ... it is possible that TensorRT will not...
Read more >
Convert Pegasus model to ONNX [Discussion] - Reddit
I got it done but the ONNX model can't generate text. Turned out that Pegasus is an encoder-decoder model and most guides are...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found