When would pegasus be able to be exported in ONNX format?
See original GitHub issueIt seems like it’s not available now, I got this error:
Error while converting the model: Unrecognized configuration class <class 'transformers.configuration_pegasus.PegasusConfig'> for this kind of AutoModel: AutoModel. Model type should be one of RetriBertConfig, T5Config, DistilBertConfig, AlbertConfig, CamembertConfig, XLMRobertaConfig, BartConfig, LongformerConfig, RobertaConfig, LayoutLMConfig, SqueezeBertConfig, BertConfig, OpenAIGPTConfig, GPT2Config, MobileBertConfig, TransfoXLConfig, XLNetConfig, FlaubertConfig, FSMTConfig, XLMConfig, CTRLConfig, ElectraConfig, ReformerConfig, FunnelConfig, LxmertConfig, BertGenerationConfig, DebertaConfig, DPRConfig, XLMProphetNetConfig, ProphetNetConfig.
Which is fair since pegasus is a new addition. Is it something the team plans to do soon?
Or can someone point me some resources on if there are other ways to export a pre-trained model from huggingface? I’m pretty new to the machine learning thing :p
Thanks all!
Issue Analytics
- State:
- Created 3 years ago
- Comments:7 (2 by maintainers)
@patil-suraj has a partial solution that he just posted to the forums. he might be able to extend that to Pegasus/BART
I’m on it! Will ping here once I get it working.
@phosfuldev, you can refer to this post to see how T5 is exported to onnx https://discuss.huggingface.co/t/speeding-up-t5-inference/1841