question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Convert LongT5 to ONNX

See original GitHub issue

System Info

transformers-cli env

  • transformers version: 4.24.0
  • Platform: Linux-5.4.0-99-generic-x86_64-with-glibc2.17
  • Python version: 3.8.12
  • Huggingface_hub version: 0.10.1
  • PyTorch version (GPU?): 1.12.1+cu102 (True)
  • onnxruntime-gpu: 1.13.1
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: <fill in>
  • Using distributed or parallel set-up in script?: <fill in>

Who can help?

ONNX model conversion: @morgan

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, …)
  • My own task or dataset (give details below)

Reproduction

This command line:

python -m transformers.onnx --model pszemraj/long-t5-tglobal-base-16384-book-summary --feature seq2seq-lm-with-past --preprocessor tokenizer --framework pt .

Gives me the following error during export validation:

Validating ONNX model...
Floating point exception (core dumped)

Expected behavior

Having a usable and validated ONNX model.

Issue Analytics

  • State:open
  • Created 10 months ago
  • Comments:12 (11 by maintainers)

github_iconTop GitHub Comments

1reaction
jplucommented, Nov 18, 2022

I have tried multiple seq_len for the input. As long as the value is not too lower than the “original” value it seems working, but if it goes too lower, I start to get a big “max absolute difference” and the validation doesn’t pass. So indeed, it is not really usable and seems too unstable as you said @fxmarty. Thanks a lot anyway for your lights on this, I let the issue open, don’t hesitate to ping me here if I can do something to help fixing on my side.

0reactions
fxmartycommented, Dec 16, 2022

not stale

Read more comments on GitHub >

github_iconTop Results From Across the Web

Export to ONNX - Transformers
Transformers provides a transformers.onnx package that enables you to convert model checkpoints to an ONNX graph by leveraging configuration objects.
Read more >
ONNXConfig: Add a configuration for all available models
There are already a lot of architectures implemented for converting PyTorch models to ONNX, but we need more! We need them all!
Read more >
LongT5: Efficient Text-To-Text Transformer for Long ...
In this paper, we present a new model, called LongT5, with which we explore the effects of scaling both the input length and...
Read more >
Converting Models to ONNX Format - Cassie
In this video we show you how to convert a model from PyTorch, TensorFlow, Keras, SciKit Learn and with Huggingface for Transformer models....
Read more >
Tweets with replies by Lei Cui (@wolfshowme) / Twitter
See below for an example Checkout Optimum to accelerate your ONNX models for ... who was our first contributor to convert model AND...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found