Converting original T5 to be used in Transformers
See original GitHub issueI want to use original T5 checkpoint in Transformers library. I found multiple answers referring to convert_t5_original_tf_checkpoint_to_pytorch.py
which does not seem to exist. Any other way? Or where can I find a (currently working) version of that file?
Issue Analytics
- State:
- Created 3 years ago
- Comments:9 (4 by maintainers)
Top Results From Across the Web
T5 - Hugging Face
The original code can be found here. Training. T5 is an encoder-decoder model and converts all NLP problems into a text-to- ...
Read more >T5 Conversion from Original Tensorflow Produce rubbish Text
However, we are using T5 original library for now, as huggingface transformers is still producing rubbish text after conversion.
Read more >Understanding T5 Model : Text to Text Transfer Transformer ...
T5 : Text-to-Text-Transfer-Transformer model proposes reframing all NLP tasks into a unified text-to-text-format where the input and output are ...
Read more >Neural machine translation with a Transformer and Keras | Text
This tutorial demonstrates how to create and train a sequence-to-sequence Transformer model to translate Portuguese into English. The Transformer was ...
Read more >Voltage Converter for Halogen Light, 220V to 12V Electronic ...
Selection method of electronic transformer: 1. Choose the same transformer power according to the original lamp; 2. Choose according to the total power...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
…actually the link you sent for the example config file proved to be extremely useful! Starting from there I’ve found all related files. Here is everything (including the config file) for T5 Small: https://huggingface.co/t5-small. Also an example workflow for future reference:
I see that they store configurations in .gin files, like this one: https://console.cloud.google.com/storage/browser/_details/t5-data/pretrained_models/small/operative_config.gin
When opening this on my laptop in Notepad, this looks like this:
=> the relevant part here seems to be only the model hyperparameters:
So maybe you can create a config.json based on those?
And happy to hear this 😃 you’re welcome