Using the T5 model with huggingface's mask-fill pipeline
See original GitHub issueDoes anyone know if it is possible to use the T5 model with hugging face’s mask-fill pipeline? The below is how you can do it using the default model but i can’t seem to figure out how to do is using the T5 model specifically?
from transformers import pipeline
nlp_fill = pipeline('fill-mask')
nlp_fill('Hugging Face is a French company based in ' + nlp_fill.tokenizer.mask_token)
Trying this for example raises the error “TypeError: must be str, not NoneType” because nlp_fill.tokenizer.mask_token is None.
nlp_fill = pipeline('fill-mask',model="t5-base", tokenizer="t5-base")
nlp_fill('Hugging Face is a French company based in ' + nlp_fill.tokenizer.mask_token)
Stack overflow question
Issue Analytics
- State:
- Created 3 years ago
- Comments:18 (8 by maintainers)
Top Results From Across the Web
T5 - Hugging Face
T5 is an encoder-decoder model and converts all NLP problems into a text-to-text format. It is trained using teacher forcing. This means that...
Read more >Using the T5 model with huggingface's mask-fill pipeline
Does anyone know if it is possible to use the T5 model with hugging face's mask-fill pipeline? The below is how you can...
Read more >Deploy T5 11B for inference for less than $500 - philschmid
This blog will teach you how to deploy T5 11B for inference using Hugging Face Inference Endpoints. The T5 model was presented in...
Read more >Fine-Tuning T5 for Question Answering using HuggingFace ...
Prepare for the Machine Learning interview: https://mlexpert.io Subscribe: http://bit.ly/venelin-subscribe Get SH*T Done with PyTorch ...
Read more >Abstractive Summarization with Hugging Face Transformers
If you are using one of the five T5 checkpoints we have to prefix the inputs with "summarize:" (the model can also translate...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Could we use the following workaround?
<extra_id_0>
could be considered as a mask tokenOutput:
@girishponkiya Thanks for your example! Unfortunately, I can’t reproduce your results. I get
Tried on CPU, GPU, ‘t5-base’ and ‘t5-3b’ — same thing.