UnboundLocalError: local variable 'tokenizer' referenced before assignment
See original GitHub issueI am runnnig the example code on the homepage. However,I met this problem.
import torch
from transformers import *
MODELS = [(BertModel, BertTokenizer, 'bert-base-uncased'),
(OpenAIGPTModel, OpenAIGPTTokenizer, 'openai-gpt'),
(GPT2Model, GPT2Tokenizer, 'gpt2'),
(CTRLModel, CTRLTokenizer, 'ctrl'),
(TransfoXLModel, TransfoXLTokenizer, 'transfo-xl-wt103'),
(XLNetModel, XLNetTokenizer, 'xlnet-base-cased'),
(XLMModel, XLMTokenizer, 'xlm-mlm-enfr-1024'),
(DistilBertModel, DistilBertTokenizer, 'distilbert-base-cased'),
(RobertaModel, RobertaTokenizer, 'roberta-base'),
(XLMRobertaModel, XLMRobertaTokenizer, 'xlm-roberta-base'),
]
for model_class, tokenizer_class, pretrained_weights in MODELS:
tokenizer = tokenizer_class.from_pretrained(pretrained_weights)
model = model_class.from_pretrained(pretrained_weights)
input_ids = torch.tensor([tokenizer.encode("Here is some text to encode", add_special_tokens=True)])
with torch.no_grad():
last_hidden_states = model(input_ids)[0]
`UnboundLocalError: local variable 'tokenizer' referenced before assignmen
This happened when the model_class goes to the XLMModel.I do not quite understand why this happen,because this problem only occurs when the model is XLMModel.
Issue Analytics
- State:
- Created 4 years ago
- Comments:8 (1 by maintainers)
Top Results From Across the Web
local variable referenced before assignment in python closure
It's like having a hidden local filename in the first line of your function (note: local is not an actual Python keyword, but...
Read more >Keras-io/Lobby - Gitter
i get this error: UnboundLocalError: local variable 'batch_outputs' referenced before assignment. this error is INSIDE the training.py script from keras.
Read more >How to Fix Local Variable Referenced Before Assignment ...
UnboundLocalError : local variable 'value' referenced before assignment. The issue is that in this line: value = value + 1.
Read more >local variable 'server' referenced before assignment
When I use the variable server on my print statement I get this error. File "/Users/c0deninja/projects/gsecurity/modules/fetch_requests.py", line 40, in ...
Read more >Latest nlp topics - PyTorch Forums
Topic Replies Views Activity
LSTM hidden_states issue 2 199 July 27, 2020
Same output for entire batch 0 568 July 26, 2020
Tricks to improve model...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I can also attest to this error.
I am using a Kaggle notebook, and I get this error after running this in my first cell. Most of it is default code, bottom two lines are the key ones.
Error thrown
Kaggle runs transformers version 2.3.0 by default. After updating to 2.5.1 it worked just fine. To update on Kaggle, turn the internet option on in the settings in the right side. Then do
!pip install -U transformers
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.