KeyError ' '- run_ner.py - Transformers 2.8.0
See original GitHub issue🐛 Bug
Information
Model I am using (Bert, XLNet …): BERT
Language I am using the model on (English, Chinese …): PT
The problem arises when using:
- [ X ] the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- an official GLUE/SQUaD task: (give the name)
- [ X ] my own task or dataset: (give details below)
To reproduce
Steps to reproduce the behavior:
I’m having trouble running a dataset in Portuguese. It is in the conll pattern
Traceback (most recent call last): File “/home/lucasrodrigues/code/transformers-2.8.0/examples/ner/run_ner.py”, line 292, in <module> main() File “/home/lucasrodrigues/code/transformers-2.8.0/examples/ner/run_ner.py”, line 170, in main if training_args.do_train File “/home/lucasrodrigues/code/transformers-2.8.0/examples/ner/utils_ner.py”, line 124, in init pad_token_label_id=self.pad_token_label_id, File “/home/lucasrodrigues/code/transformers-2.8.0/examples/ner/utils_ner.py”, line 207, in convert_examples_to_features print([label_map[label]] + [pad_token_label_id] * (len(word_tokens) - 1)) KeyError: ‘’
Expected behavior
Execution of the NER.
Environment info
transformers
version: 2.8.0- Platform: Linux-4.15.0-76-generic-x86_64-with-debian-buster-sid
- Python version: 3.6.10
- PyTorch version (GPU?): 1.3.1 (True)
- Tensorflow version (GPU?): 2.0.0 (True)
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>
Issue Analytics
- State:
- Created 3 years ago
- Comments:5
Top GitHub Comments
Additionally, the dataset format must be in a
Token Label
(delimiter is a space) format.The error message shows, that an unexpected label (
"
) was found, so I could imagine, that the dataset format is not consistent.To check this, just do the following on your training, development and test set:
This should give you all labels. If you see other tokens, then there’s something wrong in the dataset.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.