KeyError: 'labels' in distill_classifier.py
See original GitHub issueEnvironment info
transformers
version: 4.6.1- Platform: Darwin-19.6.0-x86_64-i386-64bit
- Python version: 3.7.6
- PyTorch version (GPU?): 1.8.1 (False)
- Tensorflow version (GPU?): 2.2.0 (False)
- Using GPU in script?: No
- Using distributed or parallel set-up in script?: No
Issue
I am trying to run the distill_classifier.py script from transformers/examples/research_projects/zero-shot-distillation/ with my own text data set and labels on the roberta-large-mnli model. There are a few hundred rows of text and 13 class labels. I am running the following in a cell of my notebook:
!python transformers/examples/research_projects/zero-shot-distillation/distill_classifier.py \
--data_file ./distill_data/train_unlabeled.txt \
--class_names_file ./distill_data/class_names.txt \
--teacher_name_or_path roberta-large-mnli \
--hypothesis_template "This text is about {}." \
--output_dir ./my_student/distilled
The script starts to run but after a short while I receive the following error:
Trainer is attempting to log a value of "{'Science': 0, 'Math': 1, 'Social Studies': 2, 'Language Arts': 3, 'Statistics': 4, 'Calculus': 5, 'Linear Algebra': 6, 'Probability': 7, 'Chemistry': 8, 'Biology': 9, 'Supply chain management': 10, 'Economics': 11, 'Pottery': 12}"
for key "label2id" as a parameter.
MLflow's log_param() only accepts values no longer than 250 characters so we dropped this attribute.
0%| | 0/7 [00:00<?, ?it/s]Traceback (most recent call last):
File "transformers/examples/research_projects/zero-shot-distillation/distill_classifier.py", line 338, in <module>
main()
File "transformers/examples/research_projects/zero-shot-distillation/distill_classifier.py", line 328, in main
trainer.train()
File "/opt/anaconda3/lib/python3.7/site-packages/transformers/trainer.py", line 1272, in train
tr_loss += self.training_step(model, inputs)
File "/opt/anaconda3/lib/python3.7/site-packages/transformers/trainer.py", line 1734, in training_step
loss = self.compute_loss(model, inputs)
File "transformers/examples/research_projects/zero-shot-distillation/distill_classifier.py", line 119, in compute_loss
target_p = inputs["labels"]
File "/opt/anaconda3/lib/python3.7/site-packages/transformers/tokenization_utils_base.py", line 231, in __getitem__
return self.data[item]
KeyError: 'labels'
0%| | 0/7 [00:01<?, ?it/s]
I have re-examined my labels files and am exactly following this guide for distill_classifier.py
Any help would be appreciated to distill!
Edit: Updated torch to latest version and receiving the same error. I reduced number of classes from 24 to 13 and still have this issue. When I print inputs in the compute loss function it looks like there is no key for labels:
{'attention_mask': tensor([[1, 1, 1, ..., 0, 0, 0],
[1, 1, 1, ..., 0, 0, 0],
[1, 1, 1, ..., 0, 0, 0],
...,
[1, 1, 1, ..., 0, 0, 0],
[1, 1, 1, ..., 0, 0, 0],
[1, 1, 1, ..., 0, 0, 0]]), 'input_ids': tensor([[ 101, 1999, 2262, ..., 0, 0, 0],
[ 101, 4117, 2007, ..., 0, 0, 0],
[ 101, 2130, 2295, ..., 0, 0, 0],
...,
[ 101, 1999, 2760, ..., 0, 0, 0],
[ 101, 2057, 6614, ..., 0, 0, 0],
[ 101, 2057, 1521, ..., 0, 0, 0]])}
Is there an additional parameter that is needed to assign the labels?
Edit 2: Just let the colab notebook “Distilling Zero Shot Classification.ipynb” run for a few hours and am receiving the same error with the agnews dataset. It looks like the code in the colab notebook might have an incompatibility with some other files possibly.
Edit 3: I have changed datasets and reduced to 3 classes and tried to add the label_names argument
--label_names ["Carbon emissions", "Energy efficiency", "Water scarcity"]
my ./distill_data/class_names.txt file looks like:
Carbon Emissions
Energy Efficiency
Water Scarcity
and am still facing the same error.
Who can help
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (1 by maintainers)
Top GitHub Comments
That did the trick!
I am facing the same issue and cannot run the google colab examples either. Any help is appreciated!