KeyError when initialize the tokenizer with Atutokenizer for ernie-1.0-base-zh`
See original GitHub issueSystem Info
transformers
version: 4.6.0- Platform: Linux-3.10.0-1160.49.1.el7.x86_64-x86_64-with-debian-buster-sid
- Python version: 3.6.10
- PyTorch version (GPU?): 1.7.0a0+7036e91 (True)
- Tensorflow version (GPU?): not installed (NA)
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, …) - My own task or dataset (give details below)
Reproduction
Just use
tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-1.0-base-zh")
then I encountered the problem
Traceback (most recent call last):
File "prepro_std_fin.py", line 297, in <module>
main(args)
File "prepro_std_fin.py", line 266, in main
tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-1.0-base-zh")
File "/opt/conda/lib/python3.6/site-packages/transformers/models/auto/tokenization_auto.py", line 402, in from_pretrained
config = AutoConfig.from_pretrained(pretrained_model_name_or_path, **kwargs)
File "/opt/conda/lib/python3.6/site-packages/transformers/models/auto/configuration_auto.py", line 432, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
KeyError: 'ernie'
Expected behavior
I hope you can help me solve this problem. Thanks!
Issue Analytics
- State:
- Created a year ago
- Comments:7 (2 by maintainers)
Top Results From Across the Web
huggingface transformer models: KeyError: 'input_ids ...
Though the tokenizer is passed through the DataCollator , I think we have to perform tokenization on the data:.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
It might worth posting here as well: https://huggingface.co/nghuyong/ernie-2.0-base-en/discussions/1
This seems to be an issue with the model config itself and not transformers library.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.