question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

OSError: file bert-base-uncased/config.json not found

See original GitHub issue

Environment info

  • transformers version: 4.4.2
  • Python version: 3.6
  • PyTorch version (GPU?): 1.8.0 (Tesla V100)

Information

The problem arises when using:

from transformers import BertModel
model = BertModel.from_pretrained('bert-base-uncased')

Error Info (Some personal info has been replaced by —)

file bert-base-uncased/config.json not found
Traceback (most recent call last):
  File "---/anaconda3/envs/attn/lib/python3.6/site-packages/transformers-4.2.2-py3.8.egg/transformers/configuration_utils.py", line 420, in get_config_dict
  File "---/anaconda3/envs/attn/lib/python3.6/site-packages/transformers-4.2.2-py3.8.egg/transformers/file_utils.py", line 1063, in cached_path
OSError: file bert-base-uncased/config.json not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "---.py", line 107, in <module>
    from_pretrained_input()
  File "---.py", line 96, in from_pretrained_input
    model = BertModel.from_pretrained('bert-base-uncased')
  File "---/anaconda3/envs/attn/lib/python3.6/site-packages/transformers-4.2.2-py3.8.egg/transformers/modeling_utils.py", line 962, in from_pretrained
  File "---/anaconda3/envs/attn/lib/python3.6/site-packages/transformers-4.2.2-py3.8.egg/transformers/configuration_utils.py", line 372, in from_pretrained
  File "---/anaconda3/envs/attn/lib/python3.6/site-packages/transformers-4.2.2-py3.8.egg/transformers/configuration_utils.py", line 432, in get_config_dict
OSError: Can't load config for 'bert-base-uncased'. Make sure that:

- 'bert-base-uncased' is a correct model identifier listed on 'https://huggingface.co/models'

- or 'bert-base-uncased' is the correct path to a directory containing a config.json file

what I have read:

https://github.com/huggingface/transformers/issues/353

what I have tried:

  1. loading from a downloaded model file works well
wget https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz

unzip the file and rename bert_config.json as config.json, then

model = BertModel.from_pretrained(BERT_BASE_UNCASED_CACHE)
  1. enough disk space, enough memory, free GPU

  2. open internet connection, no proxy

import pytorch_pretrained_bert as ppb
assert 'bert-large-cased' in ppb.modeling.PRETRAINED_MODEL_ARCHIVE_MAP
  1. The following models work well
model = BertModel.from_pretrained('bert-base-cased')

model = RobertaModel.from_pretrained('roberta-base')
  1. working well in server cmd but not in local pycharm (remote deployment to server)

Observation:

  • Pycharm can found the transfromers installed with pip, but that will trigger this problem
  • Pycharm cannot find the current transformers installed with conda conda install transformers=4.4 -n env -c huggingface

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:11 (1 by maintainers)

github_iconTop GitHub Comments

5reactions
vildhetcommented, Jul 11, 2021

Hi, I’ve had the same error but with roberta-base. It appeared that I had an empty folder named roberta-base in my working directory. Removing it solved the issue.

1reaction
leoliu0commented, Jul 20, 2021

I found this issue is caused by setting cache directory using checkpoint name TrainingArguments(checkpoint,evaluation_strategy=‘steps’)

change checkpoint to something else resolve the issue

Read more comments on GitHub >

github_iconTop Results From Across the Web

It looks like the config file at 'bert-base-uncased' is not a ...
Working fine for months, then I interrupted a "bert-large-cased" download and the following code returns the error in the title:
Read more >
Configuration
The base class PretrainedConfig implements the common methods for loading/saving a configuration either from a local file or directory, or from a pretrained ......
Read more >
Can't load bert German model from huggingface
Hi Rasa community, I'm using rasa to build a bot in German language and want to try out BERT in LanguageModelFeaturizer.
Read more >
OSError: It looks like the config file at 'roberta-base ...
I'd firstly suggest using relative references instead of passing a GitHub URL as a file path. URLs are not valid file paths. Before....
Read more >
bert model save_pretrained - You.com | The AI Search ...
__init__() config = BertConfig.from_pretrained('bert-base-uncased', ... download the model (in your case TensorFlow model .h5 and the config.json file), ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found