question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

cannot load BERTAdam when restoring from BioBert

See original GitHub issue

I am trying to convert the recently released BioBert checkpoint: https://github.com/naver/biobert-pretrained

The conversion script loads the checkpoint, but appears to balk at BERTAdam when building the Pytorch model.

...
Building PyTorch model from configuration: {
  "attention_probs_dropout_prob": 0.1,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.1,
  "hidden_size": 768,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "max_position_embeddings": 512,
  "num_attention_heads": 12,
  "num_hidden_layers": 12,
  "type_vocab_size": 2,
  "vocab_size": 28996
}

Initialize PyTorch weight ['bert', 'embeddings', 'LayerNorm', 'beta']
Traceback (most recent call last):
  File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/venvs/dev3.6/lib/python3.6/site-packages/pytorch_pretrained_bert/__main__.py", line 19, in <module>
    convert_tf_checkpoint_to_pytorch(TF_CHECKPOINT, TF_CONFIG, PYTORCH_DUMP_OUTPUT)
  File "/venvs/dev3.6/lib/python3.6/site-packages/pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py", line 69, in convert_tf_checkpoint_to_pytorch
    pointer = getattr(pointer, l[0])
AttributeError: 'Parameter' object has no attribute 'BERTAdam'

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

0reactions
thomwolfcommented, Mar 6, 2019

This is normal. Closing the issue now.

Read more comments on GitHub >

github_iconTop Results From Across the Web

biobert for keras version of huggingface transformers
Error message: NotImplementedError: Weights may only be loaded based on topology into Models when loading TensorFlow-formatted weights (got ...
Read more >
Fine-tuning a BERT model | Text - TensorFlow
The one downside to loading this model from TF Hub is that the structure of internal Keras layers is not restored. This makes...
Read more >
What to do when you get an error - Hugging Face Course
Feel free to test it out :) and the first thing you think of is to load the model using the pipeline from...
Read more >
COVID-19 Task-3 info extraction Bio-bert Graph - Kaggle
The members of the multi-FASTA reference file for the BBV Panel were obtained by submitting FASTQ sets from the rRNA-depleted sample with the...
Read more >
pytorch-pretrained-bert - PyPI
PyTorch version of Google AI BERT model with script to load Google pre-trained models. ... BertAdam - Bert version of Adam algorithm with...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found