question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

omegaconf.errors.ConfigAttributeError: Key 'checkpoint_activations' not in 'HubertConfig'

See original GitHub issue

🐛 Bug

Hi,

When I tried to load a hubert model, I got this error:

Python 3.8.12 (default, Oct 12 2021, 13:49:34) 
[GCC 7.5.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> import fairseq
>>> ckpt_path = "/path/to/fairseq/pretrained_models/hubert_xtralarge_ll60k_finetune_ls960_modified.pt"
>>> models, cfg, task = fairseq.checkpoint_utils.load_model_ensemble_and_task([ckpt_path])
2021-12-06 10:55:12 | INFO | fairseq.tasks.hubert_pretraining | current directory is /path/to/fairseq/scripts
2021-12-06 10:55:12 | INFO | fairseq.tasks.hubert_pretraining | HubertPretrainingTask Config {'_name': 'hubert_pretraining', 'data': '/checkpoint/abdo/old_checkpoint02/datasets/librispeech/960h/raw_repeated', 'fine_tuning': False, 'labels': ['ltr'], 'label_dir': None, 'label_rate': -1, 'sample_rate': 16000, 'normalize': True, 'enable_padding': False, 'max_keep_size': None, 'max_sample_size': 300000, 'min_sample_size': None, 'single_target': True, 'random_crop': False, 'pad_audio': False}
2021-12-06 10:55:12 | INFO | fairseq.tasks.hubert_pretraining | current directory is /path/to/fairseq/scripts
2021-12-06 10:55:12 | INFO | fairseq.tasks.hubert_pretraining | HubertPretrainingTask Config {'_name': 'hubert_pretraining', 'data': '/checkpoint/abdo/old_checkpoint02/datasets/librispeech/960h/raw_repeated', 'fine_tuning': False, 'labels': ['lyr9.km500'], 'label_dir': '/path/to/fairseq/scripts', 'label_rate': 50, 'sample_rate': 16000, 'normalize': True, 'enable_padding': False, 'max_keep_size': None, 'max_sample_size': 250000, 'min_sample_size': 32000, 'single_target': False, 'random_crop': True, 'pad_audio': False}
2021-12-06 10:55:12 | INFO | fairseq.models.hubert.hubert | HubertModel Config: {'_name': 'hubert', 'label_rate': 50, 'extractor_mode': layer_norm, 'encoder_layers': 48, 'encoder_embed_dim': 1280, 'encoder_ffn_embed_dim': 5120, 'encoder_attention_heads': 16, 'activation_fn': gelu, 'dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.1, 'encoder_layerdrop': 0.1, 'dropout_input': 0.0, 'dropout_features': 0.0, 'final_dim': 1024, 'untie_final_proj': True, 'layer_norm_first': True, 'conv_feature_layers': '[(512,10,5)] + [(512,3,2)] * 4 + [(512,2,2)] * 2', 'conv_bias': False, 'logit_temp': 0.1, 'target_glu': False, 'feature_grad_mult': 0.0, 'mask_length': 10, 'mask_prob': 0.5, 'mask_selection': static, 'mask_other': 0.0, 'no_mask_overlap': False, 'mask_min_space': 1, 'mask_channel_length': 64, 'mask_channel_prob': 0.25, 'mask_channel_selection': static, 'mask_channel_other': 0.0, 'no_mask_channel_overlap': False, 'mask_channel_min_space': 1, 'conv_pos': 128, 'conv_pos_groups': 16, 'latent_temp': [2.0, 0.5, 0.999995], 'skip_masked': False, 'skip_nomask': True}
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/path/to/fairseq/fairseq_latest/fairseq/checkpoint_utils.py", line 462, in load_model_ensemble_and_task
    model = task.build_model(cfg.model)
  File "/path/to/fairseq/fairseq_latest/fairseq/tasks/fairseq_task.py", line 335, in build_model
    model = models.build_model(cfg, self)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/__init__.py", line 105, in build_model
    return model.build_model(cfg, task)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/hubert/hubert_asr.py", line 146, in build_model
    w2v_encoder = HubertEncoder(cfg, task.target_dictionary)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/hubert/hubert_asr.py", line 272, in __init__
    model = task.build_model(w2v_args.model)
  File "/path/to/fairseq/fairseq_latest/fairseq/tasks/fairseq_task.py", line 335, in build_model
    model = models.build_model(cfg, self)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/__init__.py", line 105, in build_model
    return model.build_model(cfg, task)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/hubert/hubert.py", line 302, in build_model
    model = HubertModel(cfg, task.cfg, task.dictionaries)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/hubert/hubert.py", line 265, in __init__
    self.encoder = TransformerEncoder(cfg)
  File "/path/to/fairseq/fairseq_latest/fairseq/models/wav2vec/wav2vec2.py", line 858, in __init__
    if args.checkpoint_activations:
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 305, in __getattr__
    self._format_and_raise(key=key, value=None, cause=e)
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/base.py", line 95, in _format_and_raise
    format_and_raise(
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/_utils.py", line 629, in format_and_raise
    _raise(ex, cause)
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/_utils.py", line 610, in _raise
    raise ex  # set end OC_CAUSE=1 for full backtrace
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 303, in __getattr__
    return self._get_impl(key=key, default_value=DEFAULT_VALUE_MARKER)
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 361, in _get_impl
    node = self._get_node(key=key)
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 383, in _get_node
    self._validate_get(key)
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/dictconfig.py", line 135, in _validate_get
    self._format_and_raise(
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/base.py", line 95, in _format_and_raise
    format_and_raise(
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/_utils.py", line 694, in format_and_raise
    _raise(ex, cause)
  File "/path/to/miniconda3/envs/fairseq/lib/python3.8/site-packages/omegaconf/_utils.py", line 610, in _raise
    raise ex  # set end OC_CAUSE=1 for full backtrace
omegaconf.errors.ConfigAttributeError: Key 'checkpoint_activations' not in 'HubertConfig'
	full_key: w2v_args.checkpoint_activations
	reference_type=Optional[HubertConfig]
	object_type=HubertConfig

To Reproduce

I am following here.

Environment

  • fairseq Version (e.g., 1.0 or main): ‘1.0.0a0+0dfd6b6’
  • PyTorch Version ‘1.10.0+cu102’
  • OS (e.g., Linux): Ubuntu
  • How you installed fairseq (pip, source): pip install --editable ./
  • Python version: 3.8.12
  • CUDA/cuDNN version: CUDA Version: 11.1
  • GPU models and configuration: Tesla P100

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6

github_iconTop GitHub Comments

2reactions
EmreOzkosecommented, Dec 9, 2021

Actually just adding 3 lines to here, line206.

    checkpoint_activations: bool = field(
        default=False, metadata={"help": "recompute activations and save memory for extra compute"}
    )
1reaction
qingyundoucommented, Dec 9, 2021

that fixed the problem, thanks a lot @EmreOzkose !

Read more comments on GitHub >

github_iconTop Results From Across the Web

omegaconf.errors.ConfigAttributeError: Key ... - GitHub
ConfigAttributeError : Key 'checkpoint_activations' not in 'HubertConfig' #4057 ... When I tried to load a hubert model, I got this error:.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found