omegaconf.errors.ConfigKeyError: str interpolation key 'optimization.lr' not found
See original GitHub issueI don’t know how to solve this error: " omegaconf.errors.ConfigKeyError: str interpolation key ‘optimization.lr’ not found" Here is the command i’m running:
Code
fairseq-train data-bin/XXXX/ --save-dir XXXX --arch lstm --encoder-bidirectional --encoder-layers 2 --decoder-layers 2 --dropout 0.2 --lr 5e-4 --batch-size 32 --optimizer adam --eval-bleu --eval-bleu-args ‘{“beam”: 5, “max_len_a”: 1.2, “max_len_b”: 10}’ --eval-bleu-detok moses --best-checkpoint-metric bleu --maximize-best-checkpoint-metric
I’ll appreciate your help.
What have you tried?
I checked that i have the optimizer and learning rate in the command.
What’s your environment?
-
fairseq Version (e.g., 1.0 or master):master
-
PyTorch Version (e.g., 1.0) 1.7.1
-
OS (e.g., Linux): Linux
-
How you installed fairseq (
pip
, source):source -
Build command you used (if compiling from source): git clone https://github.com/pytorch/fairseq cd fairseq pip install --editable ./
-
Python version:python/3.8.2
-
CUDA/cuDNN version:
-
GPU models and configuration:
-
Any other relevant information:
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (2 by maintainers)
Top GitHub Comments
Hi Myleott, I uninstalled the previous version and reinstalled omegconf==2.0.5 hydra-core==1.0.4 and it works fine now. Thank you so much
I get this error after switching from a commit from May 31st to current master.