Setting different lr for different parameters using fairseq command-line
See original GitHub issue❓ Questions and Help
Before asking:
- search the issues.
- search the docs.
What is your question?
Is it possible to use the fairseq-train
command-line tools to set different learning rates for different parameters?
Code
What have you tried?
I have searched the issue and docs but cannot find useful suggestions. Thanks for your reply!!!
What’s your environment?
- fairseq Version (e.g., 1.0 or main):
- PyTorch Version (e.g., 1.0)
- OS (e.g., Linux):
- How you installed fairseq (
pip
, source): - Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information:
Issue Analytics
- State:
- Created a year ago
- Comments:5
Top Results From Across the Web
Command-line Tools — fairseq 0.12.2 documentation
Fairseq provides several command-line tools for training and evaluating models: fairseq-preprocess: Data pre-processing: build vocabularies and binarize ...
Read more >Question on how to pass pretrained parameters to a self ...
I know the file should be put into fairseq/optimu/my_optimizer.py. ... then you can set up a composite config with different optimizer ...
Read more >fairseq/docs/hydra_integration.md · OFA-Sys ... - Hugging Face
One can declare a field that, by default, will inherit its value from another config node in the same hierarchy: @dataclass FairseqAdamConfig( ...
Read more >Fault-Tolerant Fairseq Training — Ray 2.2.0
The pipeline and configurations in this document will work for other ... We set args for different ray actors for communication, add a...
Read more >fairseq Users | Thank you for open sourcing this software
I'm having a little bit of trouble figuring out how to configure checkpoints. Based on the fairseq-train CLI, it looks like there are...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
That helps a lot thanks. I have looked up the code in fairseq, it supports to customize our own optimizer through
@register_optimizer
. Thanks for reply!!! And I will close this issue since problems solved.@ZeroYuHuang I am sorry. I was wrong. There is an optimizer called composite
You can use this to set different lr for different parameter groups.