question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Setting different lr for different parameters using fairseq command-line

See original GitHub issue

❓ Questions and Help

Before asking:

  1. search the issues.
  2. search the docs.

What is your question?

Is it possible to use the fairseq-train command-line tools to set different learning rates for different parameters?

Code

What have you tried?

I have searched the issue and docs but cannot find useful suggestions. Thanks for your reply!!!

What’s your environment?

  • fairseq Version (e.g., 1.0 or main):
  • PyTorch Version (e.g., 1.0)
  • OS (e.g., Linux):
  • How you installed fairseq (pip, source):
  • Build command you used (if compiling from source):
  • Python version:
  • CUDA/cuDNN version:
  • GPU models and configuration:
  • Any other relevant information:

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
ZeroYuHuangcommented, Jul 11, 2022

No you cannot. I doubt pytorch support this feature.

I believe having more hyper parameters (lr for each weight) is not so welcomed. That does mean few people do this. So wish you good luck.

Sorry I was wrong. Pytorch can specify learning rate for each paramter.

And yes, you can. Every class with a @register_ decoration can be invoked by proper fairseq’s command line tool. So through fairseq-train you can invoke your custom optimizer.

That helps a lot thanks. I have looked up the code in fairseq, it supports to customize our own optimizer through @register_optimizer . Thanks for reply!!! And I will close this issue since problems solved.

0reactions
gmryucommented, Jul 11, 2022

@ZeroYuHuang I am sorry. I was wrong. There is an optimizer called composite

You can use this to set different lr for different parameter groups.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Command-line Tools — fairseq 0.12.2 documentation
Fairseq provides several command-line tools for training and evaluating models: fairseq-preprocess: Data pre-processing: build vocabularies and binarize ...
Read more >
Question on how to pass pretrained parameters to a self ...
I know the file should be put into fairseq/optimu/my_optimizer.py. ... then you can set up a composite config with different optimizer ...
Read more >
fairseq/docs/hydra_integration.md · OFA-Sys ... - Hugging Face
One can declare a field that, by default, will inherit its value from another config node in the same hierarchy: @dataclass FairseqAdamConfig( ...
Read more >
Fault-Tolerant Fairseq Training — Ray 2.2.0
The pipeline and configurations in this document will work for other ... We set args for different ray actors for communication, add a...
Read more >
fairseq Users | Thank you for open sourcing this software
I'm having a little bit of trouble figuring out how to configure checkpoints. Based on the fairseq-train CLI, it looks like there are...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found