Add support for `OptimizerParamGroup`
See original GitHub issueOptimizerParamGroup
is used to set different learning rates, weight decay rates, and other optimizer properties differently for different model parameters.
Exmaple: AdamW(vector<OptimizerParamGroup> param_groups, AdamWOptions defaults)
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:16 (16 by maintainers)
Top Results From Across the Web
In pytorch how do you use add_param_group () with a ...
Add a param group to the Optimizer s param_groups. This can be useful when fine tuning a pre-trained network as frozen layers can...
Read more >torch.optim.Optimizer.add_param_group
Add a param group to the Optimizer s param_groups . This can be useful when fine tuning a pre-trained network as frozen layers...
Read more >torch.optim — PyTorch 2.0 documentation
Most commonly used methods are already supported, and the interface is general enough, ... Add a param group to the Optimizer s param_groups...
Read more >Adding new parameter groups to an optimizer · Issue #292
Hey,. I am currently training a growing model that requires me to add new param groups to the optimizer at each growth.
Read more >torch.optim — PyTorch master documentation
Add a param group to the Optimizer s param_groups . This can be useful when fine tuning a pre-trained network as frozen layers...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@dsyme, @lostmsu – we can have it both ways:
Which can be further simplified by renaming the optimizer class. This should be alright, since the name of the factory method that should be the natural way of creating an optimizer doesn’t change:
I’ll need to check how this looks in F#, too. Just as important to get the F# UX right…