Add `weight_decay_filter` and `lars_adaptation_filter` to LARS
See original GitHub issue🚀 Feature
Add weight_decay_filter and lars_adaptation_filter to LARS
Motivation
weight decay typically shouldn’t be applied to BatchNorm. See fast.ai and this pytorch discuss thread.
The facebook vicreg code has parameters weight_decay_filter and lars_adaptation_filter which they set to True for any parameter that has ndim 1.
Pitch
There should be a simple way to disable weight decay and LARS adaptation on ndim==1 parameters.
Alternatives
Port Facebook LARS code and use it instead of lightning flash LARS code.
Issue Analytics
- State:
- Created a year ago
- Reactions:1
- Comments:8 (5 by maintainers)
Top Results From Across the Web
No results found
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

@krshrimali Thanks! And I am happy to help with code review if you tag me in the PR
@krshrimali Great! I am following this issue.