Investigate if lr_scheduler from segmentation can use PyTorch's schedulers
See original GitHub issueBack when it was initially implemented in 2019, the LR scheduler in the segmentation reference scripts couldn’t be implemented with native PyTorch schedulers, so we had to resort to LambdaLR
https://github.com/pytorch/vision/blob/9275cc61fb3c26ce15ced0199ad8b7540d48676c/references/segmentation/train.py#L136-L138
It might be that this is now available in PyTorch natively, and this can be simplified.
cc @datumbox
Issue Analytics
- State:
- Created 2 years ago
- Comments:10 (10 by maintainers)
Top Results From Across the Web
Using Learning Rate Scheduler and Early Stopping with ...
In this article, the readers will get to learn how to use learning rate scheduler and early stopping with PyTorch and deep learning....
Read more >torch.optim.lr_scheduler — Catalyst 20.03.1 documentation
"Please open an issue if you are unable to replicate your use case: ... "will result in PyTorch skipping the first value of...
Read more >ChainedScheduler — PyTorch 1.13 documentation
It takes a list of chainable learning rate schedulers and performs consecutive step() functions belonging to them by just one call. Parameters: schedulers...
Read more >Tutorial 5: Customize Runtime Settings
We already support to use all the optimizers implemented by PyTorch, and the only modification is to change the optimizer field of config...
Read more >ReduceLROnPlateau - Hasty.ai
If you have ever worked on a Computer Vision project, you might know that using a learning rate scheduler might significantly increase your ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
That’s correct. In fact once the Scheduler makes it to the nightly, we can make the change. Not sure if it made it to the one today or if it will appear tomorrow, but you can start a PR and I’ll review/test/merge soon. Would that work for you?
Correct the
min_lr
is the minimum permitted value for LR. I’m not 100% we have to support this TBH. Let’s see what the Core team says and if there are any weird interactions we should keep in mind.The API of Schedulers is a bit weird. The changes on the
get_lr()
happen in place so you need to undo the update of the previous epoch and apply the new one.Sounds good, make sure you tag me on the PR.