question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Investigate if lr_scheduler from segmentation can use PyTorch's schedulers

See original GitHub issue

Back when it was initially implemented in 2019, the LR scheduler in the segmentation reference scripts couldn’t be implemented with native PyTorch schedulers, so we had to resort to LambdaLR https://github.com/pytorch/vision/blob/9275cc61fb3c26ce15ced0199ad8b7540d48676c/references/segmentation/train.py#L136-L138

It might be that this is now available in PyTorch natively, and this can be simplified.

cc @datumbox

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:10 (10 by maintainers)

github_iconTop GitHub Comments

1reaction
datumboxcommented, Aug 11, 2022

That’s correct. In fact once the Scheduler makes it to the nightly, we can make the change. Not sure if it made it to the one today or if it will appear tomorrow, but you can start a PR and I’ll review/test/merge soon. Would that work for you?

1reaction
datumboxcommented, Aug 3, 2022

Correct the min_lr is the minimum permitted value for LR. I’m not 100% we have to support this TBH. Let’s see what the Core team says and if there are any weird interactions we should keep in mind.

Although it seems correct to me, I have some doubts about the part:

The API of Schedulers is a bit weird. The changes on the get_lr() happen in place so you need to undo the update of the previous epoch and apply the new one.

Yeah, I’m putting the pieces together . I will open a PR soon.

Sounds good, make sure you tag me on the PR.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using Learning Rate Scheduler and Early Stopping with ...
In this article, the readers will get to learn how to use learning rate scheduler and early stopping with PyTorch and deep learning....
Read more >
torch.optim.lr_scheduler — Catalyst 20.03.1 documentation
"Please open an issue if you are unable to replicate your use case: ... "will result in PyTorch skipping the first value of...
Read more >
ChainedScheduler — PyTorch 1.13 documentation
It takes a list of chainable learning rate schedulers and performs consecutive step() functions belonging to them by just one call. Parameters: schedulers...
Read more >
Tutorial 5: Customize Runtime Settings
We already support to use all the optimizers implemented by PyTorch, and the only modification is to change the optimizer field of config...
Read more >
ReduceLROnPlateau - Hasty.ai
If you have ever worked on a Computer Vision project, you might know that using a learning rate scheduler might significantly increase your ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found