[Feature Request] Parameter scheduling
See original GitHub issueProviding an abstraction to adjust optimizer parameters during training seems like it might be useful - techniques like SGDR seem applicable to many types of models.
The torch.optim.lr_scheduler
module in PyTorch core implements some useful schedulers, but (a) can only adjust the LR, and (b) only adjusts it per-epoch.
On the other hand, the Engine
event API seems like a really natural way to adjust parameter values, since handlers that manipulate them could be added for either ITERATION_*
or EPOCH_*
events, and modifying multiple parameters at once (e.g. LR and momentum) would be straightforward too.
I wrote a short IPython notebook as a prototype of one way it could look to do this with the event API in a general way (plots are at the very bottom). I left most of the actual scheduler code in separate files for now to try and see if the idea is even worth it first. Would this be useful?
Issue Analytics
- State:
- Created 5 years ago
- Reactions:1
- Comments:32 (17 by maintainers)
Top GitHub Comments
I think
contrib
makes sense, would be nice to have community addition (maintained by the community). But I think we should strive to keep the core ignite lib leanNo problem! I’ve been super busy as well for the past couple of weeks. I’m not opposed to opening a PR there, it’s up to you whether it would be a better match in that repo or in
ignite
. My perspective is that it’s a lot easier to set up these schedules using the event/callback API inignite
, but maybe there is value in moving it to core.