CosineAnnealingWarmRestarts is not compatible with LRScheduler.simulate_values()
See original GitHub issue🐛 Bug description
CosineAnnealingWarmRestarts
is not compatible with LRScheduler.simulate_values()
. Precisely, a object of type CosineAnnealingWarmRestarts
can’t be replicated by LRScheduler._replicate_lr_scheduler()
. It works for CosineAnnealingLR
.
For example :
lr_scheduler = CosineAnnealingWarmRestarts(optimizer=optimizer, T_0=10, eta_min=1e-3)
lr_values = LRScheduler.simulate_values(num_events=50, lr_scheduler=lr_scheduler)
produces the following error
Traceback (most recent call last):
File "tutorials/misc/lr_schedulers.py", line 56, in <module>
lr_values = LRScheduler.simulate_values(num_events=50, lr_scheduler=lr_scheduler)
File "/work/desrozis/Softwares/conda/envs/focus-light/lib/python3.7/site-packages/ignite/contrib/handlers/param_scheduler.py", line 606, in simulate_values
copy_lr_scheduler = LRScheduler._replicate_lr_scheduler(lr_scheduler)
File "/work/desrozis/Softwares/conda/envs/focus-light/lib/python3.7/site-packages/ignite/contrib/handlers/param_scheduler.py", line 627, in _replicate_lr_scheduler
copy_lr_scheduler = lr_scheduler_cls(optimizer=dummy_optimizer, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'T_i'
This issue is due to the assumption that lr_scheduler.__state_dict__
contains arguments of method CosineAnnealingWarmRestarts.__init__()
. A workaround could be to remove T_i
in a similar way to base_lrs
and last_epoch
but it’s not very satisfactory…
Hope it helps to have a better and stonger ignite.
Environment
- PyTorch Version (e.g., 1.4): 1.4
- Ignite Version (e.g., 0.3.0): 0.3.0
- OS (e.g., Linux): Linux
- How you installed Ignite (
conda
,pip
, source): conda - Python version: 3.7.6
- Any other relevant information:
Issue Analytics
- State:
- Created 4 years ago
- Reactions:1
- Comments:7 (5 by maintainers)
Top Results From Across the Web
CosineAnnealingWarmRestarts — PyTorch 1.13 documentation
Returns the state of the scheduler as a dict . It contains an entry for every variable in self.__dict__ which is not the...
Read more >Weights & Biases - Wandb
A short tutorial on how you can use the CosineAnnealingWarmRestarts Scheduler in PyTorch with code and interactive visualizations.
Read more >PyTorch using LR-Scheduler with param groups of different LR's
After a bit of testing, it looks like, this problem only occurs with CosineAnnealingWarmRestarts scheduler. I've tested CosineAnnealingLR ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
You save my day 😊 I check if it’s ok.
@sdesrozis how about the same approach as used with lr finders etc :