question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

CosineAnnealingWarmRestarts is not compatible with LRScheduler.simulate_values()

See original GitHub issue

🐛 Bug description

CosineAnnealingWarmRestarts is not compatible with LRScheduler.simulate_values(). Precisely, a object of type CosineAnnealingWarmRestarts can’t be replicated by LRScheduler._replicate_lr_scheduler(). It works for CosineAnnealingLR.

For example :

lr_scheduler = CosineAnnealingWarmRestarts(optimizer=optimizer, T_0=10, eta_min=1e-3)
lr_values = LRScheduler.simulate_values(num_events=50, lr_scheduler=lr_scheduler)

produces the following error

Traceback (most recent call last):
  File "tutorials/misc/lr_schedulers.py", line 56, in <module>
    lr_values = LRScheduler.simulate_values(num_events=50, lr_scheduler=lr_scheduler)
  File "/work/desrozis/Softwares/conda/envs/focus-light/lib/python3.7/site-packages/ignite/contrib/handlers/param_scheduler.py", line 606, in simulate_values
    copy_lr_scheduler = LRScheduler._replicate_lr_scheduler(lr_scheduler)
  File "/work/desrozis/Softwares/conda/envs/focus-light/lib/python3.7/site-packages/ignite/contrib/handlers/param_scheduler.py", line 627, in _replicate_lr_scheduler
    copy_lr_scheduler = lr_scheduler_cls(optimizer=dummy_optimizer, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'T_i'

This issue is due to the assumption that lr_scheduler.__state_dict__ contains arguments of method CosineAnnealingWarmRestarts.__init__(). A workaround could be to remove T_i in a similar way to base_lrs and last_epoch but it’s not very satisfactory…

Hope it helps to have a better and stonger ignite.

Environment

  • PyTorch Version (e.g., 1.4): 1.4
  • Ignite Version (e.g., 0.3.0): 0.3.0
  • OS (e.g., Linux): Linux
  • How you installed Ignite (conda, pip, source): conda
  • Python version: 3.7.6
  • Any other relevant information:

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:1
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
sdesroziscommented, May 26, 2020

You save my day 😊 I check if it’s ok.

1reaction
vfdev-5commented, May 26, 2020

@sdesrozis how about the same approach as used with lr finders etc :

  • save initial state dicts
  • do something
  • restore init state ?
Read more comments on GitHub >

github_iconTop Results From Across the Web

CosineAnnealingWarmRestarts — PyTorch 1.13 documentation
Returns the state of the scheduler as a dict . It contains an entry for every variable in self.__dict__ which is not the...
Read more >
Weights & Biases - Wandb
A short tutorial on how you can use the CosineAnnealingWarmRestarts Scheduler in PyTorch with code and interactive visualizations.
Read more >
PyTorch using LR-Scheduler with param groups of different LR's
After a bit of testing, it looks like, this problem only occurs with CosineAnnealingWarmRestarts scheduler. I've tested CosineAnnealingLR ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found