Learning rate initialization of create_lr_scheduler_with_warmup
See original GitHub issue🐛 Bug description
It seems that learning rate initialization for create_lr_scheduler_with_warmup
is not working well.
how to reproduce it
import torchvision.models as models
import torch.optim as optim
from ignite.handlers.param_scheduler import create_lr_scheduler_with_warmup
model = models.resnet18()
total_iteration = 100
warmup_iteration = 10
initial_lr = 1e-3
warmup_initial_lr = 1e-5
optimizer = optim.Adam(model.parameters(), lr=initial_lr)
lr_scheduler = create_lr_scheduler_with_warmup(optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max=total_iteration),
warmup_start_value=warmup_initial_lr,
warmup_duration=warmup_iteration,
warmup_end_value=initial_lr)
for _ in range(total_iteration):
print(optimizer.param_groups[0]['lr'])
lr_scheduler(None)
Results
0.001
1e-05
0.00012
0.00023
...
2.9559615522887284e-05
I thought that learning rate of optimizer should have been 1e-5. But I got 1e-3.
Environment
- PyTorch Version (e.g., 1.4):
- Ignite Version (e.g., 0.3.0):
- OS (e.g., Linux):
- How you installed Ignite (
conda
,pip
, source): pip - Python version:
- Any other relevant information: Run on Colab
Issue Analytics
- State:
- Created 2 years ago
- Comments:9
Top Results From Across the Web
How to Configure the Learning Rate When Training Deep ...
Use a Learning Rate Schedule; Adaptive Learning Rates. What Is the Learning Rate? Deep learning neural networks are trained using the stochastic ...
Read more >Learning Rate Schedules and Adaptive Learning Rate ...
Learning rate schedules seek to adjust the learning rate during training by reducing the learning rate according to a pre-defined schedule. Common learning...
Read more >How to Choose a Learning Rate Scheduler for Neural Networks
A Learning rate schedule is a predefined framework that adjusts the learning rate between epochs or iterations as the training progresses. Two ...
Read more >Setting the learning rate of your neural network. - Jeremy Jordan
Setting a schedule to adjust your learning rate during training · Cyclical learning rates · Stochastic Gradient Descent with Warm Restarts (SGDR).
Read more >12.11. Learning Rate Scheduling - Dive into Deep Learning
Nonetheless, adjusting the learning rate is often just as important as the actual algorithm. ... Another aspect that is equally important is initialization....
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@vfdev-5 I joined! Thanks!
@vfdev-5 Wow! I have to study pytorch-ignite harder! Thanks for sharing your works!