question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

FastaiLRFinder does not run more than 1 epoch. Why?

See original GitHub issue

I have been trying to use the FastaiLRFinder to find the best learning rate for my module.

If we create the trainer with the function create_supervised_trainer as below:

trainer = create_supervised_trainer(
    model, optimizer, criterion, device, output_transform=custom_output_transform
)

and run it:

with lr_finder.attach(
            trainer,
            to_save=to_save,
            num_iter=50,  
            end_lr=1.,
            step_mode='exp') as lr_finder_training:
        lr_finder_training.run(train_loader)

A warning will come up say: “UserWarning: Desired num_iter 50 is unreachable with the current run setup of 15 iteration (1 epochs) from ignite.contrib.handlers.param_scheduler import (LRScheduler, PiecewiseLinear)”

My dataloader has 15 batches to iterate, which means FastaiLRFinder does not allow you to run more than 1 epoch. Why?

According to their source code, we can see here that this verification limits the user to run more iterations than the maximum of epochs in their dataloader.

But why? Am I missing something important here?

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
KickItLikeShikacommented, Sep 9, 2021

@vfdev-5 Yes that’s a good one, I can work on it, thanks!

1reaction
vfdev-5commented, Sep 9, 2021

@KickItLikeShika you have worked on LR finder recently. Can I assign this issue to you to improve it ?

Read more comments on GitHub >

github_iconTop Results From Across the Web

FastaiLRFinder does not allow running more than 1 epoch
My dataloader has 15 batches to iterate, which means FastaiLRFinder does not allow you to run more than 1 epoch. Why?
Read more >
Why do my earlier epochs take longer than subsequent epochs?
The simplest and most intuitive reason I could think of for early epochs taking more than than latter ones, is that for your...
Read more >
Running one epoch at a time or all at once? - Fast.ai forums
Hello everyone, Imagine if I want to run 10 epochs in my model. I could run all of them using the same function,...
Read more >
How to do time profiling | PyTorch-Ignite
Learn how to get the time breakdown for individual epochs during training, ... We can print the results of the profiler in the...
Read more >
ignite.contrib.handlers.lr_finder — ignite master documentation
[docs]class FastaiLRFinder: """Learning rate finder handler for ... on how well the network can be trained over a range of learning rates and...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found