How to find the current epoch number inside the training loop?
See original GitHub issue❓ Questions and Help
How to find the current epoch number inside the training loop?
I am training a Conv autoencoder on MNIST images and want to save the reconstructed images after every 10 epochs.
Code
class AutoEncoder(pl.LightningModule):
def __init__(self):
pass
def forward(self, x):
pass
def training_step(self, batch, batch_idx):
images = batch
out = self.forward(images)
loss = self.loss(out, images)
if epoch % 10:
save(images)
I am looking for something like this where the output images are saved every 10 epoch.
What have you tried?
I haven’t tried anything since I am unable to find any documentation on current epoch no.
What’s your environment?
I am using pytorch 1.4 and Lightning version 0.7
- OS: [e.g. iOS, Linux, Win]
- Packaging [e.g. pip, conda]
- Version [e.g. 0.5.2.1]
Issue Analytics
- State:
- Created 3 years ago
- Reactions:9
- Comments:6 (1 by maintainers)
Top Results From Across the Web
How to find the current epoch number inside the training loop?
I am training a Conv autoencoder on MNIST images and want to save the reconstructed images after every 10 epochs.
Read more >pytorch_lightning.loops.epoch.training_epoch_loop
[docs]class TrainingEpochLoop(loops.Loop[_OUTPUTS_TYPE]): """Runs over all batches in a dataloader (one epoch). Args: min_steps: The minimum number of steps ...
Read more >Get epoch inside keras optimizer - tensorflow - Stack Overflow
In Keras the epochs are stored and iterated within Model.fit() so you would need to write a custom training loop and write your...
Read more >How to switch dataloaders between epochs in Lightning
You can use self.trainer.current_epoch inside pl.LightningDataModule to get the current epoch. 1 Like.
Read more >Choose optimal number of epochs to train a neural network in ...
Finding the optimal number of epochs to avoid overfitting on MNIST dataset ... set and training set by partitioning the current training set ......
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found

self.current_epochwould do. (self is a lightning module). Though it starts with 0. You can find the definition of the module here.https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/core/lightning.py
Though I’d recommend you to use IDE e.g. pycharm so you can go to the definition of the class quickly.
In case the iteration/step is required, there is also
self.global_step