question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"Step must only increase in log calls" when adding W&B logger after some training

See original GitHub issue

Describe the bug

First time W&B user here. I added the WandbLogger to my PyTorch-Lightning project using the default settings, ie. Trainer(logger=WandbLogger()). Note that I’m resuming training from a checkpoint (didn’t use W&B logger before).

I’m getting lots of warnings like wandb: WARNING Step must only increase in log calls. Step 47 < 270952; dropping {'loss': -2.309272527694702} and no metrics are reported. Other data like GPU utilisation is collected correctly.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:5
  • Comments:13 (6 by maintainers)

github_iconTop GitHub Comments

5reactions
jonashaagcommented, Jan 13, 2021

encountered the same issue. @jonashaag did you figure out how to deal with this?

I’m dealing with it by not using W&B anymore ^^

1reaction
vanpeltcommented, Jul 13, 2021

Hey @jonashaag instead of passing the “step” argument, you can log separate x-axis as separate metrics. I.E.

self.logger.experiment.log({
            "val_epoch": wandb.Image(test_img, caption='caption'),
            "val_step": self.val_step}, step=self.current_epoch)

Then in the UI you can choose a custom x-axis if you need to plot metrics against “val_step” for instance. You also shouldn’t be adding the “epoch” number to the key as this will create an unbounded number of charts. You can remove step=self.current_epoch in the above but it’s an optimization to ensure all metrics for a given epoch are logged together.

Read more comments on GitHub >

github_iconTop Results From Across the Web

"Step must only increase in log calls" when adding W&B ...
First time W&B user here. I added the WandbLogger to my PyTorch-Lightning project using the default settings, ie. Trainer(logger=WandbLogger()) ...
Read more >
Log Data with wandb.log - Documentation - Weights & Biases
Call wandb.log(dict) to log a dictionary of metrics, media, or custom objects to a step. Each time you log, we increment the step...
Read more >
Seq2Seq fnetuning wandb issue - Hugging Face Forums
wandb: WARNING Step must only increase in log calls. and because of this issue, it is not reporting all the stats in my...
Read more >
imitation.util.logger - imitation
A logger supporting contexts for accumulating mean values. self.accumulate_means creates a context manager. While in this context, values are loggged to a sub- ......
Read more >
Writing, Viewing, and Responding to Logs | Cloud Functions ...
Logs written to stdout or stderr will appear automatically in the Google Cloud console. For more advanced logging, use the Cloud Logging client...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found