question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Enable Hyperparameter logging from any hook in the LightningModule

See original GitHub issue

🚀 Feature

Make it possible to call save_hyperparameters from any hook in the LightningModule.

Motivation

Sometimes the dataset has hyperparameters that should be logged. However, the LightningDataModule is only accessible from the LightningModule once the trainer is initiated. Thus, it would be useful to call save_hyperparameters from on_fit_start, when the Trainer is specified and the hyperparameters from the dataset can easily be collected, e.g. through self.trainer.datamodule.build_hparams().

Pitch

log_hyperparameters shouldn’t look for init args in the local variables when called outside the __init__ method.

Currently, this behaviour casues an exception in line 154 in utilities/parsing.py

local_args = {k: local_vars[k] for k in init_parameters.keys()}

because the function is looking for the init parameters in the local variables, which are only available when called from __init__.

Suggestion: Remove init parameter logging when called from other places.

Alternatives

Save init parameters and add them later.

Additional context

N/A

cc @borda @carmocca @justusschock @awaelchli @ananthsub @ninginthecloud @jjenniferdai @rohitgr7

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

4reactions
wisecorneliuscommented, Apr 25, 2022

I think it is quite a quick fix. Ill give it a go sometime this week.

4reactions
awaelchlicommented, Apr 11, 2022

Hello there!

Sometimes the dataset has hyperparameters that should be logged. However, the LightningDataModule is only accessible from the LightningModule once the trainer is initiated. Thus, it would be useful to call save_hyperparameters from on_fit_start

For this particular use case, I recommend calling self.save_hyperparameters() in the DataModule directly rather than via the LightningModule hooks. This is supported in the latest PL versions. The recorded parameters (merged with the ones form LM) get logged to the logger.

Apart from that, I think this is a honest feature request. However, let me remind everyone that the main objective of save_hyperparameters is NOT JUST to send the to the logger. This is a nice to have feature but the MAIN motivation of this method is to have the parameters saved to the checkpoint so that they can be used in the right way when there is a desire to load the model back from the checkpoint (via the LightningModule.load_from_checkpoint). From that perspective, it will be very error prone to let this method be called from every hook. It is imperative that the save_hyperparameters method captures exactly the arguments passed to the init, not more, not less and not any modified ones. For this reason, I recommend not going forward with this feature. Instead, we could figure out a better error handling. I’m curious what others think about this.

Finally, when you only care about logging some parameters, this is also possible with by accessing self.logger in any hook, (or self.log_dict).

Read more comments on GitHub >

github_iconTop Results From Across the Web

Enable Hyperparameter logging from any hook in the ... - GitHub
Sometimes the dataset has hyperparameters that should be logged. However, the LightningDataModule is only accessible from the LightningModule once the trainer ...
Read more >
LightningModule - PyTorch Lightning - Read the Docs
When running under a distributed strategy, Lightning handles the distributed ... The default behavior per hook is documented here: Automatic Logging.
Read more >
Logging — PyTorch Lightning 1.8.5.post0 documentation
To visualize tensorboard in a jupyter notebook environment, run the following ... By default, Lightning logs every 50 steps. ... Hook. on_step. on_epoch ......
Read more >
LightningModule - PyTorch Lightning - Read the Docs
None - Fit will run without any optimizer. ... The default behavior per hook is documented here: Automatic Logging. Parameters. name ( str...
Read more >
Configure hyperparameters from the CLI - PyTorch Lightning
In addition, loggers that support it will automatically log the contents of self.hparams . Excluding hyperparameters. By default, every parameter of the __init ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found