Enable Hyperparameter logging from any hook in the LightningModule
See original GitHub issue🚀 Feature
Make it possible to call save_hyperparameters
from any hook in the LightningModule
.
Motivation
Sometimes the dataset has hyperparameters that should be logged. However, the LightningDataModule
is only accessible from the LightningModule
once the trainer is initiated. Thus, it would be useful to call save_hyperparameters
from on_fit_start
, when the Trainer
is specified and the hyperparameters from the dataset can easily be collected, e.g. through self.trainer.datamodule.build_hparams()
.
Pitch
log_hyperparameters
shouldn’t look for init
args in the local variables when called outside the __init__
method.
Currently, this behaviour casues an exception in line 154 in utilities/parsing.py
local_args = {k: local_vars[k] for k in init_parameters.keys()}
because the function is looking for the init parameters in the local variables, which are only available when called from __init__
.
Suggestion: Remove init parameter logging when called from other places.
Alternatives
Save init parameters and add them later.
Additional context
N/A
cc @borda @carmocca @justusschock @awaelchli @ananthsub @ninginthecloud @jjenniferdai @rohitgr7
Issue Analytics
- State:
- Created a year ago
- Comments:7 (4 by maintainers)
I think it is quite a quick fix. Ill give it a go sometime this week.
Hello there!
For this particular use case, I recommend calling
self.save_hyperparameters()
in the DataModule directly rather than via the LightningModule hooks. This is supported in the latest PL versions. The recorded parameters (merged with the ones form LM) get logged to the logger.Apart from that, I think this is a honest feature request. However, let me remind everyone that the main objective of
save_hyperparameters
is NOT JUST to send the to the logger. This is a nice to have feature but the MAIN motivation of this method is to have the parameters saved to the checkpoint so that they can be used in the right way when there is a desire to load the model back from the checkpoint (via theLightningModule.load_from_checkpoint
). From that perspective, it will be very error prone to let this method be called from every hook. It is imperative that thesave_hyperparameters
method captures exactly the arguments passed to the init, not more, not less and not any modified ones. For this reason, I recommend not going forward with this feature. Instead, we could figure out a better error handling. I’m curious what others think about this.Finally, when you only care about logging some parameters, this is also possible with by accessing
self.logger
in any hook, (orself.log_dict
).