Incorrect type hints for LightningModule.save_hyperparameters and hparams
See original GitHub issueBug description
Both LightningModule.save_hyperparameters
and LightningModule.hparams
seem to have type Union[torch._tensor.Tensor, torch.nn.modules.module.Module]
. I would expect the former to be Callable
and the latter to be an AttributeDict
.
How to reproduce the bug
import pytorch_lightning as pl
class LitMNIST(pl.LightningModule):
def __init__(self, layer_1_dim: int = 128, learning_rate: float = 1e-2):
super().__init__()
reveal_type(self.save_hyperparameters)
self.save_hyperparameters()
reveal_type(self.hparams)
self.hparams.layer_1_dim
self.hparams["learning_rate"]
Error messages and logs
The above code should not have any mypy errors. However, it has several. reveal_type
was added to see what type mypy thinks everything should be.
$ mypy --strict test.py
test.py:7: note: Revealed type is "Union[torch._tensor.Tensor, torch.nn.modules.module.Module]"
test.py:8: error: "Tensor" not callable [operator]
test.py:9: note: Revealed type is "Union[torch._tensor.Tensor, torch.nn.modules.module.Module]"
test.py:10: error: Item "Tensor" of "Union[Tensor, Module]" has no attribute "layer_1_dim" [union-attr]
test.py:11: error: Value of type "Union[Tensor, Module]" is not indexable [index]
test.py:11: error: Invalid index type "str" for "Union[Tensor, Module]"; expected type "Union[None, int, slice, Tensor, List[Any], Tuple[Any, ...]]" [index]
Found 4 errors in 1 file (checked 1 source file)
Environment
#- Lightning Component (e.g. Trainer, LightningModule, LightningApp, LightningWork, LightningFlow): LightningModule
#- PyTorch Lightning Version (e.g., 1.5.0): 1.8.2
#- Lightning App Version (e.g., 0.5.2): N/A
#- PyTorch Version (e.g., 1.10): 1.12.1
#- Python version (e.g., 3.9): 3.10.8
#- OS (e.g., Linux): macOS and Linux
#- CUDA/cuDNN version: N/A
#- GPU models and configuration: N/A
#- How you installed Lightning(`conda`, `pip`, source): Spack
#- Running environment of LightningApp (e.g. local, cloud): local
More info
This issue has been present ever since PyTorch first added public type hints (PyTorch 1.11, March 2022). Since self.hparams
is repeatedly accessed throughout every LightningModule
, this currently requires type ignores in hundreds of places in TorchGeo.
Issue Analytics
- State:
- Created 10 months ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
Warning during save_hyperparameter() gives misleading ...
Module s with save_hyperparameters() and loaded the LightningModule with load_from_checkpoint() . But since I am still building the pipeline, I ...
Read more >Configure hyperparameters from the CLI - PyTorch Lightning
Use save_hyperparameters() within your LightningModule 's __init__ method. It will enable Lightning to store all the provided arguments under the self.hparams ...
Read more >How to save hparams when not provided as argument ...
Currently I've encoded this in the model, so that the user only has to provide the sample rate to correctly set the hparams...
Read more >PyTorch Lightning - Documentation - Weights & Biases - WandB
The code snippet below shows how to define your LightningModule to log your metrics and your ... save hyper-parameters to self.hparams (auto-logged by...
Read more >Type Hints in Python — Everything You Need To Know In 5 ...
Implement type hints and static type checks in Python scripts. Source code included. ... Python has always been a dynamically typed language, ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Here’s the relevant issue upstream: https://github.com/pytorch/pytorch/issues/81462
Thanks, I’ll close this issue and follow that discussion.