Checkpoint contains hyperparameters but IntrospectorModule's __init__ is missing the argument 'hparams'.
See original GitHub issue/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/utilities/warnings.py:18: UserWarning: The dataloader, train dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of the num_workers
argumentin the
DataLoader` init to improve performance.
warnings.warn(*args, **kwargs)
Traceback (most recent call last):
File “run_20news.py”, line 45, in <module>
main_loop(config)
File “/data/CogLTX-main/main_loop.py”, line 57, in main_loop
trainer.fit(introspector)
File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py”, line 695, in fit
self.load_spawn_weights(model)
File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py”, line 373, in load_spawn_weights
loaded_model = original_model.class.load_from_checkpoint(path)
File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/core/lightning.py”, line 1509, in load_from_checkpoint
model = cls._load_model_state(checkpoint, *args, **kwargs)
File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/core/lightning.py”, line 1533, in _load_model_state
f"Checkpoint contains hyperparameters but {cls.name}'s init "
pytorch_lightning.utilities.exceptions.MisconfigurationException: Checkpoint contains hyperparameters but IntrospectorModule’s init is missing the argument ‘hparams’. Are you loading the correct checkpoint?
Issue Analytics
- State:
- Created 2 years ago
- Comments:8 (1 by maintainers)
modify paramters in class IntrospectorModule(pl.LightningModule) init function and code in below change: def init(self, config): ==> def init(self, hparams) add config = hparams
so do class ReasonerModule
Same question in pytorch-lightning 0.6.0 or 0.7.3, looks like the version provided in README is wrong. @Sleepychord @dm-thu Please give some help.