question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Checkpoint contains hyperparameters but IntrospectorModule's __init__ is missing the argument 'hparams'.

See original GitHub issue

/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/utilities/warnings.py:18: UserWarning: The dataloader, train dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of the num_workers argumentin theDataLoader` init to improve performance. warnings.warn(*args, **kwargs) Traceback (most recent call last): File “run_20news.py”, line 45, in <module> main_loop(config) File “/data/CogLTX-main/main_loop.py”, line 57, in main_loop trainer.fit(introspector) File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py”, line 695, in fit self.load_spawn_weights(model) File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/trainer/distrib_data_parallel.py”, line 373, in load_spawn_weights loaded_model = original_model.class.load_from_checkpoint(path) File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/core/lightning.py”, line 1509, in load_from_checkpoint model = cls._load_model_state(checkpoint, *args, **kwargs) File “/home/anaconda3/envs/cogltx/lib/python3.7/site-packages/pytorch_lightning/core/lightning.py”, line 1533, in _load_model_state f"Checkpoint contains hyperparameters but {cls.name}'s init " pytorch_lightning.utilities.exceptions.MisconfigurationException: Checkpoint contains hyperparameters but IntrospectorModule’s init is missing the argument ‘hparams’. Are you loading the correct checkpoint?

Issue Analytics

  • State:open
  • Created 2 years ago
  • Comments:8 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
yudong27commented, Dec 7, 2021

modify paramters in class IntrospectorModule(pl.LightningModule) init function and code in below change: def init(self, config): ==> def init(self, hparams) add config = hparams

so do class ReasonerModule

0reactions
ParanoidWcommented, Nov 16, 2021

Same question in pytorch-lightning 0.6.0 or 0.7.3, looks like the version provided in README is wrong. @Sleepychord @dm-thu Please give some help.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Hparams not restored when using load_from_checkpoint ...
Problem I'm having an issue where the model is training fine, and the saved checkpoint does indeed have the hparams used in training....
Read more >
Unable to load model from checkpoint in Pytorch-Lightning
Cause. This happens because your model is unable to load hyperparameters(n_channels, n_classes=5) from the checkpoint as you do not save ...
Read more >
pytorch_lightning.core.saving - PyTorch Lightning
However, if your checkpoint weights don't have the hyperparameters saved, use this method to ... If your model's ``hparams`` argument is :class:`~argparse.
Read more >
How to tune Pytorch Lightning hyperparameters
But if you use Pytorch Lightning, you'll need to do hyperparameter tuning. ... should take a configuration dict as a parameter on initialization....
Read more >
AttributeError: 'dict' object has no attribute 'n_channels'
I am getting this error after loading the model and when the model is called. Below is my code: def main(hparams): device =...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found