question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

AimLogger error on new Pytorch-Lightning Release

See original GitHub issue

🐛 Bug

Removed _convert_params function in new pytorch-lightning release (1.6.0).

To reproduce

from aim.pytorch_lightning import AimLogger

trainer = pl.Trainer(logger=AimLogger(experiment='experiment_name'))
trainer.logger.log_hyperparams(hparams)

Expected behavior

  • Successfully log the hyperparameters with no errors

Environment

  • Aim Version (3.8.0)
  • Python version (3.8.12)
  • pip version (21.2.4)
  • OS (e.g., Linux)
  • Pytorch-lightning (1.6.0)

Additional context

Error output

Error executing job with overrides: ['experiment=dma_net', 'debug=default', 'log_dir=debug', 'data_dir=/media/haritsahm/DataStorage/dataset/cityscapes/cityscape_fo_segmentation/', 'logger=aim']
Traceback (most recent call last):
  File "train.py", line 22, in main
    return train(config)
  File "/home/haritsahm/Documents/projects/deeplearning/pytorch-DMANet/src/training_pipeline.py", line 74, in train
    utils.log_hyperparameters(
  File "/opt/conda/lib/python3.8/site-packages/pytorch_lightning/utilities/rank_zero.py", line 32, in wrapped_fn
    return fn(*args, **kwargs)
  File "/home/haritsahm/Documents/projects/deeplearning/pytorch-DMANet/src/utils/__init__.py", line 143, in log_hyperparameters
    trainer.logger.log_hyperparams(hparams)
  File "/opt/conda/lib/python3.8/site-packages/pytorch_lightning/utilities/rank_zero.py", line 32, in wrapped_fn
    return fn(*args, **kwargs)
  File "/opt/conda/lib/python3.8/site-packages/aim/sdk/adapters/pytorch_lightning.py", line 70, in log_hyperparams
    params = self._convert_params(params)
AttributeError: 'AimLogger' object has no attribute '_convert_params'

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
root@0e330cd4b21f:/home/haritsahm/Documents/projects/deeplearning/pytorch-DMANet# 

Issue Analytics

  • State:closed
  • Created a year ago
  • Reactions:1
  • Comments:8 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
mihran113commented, Apr 18, 2022

hey @mpds! Sorry for inconvenience, but that’s actually a different dev release that doesn’t have the fix included in it. Can I ask you to try out this one and let me know if it works properly?

pip install aim==3.9.0.dev20220417
0reactions
mihran113commented, May 2, 2022

Hey folks! Glad to inform that the fix was shipped with the new aim v3.9 release, please have a try, and let me know if everything works as expected. @mpds @haritsahm @djwessel

Read more comments on GitHub >

github_iconTop Results From Across the Web

Release 3.15.1 Gev Sogomonian, Gor Arakelyan et al. - Aim
Aim easily integrates with your favourite ML frameworks. Aim loggers give access to the aim.Run object instance via the experiment property.
Read more >
aim - PyPI
A super-easy way to record, search and compare AI experiments.
Read more >
Production - PyTorch Lightning
We're happy to release PyTorch Lightning 0.9.0 today, which contains many great new features, more bug fixes than any release we ever had,...
Read more >
PyTorch Lightning and Aim - AimStack
When the training data is limited, transformations on the available data are used to synthesize new data. Mostly you can randomly choose the ......
Read more >
Apple Silicon, Multi-GPU and more - Lightning 1.7
We're excited to announce the release of PyTorch Lightning 1.7 ⚡️ ... fault-tolerant training enabled, it is useful to load the latest ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found