Support latest PyTorch lightning
See original GitHub issueExpected behavior
Optuna works with the latest PyTorch-Lightning (1.6.0).
Environment
- Optuna version: 3.0.0b0.dev
- Python version:
- OS: Independent
- (Optional) Other libraries and their versions: PyTorch-Lightning 1.6.0
Error messages, stack traces, or logs
tests/integration_tests/test_pytorch_lightning.py:158:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
optuna/study/study.py:399: in optimize
_optimize(
optuna/study/_optimize.py:68: in _optimize
_optimize_sequential(
optuna/study/_optimize.py:162: in _optimize_sequential
trial = _run_trial(study, func, catch)
optuna/study/_optimize.py:262: in _run_trial
raise func_err
optuna/study/_optimize.py:211: in _run_trial
value_or_values = func(trial)
tests/integration_tests/test_pytorch_lightning.py:143: in objective
trainer = pl.Trainer(
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/utilities/argparse.py:339: in insert_env_defaults
return fn(self, **kwargs)
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py:561: in __init__
self._call_callback_hooks("on_init_start")
/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py:1617: in _call_callback_hooks
fn(self, *args, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <optuna.integration.pytorch_lightning.PyTorchLightningPruningCallback object at 0x7f892c1c5730>
trainer = <pytorch_lightning.trainer.trainer.Trainer object at 0x7f892c1c5040>
def on_init_start(self, trainer: Trainer) -> None:
> self.is_ddp_backend = trainer._accelerator_connector.distributed_backend is not None
E AttributeError: 'AcceleratorConnector' object has no attribute 'distributed_backend'
optuna/integration/pytorch_lightning.py:60: AttributeError
https://github.com/optuna/optuna/runs/5745775785?check_suite_focus=true
tests/integration_tests/test_pytorch_lightning.py:6: in <module>
import pytorch_lightning as pl
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/__init__.py:30: in <module>
from pytorch_lightning.callbacks import Callback # noqa: E402
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/callbacks/__init__.py:26: in <module>
from pytorch_lightning.callbacks.pruning import ModelPruning
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/callbacks/pruning.py:31: in <module>
from pytorch_lightning.core.lightning import LightningModule
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/core/__init__.py:16: in <module>
from pytorch_lightning.core.lightning import LightningModule
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/core/lightning.py:40: in <module>
from pytorch_lightning.loggers import LightningLoggerBase
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/loggers/__init__.py:18: in <module>
from pytorch_lightning.loggers.tensorboard import TensorBoardLogger
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pytorch_lightning/loggers/tensorboard.py:26: in <module>
from torch.utils.tensorboard import SummaryWriter
../../../hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/torch/utils/tensorboard/__init__.py:4: in <module>
LooseVersion = distutils.version.LooseVersion
E AttributeError: module 'distutils' has no attribute 'version'
https://github.com/optuna/optuna/runs/5745734509?check_suite_focus=true
Steps to reproduce
See our CI failures.
Additional context (optional)
It may be the simplest way to support PyTorch v1.11.0 (https://github.com/PyTorchLightning/pytorch-lightning/issues/12324).
Issue Analytics
- State:
- Created a year ago
- Reactions:5
- Comments:13 (3 by maintainers)
Top Results From Across the Web
PyTorch Lightning
The ultimate PyTorch research framework. Scale your models, without the boilerplate.
Read more >Welcome to PyTorch Lightning — PyTorch Lightning 1.8.5 ...
PyTorch Lightning is the deep learning framework for professional AI researchers and machine learning engineers who need maximal flexibility without ...
Read more >pytorch-lightning - PyPI
PyTorch Lightning is the lightweight PyTorch wrapper for ML researchers. ... We test every combination of PyTorch and Python supported versions, every OS, ......
Read more >Releases · Lightning-AI/lightning - GitHub
Build and train PyTorch models and connect them to the ML lifecycle using Lightning App templates, without handling DIY infrastructure, cost management, ...
Read more >Run PyTorch Lightning and native PyTorch DDP on Amazon ...
Amazon Search scientists have used PyTorch Lightning as one of the main frameworks to train the deep learning models that power Search ranking ......
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@hrzn Yes, https://github.com/optuna/optuna/pull/3431 was my attempt to resolve the PL so it might be helpful.
Sorry, I mentioned the wrong line. It should have been https://github.com/optuna/optuna/blob/fd575d04cebae8b471e72aa890e4a4cb052aa083/.github/workflows/tests-integration.yml#L79.