Use the non-protected LRScheduler import
See original GitHub issue🚀 Feature
Motivation
Avoid protected imports
Pitch
https://github.com/pytorch/pytorch/issues/61232 has been merged to PyTorch which renames _LRScheduler
to LRScheduler
. They have kept the old class for compatibility, but we should still use the new one.
The task is to add logic like this
LRScheduler = (
torch.optim.lr_scheduler.LRScheduler
if _TORCH_GREATER_EQUAL_1_14 else
torch.optim.lr_scheduler._LRScheduler
)
to https://github.com/Lightning-AI/lightning/blob/d5003b1c07fda783f651a732c86ad48656be42c1/src/lightning_lite/utilities/types.py#L66 and places that use it
Alternatives
Keep using the protected import
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Lite: enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
-
Bolts: Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
-
Lightning Transformers: Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.
cc @borda
Issue Analytics
- State:
- Created 10 months ago
- Comments:6 (5 by maintainers)
Hi @shenoynikhil, i am working on this one.
I described the work in the top post. We want to use
LRScheduler
throughout the codebase by importing a small compatibility variable defined inlightning/src/lightning_lite/utilities/types.py