question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Lamb optimizer warning in pytorch 1.6

See original GitHub issue

Hi I’m getting this deprecated warning in pytorch 1.6 for Lamb:


  | 2020-06-25T01:58:41.682+01:00 | add_(Tensor other, *, Number alpha) (Triggered internally at /opt/conda/conda-bld/pytorch_1592982553767/work/torch/csrc/utils/python_arg_parser.cpp:766.)
-- | -- | --
  | 2020-06-25T01:58:41.682+01:00 | exp_avg.mul_(beta1).add_(1 - beta1, grad)
  | 2020-06-25T01:58:41.682+01:00 | 2020-06-25T00:58:41 - WARNING - /opt/conda/envs/py36/lib/python3.6/site-packages/torch_optimizer/lamb.py:120: UserWarning: This overload of add_ is deprecated:
  | 2020-06-25T01:58:41.682+01:00 | add_(Number alpha, Tensor other)
  | 2020-06-25T01:58:41.682+01:00 | Consider using one of the following signatures instead:
  | 2020-06-25T01:58:41.682+01:00 | add_(Tensor other, *, Number alpha) (Triggered internally at /opt/conda/conda-bld/pytorch_1592982553767/work/torch/csrc/utils/python_arg_parser.cpp:766.)
  | 2020-06-25T01:58:41.682+01:00 | exp_avg.mul_(beta1).add_(1 - beta1, grad)


Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:2
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
jettifycommented, Aug 3, 2020

Fixed in https://github.com/jettify/pytorch-optimizer/pull/163 going to do PyPI release relatively soon.

0reactions
jettifycommented, Aug 11, 2020
Read more comments on GitHub >

github_iconTop Results From Across the Web

Trying to implement LARS/LAMB optimizers #45268 - GitHub
Bug I'm trying to get the LAMB and LARS optimizers to work on TPUs, but I'm getting huge slowdowns at the first call...
Read more >
Source code for flash.core.optimizers.lamb
[docs]class LAMB(Optimizer): r"""Extends ADAM in pytorch to incorporate LAMB algorithm from the paper: `Large batch optimization for deep learning: Training ...
Read more >
Source code for transformers.trainer - Hugging Face
LambdaLR `, `optional`): A tuple containing the optimizer and the ... else: logger.warning( "You enabled PyTorch/XLA debug metrics but you don't have a...
Read more >
Script and Optimize for Mobile Recipe - PyTorch
This recipe demonstrates how to convert a PyTorch model to TorchScript which can run in a high-performance C++ environment such as iOS and...
Read more >
15. Changelog — PyTorch for the IPU: User Guide
Add warning that IPU-specific optimiser states cannot be read from the host, when calling get_state() on poptorch.optim optimisers ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found