question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

optimizer = Ranger21(params=model.parameters(), lr=learning_rate) File "/mnt/Drive1/florian/msblob/Ranger21/ranger21/ranger21.py", line 179, in __init__ self.total_iterations = num_epochs * num_batches_per_epoch TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'

See original GitHub issue

I get the following error when starting my training:

Traceback (most recent call last):
  File "tr_baseline.py", line 75, in <module>
    optimizer = Ranger21(params=model.parameters(), lr=learning_rate)
  File "/mnt/Drive1/florian/msblob/Ranger21/ranger21/ranger21.py", line 179, in __init__
    self.total_iterations = num_epochs * num_batches_per_epoch
TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'

initializing ranger with:

# ranger:
optimizer = Ranger21(params=model.parameters(), lr=learning_rate)

Issue Analytics

  • State:open
  • Created 2 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
saruarlivecommented, Jun 28, 2021

Well, have you tried as shown below,

from ranger21 import Ranger21 
optimizer = Ranger21(model.parameters(), lr = 1e-02, num_epochs = epochs, num_batches_per_epoch = len(train_loader))
0reactions
neuronflowcommented, Jul 6, 2021

One further question, I have a training where I use multiple training data loaders with different batch length…is it possible to apply ranger21 in this context?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Issues · lessw2020/Ranger21 - GitHub
Ranger deep learning optimizer rewrite to use newest components - Issues ... TypeError: unsupported operand type(s) for *: 'NoneType' and 'NoneType'.
Read more >
TypeError: unsupported operand type(s) for *: 'NoneType' and ...
I am not sure if I need to change L2 to a float. This is my first time working with matrixes on python....
Read more >
Ranger — pytorch-forecasting documentation - Read the Docs
Ranger seems to be benefiting most models. Parameters. params – iterable of parameters to optimize or dicts defining parameter groups. lr – learning...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found