question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

'AmpOptimizerState' object has no attribute 'all_fp32_params'

See original GitHub issue

I initialized the model and optimizer with amp

optimizer = Adam(filter(lambda p: p.requires_grad, model.parameters()), args.lr)
scheduler = ReduceLROnPlateau(optimizer, patience=1, factor=0.1, verbose=True, mode='max')
model, optimizer = amp.initialize(model, optimizer, opt_level="O1", verbosity=0)

and then I tried to add a group of params to the optimizer before training

optimizer.add_param_group({'params':unfreezed_params, 'lr':lr})

I got the following error

AttributeError: 'AmpOptimizerState' object has no attribute 'all_fp32_params'

System info: Ubuntu 18.04, cuda 10.0.130, pytorch 1.1.0

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
adriansahlmancommented, May 28, 2019

@thuwyh Quick workaround is to do the following:

optimizer = Adam(filter(lambda p: p.requires_grad, model.parameters()), args.lr)
scheduler = ReduceLROnPlateau(optimizer, patience=1, factor=0.1, verbose=True, mode='max')
model, optimizer = amp.initialize(model, optimizer, opt_level="O1", verbosity=0)
optimizer._lazy_init_maybe_master_weights()
1reaction
mcarillicommented, May 27, 2019

I think I know what’s going on. Amp relies on some internal stash that has knowledge of the parameters. Typically this is silently lazy-initialized by the backward context manager, or an initial call to optimizer.zero_grad().

I anticipated that people would only want to add param groups later on in training, after many steps had already been taken and the lazy stash initialization had taken place, so that is what the tests cover. I will make sure add_param_group also implements initialization if necessary.

Read more comments on GitHub >

github_iconTop Results From Across the Web

AttributeError: '' object has no attribute '' - python - Stack Overflow
Your NewsFeed class instance n doesn't have a Canvas attribute. If you want to pass the Canvas defined in your Achtergrond class instance ......
Read more >
AttributeError: 'DatasetConversionInfo' object has no attribute ...
Solved: my commad line is pot -c SR-fsrcnn.json my json configuration file is here { /* Model parameters */ "model" : {
Read more >
AttributeError: 'function' object has no attribute - Databricks
Problem You are selecting columns from a DataFrame and you get an error message. ERROR: AttributeError: 'function' object has no attribute ...
Read more >
'Window' object has no attribute '_progSignedTexFont' - Builder
I keep getting the error AttributeError: 'Window' object has no attribute '_progSignedTexFont' ################ Experiment ended with exit ...
Read more >
Python AttributeError: 'tuple' object has no attribute
AttributeError: 'tuple' object has no attribute. Learn Data Science with. This error occurs when attempting to access the values of a tuple incorrectly....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found