question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

AttributeError: 'DistributedDataParallel' object has no attribute 'callback_queued'

See original GitHub issue

My pytorch version is 1.0 stable.

Traceback (most recent call last):
  File "train_cosine_iterations_distributed.py", line 420, in <module>
    optimizer.backward(loss)
  File "/home/ycg/anaconda3/lib/python3.6/site-packages/apex-0.1-py3.6-linux-x86_64.egg/apex/fp16_utils/fp16_optimizer.py", line 482, in backward
    self.loss_scaler.backward(loss.float())
  File "/home/ycg/anaconda3/lib/python3.6/site-packages/apex-0.1-py3.6-linux-x86_64.egg/apex/fp16_utils/loss_scaler.py", line 45, in backward
    scaled_loss.backward()
  File "/home/ycg/anaconda3/lib/python3.6/site-packages/torch/tensor.py", line 102, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph)
  File "/home/ycg/anaconda3/lib/python3.6/site-packages/torch/autograd/__init__.py", line 90, in backward
    allow_unreachable=True)  # allow_unreachable flag
  File "/home/ycg/anaconda3/lib/python3.6/site-packages/apex-0.1-py3.6-linux-x86_64.egg/apex/parallel/distributed.py", line 340, in allreduce_hook
    if not self.callback_queued:
  File "/home/ycg/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 535, in __getattr__
    type(self).__name__, name))
AttributeError: 'DistributedDataParallel' object has no attribute 'callback_queued'

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:6

github_iconTop GitHub Comments

10reactions
vibhavagarwal5commented, May 28, 2020

Does model = model.module help in solving the issue?

0reactions
xieyddcommented, Sep 7, 2019

All, I solve the problem; Linked the issue #457

Read more comments on GitHub >

github_iconTop Results From Across the Web

'DistributedDataParallel' object has no attribute 'no_sync'
Hi, I am trying to fine-tune layoutLM using with the following: distribution = {'smdistributed':{'dataparallel':{ 'enabled': True } ...
Read more >
AttributeError: 'DataParallel' object has no attribute 'copy'
I found this by simply googling your problem: retinanet.load_state_dict(torch.load('filename').module.state_dict()).
Read more >
How to reach model attributes wrapped by nn.DataParallel?
Before wrapping with nn.DataParallel I was able to reach it by model.rnn but after it raises AttributeError: 'DataParallel' object has no ...
Read more >
[pytorch] AttributeError: DistributedDataParallel has no attribute
[pytorch] AttributeError: DistributedDataParallel has no attribute. pajamacoder 2021. 4. 21. 22:27. 기 학습된 모델과 파라미터를 로드해 특정 레이어를 제외한 ...
Read more >
'DistributedDataParallel' object has no attribute 'generate'(gpu ...
在使用DistributedDataParallel训练model的时候,发现在进行forward的过程中,会碰到DistributedDataParallel' object has no attribute的问题。
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found