question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[CLI] Using a non-full backward hook leads to user warning.

See original GitHub issue

Description Using wandb with the nn.Sequential module raises a user warning.

Wandb features wandb.watch and wandb.init.

How to reproduce Run this:

import torch
import torch.nn as nn
import wandb

with wandb.init(project="Project"):
    model = nn.Sequential(nn.Linear(100, 50),
                          nn.ReLU(),
                          nn.Linear(50, 5))
    wandb.watch(model)
    model.train()
    X = torch.randn(5, 100) # input
    criterion = nn.CrossEntropyLoss(reduction="mean")
    labels = torch.Tensor([0,1,2,3,4]).long()
    output = model(X)
    loss = criterion(output, labels)
    loss.backward()

Warning:

C:\Users\Me\anaconda3\envs\Project\lib\site-packages\torch\nn\modules\module.py:795: UserWarning: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed in future versions. This hook will be missing some grad_input. Please use register_full_backward_hook to get the documented behavior.
  warnings.warn("Using a non-full backward hook when the forward contains multiple autograd Nodes "

Environment

  • OS: Windows 10
  • Python Version: 3.8.8
  • PyTorch Version: 1.8.0

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:4
  • Comments:7 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
vanpeltcommented, Apr 21, 2021

The fix actually got pushed to the next release. If you want to try the fix before we release next week, you can install wandb with:

pip install --upgrade git+git://github.com/wandb/client.git@bug/issue-1122-2#egg=wandb

0reactions
github-actions[bot]commented, Jun 21, 2021

This issue is stale because it has been open 60 days with no activity.

Read more comments on GitHub >

github_iconTop Results From Across the Web

[CLI] Using a non-full backward hook leads to user warning.
Description Using wandb with the nn.Sequential module raises a user warning. Wandb features wandb.watch and wandb.init.
Read more >
PyTorch warning about using a non-full backward hook when ...
The warning you're getting: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be removed ...
Read more >
Using non-full backward hooks on a Module that does not ...
I'm getting this warning since I upgraded to 1.8. I have a custom module that returns tensor, (tensor tensor) which triggers this. What...
Read more >
Intel® Server Chassis P4000S Family Service Guide
Use this chapter for step-by- step instructions and diagrams for installing or replacing components such as thefan, power supply, front panel board, and...
Read more >
The GNU Awk User's Guide
This command format instructs the shell, or command interpreter, to start awk and use the program to process records in the input file(s)....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found