question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

🐛 Bug

When I am integrating densenet121 with torchcam I am getting following warning, which I am not getting while using the model diectly.

usr/local/lib/python3.7/dist-packages/torch/nn/functional.py:1204: UserWarning: Output 0 of BackwardHookFunctionBackward is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is deprecated and will be forbidden starting version 1.6. You can remove this warning by cloning the output of the custom Function. (Triggered internally at /pytorch/torch/csrc/autograd/variable.cpp:547.) result = torch.relu_(input)

Here is the code

model = densenet121(pretrained=True).eval()

img = read_image("image.png")
input_tensor = normalize(resize(img, (224, 224)) / 255., [0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
extract = GradCAM(model=model)
out = model(input_tensor.unsqueeze(0))

Environment

  • I am working on Google colab
  • Installed Torchcam using pip
  • Pytorch version: 1.8.1+cu101
  • Torchcam Version: 0.2.0
  • Python version: 3.7.10

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:1
  • Comments:13 (7 by maintainers)

github_iconTop GitHub Comments

3reactions
wizleecommented, Aug 11, 2021

It works great now =) Just as a reference for others that might stumble across this issue when using in Google Colab. You can use the command below to install torchcam from github instead of the pypi repo. !pip install git+https://github.com/frgfm/torch-cam.git#egg=torchcam

2reactions
frgfmcommented, Oct 14, 2021

Hi everyone 👋

My apologies, I’ve been away for a few weeks, but I can confirm a release is coming soon 😃 I just need to handle a few feature additions (CAM fusion with LayerCAM, and extending support for target_layer arg), and the release will be on its way 👍

Read more comments on GitHub >

github_iconTop Results From Across the Web

Automatic differentiation package - torch.autograd - PyTorch
Computes and returns the sum of gradients of outputs with respect to the inputs. Forward-mode Automatic Differentiation. Warning. This API is in beta....
Read more >
PyTorch warning about using a non-full backward hook when ...
The warning you're getting: Using a non-full backward hook when the forward contains multiple autograd Nodes is deprecated and will be ...
Read more >
JAX Quickstart - JAX documentation - Read the Docs
With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy code. It can differentiate through a large subset of ......
Read more >
Johnson-Automatic-Differentiation.pdf
Autograd. • github.com/hips/autograd. • differentiates native Python code ... warnings.warn("Output seems independent of input.").
Read more >
Unable to install Torchvision 0.10.0 on Jetson Nano
I would like to install Pytorch and Torchvision on my Jetson Nano 2GB developer kit. I installed Pytorch version 1.9.0 successfully, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found