Autograd Warning
See original GitHub issue🐛 Bug
When I am integrating densenet121 with torchcam I am getting following warning, which I am not getting while using the model diectly.
usr/local/lib/python3.7/dist-packages/torch/nn/functional.py:1204: UserWarning: Output 0 of BackwardHookFunctionBackward is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is deprecated and will be forbidden starting version 1.6. You can remove this warning by cloning the output of the custom Function. (Triggered internally at /pytorch/torch/csrc/autograd/variable.cpp:547.) result = torch.relu_(input)
Here is the code
model = densenet121(pretrained=True).eval()
img = read_image("image.png")
input_tensor = normalize(resize(img, (224, 224)) / 255., [0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
extract = GradCAM(model=model)
out = model(input_tensor.unsqueeze(0))
Environment
- I am working on Google colab
- Installed Torchcam using pip
- Pytorch version: 1.8.1+cu101
- Torchcam Version: 0.2.0
- Python version: 3.7.10
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:13 (7 by maintainers)
Top GitHub Comments
It works great now =) Just as a reference for others that might stumble across this issue when using in Google Colab. You can use the command below to install torchcam from github instead of the pypi repo.
!pip install git+https://github.com/frgfm/torch-cam.git#egg=torchcam
Hi everyone 👋
My apologies, I’ve been away for a few weeks, but I can confirm a release is coming soon 😃 I just need to handle a few feature additions (CAM fusion with LayerCAM, and extending support for target_layer arg), and the release will be on its way 👍