question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Why enable retain_graph in backward?

See original GitHub issue

Describe the question(问题描述) Why use retain_graph in model backward? Does the model need backward twice?

https://github.com/shenweichen/DeepCTR-Torch/blob/caa12dd15144900ecfcb04908e9834669cb12304/deepctr_torch/models/basemodel.py#L226

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
jhhugocommented, Dec 9, 2019

because 222 total_loss = loss + self.reg_loss, self.reg_loss use the graph weight, so you should save the weight to backward

0reactions
zanshuxuncommented, Dec 6, 2020

we have solved this issue in v0.2.4, please use pip install -U deepctr-torch to upgrade.

Read more comments on GitHub >

github_iconTop Results From Across the Web

What does the parameter retain_graph mean in the Variable's ...
As long as you use retain_graph=True in your backward method, you can do backward any time you want: d.backward(retain_graph=True) # fine ...
Read more >
Why does ".backward(retain_graph=True)" gives different ...
Another way to solve this is to use autograd.grad() that returns the gradient instead of accumulating them. slobodaapl ...
Read more >
Use retain_graph True in a pytorch lightning model
I would like to know the correct way to include retain_graph=True in a pytorch_lightning model. Currently, I am using: def on_backward(self ...
Read more >
Trying to backward through the graph a second time, but the ...
Hello Guys, error Traceback (most recent call last): File "", line 1, in runfile('C:/Users/SRIKANTH R/.spyder-py3/pyro_bayesian_ternary.py', ...
Read more >
understanding backward() in Pytorch-checkpoint
Define a scalar variable, set requires_grad to be true to add it to backward path for computing gradients It is actually very simple...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found