_cudnn_rnn_backward is not implemented
See original GitHub issueHi, I have a following error in my code, however I am using torch 1.3 . RuntimeError: derivative for _cudnn_rnn_backward is not implemented I know that it is pytorch related error. I am wondering in which version of pytorch it has been resolved!
Solved by using following code :
with torch.backends.cudnn.flags(enabled=False):
Issue Analytics
- State:
- Created 4 years ago
- Reactions:4
- Comments:8 (2 by maintainers)
Top Results From Across the Web
derivative for _cudnn_rnn_backward is not implemented
I have no idea to solve this problem. I impletemented torch.autograd.grad to get the gradient penalty loss, but this error just show again ......
Read more >cudnn RNN backward can only be called in training mode - ...
deep learning - RuntimeError: cudnn RNN backward can only be called in training mode - Stack Overflow. Stack Overflow for Teams – Start ......
Read more >Use of cuDNN RNN
Do you confirm cuDNN already implements stacked rnn when num_layer > 1? (no need to call num_layer times forward/backward methods) ...
Read more >Recurrent Neural Networks (RNN) with Keras
# CuDNN is only available at the layer level, and not at the cell level. # This means `LSTM(units)` will use the CuDNN...
Read more >pytorch填坑:RuntimeError: cudnn RNN backward can only ...
运行pytorch时,训练很正常,但是如果切换到eval()模式之后再继续训练, 发现报错:RuntimeError: cudnn RNN backward can only be called in ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi, I figured it out. We should place
torch.backends.cudnn.flags(enabled=False):
before the creation of model and the context should last after the higher derivatives.@nooralahzadeh Yes, I did. The complete function is as follows: