question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

RNN GRU cell complains about reusing weights

See original GitHub issue

When I run this example I get the following error: ValueError: Attempt to reuse RNNCell <tensorflow.contrib.rnn.python.ops.core_rnn_cell_impl.GRUCell object at 0x115eb2630> with a different variable scope than its first use. First use of cell was with scope 'rnn/multi_rnn_cell/cell_0/gru_cell', this attempt is with scope 'rnn/multi_rnn_cell/cell_1/gru_cell'. Please create a new instance of the cell if you would like it to use a different set of weights. If before you were using: MultiRNNCell([GRUCell(...)] * num_layers), change to: MultiRNNCell([GRUCell(...) for _ in range(num_layers)]). If before you were using the same cell instance as both the forward and reverse cell of a bidirectional RNN, simply create two instances (one for forward, one for reverse). In May 2017, we will start transitioning this cell's behavior to use existing stored weights, if any, when it is called with scope=None (which can lead to silent model degradation, so this error will remain until then.)

I tried updating line 79 in rnn_train.py to: multicell = rnn.MultiRNNCell([dropcell for _ in range(NLAYERS)], state_is_tuple=False) but that did not change anything. I am running TensorFlow 1.1.0 on mac with Python 3.5 in a conda environment.

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
Bgs4269commented, May 23, 2017

Fix confirmed!

0reactions
martin-gornercommented, May 23, 2017

fixed!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why are the weights of RNN/LSTM networks shared across ...
The accepted answer focuses on the practical side of the question: it would require a lot of resources, if there parameters are not...
Read more >
Multiple RNN in tensorflow - deep learning - Stack Overflow
but I get the following error: "Attempt to have a second RNNCell use the weights of a variable scope that already has weights"...
Read more >
Illustrated Guide to LSTM's and GRU's: A step by step ...
Gradients are values used to update a neural networks weights. ... Let's look at a cell of the RNN to see how you...
Read more >
RNN vs GRU - Weights & Biases
Here we seek to compare two types of cells, which can be used in RNN translation : RNNCells and GRUCells. The two models...
Read more >
Recurrent Neural Network-RNN - DataDrivenInvestor
In RNN, we share the weights and feed the output back into the inputs recursively.This recurrent formulation helps process sequential data.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found