Bidirectional RNNs?
See original GitHub issueI have an idea for adding bidirectional RNNs to Keras and I’m curious what the Keras devs think of it.
- Add a
Reverse
layer which simply slices its input tensor along the timestep dimension (e.g.X_input[:, ::-1]
. This would preserve masking and slice masks in reverse as well. - Add a
Bidirectional
class which takes an RNN class as a parameter and internally constructs the forward and backward instances, along with their merge. The backward instance can be Reverse(RNN(Reverse(x))).
How does that sound?
Issue Analytics
- State:
- Created 8 years ago
- Comments:13 (9 by maintainers)
Top Results From Across the Web
10.4. Bidirectional Recurrent Neural Networks
In bidirectional RNNs, the hidden state for each time step is simultaneously determined by the data prior to and after the current time...
Read more >Bidirectional recurrent neural networks - Wikipedia
Bidirectional recurrent neural networks (BRNN) connect two hidden layers of opposite directions to the same output. With this form of generative deep ...
Read more >Understanding Bidirectional RNN in PyTorch | by Ceshine Lee
Bidirectional recurrent neural networks (RNN) are really just putting two independent RNNs together. The input sequence is fed in normal time ...
Read more >Bidirectional Recurrent Neural Networks Definition - DeepAI
Bidirectional recurrent neural networks (BRNN) connect two hidden layers running in opposite directions to a single output, allowing them to receive ...
Read more >Bidirectional recurrent neural networks - IEEE Xplore
Bidirectional recurrent neural networks ... Abstract: In the first part of this paper, a regular recurrent neural network (RNN) is extended to a...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Figured I’d add my 2 cents on this, using the functional API:
If
return_sequences=True
on the RNNs, changeconcat_axis=2
for time-wise concatenation.However, due to the
merge
layer, this implementation doesn’t let you useMasking
. Is there a way to do concatenation that supports masking? This seems like a common enough use case.Done, right?
https://keras.io/layers/wrappers/#bidirectional