question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Bidirectional RNNs?

See original GitHub issue

I have an idea for adding bidirectional RNNs to Keras and I’m curious what the Keras devs think of it.

  • Add a Reverse layer which simply slices its input tensor along the timestep dimension (e.g. X_input[:, ::-1]. This would preserve masking and slice masks in reverse as well.
  • Add a Bidirectional class which takes an RNN class as a parameter and internally constructs the forward and backward instances, along with their merge. The backward instance can be Reverse(RNN(Reverse(x))).

How does that sound?

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Comments:13 (9 by maintainers)

github_iconTop GitHub Comments

3reactions
codekansascommented, Apr 18, 2016

Figured I’d add my 2 cents on this, using the functional API:

f_lstm = LSTM(n_lstm_dims)
b_lstm = LSTM(n_lstm_dims, go_backwards=True)
f_input = f_lstm(input)
b_input = b_lstm(input)
together = merge([f_input, b_input], mode='concat', concat_axis=1)

If return_sequences=True on the RNNs, change concat_axis=2 for time-wise concatenation.

However, due to the merge layer, this implementation doesn’t let you use Masking. Is there a way to do concatenation that supports masking? This seems like a common enough use case.

1reaction
carlthomecommented, Sep 12, 2016
Read more comments on GitHub >

github_iconTop Results From Across the Web

10.4. Bidirectional Recurrent Neural Networks
In bidirectional RNNs, the hidden state for each time step is simultaneously determined by the data prior to and after the current time...
Read more >
Bidirectional recurrent neural networks - Wikipedia
Bidirectional recurrent neural networks (BRNN) connect two hidden layers of opposite directions to the same output. With this form of generative deep ...
Read more >
Understanding Bidirectional RNN in PyTorch | by Ceshine Lee
Bidirectional recurrent neural networks (RNN) are really just putting two independent RNNs together. The input sequence is fed in normal time ...
Read more >
Bidirectional Recurrent Neural Networks Definition - DeepAI
Bidirectional recurrent neural networks (BRNN) connect two hidden layers running in opposite directions to a single output, allowing them to receive ...
Read more >
Bidirectional recurrent neural networks - IEEE Xplore
Bidirectional recurrent neural networks ... Abstract: In the first part of this paper, a regular recurrent neural network (RNN) is extended to a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found