question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Bidirectional(LSTM(..., stateful=True)) crashes

See original GitHub issue
nb_samples = 1
nb_timesteps = 1
nb_features = 1
nb_hidden = 1

i = Input(batch_shape=(nb_samples, nb_timesteps, nb_features))
o = Bidirectional(LSTM(nb_hidden, stateful=True))(i)
o = Dense(nb_classes, activation='softmax')(o)
model = Model(i, o)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-14-f240ae219281> in <module>()
      5 
      6 i = Input(batch_shape=(nb_samples, nb_timesteps, nb_features))
----> 7 o = Bidirectional(LSTM(nb_hidden, stateful=True))(i)
      8 o = Dense(nb_classes, activation='softmax')(o)
      9 model = Model(i, o)

/home/carl/anaconda3/lib/python3.5/site-packages/keras/layers/wrappers.py in __init__(self, layer, merge_mode, weights, **kwargs)
    164         config = layer.get_config()
    165         config['go_backwards'] = not config['go_backwards']
--> 166         self.backward_layer = layer.__class__.from_config(config)
    167         self.forward_layer.name = 'forward_' + self.forward_layer.name
    168         self.backward_layer.name = 'backward_' + self.backward_layer.name

/home/carl/anaconda3/lib/python3.5/site-packages/keras/engine/topology.py in from_config(cls, config)
    869                 output of get_config.
    870         '''
--> 871         return cls(**config)
    872 
    873     def count_params(self):

/home/carl/anaconda3/lib/python3.5/site-packages/keras/layers/recurrent.py in __init__(self, output_dim, init, inner_init, forget_bias_init, activation, inner_activation, W_regularizer, U_regularizer, b_regularizer, dropout_W, dropout_U, **kwargs)
    675         if self.dropout_W or self.dropout_U:
    676             self.uses_learning_phase = True
--> 677         super(LSTM, self).__init__(**kwargs)
    678 
    679     def build(self, input_shape):

/home/carl/anaconda3/lib/python3.5/site-packages/keras/layers/recurrent.py in __init__(self, weights, return_sequences, go_backwards, stateful, unroll, consume_less, input_dim, input_length, **kwargs)
    163         if self.input_dim:
    164             kwargs['input_shape'] = (self.input_length, self.input_dim)
--> 165         super(Recurrent, self).__init__(**kwargs)
    166 
    167     def get_output_shape_for(self, input_shape):

/home/carl/anaconda3/lib/python3.5/site-packages/keras/engine/topology.py in __init__(self, **kwargs)
    323             # to insert before the current layer
    324             if 'batch_input_shape' in kwargs:
--> 325                 batch_input_shape = tuple(kwargs['batch_input_shape'])
    326             elif 'input_shape' in kwargs:
    327                 batch_input_shape = (None,) + tuple(kwargs['input_shape'])

TypeError: 'NoneType' object is not iterable

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

4reactions
mobeetscommented, Aug 18, 2017

Does anyone understand what is happening for a stateful bidirectional layer? It doesn’t crash now, but I’m not sure I understand how the output would make any sense.

3reactions
Russ09commented, May 3, 2017

I haven’t revisited this recently, but this issue suggests that Keras doesn’t handle it and crashes, as expected, however a warning that bidirectional stateful doesn’t work would maybe be more appropriate?

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Develop a Bidirectional LSTM For Sequence ...
LSTM with reversed input sequences (e.g. you can do this by setting the “go_backwards” argument to he LSTM layer to “True”); Bidirectional LSTM....
Read more >
python - Keras Bidirectional LSTMs - Stack Overflow
1 Answer 1 · The image you link stacks 2 bidirectional-LSTMs (or more) over a sequence input and then added a dense layer...
Read more >
My Keras bidirectional LSTM model is giving terrible predictions
I mean the backwards layer has to predict the latest value first and only after predicting it sees the sequence which gives the...
Read more >
Text classification with an RNN - TensorFlow
The main disadvantage of a bidirectional RNN is that you can't efficiently stream predictions as words are being added to the end. After...
Read more >
NLP a Gentle Introduction (LSTM, Word2Vec, BERT) | Kaggle
At the end it is about Bidirectional Encoder Representations from Transformers (BERT). ... return_sequences = True, stateful=False, recurrent_dropout = 0.4, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found