question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unable to construct simple LSTM layer

See original GitHub issue

Using latest Keras. Code used:

import keras.backend as K
from keras.layers.core import Dense
from keras.layers.recurrent import LSTM
from keras.models import Sequential

model = Sequential()
model.add(LSTM(10, input_shape=(2,3)))
model.add(Dense(1, activation='relu'))

Getting the following error, Although I’m able to put any non-recurrent layer as input layer.

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-3-611aef6e5d2e> in <module>()
      1 model = Sequential()
----> 2 model.add(LSTM(10, input_shape=(2,3)))
      3 model.add(Dense(1, activation='relu'))

/home/user/anaconda2/lib/python2.7/site-packages/Keras-2.1.6-py2.7.egg/keras/engine/sequential.pyc in add(self, layer)
    164                     # and create the node connecting the current layer
    165                     # to the input layer we just created.
--> 166                     layer(x)
    167                     set_inputs = True
    168                 else:

/home/user/anaconda2/lib/python2.7/site-packages/Keras-2.1.6-py2.7.egg/keras/layers/recurrent.pyc in __call__(self, inputs, initial_state, constants, **kwargs)
    498 
    499         if initial_state is None and constants is None:
--> 500             return super(RNN, self).__call__(inputs, **kwargs)
    501 
    502         # If any of `initial_state` or `constants` are specified and are Keras

/home/user/anaconda2/lib/python2.7/site-packages/Keras-2.1.6-py2.7.egg/keras/engine/base_layer.pyc in __call__(self, inputs, **kwargs)
    452             # Actually call the layer,
    453             # collecting output(s), mask(s), and shape(s).
--> 454             output = self.call(inputs, **kwargs)
    455             output_mask = self.compute_mask(inputs, previous_mask)
    456 

/home/user/anaconda2/lib/python2.7/site-packages/Keras-2.1.6-py2.7.egg/keras/layers/recurrent.pyc in call(self, inputs, mask, training, initial_state)
   2110                                       mask=mask,
   2111                                       training=training,
-> 2112                                       initial_state=initial_state)
   2113 
   2114     @property

/home/user/anaconda2/lib/python2.7/site-packages/Keras-2.1.6-py2.7.egg/keras/layers/recurrent.pyc in call(self, inputs, mask, training, initial_state, constants)
    607                                              mask=mask,
    608                                              unroll=self.unroll,
--> 609                                              input_length=timesteps)
    610         if self.stateful:
    611             updates = []

/home/user/anaconda2/lib/python2.7/site-packages/Keras-2.1.6-py2.7.egg/keras/backend/tensorflow_backend.pyc in rnn(step_function, inputs, initial_states, go_backwards, mask, constants, unroll, input_length)
   2927             parallel_iterations=32,
   2928             swap_memory=True,
-> 2929             maximum_iterations=input_length)
   2930         last_time = final_outputs[0]
   2931         output_ta = final_outputs[1]

TypeError: while_loop() got an unexpected keyword argument 'maximum_iterations'

EDIT: It seems keras master branch does not work with tensorflow 1.4.1. Is that an expected behaviour? https://travis-ci.org/deeiip/keras/jobs/382610473#L1917

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:5
  • Comments:6

github_iconTop GitHub Comments

10reactions
izurocommented, Jun 12, 2018

Had the same issue with Keras 2.2.0 & tensorflow-gpu 1.8.0 . Reverted back to 2.1.6 and model fits without error. pip uninstall keras pip install keras==2.1.6

0reactions
praveenraghuvanshi1512commented, Oct 6, 2019

I am getting this error : if isinstance(identifier, tf.train.Optimizer): NameError: name ‘tf’ is not defined TF : 1.4.0 Keras : 2.1.6 MODEL

import os
import importlib
import warnings
import tensorflow as tf
import keras
warnings.filterwarnings(action='ignore', category=DeprecationWarning)
def set_keras_backend(backend):
    from keras import backend as K

    if K.backend() != backend:
        os.environ['KERAS_BACKEND'] = backend
        importlib.reload(K)
        assert K.backend() == backend
set_keras_backend("tensorflow")

from keras.models import load_model
def model(embedding_matrix,embed,len_distinct,len_label,epochs,batch_size,weight,vector_size):
    import tensorflow as tf
    model = Sequential()
    model.add(Embedding(len_distinct, vector_size, input_length=maxlen,weights=[embedding_matrix],trainable=True))
    model.add(Bidirectional(LSTM(100,return_sequences=False,dropout=0.2))) # was false
    model.add(Dense(len_label, activation=tf.nn.softmax))
    adam=tf.train.AdamOptimizer(learning_rate=0.0001)
    model.compile(loss='categorical_crossentropy', optimizer=adam, metrics=['accuracy',f1score,sensitivity,precision])

I am also getting the same error “name ‘tf’ is not defined” for a simple network. Tried ‘import tensorflow as tf’ without success. TF: 1.8.0 Keras: 2.1.6

Upgraded tensforflow to latest(2.0.0) and replaced import statement from ‘from keras.models import Sequential’ to ‘from tensorflow.keras.models import Sequential’ with tensorlfow prefix and everything worked… 😊

Read more comments on GitHub >

github_iconTop Results From Across the Web

ValueError issue when using LSTM layer in tensorflow
I am trying to build up a very simple LSTM structure using padding and masking to learn how to train time series data....
Read more >
Choosing the right Hyperparameters for a simple LSTM using ...
In this article, I want to give some intuition on how to make some of the decisions like finding the right parameters while...
Read more >
Working with RNNs - Keras
Build a RNN model with nested input/output. Let's build a Keras model that uses a keras.layers.RNN layer and the custom cell we just...
Read more >
How To Code Your First LSTM Network In Keras
In this article, we will learn to implement a simple Recurrent Neural Network, called LSTM Network using Keras and MNIST dataset .
Read more >
tf.keras.layers.LSTM | TensorFlow v2.11.0
Can only be used when RNN layer is constructed with stateful = True . Args: states: Numpy arrays that contains the value for...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found