question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=4

See original GitHub issue

Here’s the code I’ve written:


model.add(LSTM(150,
               input_shape=(64, 7, 339),
               return_sequences=False))
model.add(Dropout(0.2))

model.add(LSTM(
    200,
    return_sequences=True))
model.add(Dropout(0.2))

model.add(LSTM(
    150,
    return_sequences=True))
model.add(Dropout(0.2))

model.add(Dense(
    output_dim=1))
model.add(Activation('sigmoid'))

start = time.time()
model.compile(loss='mse', optimizer='rmsprop')
print('compilation time : ', time.time() - start)

model.fit(
    trainX,
    trainY_Buy,
    batch_size=64,
    nb_epoch=10,
    verbose=1,
    validation_split=0.05)

the error i’m getting is this: ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=4 on this line: model.add(LSTM(150, input_shape=(64, 7, 339), return_sequences=False))

my X shape is: (492, 7, 339) my Y shape is: (492,)

anyone have any ideas on what I’m doing wrong?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Reactions:1
  • Comments:36 (4 by maintainers)

github_iconTop GitHub Comments

181reactions
td2014commented, Jul 22, 2017

@ajanaliz . You may need to turn “return_sequences=True” in the first layer. Maybe that will solve it. I hope that works. Thanks.

112reactions
td2014commented, Jul 22, 2017

@ajanaliz . I took a quick look, and I believe that you need to remove the leading “64” from the input shape of the LSTM layer --> input_shape=(64, 7, 339), --> input_shape=(7, 339). Keras’ convention is that the batch dimension (number of examples (not the same as timesteps)) is typically omitted in the input_shape arguments. The batching (number of examples per batch) is handled in the fit call. I hope that helps. Thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Input 0 is incompatible with layer lstm_13: expected ndim=3 ...
I solved the problem by making. input size: (95000,360,1) and output size: (95000,22). and changed the input shape to (360,1) in the code ......
Read more >
Keras: ValueError: Input 0 is incompatible with layer lstm_1 ...
machine learning - Keras: ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=2 - Data Science Stack Exchange. Stack Overflow ......
Read more >
Input 0 is incompatible with layer lstm: expected ndim=3 ...
I'm using Keras 2.2 LSTM and want to build a recurrent network, but I can't find the reason of this error.
Read more >
How to stack multiple LSTMs in keras? - Wandb
The solution is to add return_sequences=True to all LSTM layers except the last one so that its output tensor has ndim=3 (i.e. batch...
Read more >
ValueError: Input 0 of layer sequential is ... - TensorFlow Forum
I am working with the LSTM model and getting this error. ... layer sequential is incompatible with the layer: expected ndim=3, found ndim=2....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found