Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=4
See original GitHub issueHere’s the code I’ve written:
model.add(LSTM(150,
input_shape=(64, 7, 339),
return_sequences=False))
model.add(Dropout(0.2))
model.add(LSTM(
200,
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(
150,
return_sequences=True))
model.add(Dropout(0.2))
model.add(Dense(
output_dim=1))
model.add(Activation('sigmoid'))
start = time.time()
model.compile(loss='mse', optimizer='rmsprop')
print('compilation time : ', time.time() - start)
model.fit(
trainX,
trainY_Buy,
batch_size=64,
nb_epoch=10,
verbose=1,
validation_split=0.05)
the error i’m getting is this: ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=4 on this line: model.add(LSTM(150, input_shape=(64, 7, 339), return_sequences=False))
my X shape is: (492, 7, 339) my Y shape is: (492,)
anyone have any ideas on what I’m doing wrong?
Issue Analytics
- State:
- Created 6 years ago
- Reactions:1
- Comments:36 (4 by maintainers)
Top Results From Across the Web
Input 0 is incompatible with layer lstm_13: expected ndim=3 ...
I solved the problem by making. input size: (95000,360,1) and output size: (95000,22). and changed the input shape to (360,1) in the code ......
Read more >Keras: ValueError: Input 0 is incompatible with layer lstm_1 ...
machine learning - Keras: ValueError: Input 0 is incompatible with layer lstm_1: expected ndim=3, found ndim=2 - Data Science Stack Exchange. Stack Overflow ......
Read more >Input 0 is incompatible with layer lstm: expected ndim=3 ...
I'm using Keras 2.2 LSTM and want to build a recurrent network, but I can't find the reason of this error.
Read more >How to stack multiple LSTMs in keras? - Wandb
The solution is to add return_sequences=True to all LSTM layers except the last one so that its output tensor has ndim=3 (i.e. batch...
Read more >ValueError: Input 0 of layer sequential is ... - TensorFlow Forum
I am working with the LSTM model and getting this error. ... layer sequential is incompatible with the layer: expected ndim=3, found ndim=2....
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@ajanaliz . You may need to turn “return_sequences=True” in the first layer. Maybe that will solve it. I hope that works. Thanks.
@ajanaliz . I took a quick look, and I believe that you need to remove the leading “64” from the input shape of the LSTM layer --> input_shape=(64, 7, 339), --> input_shape=(7, 339). Keras’ convention is that the batch dimension (number of examples (not the same as timesteps)) is typically omitted in the input_shape arguments. The batching (number of examples per batch) is handled in the fit call. I hope that helps. Thanks.