Error when replacing merge layer with concatenate layer
See original GitHub issueHey,
I’m currently updating my code from Keras 1.0.8 to the latest version 2.0.6. I switched a merge-layer for the new concatenate-layer, but I’m getting an error:
The first layer in a Sequential model must get an
input_shape
orbatch_input_shape
argument.
Simplified, my code looks like this:
LSTM_1 = Sequential()
LSTM_1.add(Embedding(2000, 100, weights=[emb_1], input_length=100, mask_zero=True))
LSTM_1.add(LSTM(100, input_shape=(1000, 100)))
LSTM_2 = Sequential()
LSTM_2.add(Embedding(5000, 100, weights=[emb_2], input_length=2000, mask_zero=True))
LSTM_2.add(LSTM(100, input_shape=(2000, 100)))
LSTM_3 = Sequential()
LSTM_3.add(Embedding(3000, 100, weights=[emb_3], input_length=500, mask_zero=True))
LSTM_3.add(LSTM(100, input_shape=(500, 100)))
merged_model = Sequential()
merged_model.add(Concatenate([LSTM_1, LSTM_2, LSTM_3]))
merged_model.add(Dense(2, activation='softmax'))
merged_model.compile('adam', 'categorical_crossentropy')
merged_model.fit([X_1, X_2, X_3], y, batch_size=200, epochs=10, verbose=1)
Instead of the Concatenate
layer I had the following line:
merged_model.add(Merge([LSTM_1, LSTM_2, LSTM_3], mode='concat'))
The problem is, that merged_model.summary()
gives me the following with the old Merge
layer and the latest Keras version:
Layer (type) Output Shape Param #
=================================================================
merge_1 (Merge) (None, 300) 0
_________________________________________________________________
dense_1 (Dense) (None, 2) 602
=================================================================
Total params: 10,943,302
Trainable params: 241,802
Non-trainable params: 10,701,500
Before I updated to the latest version, it was building the model correctly with the LSTM layers inside.
Can someone explain me what’s going wrong here?
Thanks!
Issue Analytics
- State:
- Created 6 years ago
- Comments:8
Top Results From Across the Web
Concatenate error: The added layer must be an instance of ...
I'm working on a snippet in Keras where I have two Sequential models that needs to be merged in a ...
Read more >Merging two layers - keras - Data Science Stack Exchange
Returns→ A tensor, the concatenation of the inputs alongside axis axis. Since you are using the Functional API: from keras.layers import ...
Read more >pandas.concat — pandas 1.5.2 documentation
Merge DataFrames by indexes or columns. Notes. The keys, levels, and names arguments are all optional. A walkthrough of how this method fits...
Read more >Manage layers and groups in Photoshop - Adobe Support
Merge active: Select all the layers you want to merge whether they are vector or pixel layers, and go to Layer > Merge...
Read more >Concatenate layer - Keras
tf.keras.layers.Concatenate(axis=-1, **kwargs). Layer that concatenates a list of inputs. It takes as input a list of tensors, all of the same shape except ......
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Well, not much to share actually. I just used the functional API instead like this:
This should work.
I do believe there should be an appropriate, easy Sequential solution. Many available packages are written in Sequential, and if you want to do something tricky with them you are screwed.