question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error when replacing merge layer with concatenate layer

See original GitHub issue

Hey,

I’m currently updating my code from Keras 1.0.8 to the latest version 2.0.6. I switched a merge-layer for the new concatenate-layer, but I’m getting an error:

The first layer in a Sequential model must get an input_shape or batch_input_shape argument.

Simplified, my code looks like this:

LSTM_1 = Sequential()
LSTM_1.add(Embedding(2000, 100, weights=[emb_1], input_length=100, mask_zero=True))
LSTM_1.add(LSTM(100, input_shape=(1000, 100)))

LSTM_2 = Sequential()
LSTM_2.add(Embedding(5000, 100, weights=[emb_2], input_length=2000, mask_zero=True))
LSTM_2.add(LSTM(100, input_shape=(2000, 100)))

LSTM_3 = Sequential()
LSTM_3.add(Embedding(3000, 100, weights=[emb_3], input_length=500, mask_zero=True))
LSTM_3.add(LSTM(100, input_shape=(500, 100)))

merged_model = Sequential()
merged_model.add(Concatenate([LSTM_1, LSTM_2, LSTM_3]))
merged_model.add(Dense(2, activation='softmax'))
merged_model.compile('adam', 'categorical_crossentropy')

merged_model.fit([X_1, X_2, X_3], y, batch_size=200, epochs=10, verbose=1)

Instead of the Concatenate layer I had the following line: merged_model.add(Merge([LSTM_1, LSTM_2, LSTM_3], mode='concat'))

The problem is, that merged_model.summary() gives me the following with the old Merge layer and the latest Keras version:


Layer (type)                 Output Shape              Param #   
=================================================================
merge_1 (Merge)              (None, 300)               0         
_________________________________________________________________
dense_1 (Dense)              (None, 2)                 602       
=================================================================
Total params: 10,943,302
Trainable params: 241,802
Non-trainable params: 10,701,500

Before I updated to the latest version, it was building the model correctly with the LSTM layers inside.

Can someone explain me what’s going wrong here?

Thanks!

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:8

github_iconTop GitHub Comments

9reactions
v1nc3nt27commented, Dec 4, 2017

Well, not much to share actually. I just used the functional API instead like this:

from keras.layers.merge import concatenate
from keras.layers import Embedding, Input
from keras.models import Model
from keras.layers.core import Dense

# First LSTM
input_1 = Input(shape=(SEQ_LENGTH,), dtype='int32')
embedding_1 = Embedding(input_dim=len(EMBEDDING_FILE), output_dim=EMBEDDING_DIM, weights=[EMBEDDING_FILE], input_length=SEQ_LENGTH, mask_zero=True, trainable=True)(input_1)
LSTM_1 = LSTM(EMBEDDING_DIM, batch_input_shape=(batch_size, SEQ_LENGTH, EMBEDDING_DIM), input_shape=(SEQ_LENGTH, EMBEDDING_DIM))(embedding_1)

# Second LSTM
input_2 = Input(shape=(SEQ_LENGTH,), dtype='int32')
embedding_2 = Embedding(input_dim=len(EMBEDDING_FILE), output_dim=EMBEDDING_DIM, weights=[EMBEDDING_FILE], input_length=SEQ_LENGTH, mask_zero=True, trainable=True)(input_2)
LSTM_2 = LSTM(EMBEDDING_DIM, batch_input_shape=(batch_size, SEQ_LENGTH, EMBEDDING_DIM), input_shape=(SEQ_LENGTH, EMBEDDING_DIM))(embedding_2)

# Merge
merged = concatenate([LSTM_1, LSTM_2])

# Dense
dense_out = Dense(no_of_classes, activation='softmax')(merged)

# build and compile model
model = Model(inputs=[input_1, input_2], outputs=[dense_out])
model.compile(optimizers.Adam(), 'kullback_leibler_divergence',  metrics=['accuracy'])

# train
model.fit([X_data_1, X_data_2], y_true)

This should work.

1reaction
ylmengcommented, Oct 16, 2017

I do believe there should be an appropriate, easy Sequential solution. Many available packages are written in Sequential, and if you want to do something tricky with them you are screwed.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Concatenate error: The added layer must be an instance of ...
I'm working on a snippet in Keras where I have two Sequential models that needs to be merged in a ...
Read more >
Merging two layers - keras - Data Science Stack Exchange
Returns→ A tensor, the concatenation of the inputs alongside axis axis. Since you are using the Functional API: from keras.layers import ...
Read more >
pandas.concat — pandas 1.5.2 documentation
Merge DataFrames by indexes or columns. Notes. The keys, levels, and names arguments are all optional. A walkthrough of how this method fits...
Read more >
Manage layers and groups in Photoshop - Adobe Support
Merge active: Select all the layers you want to merge whether they are vector or pixel layers, and go to Layer > Merge...
Read more >
Concatenate layer - Keras
tf.keras.layers.Concatenate(axis=-1, **kwargs). Layer that concatenates a list of inputs. It takes as input a list of tensors, all of the same shape except ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found