ValueError: Error when checking target: expected dense_14 to have shape (None, 2) but got array with shape (928, 1)
See original GitHub issueI am working through the keras transfer learning tutorial here : https://blog.keras.io/building-powerful-image-classification-models-using-very-little-data.html using Keras with a tensorflow backend. My data is made up training data (499 and 443 images of class 0 and 1) and validation data (101 and 103 image of class 0 and 1)
When I try and run the block of code below I receive the error
ValueError: Error when checking target: expected dense_14 to have shape (None, 2) but got array with shape (928, 1)
My understanding of the structure is that my input of (928,4,4,512) which sets the input shape to the flatten_9 layer at (none, 8192) but I am confused as why is causes and error at the dense_14 layer as the size of the hidden layers is already defined?
my model configuration is
_________________________________________________________________
Layer (type) Output Shape Param #
=======================================
flatten_9 (Flatten) (None, 8192) 0
_________________________________________________________________
dense_13 (Dense) (None, 256) 2097408
_________________________________________________________________
dropout_7 (Dropout) (None, 256) 0
_________________________________________________________________
dense_14 (Dense) (None, 2) 514
========================================
def train_top_model():
train_data = np.load(open('bottleneck_features_train.npy','rb'))
train_labels = np.array(
[0] * 499 + [1] * 443)
validation_data = np.load(open('bottleneck_features_validation.npy','rb'))
validation_labels = np.array(
[0] * 101 + [1] * 103)
model = Sequential()
model.add(Flatten(input_shape=train_data.shape[1:]))
model.add(Dense(256, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(2, activation='sigmoid'))
model.compile(optimizer='rmsprop',
loss='binary_crossentropy', metrics=['accuracy'])
model.fit(train_data, train_labels,
epochs=epochs,
batch_size=batch_size,
validation_data=(validation_data, validation_labels))
model.save_weights(top_model_weights_path)
Issue Analytics
- State:
- Created 6 years ago
- Reactions:7
- Comments:22
use “keras.utils.np_utils.to_categorical” to convert your train_labels to categorical one-hot vectors.
In my case, the loss parameter in the compiling the model was specified as “sparse_categorical_crossentropy”. When I changed it to “categorical_crossentropy”, the error was fixed.