Keras Resnet50 implementation pooling options do nothing
See original GitHub issueHello,
I was looking at the Resnet50 implementation bundled with Keras: https://github.com/fchollet/deep-learning-models/blob/master/resnet50.py
Supposedly there is an optional pooling toggle that either does either no pooling, global average pooling, or global max pooling at the end of the network. However looking at the code for that section:
...
x = AveragePooling2D((7, 7), name='avg_pool')(x)
if include_top:
x = Flatten()(x)
x = Dense(classes, activation='softmax', name='fc1000')(x)
else:
if pooling == 'avg':
x = GlobalAveragePooling2D()(x)
elif pooling == 'max':
x = GlobalMaxPooling2D()(x)
It seems that no matter what option you pick Global Average Pooling (GAP) is always applied by means of the x = AveragePooling2D((7, 7), name='avg_pool')(x)
line which if I understand correctly basically does the same thing as GAP by reducing everything down to a (1, 1, 2048) output.
The (optional) Global Average Pooling or Global Max Pooling operations after this line have nothing to work with anymore since the output is already (1,1) spatially and thus nothing can be averaged or max pooled anymore at this point making the optional toggle for them inoperable. Obviously the “no pooling” option is also non-functional because of this.
I would suggest removing the offending Average Pooling line and letting the GlobalAveragePooling2D operation take care of the this if requested by the user. The other options will then also function.
Issue Analytics
- State:
- Created 6 years ago
- Reactions:3
- Comments:9 (1 by maintainers)
I agree with @mxvs,
AveragePooling2D
should be removed ifpooling == None
.Hello,
My point is that the AveragePooling2D(7,7) operation prevents the other options from working.
If you first perform AveragePooling2D(7,7) followed by GlobalMaxPooling2D() you don’t get max pooling at all, since MaxPooling of a (1,1) spatially has nothing to pool (its already (1,1)).
The correct code looks like this:
This way you can either get: A) average pooling + top layer (like in the ResNet Paper) B) GlobalAverage Pooling without the top layer C) GlobalMaxPooling without the top player D) No pooling and simply the output of the last convolutional layer (as its mentioned in the Keras documentation).
The currently implementation prevents options C) and D) from working (since you always get AveragePooling after the last conv layer even if you don’t want that) and option B is currently only flattens (you can’t average pool a (1,1) volume).