finetune with different embedding size
See original GitHub issueHi
I want to finetune your pretrained model (r50) with embedding size 128. Starting point is your train_softmax.py:
CUDA_VISIBLE_DEVICES='0' python -u train_softmax.py --emb-size=128 --pretrained '../models/model-r50-am-lfw/model,0' --network r50 --loss-type 0 --margin-m 0.5 --data-dir ../datasets/faces_ms1m_112x112 --prefix ../models/model-r50-am-lfw/
this results in:
mxnet.base.MXNetError: [08:51:39] src/operator/nn/../tensor/../elemwise_op_common.h:123: Check failed: assign(&dattr, (*vec)[i]) Incompatible attr in node at 0-th output: expected [512], got [128]
I have tried to remove the last fc layer (fc1), then add fc7 as done in your code, then freeze all layers except fc7. Still the dimensions don’t match.
Any advise on how to do this? Thanks.
Issue Analytics
- State:
- Created 5 years ago
- Comments:9
@mpaffolter yes. Actually I have the same question. If there is no validation set, how can we know whether our model is overfitting?
@nttstar Could you please help to answer, thank you. 😃
@mpaffolter @staceycy @nttstar same question. The comparison with state-of-the-art seems unfair.