Error while testing over pretrained model
See original GitHub issueI am using pytorch v1.0, python 3.6 and using the same commit as the release of the librispeech pretrained model.
When I try to run on librispeech pretrained model as-
python3 test.py --model-path librispeech.pth --test-manifest an4_val_manifest.csv
I get this error-
RuntimeError: Error(s) in loading state_dict for DeepSpeech: Missing key(s) in state_dict: "conv.seq_module.0.weight", "conv.seq_module.0.bias", "conv.seq_module.1.weight", "conv.seq_module.1.bias", "conv.seq_module.1.running_mean", "conv.seq_module.1.running_var", "conv.seq_module.3.weight", "conv.seq_module.3.bias", "conv.seq_module.4.weight", "conv.seq_module.4.bias", "conv.seq_module.4.running_mean", "conv.seq_module.4.running_var", "rnns.0.rnn.bias_ih_l0", "rnns.0.rnn.bias_hh_l0", "rnns.0.rnn.bias_ih_l0_reverse", "rnns.0.rnn.bias_hh_l0_reverse", "rnns.1.rnn.bias_ih_l0", "rnns.1.rnn.bias_hh_l0", "rnns.1.rnn.bias_ih_l0_reverse", "rnns.1.rnn.bias_hh_l0_reverse" Unexpected key(s) in state_dict: "conv.0.weight", "conv.0.bias", "conv.1.weight", "conv.1.bias", "conv.1.running_mean", "conv.1.running_var", "conv.3.weight", "conv.3.bias", "conv.4.weight", "conv.4.bias", "conv.4.running_mean", "conv.4.running_var". size mismatch for rnns.0.rnn.weight_ih_l0: copying a param with shape torch.Size([2400, 672]) from checkpoint, the shape in current model is torch.Size([2400, 1312]). size mismatch for rnns.0.rnn.weight_ih_l0_reverse: copying a param with shape torch.Size([2400, 672]) from checkpoint, the shape in current model is torch.Size([2400, 1312]).
Can anyone please help, How to fix this error?
Issue Analytics
- State:
- Created 5 years ago
- Reactions:7
- Comments:10 (1 by maintainers)
Top GitHub Comments
Hi,
Using the 80a060f I got two other different errors:
size mismatch for rnns.0.rnn.weight_ih_l0: copying a param of torch.Size([2400, 1312]) from checkpoint, where the shape is torch.Size([2400, 672]) in current model. size mismatch for rnns.0.rnn.weight_ih_l0_reverse: copying a param of torch.Size([2400, 1312]) from checkpoint, where the shape is torch.Size([2400, 672]) in current model.
Lucian
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.