size mismatch
See original GitHub issueHey @dbolya,
I’m trying to use ResNet34 as a backbone but the mismatch error shows up. Could you please tell me how to solve it?
RuntimeError: Error(s) in loading state_dict for ResNetBackbone:
size mismatch for layers.0.0.conv1.weight: copying a param with shape torch.Size([64, 64, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 64, 1, 1]).
size mismatch for layers.0.1.conv1.weight: copying a param with shape torch.Size([64, 64, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 256, 1, 1]).
size mismatch for layers.1.0.conv1.weight: copying a param with shape torch.Size([128, 64, 3, 3]) from checkpoint, the shape in current model is torch.Size([128, 256, 1, 1]).
size mismatch for layers.1.0.downsample.0.weight: copying a param with shape torch.Size([128, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1, 1]).
size mismatch for layers.1.0.downsample.1.weight: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for layers.1.0.downsample.1.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for layers.1.0.downsample.1.running_mean: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for layers.1.0.downsample.1.running_var: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for layers.1.1.conv1.weight: copying a param with shape torch.Size([128, 128, 3, 3]) from checkpoint, the shape in current model is torch.Size([128, 512, 1, 1]).
size mismatch for layers.2.0.conv1.weight: copying a param with shape torch.Size([256, 128, 3, 3]) from checkpoint, the shape in current model is torch.Size([256, 512, 1, 1]).
size mismatch for layers.2.0.downsample.0.weight: copying a param with shape torch.Size([256, 128, 1, 1]) from checkpoint, the shape in current model is torch.Size([1024, 512, 1, 1]).
size mismatch for layers.2.0.downsample.1.weight: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for layers.2.0.downsample.1.bias: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for layers.2.0.downsample.1.running_mean: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for layers.2.0.downsample.1.running_var: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([1024]).
size mismatch for layers.2.1.conv1.weight: copying a param with shape torch.Size([256, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([256, 1024, 1, 1]).
size mismatch for layers.3.0.conv1.weight: copying a param with shape torch.Size([512, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([512, 1024, 1, 1]).
size mismatch for layers.3.0.downsample.0.weight: copying a param with shape torch.Size([512, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([2048, 1024, 1, 1]).
size mismatch for layers.3.0.downsample.1.weight: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for layers.3.0.downsample.1.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for layers.3.0.downsample.1.running_mean: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for layers.3.0.downsample.1.running_var: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for layers.3.1.conv1.weight: copying a param with shape torch.Size([512, 512, 3, 3]) from checkpoint, the shape in current model is torch.Size([512, 2048, 1, 1]).
Thanks!
Issue Analytics
- State:
- Created 3 years ago
- Comments:14
Top Results From Across the Web
How do I clear the message 'Size Mismatch'? - Brother
The message 'Size Mismatch' will appear if the paper loaded in the paper tray is smaller than letter size (8.5 x 11 inches)...
Read more >How do I clear the message "Size Mismatch"? - Brother Canada
The message "Size Mismatch" will occur if the paper installed in the tray is smaller than letter-size (8.5 x 11 inches) when printing...
Read more >How to Solve HP Paper Mismatch Error for Windows
Check the paper size setting to make sure it matches the paper you want to print. 1. In the program you want to...
Read more >How to handle “Paper Size Mismatch” errors on library printers
Re-print your document, changing the print size to fit on standard letter paper: Click “Properties”. Choose the “Paper” tab at top. Check the...
Read more >Paper Size Mismatch error - HP Support Community - 8312477
From last few days i am not able to print any A4 size document.. The only option available to print is: 5*7 When...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
You can just try-except it, and let it be, you don’t have to worry about the mismatch.
Or, you can lookup the architecture for ResNet 34, and find the
args
inResNetBackbone
that matches it.Yes, the instructions are for https://github.com/dbolya/yolact/issues/36, but they should work for any dimension mismatch due to
state_dict
loading. It’s either that, or you have to change'args': ([3, 4, 6, 3],),
somehow, to fit Resnet34. Have a read inResNetBackbone
inbackbone.py
;args
is passed in as thelayers
argument.