question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Loading DCU pretrained model (from JorisCos / DCUNet_Libri1Mix_enhsingle_16k )

See original GitHub issue

asteroid version : 0.4.4

function call : model = DCUNet.from_pretrained("JorisCos/DCUNet_Libri1Mix_enhsingle_16k") Error :

RuntimeError: Error(s) in loading state_dict for DCUNet:
	size mismatch for masker.decoders.7.deconv.re_module.weight: copying a param with shape torch.Size([180, 45, 7, 5]) from checkpoint, the shape in current model is torch.Size([180, 90, 7, 5]).
	size mismatch for masker.decoders.7.deconv.im_module.weight: copying a param with shape torch.Size([180, 45, 7, 5]) from checkpoint, the shape in current model is torch.Size([180, 90, 7, 5]).
	size mismatch for masker.decoders.7.norm.re_module.weight: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.re_module.bias: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.re_module.running_mean: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.re_module.running_var: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.im_module.weight: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.im_module.bias: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.im_module.running_mean: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.7.norm.im_module.running_var: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.deconv.re_module.weight: copying a param with shape torch.Size([90, 45, 1, 7]) from checkpoint, the shape in current model is torch.Size([135, 90, 1, 7]).
	size mismatch for masker.decoders.8.deconv.im_module.weight: copying a param with shape torch.Size([90, 45, 1, 7]) from checkpoint, the shape in current model is torch.Size([135, 90, 1, 7]).
	size mismatch for masker.decoders.8.norm.re_module.weight: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.re_module.bias: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.re_module.running_mean: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.re_module.running_var: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.im_module.weight: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.im_module.bias: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.im_module.running_mean: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.decoders.8.norm.im_module.running_var: copying a param with shape torch.Size([45]) from checkpoint, the shape in current model is torch.Size([90]).
	size mismatch for masker.output_layer.0.re_module.weight: copying a param with shape torch.Size([90, 1, 7, 1]) from checkpoint, the shape in current model is torch.Size([135, 1, 7, 1]).
	size mismatch for masker.output_layer.0.im_module.weight: copying a param with shape torch.Size([90, 1, 7, 1]) from checkpoint, the shape in current model is torch.Size([135, 1, 7, 1]).

information from cached_model (via cached_download function + torch.load) model_args= {‘architecture’: ‘Large-DCUNet-20’, ‘stft_kernel_size’: 1024, ‘stft_stride’: 256, ‘sample_rate’: 16000.0, ‘fix_length_mode’: ‘pad’, ‘n_src’: 1}

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7

github_iconTop GitHub Comments

1reaction
JorisCoscommented, Mar 1, 2021

The first Large-DCUNet-20 architecture wasn’t implemented correctly we corrected it. I trained on a different Large-DCUNet-20 architecture before we corrected the current version. The correction ended up in small size changes in the last 2 decoders layers compared to the architecture that I used. This why we can’t load the models size mismatch

0reactions
JorisCoscommented, Mar 8, 2021

The model is fixed and now available here. Thanks again for reporting the issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

No results found

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found