TypeError: __init__() got an unexpected keyword argument 'norm_first'
See original GitHub issueI get this error when trying the new htdemucs with demucs 4.0.0
(demucs) C:\Users\Admin>demucs -n htdemucs D:\input.wav
Traceback (most recent call last):
File "d:\programas\anaconda\envs\demucs\lib\runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "d:\programas\anaconda\envs\demucs\lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "D:\Programas\Anaconda\envs\demucs\Scripts\demucs.exe\__main__.py", line 7, in <module>
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\separate.py", line 121, in main
model = get_model_from_args(args)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\pretrained.py", line 89, in get_model_from_args
return get_model(name=args.name, repo=args.repo)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\pretrained.py", line 74, in get_model
model = any_repo.get_model(name)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\repo.py", line 148, in get_model
return self.bag_repo.get_model(name_or_sig)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\repo.py", line 130, in get_model
models = [self.model_repo.get_model(sig) for sig in signatures]
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\repo.py", line 130, in <listcomp>
models = [self.model_repo.get_model(sig) for sig in signatures]
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\repo.py", line 67, in get_model
return load_model(pkg)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\states.py", line 62, in load_model
model = klass(*args, **kwargs)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\states.py", line 146, in __init__
init(self, *args, **kwargs)
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\htdemucs.py", line 384, in __init__
self.crosstransformer = CrossTransformerEncoder(
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\transformer.py", line 636, in __init__
self.layers.append(MyTransformerEncoderLayer(**kwargs_classic_encoder))
File "d:\programas\anaconda\envs\demucs\lib\site-packages\demucs\transformer.py", line 297, in __init__
super().__init__(
TypeError: __init__() got an unexpected keyword argument 'norm_first'
Issue Analytics
- State:
- Created 9 months ago
- Comments:9
Top Results From Across the Web
TypeError: __init__() got an unexpected keyword argument ...
Based on the error message only, I would suggest putting **kwargs in __init__ . This object will then accept any other keyword argument...
Read more >TypeError: __init__() got an unexpected keyword ... - GitHub
I have used Seq2SeqTrainingArguments class from transformers: import logging from ... TypeError: init() got an unexpected keyword argument ...
Read more >TypeError: __init__() got an unexpected keyword argument ...
Hi, I have followed the steps given in the chapter to set up mflix app. ... TypeError: init() got an unexpected keyword argument...
Read more >tensor.__init__() got an unexpected keyword argument 'shape'
Running the code above gives the error: TypeError: __init__ () got an unexpected keyword argument 'shape'. The comment below says that tf.zeros_initializer ...
Read more >Transformerencoderlayer init error - nlp - PyTorch Forums
TypeError : init() got an unexpected keyword argument 'batch_first'. Shisho_Sama (A curious guy here!) July 5, 2021, 2:40am #2. 111412:.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yes, it’s working with torch 1.13.1. So the version in requirements.txt should be increased to when that argument was added to torch.
I’ve investigated this a bit. MyTransformerEncoderLayer wants to pass
norm_first=False
to torch.nn.TransformerEncoderLayer.requirements.txt says:
I was running torch 1.9.0, which does not have that argument, but later versions do. Currently updating torch to 1.13.1…