math.sqrt gets a negative argument
See original GitHub issueHi! I have been trying to train the TransformerXL language model (https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/run_wt103_base.sh) with RAdam and I get *** ValueError: math domain error
Traceback (most recent call last):
File "train.py", line 543, in <module>
train()
File "train.py", line 463, in train
optimizer.step()
File "/transformer-xl/pytorch/radam.py", line 69, in step
N_sma * N_sma_max / (N_sma_max - 2))) / beta1_t
ValueError: math domain error
this is because the argument to math.sqrt is negative here - https://github.com/LiyuanLucasLiu/RAdam/blob/master/language-model/model_word_ada/radam.py#L67
What would be the right fix for this? I tried math.sqrt(abs()) but that performs worse than adam.
Issue Analytics
- State:
- Created 4 years ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
c - sqrt is only defined when argument is nonnegative
The square root of a negative number is a complex number. C's sqrt() function (via math.h ) does not return a complex type,...
Read more >Why is the square root of a negative number impossible?
Since square roots undo squaring, negative numbers can't have square roots. * non-negative; 0 times itself is 0, which is not a positive...
Read more >JavaScript: Math sqrt() function - TechOnTheNet
The sqrt() function returns the square root of a number. If the number is a negative number, the sqrt() function will return NaN....
Read more >Java sqrt() method with Examples - GeeksforGeeks
sqrt() returns the square root of a value of type double passed to it as argument. If the argument is NaN or negative,...
Read more >Math.sqrt() Method in Java - Scaler Topics
If the argument is NaN and negative number (or negative infinity) Math.sqrt() method returns NaN i.e. Not a Number.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Thanks so much! I used the other script (https://github.com/LiyuanLucasLiu/RAdam/blob/master/radam.py) for training the TransformerXL (base) language model on wt103 and am able to get better performance (better train/val ppl after equal no of iters) than adam without any tuning of lr!
Also, I’ve fixed the bug in the https://github.com/LiyuanLucasLiu/RAdam/blob/master/language-model/model_word_ada/radam.py#L67 : -)