flaky test: tests.chainer_tests.functions_tests.loss_tests.test_triplet.TestTriplet
See original GitHub issuetests.chainer_tests.functions_tests.loss_tests.test_triplet.TestTriplet_param_14_{dtype=float32, batchsize=5, margin=0.5, input_dim=3, reduce='mean'}.test_double_backward_gpu
chainer/utils/__init__.py:104: in _raise_from
six.reraise(exc_type, new_exc, sys.exc_info()[2])
chainer/testing/parameterized.py:78: in new_method
return base_method(self, *args, **kwargs)
tests/chainer_tests/functions_tests/loss_tests/test_triplet.py:148: in test_double_backward_gpu
cuda.to_gpu(self.ggn))
tests/chainer_tests/functions_tests/loss_tests/test_triplet.py:137: in check_double_backward
dtype=numpy.float64, **self.check_double_backward_options)
chainer/gradient_check.py:1049: in check_double_backward
utils._raise_from(AssertionError, f.getvalue(), e)
chainer/utils/__init__.py:104: in _raise_from
six.reraise(exc_type, new_exc, sys.exc_info()[2])
chainer/gradient_check.py:1030: in check_double_backward
detect_nondifferentiable=detect_nondifferentiable)
chainer/gradient_check.py:900: in check_backward
detect_nondifferentiable, is_immutable_params=False
chainer/gradient_check.py:464: in run
self._run()
chainer/gradient_check.py:507: in _run
self._compare_gradients(gx_numeric, gx_backward, directions)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <chainer.gradient_check._CheckBackward object at 0x7efe4fc50a50>
gx_numeric = array(29.866205722202967)
gx_backward = array(-0.034166338998209554)
directions = [array([[ 0.20815725, 0.02468207, 0.13950035],
[ 0.01339531, 0.119112...84, 0.05095844, -0.21432331],
...0.034382...24, -0.0790734 , 0.05782918],
[ 0.01867079, -0.03709946, 0.04035342]]), array(-0.1402532765682968)]
def _compare_gradients(self, gx_numeric, gx_backward, directions):
atol = self.atol
rtol = self.rtol
# Compare the gradients
try:
testing.assert_allclose(
gx_numeric, gx_backward, atol=atol, rtol=rtol)
except AssertionError as e:
eps = self.eps
xs = self.xs
gys = self.gys
f = six.StringIO()
f.write('check_backward failed (eps={} atol={} rtol={})\n'.format(
eps, atol, rtol))
for i, x in enumerate(xs):
f.write('inputs[{}]:\n'.format(i))
f.write('{}\n'.format(x))
for i, gy in enumerate(gys):
f.write('grad_outputs[{}]:\n'.format(i))
f.write('{}\n'.format(gy))
for i, d in enumerate(directions):
f.write('directions[{}]:\n'.format(i))
f.write('{}\n'.format(d))
f.write('gradients (numeric): {}\n'.format(gx_numeric))
f.write('gradients (backward): {}\n'.format(gx_backward))
f.write('\n')
f.write('x: numeric gradient, y: backward gradient')
f.write(str(e))
> raise AssertionError(f.getvalue())
E AssertionError: Parameterized test failed.
E
E Base test method: TestTriplet.test_double_backward_gpu
E Test parameters:
E dtype: <type 'numpy.float32'>
E batchsize: 5
E margin: 0.5
E input_dim: 3
E reduce: mean
E
E
E (caused by)
E AssertionError: check_double_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E input[0]:
E [[-0.09617408 -0.71473444 -0.43992674]
E [-0.81433535 -0.41757172 -0.24745303]
E [-0.81118673 0.77476585 -0.81320477]
E [-0.01487272 0.23381303 0.6911819 ]
E [ 0.99071407 0.16982687 -0.68330991]]
E input[1]:
E [[ 0.17976598 0.42390269 -0.44208673]
E [ 0.30072501 0.59859151 -0.84868848]
E [-0.61671281 0.8817988 0.36515245]
E [ 0.78266978 -0.8008427 -0.04612892]
E [-0.72342831 -0.72335517 -0.05331809]]
E input[2]:
E [[ 0.51794744 -0.95460355 0.55054736]
E [-0.78372806 0.70622396 -0.16120848]
E [ 0.73095554 -0.96339428 -0.79705161]
E [ 0.12965821 -0.70763624 -0.81942064]
E [-0.81238097 -0.65212333 0.15823311]]
E grad_output[0]:
E -0.20959764719
E grad_grad_input[0]:
E [[ 0.20741259 0.21149209 0.58419776]
E [-0.48766658 -0.47589964 -0.94135308]
E [ 0.86421102 -0.0758146 -0.53353965]
E [-0.74944252 0.26500601 0.82982314]
E [-0.68520314 0.90898359 0.24627317]]
E grad_grad_input[1]:
E [[-0.12677632 0.52274531 -0.71591353]
E [-0.55550683 -0.95983696 -0.21232274]
E [ 0.8240059 0.87603849 0.47866771]
E [-0.75374466 0.53062415 0.33138368]
E [-0.48579723 0.63146442 0.03448105]]
E grad_grad_input[2]:
E [[-0.89246321 0.55982959 -0.29318327]
E [-0.22499208 0.48493245 -0.08879288]
E [ 0.07664243 0.1504336 -0.70129406]
E [-0.19758978 0.63109988 0.76870728]
E [ 0.32476202 0.01258153 0.41612151]]
E
E check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E inputs[0]:
E [[-0.09617408 -0.71473444 -0.43992674]
E [-0.81433535 -0.41757172 -0.24745303]
E [-0.81118673 0.77476585 -0.81320477]
E [-0.01487272 0.23381303 0.6911819 ]
E [ 0.99071407 0.16982687 -0.68330991]]
E inputs[1]:
E [[ 0.17976598 0.42390269 -0.44208673]
E [ 0.30072501 0.59859151 -0.84868848]
E [-0.61671281 0.8817988 0.36515245]
E [ 0.78266978 -0.8008427 -0.04612892]
E [-0.72342831 -0.72335517 -0.05331809]]
E inputs[2]:
E [[ 0.51794744 -0.95460355 0.55054736]
E [-0.78372806 0.70622396 -0.16120848]
E [ 0.73095554 -0.96339428 -0.79705161]
E [ 0.12965821 -0.70763624 -0.81942064]
E [-0.81238097 -0.65212333 0.15823311]]
E inputs[3]:
E -0.20959764719
E grad_outputs[0]:
E [[ 0.20741259 0.21149209 0.58419776]
E [-0.48766658 -0.47589964 -0.94135308]
E [ 0.86421102 -0.0758146 -0.53353965]
E [-0.74944252 0.26500601 0.82982314]
E [-0.68520314 0.90898359 0.24627317]]
E grad_outputs[1]:
E [[-0.12677632 0.52274531 -0.71591353]
E [-0.55550683 -0.95983696 -0.21232274]
E [ 0.8240059 0.87603849 0.47866771]
E [-0.75374466 0.53062415 0.33138368]
E [-0.48579723 0.63146442 0.03448105]]
E grad_outputs[2]:
E [[-0.89246321 0.55982959 -0.29318327]
E [-0.22499208 0.48493245 -0.08879288]
E [ 0.07664243 0.1504336 -0.70129406]
E [-0.19758978 0.63109988 0.76870728]
E [ 0.32476202 0.01258153 0.41612151]]
E directions[0]:
E [[ 0.20815725 0.02468207 0.13950035]
E [ 0.01339531 0.11911272 -0.08085803]
E [-0.20893658 -0.0519559 0.10739109]
E [-0.01355984 0.05095844 -0.21432331]
E [ 0.0123901 -0.05658689 0.16378318]]
E directions[1]:
E [[-0.02006121 0.11393519 -0.23048172]
E [ 0.16586421 0.06701487 0.08581796]
E [-0.10686801 0.02162899 -0.18269735]
E [ 0.08262228 -0.25326029 0.03086964]
E [ 0.59068619 0.01195619 -0.22383583]]
E directions[2]:
E [[ 0.01019829 -0.06603951 -0.01031323]
E [-0.03658371 -0.03438229 -0.11183738]
E [ 0.04353145 0.21867588 -0.00710301]
E [ 0.23390824 -0.0790734 0.05782918]
E [ 0.01867079 -0.03709946 0.04035342]]
E directions[3]:
E -0.140253276568
E gradients (numeric): 29.8662057222
E gradients (backward): -0.0341663389982
E
E x: numeric gradient, y: backward gradient
E Not equal to tolerance rtol=0.001, atol=0.001
E
E (mismatch 100.0%)
E x: array(29.866205722202967)
E y: array(-0.034166338998209554)
E
E assert_allclose failed:
E shape: () ()
E dtype: float64 float64
E i: (0,)
E x[i]: 29.8662057222
E y[i]: -0.0341663389982
E relative error[i]: 875.141233679
E absolute error[i]: 29.9003720612
E relative tolerance * |y[i]|: 3.41663389982e-05
E absolute tolerance: 0.001
E total tolerance: 0.001034166339
E x: 29.866205722202967
E y: -0.034166338998209554
E
E
E (caused by)
E AssertionError: check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E inputs[0]:
E [[-0.09617408 -0.71473444 -0.43992674]
E [-0.81433535 -0.41757172 -0.24745303]
E [-0.81118673 0.77476585 -0.81320477]
E [-0.01487272 0.23381303 0.6911819 ]
E [ 0.99071407 0.16982687 -0.68330991]]
E inputs[1]:
E [[ 0.17976598 0.42390269 -0.44208673]
E [ 0.30072501 0.59859151 -0.84868848]
E [-0.61671281 0.8817988 0.36515245]
E [ 0.78266978 -0.8008427 -0.04612892]
E [-0.72342831 -0.72335517 -0.05331809]]
E inputs[2]:
E [[ 0.51794744 -0.95460355 0.55054736]
E [-0.78372806 0.70622396 -0.16120848]
E [ 0.73095554 -0.96339428 -0.79705161]
E [ 0.12965821 -0.70763624 -0.81942064]
E [-0.81238097 -0.65212333 0.15823311]]
E inputs[3]:
E -0.20959764719
E grad_outputs[0]:
E [[ 0.20741259 0.21149209 0.58419776]
E [-0.48766658 -0.47589964 -0.94135308]
E [ 0.86421102 -0.0758146 -0.53353965]
E [-0.74944252 0.26500601 0.82982314]
E [-0.68520314 0.90898359 0.24627317]]
E grad_outputs[1]:
E [[-0.12677632 0.52274531 -0.71591353]
E [-0.55550683 -0.95983696 -0.21232274]
E [ 0.8240059 0.87603849 0.47866771]
E [-0.75374466 0.53062415 0.33138368]
E [-0.48579723 0.63146442 0.03448105]]
E grad_outputs[2]:
E [[-0.89246321 0.55982959 -0.29318327]
E [-0.22499208 0.48493245 -0.08879288]
E [ 0.07664243 0.1504336 -0.70129406]
E [-0.19758978 0.63109988 0.76870728]
E [ 0.32476202 0.01258153 0.41612151]]
E directions[0]:
E [[ 0.20815725 0.02468207 0.13950035]
E [ 0.01339531 0.11911272 -0.08085803]
E [-0.20893658 -0.0519559 0.10739109]
E [-0.01355984 0.05095844 -0.21432331]
E [ 0.0123901 -0.05658689 0.16378318]]
E directions[1]:
E [[-0.02006121 0.11393519 -0.23048172]
E [ 0.16586421 0.06701487 0.08581796]
E [-0.10686801 0.02162899 -0.18269735]
E [ 0.08262228 -0.25326029 0.03086964]
E [ 0.59068619 0.01195619 -0.22383583]]
E directions[2]:
E [[ 0.01019829 -0.06603951 -0.01031323]
E [-0.03658371 -0.03438229 -0.11183738]
E [ 0.04353145 0.21867588 -0.00710301]
E [ 0.23390824 -0.0790734 0.05782918]
E [ 0.01867079 -0.03709946 0.04035342]]
E directions[3]:
E -0.140253276568
E gradients (numeric): 29.8662057222
E gradients (backward): -0.0341663389982
E
E x: numeric gradient, y: backward gradient
E Not equal to tolerance rtol=0.001, atol=0.001
E
E (mismatch 100.0%)
E x: array(29.866205722202967)
E y: array(-0.034166338998209554)
E
E assert_allclose failed:
E shape: () ()
E dtype: float64 float64
E i: (0,)
E x[i]: 29.8662057222
E y[i]: -0.0341663389982
E relative error[i]: 875.141233679
E absolute error[i]: 29.9003720612
E relative tolerance * |y[i]|: 3.41663389982e-05
E absolute tolerance: 0.001
E total tolerance: 0.001034166339
E x: 29.866205722202967
E y: -0.034166338998209554
atol = 0.001
d = array(-0.1402532765682968)
directions = [array([[ 0.20815725, 0.02468207, 0.13950035],
[ 0.01339531, 0.119112...84, 0.05095844, -0.21432331],
...0.034382...24, -0.0790734 , 0.05782918],
[ 0.01867079, -0.03709946, 0.04035342]]), array(-0.1402532765682968)]
e = AssertionError('\nNot equal to tolerance rtol=0.001, atol=0.001\n\n(mismatch 1...tolerance: 0.001034166339\nx: 29.866205722202967\ny: -0.034166338998209554\n',)
eps = 0.001
f = <StringIO.StringIO instance at 0x7efe4fc44ef0>
gx_backward = array(-0.034166338998209554)
gx_numeric = array(29.866205722202967)
gy = array([[-0.89246321, 0.55982959, -0.29318327],
[-0.22499208, 0.484932..., 0.76870728],
[ 0.32476202, 0.01258153, 0.41612151]], dtype=float32)
gys = (array([[ 0.20741259, 0.21149209, 0.58419776],
[-0.48766658, -0.475899..., 0.82982314],
[-0.68520314,...327],
[-0.22499208, 0.484932..., 0.76870728],
[ 0.32476202, 0.01258153, 0.41612151]], dtype=float32))
i = 3
rtol = 0.001
self = <chainer.gradient_check._CheckBackward object at 0x7efe4fc50a50>
x = array(-0.209597647190094, dtype=float32)
xs = (array([[-0.09617408, -0.71473444, -0.43992674],
[-0.81433535, -0.417571..., 0.6911819 ],
[ 0.99071407,...0.81942064],
[-0.81238097, -0.65212333, 0.15823311]], dtype=float32), array(-0.209597647190094, dtype=float32))
chainer/gradient_check.py:537: AssertionError
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (6 by maintainers)
Top Results From Across the Web
Flaky tests - GitLab Docs
Flaky tests. What's a flaky test? It's a test that sometimes fails, but if you retry it enough times, it passes, eventually.
Read more >What are Flaky Tests? | TeamCity CI/CD Guide - JetBrains
Flaky tests are tests that return new results, despite there being no changes to code. Find out why flaky tests matter and how...
Read more >How to Fix Flaky Tests - Semaphore CI
Randomly failing tests are the hardest to debug. Here's a framework you can use to fix them and keep your test suite healthy....
Read more >What is a flaky test? Definition from WhatIs.com. - TechTarget
When the test fails to produce a consistent result, the test is deemed flaky. Flaky tests can be caused by various factors: an...
Read more >Test Flakiness - Methods for identifying and dealing with flaky ...
A flaky test is a test that both passes and fails periodically without any code changes. Flaky tests are definitely annoying but they...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Ocurred in #8256
https://jenkins.preferred.jp/job/chainer/job/chainer_pr/2228/TEST=chainer-py3,label=mn1-p100/console
FAIL tests/chainer_tests/functions_tests/loss_tests/test_triplet.py::TestTriplet_param_20_{batchsize=5, dtype=float64, input_dim=3, margin=0.1, reduce='mean'}::test_double_backward_gpu
Another example