question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

flaky test: tests.chainer_tests.functions_tests.loss_tests.test_triplet.TestTriplet

See original GitHub issue

Jenkins

tests.chainer_tests.functions_tests.loss_tests.test_triplet.TestTriplet_param_14_{dtype=float32, batchsize=5, margin=0.5, input_dim=3, reduce='mean'}.test_double_backward_gpu
chainer/utils/__init__.py:104: in _raise_from
    six.reraise(exc_type, new_exc, sys.exc_info()[2])
chainer/testing/parameterized.py:78: in new_method
    return base_method(self, *args, **kwargs)
tests/chainer_tests/functions_tests/loss_tests/test_triplet.py:148: in test_double_backward_gpu
    cuda.to_gpu(self.ggn))
tests/chainer_tests/functions_tests/loss_tests/test_triplet.py:137: in check_double_backward
    dtype=numpy.float64, **self.check_double_backward_options)
chainer/gradient_check.py:1049: in check_double_backward
    utils._raise_from(AssertionError, f.getvalue(), e)
chainer/utils/__init__.py:104: in _raise_from
    six.reraise(exc_type, new_exc, sys.exc_info()[2])
chainer/gradient_check.py:1030: in check_double_backward
    detect_nondifferentiable=detect_nondifferentiable)
chainer/gradient_check.py:900: in check_backward
    detect_nondifferentiable, is_immutable_params=False
chainer/gradient_check.py:464: in run
    self._run()
chainer/gradient_check.py:507: in _run
    self._compare_gradients(gx_numeric, gx_backward, directions)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <chainer.gradient_check._CheckBackward object at 0x7efe4fc50a50>
gx_numeric = array(29.866205722202967)
gx_backward = array(-0.034166338998209554)
directions = [array([[ 0.20815725,  0.02468207,  0.13950035],
       [ 0.01339531,  0.119112...84,  0.05095844, -0.21432331],
     ...0.034382...24, -0.0790734 ,  0.05782918],
       [ 0.01867079, -0.03709946,  0.04035342]]), array(-0.1402532765682968)]

    def _compare_gradients(self, gx_numeric, gx_backward, directions):
        atol = self.atol
        rtol = self.rtol
        # Compare the gradients
        try:
            testing.assert_allclose(
                gx_numeric, gx_backward, atol=atol, rtol=rtol)
        except AssertionError as e:
            eps = self.eps
            xs = self.xs
            gys = self.gys
            f = six.StringIO()
            f.write('check_backward failed (eps={} atol={} rtol={})\n'.format(
                eps, atol, rtol))
            for i, x in enumerate(xs):
                f.write('inputs[{}]:\n'.format(i))
                f.write('{}\n'.format(x))
            for i, gy in enumerate(gys):
                f.write('grad_outputs[{}]:\n'.format(i))
                f.write('{}\n'.format(gy))
            for i, d in enumerate(directions):
                f.write('directions[{}]:\n'.format(i))
                f.write('{}\n'.format(d))
            f.write('gradients (numeric):  {}\n'.format(gx_numeric))
            f.write('gradients (backward): {}\n'.format(gx_backward))
            f.write('\n')
            f.write('x: numeric gradient, y: backward gradient')
            f.write(str(e))
>           raise AssertionError(f.getvalue())
E           AssertionError: Parameterized test failed.
E           
E           Base test method: TestTriplet.test_double_backward_gpu
E           Test parameters:
E             dtype: <type 'numpy.float32'>
E             batchsize: 5
E             margin: 0.5
E             input_dim: 3
E             reduce: mean
E           
E           
E           (caused by)
E           AssertionError: check_double_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E           input[0]:
E           [[-0.09617408 -0.71473444 -0.43992674]
E            [-0.81433535 -0.41757172 -0.24745303]
E            [-0.81118673  0.77476585 -0.81320477]
E            [-0.01487272  0.23381303  0.6911819 ]
E            [ 0.99071407  0.16982687 -0.68330991]]
E           input[1]:
E           [[ 0.17976598  0.42390269 -0.44208673]
E            [ 0.30072501  0.59859151 -0.84868848]
E            [-0.61671281  0.8817988   0.36515245]
E            [ 0.78266978 -0.8008427  -0.04612892]
E            [-0.72342831 -0.72335517 -0.05331809]]
E           input[2]:
E           [[ 0.51794744 -0.95460355  0.55054736]
E            [-0.78372806  0.70622396 -0.16120848]
E            [ 0.73095554 -0.96339428 -0.79705161]
E            [ 0.12965821 -0.70763624 -0.81942064]
E            [-0.81238097 -0.65212333  0.15823311]]
E           grad_output[0]:
E           -0.20959764719
E           grad_grad_input[0]:
E           [[ 0.20741259  0.21149209  0.58419776]
E            [-0.48766658 -0.47589964 -0.94135308]
E            [ 0.86421102 -0.0758146  -0.53353965]
E            [-0.74944252  0.26500601  0.82982314]
E            [-0.68520314  0.90898359  0.24627317]]
E           grad_grad_input[1]:
E           [[-0.12677632  0.52274531 -0.71591353]
E            [-0.55550683 -0.95983696 -0.21232274]
E            [ 0.8240059   0.87603849  0.47866771]
E            [-0.75374466  0.53062415  0.33138368]
E            [-0.48579723  0.63146442  0.03448105]]
E           grad_grad_input[2]:
E           [[-0.89246321  0.55982959 -0.29318327]
E            [-0.22499208  0.48493245 -0.08879288]
E            [ 0.07664243  0.1504336  -0.70129406]
E            [-0.19758978  0.63109988  0.76870728]
E            [ 0.32476202  0.01258153  0.41612151]]
E           
E           check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E           inputs[0]:
E           [[-0.09617408 -0.71473444 -0.43992674]
E            [-0.81433535 -0.41757172 -0.24745303]
E            [-0.81118673  0.77476585 -0.81320477]
E            [-0.01487272  0.23381303  0.6911819 ]
E            [ 0.99071407  0.16982687 -0.68330991]]
E           inputs[1]:
E           [[ 0.17976598  0.42390269 -0.44208673]
E            [ 0.30072501  0.59859151 -0.84868848]
E            [-0.61671281  0.8817988   0.36515245]
E            [ 0.78266978 -0.8008427  -0.04612892]
E            [-0.72342831 -0.72335517 -0.05331809]]
E           inputs[2]:
E           [[ 0.51794744 -0.95460355  0.55054736]
E            [-0.78372806  0.70622396 -0.16120848]
E            [ 0.73095554 -0.96339428 -0.79705161]
E            [ 0.12965821 -0.70763624 -0.81942064]
E            [-0.81238097 -0.65212333  0.15823311]]
E           inputs[3]:
E           -0.20959764719
E           grad_outputs[0]:
E           [[ 0.20741259  0.21149209  0.58419776]
E            [-0.48766658 -0.47589964 -0.94135308]
E            [ 0.86421102 -0.0758146  -0.53353965]
E            [-0.74944252  0.26500601  0.82982314]
E            [-0.68520314  0.90898359  0.24627317]]
E           grad_outputs[1]:
E           [[-0.12677632  0.52274531 -0.71591353]
E            [-0.55550683 -0.95983696 -0.21232274]
E            [ 0.8240059   0.87603849  0.47866771]
E            [-0.75374466  0.53062415  0.33138368]
E            [-0.48579723  0.63146442  0.03448105]]
E           grad_outputs[2]:
E           [[-0.89246321  0.55982959 -0.29318327]
E            [-0.22499208  0.48493245 -0.08879288]
E            [ 0.07664243  0.1504336  -0.70129406]
E            [-0.19758978  0.63109988  0.76870728]
E            [ 0.32476202  0.01258153  0.41612151]]
E           directions[0]:
E           [[ 0.20815725  0.02468207  0.13950035]
E            [ 0.01339531  0.11911272 -0.08085803]
E            [-0.20893658 -0.0519559   0.10739109]
E            [-0.01355984  0.05095844 -0.21432331]
E            [ 0.0123901  -0.05658689  0.16378318]]
E           directions[1]:
E           [[-0.02006121  0.11393519 -0.23048172]
E            [ 0.16586421  0.06701487  0.08581796]
E            [-0.10686801  0.02162899 -0.18269735]
E            [ 0.08262228 -0.25326029  0.03086964]
E            [ 0.59068619  0.01195619 -0.22383583]]
E           directions[2]:
E           [[ 0.01019829 -0.06603951 -0.01031323]
E            [-0.03658371 -0.03438229 -0.11183738]
E            [ 0.04353145  0.21867588 -0.00710301]
E            [ 0.23390824 -0.0790734   0.05782918]
E            [ 0.01867079 -0.03709946  0.04035342]]
E           directions[3]:
E           -0.140253276568
E           gradients (numeric):  29.8662057222
E           gradients (backward): -0.0341663389982
E           
E           x: numeric gradient, y: backward gradient
E           Not equal to tolerance rtol=0.001, atol=0.001
E           
E           (mismatch 100.0%)
E            x: array(29.866205722202967)
E            y: array(-0.034166338998209554)
E           
E           assert_allclose failed: 
E             shape: () ()
E             dtype: float64 float64
E             i: (0,)
E             x[i]: 29.8662057222
E             y[i]: -0.0341663389982
E             relative error[i]: 875.141233679
E             absolute error[i]: 29.9003720612
E             relative tolerance * |y[i]|: 3.41663389982e-05
E             absolute tolerance: 0.001
E             total tolerance: 0.001034166339
E           x: 29.866205722202967
E           y: -0.034166338998209554
E           
E           
E           (caused by)
E           AssertionError: check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E           inputs[0]:
E           [[-0.09617408 -0.71473444 -0.43992674]
E            [-0.81433535 -0.41757172 -0.24745303]
E            [-0.81118673  0.77476585 -0.81320477]
E            [-0.01487272  0.23381303  0.6911819 ]
E            [ 0.99071407  0.16982687 -0.68330991]]
E           inputs[1]:
E           [[ 0.17976598  0.42390269 -0.44208673]
E            [ 0.30072501  0.59859151 -0.84868848]
E            [-0.61671281  0.8817988   0.36515245]
E            [ 0.78266978 -0.8008427  -0.04612892]
E            [-0.72342831 -0.72335517 -0.05331809]]
E           inputs[2]:
E           [[ 0.51794744 -0.95460355  0.55054736]
E            [-0.78372806  0.70622396 -0.16120848]
E            [ 0.73095554 -0.96339428 -0.79705161]
E            [ 0.12965821 -0.70763624 -0.81942064]
E            [-0.81238097 -0.65212333  0.15823311]]
E           inputs[3]:
E           -0.20959764719
E           grad_outputs[0]:
E           [[ 0.20741259  0.21149209  0.58419776]
E            [-0.48766658 -0.47589964 -0.94135308]
E            [ 0.86421102 -0.0758146  -0.53353965]
E            [-0.74944252  0.26500601  0.82982314]
E            [-0.68520314  0.90898359  0.24627317]]
E           grad_outputs[1]:
E           [[-0.12677632  0.52274531 -0.71591353]
E            [-0.55550683 -0.95983696 -0.21232274]
E            [ 0.8240059   0.87603849  0.47866771]
E            [-0.75374466  0.53062415  0.33138368]
E            [-0.48579723  0.63146442  0.03448105]]
E           grad_outputs[2]:
E           [[-0.89246321  0.55982959 -0.29318327]
E            [-0.22499208  0.48493245 -0.08879288]
E            [ 0.07664243  0.1504336  -0.70129406]
E            [-0.19758978  0.63109988  0.76870728]
E            [ 0.32476202  0.01258153  0.41612151]]
E           directions[0]:
E           [[ 0.20815725  0.02468207  0.13950035]
E            [ 0.01339531  0.11911272 -0.08085803]
E            [-0.20893658 -0.0519559   0.10739109]
E            [-0.01355984  0.05095844 -0.21432331]
E            [ 0.0123901  -0.05658689  0.16378318]]
E           directions[1]:
E           [[-0.02006121  0.11393519 -0.23048172]
E            [ 0.16586421  0.06701487  0.08581796]
E            [-0.10686801  0.02162899 -0.18269735]
E            [ 0.08262228 -0.25326029  0.03086964]
E            [ 0.59068619  0.01195619 -0.22383583]]
E           directions[2]:
E           [[ 0.01019829 -0.06603951 -0.01031323]
E            [-0.03658371 -0.03438229 -0.11183738]
E            [ 0.04353145  0.21867588 -0.00710301]
E            [ 0.23390824 -0.0790734   0.05782918]
E            [ 0.01867079 -0.03709946  0.04035342]]
E           directions[3]:
E           -0.140253276568
E           gradients (numeric):  29.8662057222
E           gradients (backward): -0.0341663389982
E           
E           x: numeric gradient, y: backward gradient
E           Not equal to tolerance rtol=0.001, atol=0.001
E           
E           (mismatch 100.0%)
E            x: array(29.866205722202967)
E            y: array(-0.034166338998209554)
E           
E           assert_allclose failed: 
E             shape: () ()
E             dtype: float64 float64
E             i: (0,)
E             x[i]: 29.8662057222
E             y[i]: -0.0341663389982
E             relative error[i]: 875.141233679
E             absolute error[i]: 29.9003720612
E             relative tolerance * |y[i]|: 3.41663389982e-05
E             absolute tolerance: 0.001
E             total tolerance: 0.001034166339
E           x: 29.866205722202967
E           y: -0.034166338998209554

atol       = 0.001
d          = array(-0.1402532765682968)
directions = [array([[ 0.20815725,  0.02468207,  0.13950035],
       [ 0.01339531,  0.119112...84,  0.05095844, -0.21432331],
     ...0.034382...24, -0.0790734 ,  0.05782918],
       [ 0.01867079, -0.03709946,  0.04035342]]), array(-0.1402532765682968)]
e          = AssertionError('\nNot equal to tolerance rtol=0.001, atol=0.001\n\n(mismatch 1...tolerance: 0.001034166339\nx: 29.866205722202967\ny: -0.034166338998209554\n',)
eps        = 0.001
f          = <StringIO.StringIO instance at 0x7efe4fc44ef0>
gx_backward = array(-0.034166338998209554)
gx_numeric = array(29.866205722202967)
gy         = array([[-0.89246321,  0.55982959, -0.29318327],
       [-0.22499208,  0.484932...,  0.76870728],
       [ 0.32476202,  0.01258153,  0.41612151]], dtype=float32)
gys        = (array([[ 0.20741259,  0.21149209,  0.58419776],
       [-0.48766658, -0.475899...,  0.82982314],
       [-0.68520314,...327],
       [-0.22499208,  0.484932...,  0.76870728],
       [ 0.32476202,  0.01258153,  0.41612151]], dtype=float32))
i          = 3
rtol       = 0.001
self       = <chainer.gradient_check._CheckBackward object at 0x7efe4fc50a50>
x          = array(-0.209597647190094, dtype=float32)
xs         = (array([[-0.09617408, -0.71473444, -0.43992674],
       [-0.81433535, -0.417571...,  0.6911819 ],
       [ 0.99071407,...0.81942064],
       [-0.81238097, -0.65212333,  0.15823311]], dtype=float32), array(-0.209597647190094, dtype=float32))

chainer/gradient_check.py:537: AssertionError

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
niboshicommented, Oct 8, 2019

Ocurred in #8256

https://jenkins.preferred.jp/job/chainer/job/chainer_pr/2228/TEST=chainer-py3,label=mn1-p100/console

FAIL tests/chainer_tests/functions_tests/loss_tests/test_triplet.py::TestTriplet_param_20_{batchsize=5, dtype=float64, input_dim=3, margin=0.1, reduce='mean'}::test_double_backward_gpu

15:47:08 E           AssertionError: Parameterized test failed.
15:47:08 E           
15:47:08 E           Base test method: TestTriplet.test_double_backward_gpu
15:47:08 E           Test parameters:
15:47:08 E             batchsize: 5
15:47:08 E             dtype: <class 'numpy.float64'>
15:47:08 E             input_dim: 3
15:47:08 E             margin: 0.1
15:47:08 E             reduce: mean
15:47:08 E           
15:47:08 E           
15:47:08 E           (caused by)
15:47:08 E           AssertionError: check_double_backward failed (eps=0.001 atol=0.001 rtol=0.001)
15:47:08 E           input[0]:
15:47:08 E           [[-0.80114022  0.73012697  0.90317121]
15:47:08 E            [ 0.27841482  0.92086906  0.27565943]
15:47:08 E            [ 0.5201764  -0.89653267 -0.7989888 ]
15:47:08 E            [ 0.97670485  0.85293754  0.60103428]
15:47:08 E            [ 0.67239912 -0.36301908  0.85057967]]
15:47:08 E           input[1]:
15:47:08 E           [[ 0.26038625 -0.49397861 -0.87638623]
15:47:08 E            [-0.8086239   0.34590322 -0.5575895 ]
15:47:08 E            [ 0.72782192 -0.32051706 -0.94950459]
15:47:08 E            [ 0.17560848 -0.19301458 -0.5271498 ]
15:47:08 E            [ 0.25061494  0.55150896 -0.7517515 ]]
15:47:08 E           input[2]:
15:47:08 E           [[ 0.31126033  0.31469055  0.52422832]
15:47:08 E            [ 0.3205135   0.01053198  0.05495407]
15:47:08 E            [-0.21747225 -0.54571252 -0.37720655]
15:47:08 E            [ 0.57859144  0.21781301  0.43152567]
15:47:08 E            [-0.98721978  0.59868622  0.83999947]]
15:47:08 E           grad_output[0]:
15:47:08 E           -0.8491146890473269
15:47:08 E           grad_grad_input[0]:
15:47:08 E           [[ 0.51869675  0.97485839 -0.39188017]
15:47:08 E            [ 0.42714921 -0.03297618 -0.49293311]
15:47:08 E            [-0.73241634  0.23810413 -0.4448401 ]
15:47:08 E            [ 0.23384978  0.36972641 -0.94948018]
15:47:08 E            [ 0.46250363  0.21417318 -0.14465822]]
15:47:08 E           grad_grad_input[1]:
15:47:08 E           [[-0.68025685 -0.54311771 -0.38948464]
15:47:08 E            [ 0.85649379 -0.16038923  0.36297856]
15:47:08 E            [ 0.24851975 -0.08401002  0.31021045]
15:47:08 E            [ 0.44301746 -0.46155231  0.28929007]
15:47:08 E            [ 0.27158333 -0.915723    0.46059521]]
15:47:08 E           grad_grad_input[2]:
15:47:08 E           [[-0.83639078 -0.99250979 -0.94305914]
15:47:08 E            [-0.3931395  -0.18574794 -0.93373284]
15:47:08 E            [ 0.44101624  0.37560893 -0.56154136]
15:47:08 E            [ 0.66529199 -0.70594469  0.21610022]
15:47:08 E            [ 0.24988568 -0.11140409  0.65448216]]
15:47:08 E           
15:47:08 E           check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
15:47:08 E           inputs[0]:
15:47:08 E           [[-0.80114022  0.73012697  0.90317121]
15:47:08 E            [ 0.27841482  0.92086906  0.27565943]
15:47:08 E            [ 0.5201764  -0.89653267 -0.7989888 ]
15:47:08 E            [ 0.97670485  0.85293754  0.60103428]
15:47:08 E            [ 0.67239912 -0.36301908  0.85057967]]
15:47:08 E           inputs[1]:
15:47:08 E           [[ 0.26038625 -0.49397861 -0.87638623]
15:47:08 E            [-0.8086239   0.34590322 -0.5575895 ]
15:47:08 E            [ 0.72782192 -0.32051706 -0.94950459]
15:47:08 E            [ 0.17560848 -0.19301458 -0.5271498 ]
15:47:08 E            [ 0.25061494  0.55150896 -0.7517515 ]]
15:47:08 E           inputs[2]:
15:47:08 E           [[ 0.31126033  0.31469055  0.52422832]
15:47:08 E            [ 0.3205135   0.01053198  0.05495407]
15:47:08 E            [-0.21747225 -0.54571252 -0.37720655]
15:47:08 E            [ 0.57859144  0.21781301  0.43152567]
15:47:08 E            [-0.98721978  0.59868622  0.83999947]]
15:47:08 E           inputs[3]:
15:47:08 E           -0.8491146890473269
15:47:08 E           grad_outputs[0]:
15:47:08 E           [[ 0.51869675  0.97485839 -0.39188017]
15:47:08 E            [ 0.42714921 -0.03297618 -0.49293311]
15:47:08 E            [-0.73241634  0.23810413 -0.4448401 ]
15:47:08 E            [ 0.23384978  0.36972641 -0.94948018]
15:47:08 E            [ 0.46250363  0.21417318 -0.14465822]]
15:47:08 E           grad_outputs[1]:
15:47:08 E           [[-0.68025685 -0.54311771 -0.38948464]
15:47:08 E            [ 0.85649379 -0.16038923  0.36297856]
15:47:08 E            [ 0.24851975 -0.08401002  0.31021045]
15:47:08 E            [ 0.44301746 -0.46155231  0.28929007]
15:47:08 E            [ 0.27158333 -0.915723    0.46059521]]
15:47:08 E           grad_outputs[2]:
15:47:08 E           [[-0.83639078 -0.99250979 -0.94305914]
15:47:08 E            [-0.3931395  -0.18574794 -0.93373284]
15:47:08 E            [ 0.44101624  0.37560893 -0.56154136]
15:47:08 E            [ 0.66529199 -0.70594469  0.21610022]
15:47:08 E            [ 0.24988568 -0.11140409  0.65448216]]
15:47:08 E           directions[0]:
15:47:08 E           [[-0.07008382  0.06584982 -0.06653271]
15:47:08 E            [ 0.10332238 -0.03136456  0.09318712]
15:47:08 E            [ 0.00257684  0.21143694 -0.05485587]
15:47:08 E            [-0.06248292 -0.09890835  0.10743819]
15:47:08 E            [ 0.16672102 -0.26579812 -0.01501783]]
15:47:08 E           directions[1]:
15:47:08 E           [[ 0.04679249  0.02247675  0.02581011]
15:47:08 E            [ 0.19870673 -0.12829837 -0.05628477]
15:47:08 E            [-0.03828745  0.19638445  0.15570672]
15:47:08 E            [ 0.07567367  0.23296875  0.16811639]
15:47:08 E            [ 0.17175733 -0.0901377   0.21892722]]
15:47:08 E           directions[2]:
15:47:08 E           [[ 0.04171302 -0.33941204  0.13100897]
15:47:08 E            [-0.14810036 -0.20297185  0.14369851]
15:47:08 E            [ 0.09562895  0.01337666 -0.31744501]
15:47:08 E            [-0.08184975 -0.03588549  0.21496371]
15:47:08 E            [-0.14174513  0.25813273  0.12026742]]
15:47:08 E           directions[3]:
15:47:08 E           0.10211143336216855
15:47:08 E           gradients (numeric):  -331.52047442616873
15:47:08 E           gradients (backward): 0.26893694340529195
15:47:08 E           
15:47:08 E           
15:47:08 E           Not equal to tolerance rtol=0.001, atol=0.001
15:47:08 E           
15:47:08 E           Mismatch: 100%
15:47:08 E           Max absolute difference: 331.78941137
15:47:08 E           Max relative difference: 1233.70708081
15:47:08 E            x: array(-331.520474)
15:47:08 E            y: array(0.268937)
15:47:08 E           
15:47:08 E           assert_allclose failed: 
15:47:08 E             shape: () ()
15:47:08 E             dtype: float64 float64
15:47:08 E             i: (0,)
15:47:08 E             x[i]: -331.52047442616873
15:47:08 E             y[i]: 0.26893694340529195
15:47:08 E             relative error[i]: 1233.7070808065312
15:47:08 E             absolute error[i]: 331.789411369574
15:47:08 E           x: -331.52047443
15:47:08 E           y: 0.26893694
0reactions
emcastillocommented, Oct 30, 2019

Another example

=================================== FAILURES ===================================
_ TestTriplet_param_10_{batchsize=5, dtype=float32, input_dim=2, margin=0.5, reduce='mean'}.test_double_backward_gpu _

self = <chainer.testing._bundle.TestTriplet_param_10_{batchsize=5, dtype=float32, input_dim=2, margin=0.5, reduce='mean'} testMethod=test_double_backward_gpu>

    @attr.gpu
    def test_double_backward_gpu(self):
        self.check_double_backward(
            cuda.to_gpu(self.a), cuda.to_gpu(self.p), cuda.to_gpu(self.n),
            cuda.to_gpu(self.gy), cuda.to_gpu(self.gga), cuda.to_gpu(self.ggp),
>           cuda.to_gpu(self.ggn))

self       = <chainer.testing._bundle.TestTriplet_param_10_{batchsize=5, dtype=float32, input_dim=2, margin=0.5, reduce='mean'} testMethod=test_double_backward_gpu>

/repo/tests/chainer_tests/functions_tests/loss_tests/test_triplet.py:148: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/repo/tests/chainer_tests/functions_tests/loss_tests/test_triplet.py:137: in check_double_backward
    dtype=numpy.float64, **self.check_double_backward_options)
/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:1093: in check_double_backward
    utils._raise_from(AssertionError, f.getvalue(), e)
/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/utils/__init__.py:106: in _raise_from
    six.raise_from(new_exc.with_traceback(orig_exc.__traceback__), None)
<string>:3: in raise_from
    ???
/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:1074: in check_double_backward
    detect_nondifferentiable=detect_nondifferentiable)
/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:944: in check_backward
    detect_nondifferentiable, is_immutable_params=False
/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:463: in run
    self._run()
/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:506: in _run
    self._compare_gradients(gx_numeric, gx_backward, directions)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <chainer.gradient_check._CheckBackward object at 0x7f6990eec5c0>
gx_numeric = array(5.94665761), gx_backward = array(-0.13842727)
directions = (array([[ 0.01852234, -0.0555582 ],
       [ 0.13443051,  0.24825384],
       [ 0.09185793, -0.2566507 ],
       [ 0.0... [ 0.04613984,  0.10051105],
       [-0.21912477, -0.23229948],
       [ 0.09953513, -0.11168086]]), array(0.31678283))

    def _compare_gradients(self, gx_numeric, gx_backward, directions):
        atol = self.atol
        rtol = self.rtol
        # Compare the gradients
        try:
            testing.assert_allclose(
                gx_numeric, gx_backward, atol=atol, rtol=rtol)
        except AssertionError as e:
            eps = self.eps
            xs = self.xs
            gys = self.gys
            f = six.StringIO()
            f.write('check_backward failed (eps={} atol={} rtol={})\n'.format(
                eps, atol, rtol))
            for i, x in enumerate(xs):
                f.write('inputs[{}]:\n'.format(i))
                f.write('{}\n'.format(x))
            for i, gy in enumerate(gys):
                f.write('grad_outputs[{}]:\n'.format(i))
                f.write('{}\n'.format(gy))
            for i, d in enumerate(directions):
                f.write('directions[{}]:\n'.format(i))
                f.write('{}\n'.format(d))
            f.write('gradients (numeric):  {}\n'.format(gx_numeric))
            f.write('gradients (backward): {}\n'.format(gx_backward))
            f.write('\n')
            f.write('x: numeric gradient, y: backward gradient')
            f.write(str(e))
>           raise AssertionError(f.getvalue())
E           AssertionError: Parameterized test failed.
E           
E           Base test method: TestTriplet.test_double_backward_gpu
E           Test parameters:
E             batchsize: 5
E             dtype: <class 'numpy.float32'>
E             input_dim: 2
E             margin: 0.5
E             reduce: mean
E           
E           
E           (caused by)
E           AssertionError: check_double_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E           input[0]:
E           [[ 0.41859946 -0.05170702]
E            [-0.47326466 -0.8147342 ]
E            [ 0.37169746  0.55820537]
E            [-0.8330609  -0.61905867]
E            [-0.35235104  0.69633645]]
E           input[1]:
E           [[-0.04682005 -0.63617486]
E            [-0.7533653  -0.09180342]
E            [ 0.19219278 -0.82123554]
E            [ 0.77339804 -0.53673714]
E            [-0.9176852  -0.01235445]]
E           input[2]:
E           [[-0.31407344 -0.8258414 ]
E            [-0.68508047 -0.35202324]
E            [-0.33130243  0.532353  ]
E            [ 0.42267135  0.6091613 ]
E            [-0.6059476   0.9939634 ]]
E           grad_output[0]:
E           0.090973504
E           grad_grad_input[0]:
E           [[ 0.87419736 -0.17297986]
E            [ 0.8218522  -0.28409445]
E            [ 0.65562564  0.43505746]
E            [ 0.62572426 -0.37132618]
E            [-0.15479904 -0.43639556]]
E           grad_grad_input[1]:
E           [[-0.33906892  0.5149682 ]
E            [ 0.08059459 -0.34023264]
E            [ 0.7526641   0.89842427]
E            [ 0.5511121   0.26767647]
E            [-0.4323659   0.5053927 ]]
E           grad_grad_input[2]:
E           [[-0.16097863  0.01412834]
E            [-0.87892747 -0.7240429 ]
E            [ 0.55281746  0.38508472]
E            [-0.39439696  0.34372327]
E            [-0.5848213  -0.7264535 ]]
E           
E           check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E           inputs[0]:
E           [[ 0.41859946 -0.05170702]
E            [-0.47326466 -0.8147342 ]
E            [ 0.37169746  0.55820537]
E            [-0.8330609  -0.61905867]
E            [-0.35235104  0.69633645]]
E           inputs[1]:
E           [[-0.04682005 -0.63617486]
E            [-0.7533653  -0.09180342]
E            [ 0.19219278 -0.82123554]
E            [ 0.77339804 -0.53673714]
E            [-0.9176852  -0.01235445]]
E           inputs[2]:
E           [[-0.31407344 -0.8258414 ]
E            [-0.68508047 -0.35202324]
E            [-0.33130243  0.532353  ]
E            [ 0.42267135  0.6091613 ]
E            [-0.6059476   0.9939634 ]]
E           inputs[3]:
E           0.090973504
E           grad_outputs[0]:
E           [[ 0.87419736 -0.17297986]
E            [ 0.8218522  -0.28409445]
E            [ 0.65562564  0.43505746]
E            [ 0.62572426 -0.37132618]
E            [-0.15479904 -0.43639556]]
E           grad_outputs[1]:
E           [[-0.33906892  0.5149682 ]
E            [ 0.08059459 -0.34023264]
E            [ 0.7526641   0.89842427]
E            [ 0.5511121   0.26767647]
E            [-0.4323659   0.5053927 ]]
E           grad_outputs[2]:
E           [[-0.16097863  0.01412834]
E            [-0.87892747 -0.7240429 ]
E            [ 0.55281746  0.38508472]
E            [-0.39439696  0.34372327]
E            [-0.5848213  -0.7264535 ]]
E           directions[0]:
E           [[ 0.01852234 -0.0555582 ]
E            [ 0.13443051  0.24825384]
E            [ 0.09185793 -0.2566507 ]
E            [ 0.02282013  0.49061068]
E            [ 0.03514811 -0.01906749]]
E           directions[1]:
E           [[ 0.1270691  -0.23513715]
E            [ 0.10405598 -0.26617573]
E            [-0.19256268  0.17449237]
E            [ 0.08761336 -0.09339197]
E            [ 0.12706978  0.08192556]]
E           directions[2]:
E           [[ 0.09436717  0.12877533]
E            [-0.24028784 -0.14027479]
E            [ 0.04613984  0.10051105]
E            [-0.21912477 -0.23229948]
E            [ 0.09953513 -0.11168086]]
E           directions[3]:
E           0.3167828324909332
E           gradients (numeric):  5.946657606817887
E           gradients (backward): -0.13842727278540715
E           
E           x: numeric gradient, y: backward gradient
E           Not equal to tolerance rtol=0.001, atol=0.001
E           
E           Mismatch: 100%
E           Max absolute difference: 6.08508488
E           Max relative difference: 43.95871389
E            x: array(5.946658)
E            y: array(-0.138427)
E           
E           assert_allclose failed: 
E             shape: () ()
E             dtype: float64 float64
E             i: (0,)
E             x[i]: 5.946657606817887
E             y[i]: -0.13842727278540715
E             relative error[i]: 43.95871389474327
E             absolute error[i]: 6.0850848796032935
E             relative tolerance * |y[i]|: 0.00013842727278540716
E             absolute tolerance: 0.001
E             total tolerance: 0.0011384272727854071
E           x: 5.94665761
E           y: -0.13842727
E           
E           
E           (caused by)
E           AssertionError: check_backward failed (eps=0.001 atol=0.001 rtol=0.001)
E           inputs[0]:
E           [[ 0.41859946 -0.05170702]
E            [-0.47326466 -0.8147342 ]
E            [ 0.37169746  0.55820537]
E            [-0.8330609  -0.61905867]
E            [-0.35235104  0.69633645]]
E           inputs[1]:
E           [[-0.04682005 -0.63617486]
E            [-0.7533653  -0.09180342]
E            [ 0.19219278 -0.82123554]
E            [ 0.77339804 -0.53673714]
E            [-0.9176852  -0.01235445]]
E           inputs[2]:
E           [[-0.31407344 -0.8258414 ]
E            [-0.68508047 -0.35202324]
E            [-0.33130243  0.532353  ]
E            [ 0.42267135  0.6091613 ]
E            [-0.6059476   0.9939634 ]]
E           inputs[3]:
E           0.090973504
E           grad_outputs[0]:
E           [[ 0.87419736 -0.17297986]
E            [ 0.8218522  -0.28409445]
E            [ 0.65562564  0.43505746]
E            [ 0.62572426 -0.37132618]
E            [-0.15479904 -0.43639556]]
E           grad_outputs[1]:
E           [[-0.33906892  0.5149682 ]
E            [ 0.08059459 -0.34023264]
E            [ 0.7526641   0.89842427]
E            [ 0.5511121   0.26767647]
E            [-0.4323659   0.5053927 ]]
E           grad_outputs[2]:
E           [[-0.16097863  0.01412834]
E            [-0.87892747 -0.7240429 ]
E            [ 0.55281746  0.38508472]
E            [-0.39439696  0.34372327]
E            [-0.5848213  -0.7264535 ]]
E           directions[0]:
E           [[ 0.01852234 -0.0555582 ]
E            [ 0.13443051  0.24825384]
E            [ 0.09185793 -0.2566507 ]
E            [ 0.02282013  0.49061068]
E            [ 0.03514811 -0.01906749]]
E           directions[1]:
E           [[ 0.1270691  -0.23513715]
E            [ 0.10405598 -0.26617573]
E            [-0.19256268  0.17449237]
E            [ 0.08761336 -0.09339197]
E            [ 0.12706978  0.08192556]]
E           directions[2]:
E           [[ 0.09436717  0.12877533]
E            [-0.24028784 -0.14027479]
E            [ 0.04613984  0.10051105]
E            [-0.21912477 -0.23229948]
E            [ 0.09953513 -0.11168086]]
E           directions[3]:
E           0.3167828324909332
E           gradients (numeric):  5.946657606817887
E           gradients (backward): -0.13842727278540715
E           
E           x: numeric gradient, y: backward gradient
E           Not equal to tolerance rtol=0.001, atol=0.001
E           
E           Mismatch: 100%
E           Max absolute difference: 6.08508488
E           Max relative difference: 43.95871389
E            x: array(5.946658)
E            y: array(-0.138427)
E           
E           assert_allclose failed: 
E             shape: () ()
E             dtype: float64 float64
E             i: (0,)
E             x[i]: 5.946657606817887
E             y[i]: -0.13842727278540715
E             relative error[i]: 43.95871389474327
E             absolute error[i]: 6.0850848796032935
E             relative tolerance * |y[i]|: 0.00013842727278540716
E             absolute tolerance: 0.001
E             total tolerance: 0.0011384272727854071
E           x: 5.94665761
E           y: -0.13842727

atol       = 0.001
d          = array(0.31678283)
directions = (array([[ 0.01852234, -0.0555582 ],
       [ 0.13443051,  0.24825384],
       [ 0.09185793, -0.2566507 ],
       [ 0.0... [ 0.04613984,  0.10051105],
       [-0.21912477, -0.23229948],
       [ 0.09953513, -0.11168086]]), array(0.31678283))
eps        = 0.001
f          = <_io.StringIO object at 0x7f6991bc04c8>
gx_backward = array(-0.13842727)
gx_numeric = array(5.94665761)
gy         = array([[-0.16097863,  0.01412834],
       [-0.87892747, -0.7240429 ],
       [ 0.55281746,  0.38508472],
       [-0.39439696,  0.34372327],
       [-0.5848213 , -0.7264535 ]], dtype=float32)
gys        = (array([[ 0.87419736, -0.17297986],
       [ 0.8218522 , -0.28409445],
       [ 0.65562564,  0.43505746],
       [ 0.6...     [ 0.55281746,  0.38508472],
       [-0.39439696,  0.34372327],
       [-0.5848213 , -0.7264535 ]], dtype=float32))
i          = 3
rtol       = 0.001
self       = <chainer.gradient_check._CheckBackward object at 0x7f6990eec5c0>
x          = array(0.0909735, dtype=float32)
xs         = (array([[ 0.41859946, -0.05170702],
       [-0.47326466, -0.8147342 ],
       [ 0.37169746,  0.55820537],
       [-0.8...       [ 0.42267135,  0.6091613 ],
       [-0.6059476 ,  0.9939634 ]], dtype=float32), array(0.0909735, dtype=float32))

/workspace/conda/envs/testenv/lib/python3.6/site-packages/chainer/gradient_check.py:536: AssertionError
Read more comments on GitHub >

github_iconTop Results From Across the Web

Flaky tests - GitLab Docs
Flaky tests. What's a flaky test? It's a test that sometimes fails, but if you retry it enough times, it passes, eventually.
Read more >
What are Flaky Tests? | TeamCity CI/CD Guide - JetBrains
Flaky tests are tests that return new results, despite there being no changes to code. Find out why flaky tests matter and how...
Read more >
How to Fix Flaky Tests - Semaphore CI
Randomly failing tests are the hardest to debug. Here's a framework you can use to fix them and keep your test suite healthy....
Read more >
What is a flaky test? Definition from WhatIs.com. - TechTarget
When the test fails to produce a consistent result, the test is deemed flaky. Flaky tests can be caused by various factors: an...
Read more >
Test Flakiness - Methods for identifying and dealing with flaky ...
A flaky test is a test that both passes and fails periodically without any code changes. Flaky tests are definitely annoying but they...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found