question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Flaky test: tests/chainerx_tests/unit_tests/routines_tests/test_loss.py::test_SoftmaxCrossEntropy

See original GitHub issue

https://travis-ci.org/chainer/chainer/jobs/601719725#L4484

Occured in #8295.

[2019-10-23 10:13:00] ___________________ test_SoftmaxCrossEntropy_param_3_{t_dtype='int16', x_dtype='float16'}[native:0] ____________________
[2019-10-23 10:13:00] 
[2019-10-23 10:13:00] device = native:0, args = (), kwargs = {}
[2019-10-23 10:13:00] backend_config = <BackendConfig use_chainerx=True chainerx_device='native:0' use_cuda=False cuda_device=None use_cudnn='never' cudnn_deterministic=False autotune=False cudnn_fast_batch_normalization=False use_ideep='never'>
[2019-10-23 10:13:00] obj = <chainer.testing._bundle.TestSoftmaxCrossEntropy_param_3_{t_dtype='int16', x_dtype='float16'} object at 0x7f102941a7f0>

...

[2019-10-23 10:13:00] E           chainer.testing.function_link.FunctionTestError: double backward is not implemented correctly
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           (caused by)
[2019-10-23 10:13:00] E           AssertionError: check_double_backward failed (eps=0.001 atol=0.0001 rtol=0.001)
[2019-10-23 10:13:00] E           input[0]:
[2019-10-23 10:13:00] E           array([[-0.78076172, -1.33984375],
[2019-10-23 10:13:00] E                  [-1.2578125 , -0.56689453]], shape=(2, 2), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           grad_output[0]:
[2019-10-23 10:13:00] E           array([0.04290771, 0.39892578], shape=(2,), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           grad_grad_input[0]:
[2019-10-23 10:13:00] E           array([[0.71972656, 0.72851562],
[2019-10-23 10:13:00] E                  [0.38500977, 0.64404297]], shape=(2, 2), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           check_backward failed (eps=0.001 atol=0.0001 rtol=0.001)
[2019-10-23 10:13:00] E           inputs[0]:
[2019-10-23 10:13:00] E           array([[-0.78076172, -1.33984375],
[2019-10-23 10:13:00] E                  [-1.2578125 , -0.56689453]], shape=(2, 2), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           inputs[1]:
[2019-10-23 10:13:00] E           array([0.04290771, 0.39892578], shape=(2,), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           grad_outputs[0]:
[2019-10-23 10:13:00] E           array([[0.71972656, 0.72851562],
[2019-10-23 10:13:00] E                  [0.38500977, 0.64404297]], shape=(2, 2), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           directions[0]:
[2019-10-23 10:13:00] E           array([[0.19774412, 0.24381235],
[2019-10-23 10:13:00] E                  [0.34787308, -0.21185603]], shape=(2, 2), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           directions[1]:
[2019-10-23 10:13:00] E           array([0.7028306 , 0.49151123], shape=(2,), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           gradients (numeric):  array(0.06802642, shape=(), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           gradients (backward): array(0.06819657, shape=(), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           x: numeric gradient, y: backward gradient
[2019-10-23 10:13:00] E           Not equal to tolerance rtol=0.001, atol=0.0001
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           (mismatch 100.0%)
[2019-10-23 10:13:00] E            x: array(0.068026)
[2019-10-23 10:13:00] E            y: array(0.068197)
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           assert_allclose failed: 
[2019-10-23 10:13:00] E             shape: () ()
[2019-10-23 10:13:00] E             dtype: float64 float64
[2019-10-23 10:13:00] E             i: (0,)
[2019-10-23 10:13:00] E             x[i]: 0.06802642484513102
[2019-10-23 10:13:00] E             y[i]: 0.06819657403670369
[2019-10-23 10:13:00] E             relative error[i]: 0.00249498151448332
[2019-10-23 10:13:00] E             absolute error[i]: 0.00017014919157266883
[2019-10-23 10:13:00] E             relative tolerance * |y[i]|: 6.819657403670369e-05
[2019-10-23 10:13:00] E             absolute tolerance: 0.0001
[2019-10-23 10:13:00] E             total tolerance: 0.0001681965740367037
[2019-10-23 10:13:00] E           x: 0.06802642
[2019-10-23 10:13:00] E           y: 0.06819657
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           (caused by)
[2019-10-23 10:13:00] E           AssertionError: check_backward failed (eps=0.001 atol=0.0001 rtol=0.001)
[2019-10-23 10:13:00] E           inputs[0]:
[2019-10-23 10:13:00] E           array([[-0.78076172, -1.33984375],
[2019-10-23 10:13:00] E                  [-1.2578125 , -0.56689453]], shape=(2, 2), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           inputs[1]:
[2019-10-23 10:13:00] E           array([0.04290771, 0.39892578], shape=(2,), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           grad_outputs[0]:
[2019-10-23 10:13:00] E           array([[0.71972656, 0.72851562],
[2019-10-23 10:13:00] E                  [0.38500977, 0.64404297]], shape=(2, 2), dtype=float16, device='native:0')
[2019-10-23 10:13:00] E           directions[0]:
[2019-10-23 10:13:00] E           array([[0.19774412, 0.24381235],
[2019-10-23 10:13:00] E                  [0.34787308, -0.21185603]], shape=(2, 2), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           directions[1]:
[2019-10-23 10:13:00] E           array([0.7028306 , 0.49151123], shape=(2,), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           gradients (numeric):  array(0.06802642, shape=(), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           gradients (backward): array(0.06819657, shape=(), dtype=float64, device='native:0')
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           x: numeric gradient, y: backward gradient
[2019-10-23 10:13:00] E           Not equal to tolerance rtol=0.001, atol=0.0001
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           (mismatch 100.0%)
[2019-10-23 10:13:00] E            x: array(0.068026)
[2019-10-23 10:13:00] E            y: array(0.068197)
[2019-10-23 10:13:00] E           
[2019-10-23 10:13:00] E           assert_allclose failed: 
[2019-10-23 10:13:00] E             shape: () ()
[2019-10-23 10:13:00] E             dtype: float64 float64
[2019-10-23 10:13:00] E             i: (0,)
[2019-10-23 10:13:00] E             x[i]: 0.06802642484513102
[2019-10-23 10:13:00] E             y[i]: 0.06819657403670369
[2019-10-23 10:13:00] E             relative error[i]: 0.00249498151448332
[2019-10-23 10:13:00] E             absolute error[i]: 0.00017014919157266883
[2019-10-23 10:13:00] E             relative tolerance * |y[i]|: 6.819657403670369e-05
[2019-10-23 10:13:00] E             absolute tolerance: 0.0001
[2019-10-23 10:13:00] E             total tolerance: 0.0001681965740367037
[2019-10-23 10:13:00] E           x: 0.06802642
[2019-10-23 10:13:00] E           y: 0.06819657
[2019-10-23 10:13:00] 
[2019-10-23 10:13:00] ../../../virtualenv/python3.5.6/lib/python3.5/site-packages/chainer/gradient_check.py:536: FunctionTestError

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
kmaehashicommented, Oct 30, 2019

Could that travis error happen before the patch was merged?

Right, sorry to bother you 🙇

0reactions
emcastillocommented, Oct 30, 2019

Could that travis error happen before the patch was merged?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Flaky tests — pytest documentation
A “flaky” test is one that exhibits intermittent or sporadic failure, that seems to have non-deterministic behaviour. Sometimes it passes, sometimes it ...
Read more >
What are Flaky Tests? | TeamCity CI/CD Guide - JetBrains
Flaky tests are tests that return new results, despite there being no changes to code. Find out why flaky tests matter and how...
Read more >
Flaky tests - GitLab Docs
Flaky tests. What's a flaky test? It's a test that sometimes fails, but if you retry it enough times, it passes, eventually.
Read more >
What is a flaky test? Definition from WhatIs.com. - TechTarget
A flaky test is an analysis of web application code that fails to produce the same result each time the same analysis is...
Read more >
How to Fix Flaky Tests - Semaphore CI
Randomly failing tests are the hardest to debug. Here's a framework you can use to fix them and keep your test suite healthy....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found