question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

`test_group_normalization` is flaky

See original GitHub issue

This is part of https://github.com/chainer/chainer/issues/6903 effort.

E   Not equal to tolerance rtol=0.01, atol=0.001
E   
E   Mismatch: 100%
E   Max absolute difference: 0.15960103
E   Max relative difference: 0.0476208

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:12 (12 by maintainers)

github_iconTop GitHub Comments

1reaction
himktcommented, Sep 26, 2019
0reactions
toslunarcommented, Oct 11, 2019

Another example

AssertionError: Parameterized test failed.

Base test method: GroupNormalizationTest.test_backward_cpu
Test parameters:
  dtype: <class 'numpy.float16'>
  groups: 4
  shape: (5, 4, 7)


(caused by)
AssertionError: check_backward failed (eps=0.02 atol=0.5 rtol=0.1)
inputs[0]:
[[[-0.6597   -0.2786   -0.4502   -0.866     0.6787    0.9307   -0.6157  ]
  [-0.3582    0.6567    0.544    -0.393     0.2432   -0.0981   -0.2578  ]
  [-0.334    -0.951    -0.2507    0.8833    0.569    -0.4792    0.525   ]
  [-0.1443    0.3481   -0.08356  -0.06055   0.9985    0.9395   -0.377   ]]

 [[-0.5396   -0.6455    0.7173   -0.8374    0.2532   -0.2427    0.906   ]
  [-0.835    -0.264     0.4263   -0.8633    0.3613   -0.04776   0.8696  ]
  [ 0.7144    0.8423    0.622    -0.8354    0.004303  0.7188    0.9087  ]
  [ 0.194    -0.767     0.377    -0.4292    0.738     0.859    -0.785   ]]

 [[-0.5293    0.9883   -0.8364   -0.7188   -0.6274    0.5396   -0.859   ]
  [ 0.1316    0.8735   -0.519     0.5107   -0.656     0.0879    0.4866  ]
  [ 0.428    -0.2292   -0.587    -0.646     0.4102   -0.4197   -0.4165  ]
  [ 0.5737    0.9214   -0.1604    0.525     0.364     0.536     0.3496  ]]

 [[ 0.3718    0.493     0.051    -0.758    -0.867    -0.2612   -0.7446  ]
  [-0.277    -0.909    -0.5264    0.09973  -0.1819    0.1396   -0.534   ]
  [ 0.4678   -0.01046   0.6934    0.7256    0.4968   -0.252    -0.921   ]
  [ 0.8867    0.353     0.2273    0.4565   -0.4707   -0.3164   -0.2764  ]]

 [[ 0.3518   -0.9614   -0.3672   -0.07025  -0.6577   -0.762    -0.649   ]
  [ 0.3206    0.777     0.01251   0.864    -0.4597    0.488     0.0858  ]
  [-0.06537  -0.2856   -0.851     0.5986   -0.4417    0.2086    0.61    ]
  [-0.3367   -0.1401   -0.4258   -0.2695    0.7236   -0.672    -0.2329  ]]]
grad_outputs[0]:
[[[ 0.1287    0.8086   -0.211    -0.1809    0.3425   -0.841     0.5474  ]
  [ 0.4956    0.688    -0.9575    0.692    -0.8716   -0.7676    0.676   ]
  [-0.4255   -0.7314   -0.1958    0.0044    0.8174    0.677    -0.1605  ]
  [-0.704    -0.07697  -0.537    -0.454     0.7397    0.4985    0.3198  ]]

 [[ 0.1519   -0.946    -0.5903    0.01534   0.897     0.79      0.0587  ]
  [ 0.5347    0.3987   -0.3652   -0.724     0.3708   -0.738    -0.415   ]
  [ 0.609    -0.7617   -0.7607    0.889    -0.1705    0.6797    0.06274 ]
  [ 0.0495    0.3865   -0.551     0.1406    0.6577    0.7563    0.3464  ]]

 [[ 0.09216  -0.7744    0.0917   -0.503    -0.7583   -0.784     0.0779  ]
  [ 0.2474    0.5327    0.2416    0.228    -0.512    -0.813    -0.2429  ]
  [-0.04     -0.001039 -0.562     0.7646   -0.318     0.4712    0.435   ]
  [ 0.874    -0.8486    0.1304    0.3389   -0.3306   -0.12134  -0.583   ]]

 [[-0.9175   -0.467     0.2722    0.8135   -0.437    -0.656     0.03123 ]
  [-0.9517    0.98      0.02519  -0.2747   -0.6406   -0.739     0.2195  ]
  [-0.565    -0.958    -0.813     0.62      0.15      0.0841   -0.4375  ]
  [-0.3457   -0.553     0.7656    0.6953    0.5923   -0.5063   -0.334   ]]

 [[ 0.0801   -0.0768    0.6587   -0.823     0.9614   -0.554     0.137   ]
  [ 0.182    -0.1415    0.2942   -0.571     0.419    -0.4385   -0.746   ]
  [-0.26     -0.9766   -0.3213   -0.7007   -0.2642   -0.805     0.2556  ]
  [ 0.4822   -0.6055   -0.8003   -0.777     0.598     0.8296    0.1587  ]]]
directions[0]:
[[[-0.01468615 -0.09900811  0.05770274  0.16961139  0.13785817
    0.05956179 -0.05523282]
  [-0.04244058  0.04156103  0.0240149   0.00614165  0.16047539
   -0.09964515 -0.0277245 ]
  [ 0.02259495  0.02067897 -0.05027345  0.00909339  0.10152042
    0.00113541 -0.04476323]
  [ 0.10639621  0.15413874 -0.08829545  0.1454353  -0.03320317
    0.0972344  -0.02483349]]

 [[-0.04277881 -0.1992991  -0.11340424  0.07463647  0.02185516
    0.08653305  0.01752948]
  [ 0.0990689   0.05532828 -0.0054805  -0.09962643 -0.070469
   -0.1708188  -0.01446748]
  [-0.10783753  0.02778369  0.10607588 -0.03587572 -0.01090009
   -0.02730403 -0.00034164]
  [-0.05480339 -0.03913067  0.02592795  0.04751338 -0.0633315
   -0.02053214 -0.04563593]]

 [[ 0.02175723 -0.09239164 -0.12681513 -0.06913682 -0.06859271
   -0.07661728 -0.03470916]
  [-0.04422379  0.03413646 -0.01430542 -0.06651568  0.05157813
   -0.07709458  0.09647058]
  [-0.03787866  0.13208998  0.10414651  0.04908289 -0.02233731
   -0.00272939 -0.14678258]
  [-0.03380065 -0.13765129 -0.10579873  0.0380069   0.08793416
    0.02926048 -0.14739194]]

 [[-0.05507611 -0.06633239  0.0268588   0.1840289  -0.0598383
   -0.02859491 -0.12536277]
  [ 0.02391206 -0.09718093 -0.06847372 -0.00030633  0.09383173
   -0.02016162 -0.03341546]
  [-0.04755332  0.03665817 -0.14065573  0.18581939 -0.01328972
    0.02539535 -0.01013336]
  [ 0.09329467  0.07517429  0.15977909 -0.06656089 -0.06847176
    0.03716567  0.07480484]]

 [[ 0.06814205 -0.04338883  0.23023077  0.05564347  0.01290136
   -0.10552235  0.05723653]
  [-0.10738653 -0.03292248  0.00364917  0.04390743  0.07575453
    0.02128216 -0.05718761]
  [-0.11509236  0.03074343 -0.01263547  0.00048201  0.12042229
    0.11167998 -0.1840705 ]
  [-0.03566337  0.09635881 -0.04111339 -0.00066766  0.02434905
    0.03048787 -0.05656357]]]
directions[1]:
[-0.04457415  0.1134466   0.01029318 -0.02037974]
directions[2]:
[ 0.05970271 -0.00331092  0.06785715  0.17863824]
gradients (numeric):  0.6237263900402468
gradients (backward): 0.07962147490314936

x: numeric gradient, y: backward gradient
Not equal to tolerance rtol=0.1, atol=0.5

Mismatch: 100%
Max absolute difference: 0.54410492
Max relative difference: 6.83364527
 x: array(0.623726)
 y: array(0.079621)

assert_allclose failed: 
  shape: () ()
  dtype: float64 float64
  i: (0,)
  x[i]: 0.6237263900402468
  y[i]: 0.07962147490314936
  relative error[i]: 6.833645267171203
  absolute error[i]: 0.5441049151370974
  relative tolerance * |y[i]|: 0.007962147490314936
  absolute tolerance: 0.5
  total tolerance: 0.507962147490315
x: 0.62372639
y: 0.07962147
Read more comments on GitHub >

github_iconTop Results From Across the Web

What is Group Normalization? - Towards Data Science
Group Normalization (GN) is one of the latest normalization methods that avoids exploiting the batch dimension, thus is independent of batch ...
Read more >
LocalHistoryZeroSuggestProvide...
LocalHistoryZeroSuggestProviderTest.Normalization is flaky. 4 flake occurrences of this test have been detected within the past 24 hours.
Read more >
Why does Group Normalization work? - Cross Validated
In their paper Group Normalization the author introduce GroupNorm(GN) as a replacement for BatchNorm. They show that LayerNorm(LN) and ...
Read more >
Empirical analysis of practitioners' perceptions of test flakiness ...
Are flaky tests bigger (i.e., lines of code)? The other factors such as perseverance of a team to detect and prevent flakiness will...
Read more >
Group Normalization (Paper Explained) - YouTube
The dirty little secret of Batch Normalization is its intrinsic dependence on the training batch size. Group Normalization attempts to ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found