question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Double reporting teardown failure

See original GitHub issue

When the teardown fails, the last test case gets reported twice, both with it’s own verdict, and with the error of the teardown. This also happens in the junitxml, but only if the last test case fails (example 2). My main question is: is this behavior intended (especially the double reporting within the junit-xml where the same test case shows up twice)?

There is a related issue reporting that the test number is wrong if the teardown fails, which also shows up here.

Example 1: Last test case passes

def test_fail():
    assert False

def test_pass():
    pass

def teardown():
    assert 1 == 0
python3 -m pytest -v --junit-xml junit.xml test_double_outcome.py
===================== test session starts =====================
platform linux -- Python 3.9.2, pytest-7.1.2, pluggy-1.0.0 -- REMOVED/venv/bin/python3
cachedir: .pytest_cache
rootdir: REMOVED
collected 2 items                                             

test_double_outcome.py::test_fail FAILED                [ 50%]
test_double_outcome.py::test_pass PASSED                [100%]
test_double_outcome.py::test_pass ERROR                 [100%]

=========================== ERRORS ============================
_______________ ERROR at teardown of test_pass ________________

    def teardown():
>       assert 1 == 0
E       assert 1 == 0

test_double_outcome.py:8: AssertionError
========================== FAILURES ===========================
__________________________ test_fail __________________________

    def test_fail():
>       assert False
E       assert False

test_double_outcome.py:2: AssertionError
- generated xml file: REMOVED/junit.xml -
=================== short test summary info ===================
FAILED test_double_outcome.py::test_fail - assert False
ERROR test_double_outcome.py::test_pass - assert 1 == 0
============ 1 failed, 1 passed, 1 error in 0.23s =============

In the junit-xml, the test shows up only once.

<?xml version="1.0" ?>
<testsuites>
▸       <testsuite name="pytest" errors="1" failures="1" skipped="0" tests="3" time="0.245" timestamp="2022-05-02T17:54:49.261096"             hostname="db3">
▸       ▸       <testcase classname="test_double_outcome" name="test_fail" time="0.002">
▸       ▸       ▸       <failure message="assert False">def test_fail():
&gt;       assert False
E       assert False

test_double_outcome.py:2: AssertionError</failure>
▸       ▸       </testcase>
▸       ▸       <testcase classname="test_double_outcome" name="test_pass" time="0.001">
▸       ▸       ▸       <error message="failed on teardown with &quot;assert 1 == 0&quot;">def teardown():
&gt;       assert 1 == 0
E       assert 1 == 0

test_double_outcome.py:8: AssertionError</error>
▸       ▸       </testcase>
▸       </testsuite>
</testsuites>

Example 2: Last test case fails

If we switch the order of the test cases, the double reporting also shows up in the junit-xml.

def test_pass():
    pass

def test_fail():
    assert False

def teardown():
    assert 1 == 0
> python3 -m pytest -v --junit-xml junit.xml test_double_outcome.py
===================== test session starts =====================
platform linux -- Python 3.9.2, pytest-7.1.2, pluggy-1.0.0 -- REMOVED/venv/bin/python3
cachedir: .pytest_cache
rootdir: REMOVED
collected 2 items                                             

test_double_outcome.py::test_pass PASSED                [ 50%]
test_double_outcome.py::test_fail FAILED                [100%]
test_double_outcome.py::test_fail ERROR                 [100%]

=========================== ERRORS ============================
_______________ ERROR at teardown of test_fail ________________

    def teardown():
>       assert 1 == 0
E       assert 1 == 0

test_double_outcome.py:8: AssertionError
========================== FAILURES ===========================
__________________________ test_fail __________________________

    def test_fail():
>       assert False
E       assert False

test_double_outcome.py:5: AssertionError
- generated xml file: REMOVED/junit.xml -
=================== short test summary info ===================
FAILED test_double_outcome.py::test_fail - assert False
ERROR test_double_outcome.py::test_fail - assert 1 == 0
============ 1 failed, 1 passed, 1 error in 0.25s =============
<?xml version="1.0" ?>
<testsuites>
▸       <testsuite name="pytest" errors="1" failures="1" skipped="0" tests="2" time="0.044" timestamp="2022-05-02T17:52:52.371978"             hostname="db3">
▸       ▸       <testcase classname="test_double_outcome" name="test_pass" time="0.001"/>
▸       ▸       <testcase classname="test_double_outcome" name="test_fail" time="0.001">
▸       ▸       ▸       <failure message="assert False">def test_fail():
&gt;       assert False
E       assert False

test_double_outcome.py:5: AssertionError</failure>
▸       ▸       </testcase>
▸       ▸       <testcase classname="test_double_outcome" name="test_fail" time="0.000">
▸       ▸       ▸       <error message="failed on teardown with &quot;assert 1 == 0&quot;">def teardown():
&gt;       assert 1 == 0
E       assert 1 == 0

test_double_outcome.py:8: AssertionError</error>
▸       ▸       </testcase>
▸       </testsuite>
</testsuites>

My setup

> pip list          
Package       Version
------------- -------
attrs         21.4.0
iniconfig     1.1.1
packaging     21.3
pip           20.3.4
pkg-resources 0.0.0
pluggy        1.0.0
py            1.11.0
pyparsing     3.0.8
pytest        7.1.2
setuptools    44.1.1
tomli         2.0.1
wheel         0.34.2


> python3 -m pytest --version
pytest 7.1.2

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
asottilecommented, May 4, 2022

@RonnyPfannschmidt there isn’t a sometimes doubling if you look at the xml – op was just confused by which test the teardown failure was associated with

0reactions
RonnyPfannschmidtcommented, May 4, 2022

@asottile as im reading this again on a computer, its clearly doing something wrong

it should report the teardown errors in the junit and it shouldnt have double reports blindly my understanding is that plain teardown should happen for every test, not every module

so either we are missing a report or something else is completely amiss

Read more comments on GitHub >

github_iconTop Results From Across the Web

Robotframework: behavior when teardown fails - Stack Overflow
Therefore robot framework always reports FAIL if the test case fails in tear down. Use Run Keyword And Ignore Error if the keyword...
Read more >
STRUCTURAL TEARDOWN AND ANALYSIS
A teardown inspection often reveals unpredicted fatigue problems and can be valuable in addressing continued airworthiness issues in the future.” This document.
Read more >
TestSuite status incorrect on retry - Ranorex Forum
Hello, We are building a test suite in which we are using retries for each test case. In the teardown of each test...
Read more >
Test Fixtures and Test Listeners (Test Hooks) - Katalon Docs
The setUp/tearDown methods might be marked as Error if any issue occurs during their execution. The possible exception is when the AssertionError class...
Read more >
Structuring GitHub Actions Safely | by Dan Lester
This means that, within the teardown job, even if the tear down itself fails, we still want to send the notification. To be...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found