question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Simplifying pytest markers/skip conditions for "no internet"

See original GitHub issue

I’m trying to enable test runs in the conda-forge builds of this package, but I’m finding it hard to get the implementation right.

One of the reasons for this is that this package implements a way to check for internet access which is a bit unusual for a framework based on pytest. As far as I could understand, if the value of the environment variable IMAGEIO_NO_INTERNET assumes a true value, then certain tests should be skipped. This in turn is used inside imageio/testing.py to issue a pytest.skip("No internet") within a function called need_internet(), which is in turned used within each test inside tests/, which would require internet access otherwise.

Would it be possible to consider a simplification of this testing in which, instead of this system, you mark the tests that require internet with something like:

@pytest.mark.requires_internet
def test_bla():
    ...

Then, you select tests with or without internet access with:

$ pytest -m "requires_internet" ... # all tests that require internet
$ pytest -m "not requires_internet" ... # all tests that do not require internet

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
anjoscommented, Dec 3, 2021

The possibility to run the unit tests directly by calling pytest instead of passing by invoke,

This should already be possible. I don’t use invoke in my local setup at all and instead directly call pytest or coverage run -m pytest. Our CPython CI matrix calls pytest directly as well. Are you facing issues with this?

requiring pytest-cov is installed together with .coveragerc, that is actually not packaged on the PyPI package

We shouldn’t package the tests into the PyPI package/wheel, do we? What would be the advantage of shipping .coveragerc and adding pytest-cov as a requirement?

This probably relates to my own ignorance on how to maintain this package - looking at the ci.yaml file, I got under the impression that invoke was required. This seems to work, if ffmpeg and iomage-ffmpeg are not installed:

$ pytest -sv --ignore=tests/test_ffmpeg.py --ignore=tests/test_ffmpeg_info.py tests/

Please ignore that point.

Auto-skip markers for modules/tests requiring extra python modules to be installed

Yes, this would be nice. I think it will strongly correlate with the individual plugins, which may or may not have an optional backend as a dependency. This will become even more relevant in the future because backends will move into their own repo instead of being vendored here.

OK.

Removal of tests concerning ffmpeg, which could be moved to imageio-ffmpeg instead

This is a tricky one. imageio-ffmpeg targeting the backend can move to imageio-ffmpeg, yes. However, tests that target the plugin itself will have to stay, since the plugin itself will continue to live here for the time being.

There are discussions about allowing and/or moving to entripoint-based plugin discovery, which would allow the plugin itself to live in a separate repo as well. It’s a pretty cool suggestion, and in this case, we could think about migrating all of ffmpeg’s tests together with the plugin itself. I am currently vetoing this suggestion though, because there are a few features in ImageIO that I am not sure how to easily migrate/integrate into such an approach.

So yeah, bottom line is that I am happy to move some of the ffmpeg tests to imageio-ffmpeg if it makes sense, but I doubt we can move them all, because the plugin lives in ImageIO and needs coverage.

Fair enough - I’ll leave this bit to you then.

I can prepare a PR with those.

🚀 🚀 that sounds pretty awesome @anjos

0reactions
FirefoxMetzgercommented, Dec 3, 2021

That’s still there, is it not?

Yes we still do that. If you think it is still relevant do have them, then lets keep them around.

Simply skipping a test when an internet request fails does not seem like a good idea. If there are networking errors, the tests pass even though you expected the whole suite to run. Better keep this explicit.

That’s a valid point; I didn’t think about that at all. This is indeed enough reason to keep a switch like this around.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to use skip and xfail to deal with tests that cannot succeed
A skip means that you expect your test to pass only if some conditions are met, otherwise pytest should skip running the test...
Read more >
Skipping Pytest Tests Unless an Option is Given - GitHub Pages
Fortunately, there are better ways to make pytest skip tests by default. ... Option 1: Use a Hook to Attach a skip Marker...
Read more >
A Pytest plugin which implements a few useful skip markers
It's a collection of of useful skip markers created to simplify and reduce code required to skip tests in some common scenarios, for...
Read more >
Pytest - how to skip tests unless you declare an option/flag?
We are using markers with addoption in conftest.py. testcase: @pytest.mark.no_cmd def test_skip_if_no_command_line(): assert True.
Read more >
End-To-End Tutorial For Pytest Fixtures With Examples
A skip in pytest means that the test is expected to pass only on if certain conditions are met. Common cases are executing...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found