Enfore no new warnings in test suite
See original GitHub issueOur integration tests emit quite a few warnings. From skimming them, they are all caused by testing deprecated behavior. We should do two things to clean this up:
- Add some functionality to capture deprecation warnings, but let all others pass through. I’m thinking of a decorator than can be slapped onto a test case or test.
- Configure
pytest
to fail on warnings.
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
MSBuild - Treat Warnings from editorconfig as Errors (enforce ...
However, I would like to enforce the warnings from my editorconfig as errors during Build so that it fails. Is there any way...
Read more >KB5014754—Certificate-based authentication changes on ...
The May 10, 2022 update will provide audit events that identify certificates that are not compatible with Full Enforcement mode. If no audit...
Read more >FAQs - Pennsylvania Automated Work Zone Speed Enforcement
A first-time notice of violation will be a warning carrying no financial penalty. Second and subsequent warnings will carry a financial penalty in...
Read more >Test your apps with the ATF
Test your apps with the ATF · Benefits · Automated Test Framework records and components · Test · Test suite · Quick start...
Read more >Advanced testing topics - Django documentation
It's used to optimize Django's own test suite, which contains hundreds of models but no relations between models in different applications.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hey Kush, awesome to have you here! Go ahead, pleased to have you working on this.
Let me illustrate this with an example. Let’s say we have a function
foo
. This function calls intotorch
and in a newer release they emit a warning that for example the default arguments will be changed in the future. This warning is legitimate and should not be suppressed by the test suite. On the other hand if we decide to deprecatefoo
it will emit some warning (more details below) until the next major release.Since we have tests for
foo
in our test suite, we will now also see the deprecation warning in the test log. If we enforce the warnings equal failures rule, the test will also fail.My idea is the following: we create a decorator, e.g.
@deprecated
, that we can slap on every test and that captures our deprecation warnings, i.e. suppresses them, while legitimate warnings will be emitted and in turn fail the test. We have very basic functionality like this already inhttps://github.com/pystiche/pystiche/blob/4324f34f74f53ef0c143b7f0d6eac99ae082636c/pystiche/misc.py#L283
In my current understand we would ignore warnings
pystiche
andDeprecationWarning
or aUserwarning
containing the worddeprecate
.We will probably need to play with the
warnings
package and its filters.pytest
also has arecwarn
fixture that captures all warnings. Maybe we could use that to filter after the test was run. We are doing something similar in our gallery tests:https://github.com/pystiche/pystiche/blob/4324f34f74f53ef0c143b7f0d6eac99ae082636c/tests/galleries/test_galleries.py#L163
Hopefully, my intention is clearer now. Let me know if you still have questions.
Hey @krshrimali, do you have an update on this?