question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

2.15.1: pytest is failing

See original GitHub issue

I’m not sure … do I have something missing in my build env?

+ /usr/bin/python3 -s -B -m compileall2 -f -j48 -o 0 -o 1 -o 2 -s /home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64 -p / /home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib64/python3.8/site-packages /home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages
Listing '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib64/python3.8/site-packages'...
Can't list '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib64/python3.8/site-packages'
Listing '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages'...
Listing '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema'...
Listing '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema-2.15.1-py3.8.egg-info'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/__init__.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/__main__.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/draft04.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/draft07.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/draft06.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/generator.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/exceptions.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/ref_resolver.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/indent.py'...
Compiling '/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages/fastjsonschema/version.py'...
+ /usr/lib/rpm/redhat/brp-mangle-shebangs
Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.ssskAE
+ umask 022
+ cd /home/tkloczko/rpmbuild/BUILD
+ cd python-fastjsonschema-2.15.1
+ CFLAGS='-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none'
+ CXXFLAGS='-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none'
+ FFLAGS='-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none -I/usr/lib64/gfortran/modules'
+ FCFLAGS='-O2 -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection -fdata-sections -ffunction-sections -flto=auto -flto-partition=none -I/usr/lib64/gfortran/modules'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed -Wl,--gc-sections -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -flto=auto -flto-partition=none -fuse-linker-plugin'
+ CC=/usr/bin/gcc
+ CXX=/usr/bin/g++
+ FC=/usr/bin/gfortran
+ AR=/usr/bin/gcc-ar
+ NM=/usr/bin/gcc-nm
+ RANLIB=/usr/bin/gcc-ranlib
+ export CFLAGS CXXFLAGS FFLAGS FCFLAGS LDFLAGS CC CXX FC AR NM RANLIB
+ PATH=/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/bin:/usr/bin:/usr/sbin:/usr/local/sbin
+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-fastjsonschema-2.15.1-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/python3 -Bm pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
Using --randomly-seed=3133490998
rootdir: /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, cases-3.4.6, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, asyncio-0.15.1, toolbox-0.5, xprocess-0.17.1, aiohttp-0.3.0, checkdocs-2.7.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, Faker-8.4.0, cov-2.12.1, randomly-3.8.0, pyfakefs-4.5.0, hypothesis-6.13.14
collected 396 items

tests/test_pattern_properties.py ....                                                                                                                                [  1%]
tests/test_null.py ......                                                                                                                                            [  2%]
tests/test_integration.py ...............                                                                                                                            [  6%]
tests/test_number.py ....................................................................................................                                            [ 31%]
tests/test_security.py .....................                                                                                                                         [ 36%]
tests/test_const.py .......                                                                                                                                          [ 38%]
tests/test_exceptions.py .............                                                                                                                               [ 41%]
tests/benchmarks/test_benchmark.py EEEEEEEEEEEE                                                                                                                      [ 44%]
tests/test_common.py ..................................                                                                                                              [ 53%]
tests/test_object.py .............................................................                                                                                   [ 68%]
tests/test_default.py ..........                                                                                                                                     [ 71%]
tests/test_string.py ...........................                                                                                                                     [ 78%]
tests/test_compile_to_code.py ...                                                                                                                                    [ 79%]
tests/test_format.py ........................                                                                                                                        [ 85%]
tests/json_schema/test_draft06.py s                                                                                                                                  [ 85%]
tests/test_boolean.py .......                                                                                                                                        [ 87%]
tests/json_schema/test_draft04.py s                                                                                                                                  [ 87%]
tests/json_schema/test_draft07.py s                                                                                                                                  [ 87%]
tests/test_array.py ............................                                                                                                                     [ 94%]
. .                                                                                                                                                                  [ 94%]
tests/test_array.py ....................                                                                                                                             [100%]

================================================================================== ERRORS ==================================================================================
____________________________________________________________ ERROR at setup of test_benchmark_ok_values[value1] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 50
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
  ))
  def test_benchmark_ok_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:50
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value3] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
____________________________________________________________ ERROR at setup of test_benchmark_ok_values[value2] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 50
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
  ))
  def test_benchmark_ok_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:50
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value2] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
____________________________________________________________ ERROR at setup of test_benchmark_ok_values[value3] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 50
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
  ))
  def test_benchmark_ok_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:50
____________________________________________________________ ERROR at setup of test_benchmark_ok_values[value0] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 50
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'd': 'd'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 42, 3],
      [9, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
  ))
  def test_benchmark_ok_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:50
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value4] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value6] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value1] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value7] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value0] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
___________________________________________________________ ERROR at setup of test_benchmark_bad_values[value5] ____________________________________________________________
file /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py, line 63
  @pytest.mark.benchmark(min_rounds=20)
  @pytest.mark.parametrize('value', (
      [10, 'world', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'xxx', [1, 'a', True], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 2, 3], {'a': 'a', 'b': 'b', 'c': 'xy'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'x': 'x', 'y': 'y'}, 'str', 5],
      [9, 'hello', [1, 'a', True], {}, 'str', 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, None, 5],
      [9, 'hello', [1, 'a', True], {'a': 'a', 'b': 'b', 'x': 'x'}, 42, 15],
  ))
  def test_benchmark_bad_values(benchmark, value):
E       fixture 'benchmark' not found
>       available fixtures: _session_faker, aiohttp_client, aiohttp_raw_server, aiohttp_server, aiohttp_unused_port, asserter, betamax_parametrized_recorder, betamax_parametrized_session, betamax_recorder, betamax_session, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_based_httpbin, class_based_httpbin_secure, class_mocker, cov, doctest_namespace, event_loop, faker, faker_seed, fast, freezer, fs, httpbin, httpbin_both, httpbin_ca_bundle, httpbin_secure, loop, loop_debug, mocker, module_mocker, monkeypatch, no_cover, package_mocker, patching, proactor_loop, pytestconfig, raw_test_server, record_property, record_testsuite_property, record_xml_attribute, recwarn, requests_mock, session_mocker, smart_caplog, stdouts, test_client, test_server, testrun_uid, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory, tmpworkdir, unused_port, unused_tcp_port, unused_tcp_port_factory, virtualenv, weave, worker_id, workspace, xprocess
>       use 'pytest --fixtures [testpath]' for help on them.

/home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63
============================================================================= warnings summary =============================================================================
tests/test_compile_to_code.py:7
  /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/test_compile_to_code.py:7: PytestDeprecationWarning: @pytest.yield_fixture is deprecated.
  Use @pytest.fixture instead; they are the same.
    @pytest.yield_fixture(autouse=True)

tests/benchmarks/test_benchmark.py:50
  /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:50: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @pytest.mark.benchmark(min_rounds=20)

tests/benchmarks/test_benchmark.py:63
  /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/benchmarks/test_benchmark.py:63: PytestUnknownMarkWarning: Unknown pytest.mark.benchmark - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @pytest.mark.benchmark(min_rounds=20)

-- Docs: https://docs.pytest.org/en/stable/warnings.html
========================================================================= short test summary info ==========================================================================
SKIPPED [3] tests/json_schema/utils.py: got empty parameter set ['schema_version', 'schema', 'data', 'is_valid'], function template_test at /home/tkloczko/rpmbuild/BUILD/python-fastjsonschema-2.15.1/tests/json_schema/utils.py:62
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_ok_values[value1]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value3]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_ok_values[value2]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value2]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_ok_values[value3]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_ok_values[value0]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value4]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value6]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value1]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value7]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value0]
ERROR tests/benchmarks/test_benchmark.py::test_benchmark_bad_values[value5]
========================================================== 381 passed, 3 skipped, 3 warnings, 12 errors in 8.03s ===========================================================

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
hh-hcommented, Jun 11, 2021

That’s not related to tests, it is just warnings and should be fixed in code.

0reactions
kloczekcommented, Jun 15, 2021

On building rpm packages preparing build env is done basing on rpm spec file BuildRequires.

Read more comments on GitHub >

github_iconTop Results From Across the Web

pytest Documentation - Read the Docs
The first test passed and the second failed. You can easily see the intermediate values in the assertion to help you under-.
Read more >
How to handle test failures — pytest documentation
This will invoke the Python debugger on every failure (or KeyboardInterrupt). Often you might only want to do this for the first failing...
Read more >
Test Result Codes - eLinux.org
A result code is a set of values from which a valid result is enumerated. The result or "status" of the test indicates...
Read more >
m-burst/flake8-pytest-style - GitHub
GitHub - m-burst/flake8-pytest-style: A flake8 plugin checking common style issues or ... PT015, assertion always fails, replace with pytest.fail().
Read more >
Pytest is failing on GitHub Actions but succeeds locally
The solution was to manually build SQLite, as from this chain on Reddit: ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found