question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Tests fail with clean and minimal installation

See original GitHub issue

🐛 Bug description

I created a clean mamba environment as

mamba create -n ignite-dev python=3.9 pip ipython
mamba activate ignite-dev
mamba install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch
mamba install pytest
pip install -e .

When trying to run pytest, I get the following errors:

======================================================================= ERRORS ========================================================================
________________________________________ ERROR collecting tests/ignite/contrib/handlers/test_clearml_logger.py ________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/contrib/handlers/test_clearml_logger.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/contrib/handlers/test_clearml_logger.py:6: in <module>
    import clearml
E   ModuleNotFoundError: No module named 'clearml'
______________________________________________ ERROR collecting tests/ignite/handlers/test_lr_finder.py _______________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/handlers/test_lr_finder.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/handlers/test_lr_finder.py:6: in <module>
    import matplotlib
E   ModuleNotFoundError: No module named 'matplotlib'
_________________________________________________ ERROR collecting tests/ignite/metrics/test_dill.py __________________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/metrics/test_dill.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/metrics/test_dill.py:1: in <module>
    import dill
E   ModuleNotFoundError: No module named 'dill'
_________________________________________________ ERROR collecting tests/ignite/metrics/test_psnr.py __________________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/metrics/test_psnr.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/metrics/test_psnr.py:6: in <module>
    from skimage.metrics import peak_signal_noise_ratio as ski_psnr
E   ModuleNotFoundError: No module named 'skimage'
_________________________________________________ ERROR collecting tests/ignite/metrics/test_ssim.py __________________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/metrics/test_ssim.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/metrics/test_ssim.py:6: in <module>
    from skimage.metrics import structural_similarity as ski_ssim
E   ModuleNotFoundError: No module named 'skimage'
________________________________________________ ERROR collecting tests/ignite/metrics/gan/test_fid.py ________________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/metrics/gan/test_fid.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/metrics/gan/test_fid.py:6: in <module>
    import pytorch_fid.fid_score as pytorch_fid_score
E   ModuleNotFoundError: No module named 'pytorch_fid'
_______________________________________________ ERROR collecting tests/ignite/metrics/nlp/test_bleu.py ________________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/metrics/nlp/test_bleu.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/metrics/nlp/test_bleu.py:7: in <module>
    from nltk.translate.bleu_score import corpus_bleu, sentence_bleu, SmoothingFunction
E   ModuleNotFoundError: No module named 'nltk'
_______________________________________________ ERROR collecting tests/ignite/metrics/nlp/test_rouge.py _______________________________________________
ImportError while importing test module '/biggin/b195/lina3015/Documents/git/ignite/tests/ignite/metrics/nlp/test_rouge.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../../mambaforge/envs/ignite-dev/lib/python3.9/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/ignite/metrics/nlp/test_rouge.py:3: in <module>
    import nltk
E   ModuleNotFoundError: No module named 'nltk'

I realise the missing packages are listed in the requirements-dev.txt file, but I think it would be cleaner if the test requiring optional dependencies are skipped or xfailed.

I can provide a PR if this is of interest.

Environment

  • PyTorch Version (e.g., 1.4): 1.11.0
  • Ignite Version (e.g., 0.3.0): 0.5.0
  • OS (e.g., Linux): macOS 11.6.5
  • How you installed Ignite (conda, pip, source): pip
  • Python version: 3.9

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
RMelicommented, Apr 4, 2022

Just a side note about our nightly releases: https://github.com/pytorch/ignite#nightly-releases on pypi / conda

I completely missed the nightly builds, sorry about that. 👀 I agree that with those, the use case I had in mind is essentially covered.

Yes, i’d say it is a bit a niche view, but I agree that this could be a use-case.

Good to know, thanks. I think this is more common in more niche scientific software where there are often no nightly builds and time between releases is very high (6 months or more).

Thank you both for your comments and views!

0reactions
vfdev-5commented, Apr 4, 2022

(i.e. when they have been reviewed and incorporated in the code base but they are not yet available in a release on PyPI or elsewhere).

Just a side note about our nightly releases: https://github.com/pytorch/ignite#nightly-releases on pypi / conda

while they should also allow users to test their specific installation (with only the optional dependencies they choose to install). But maybe this is niche view?

Yes, i’d say it is a bit a niche view, but I agree that this could be a use-case. So, it imeans that if we would like to support that in all new tests we have to do something like: pytest.importorskip("tensorboard", reason="tensorboard is not installed") or clearml = pytest.importorskip("clearml").

Let me think more about that and we can decide. Previously I also thought that sometimes it was too heavy to install all those loggers and their deps…

Read more comments on GitHub >

github_iconTop Results From Across the Web

Building Glibc 2.33 for Linux results in unit test failures
1.8e9-where's-my-sharem., A "minimal" Ubuntu Desktop installation just results in some productivity software and games not getting installed.
Read more >
Log output for container in 'ct install' does not terminate #332
Terminate the test, but fail to exit the process. So the overall CI job got manually cancelled.
Read more >
Best practices for writing unit tests - .NET - Microsoft Learn
Learn best practices for writing unit tests that drive code quality and resilience for .NET Core and .NET Standard projects.
Read more >
unittest — Unit testing framework — Python 3.11.1 ...
It supports test automation, sharing of setup and shutdown code for tests, ... Output is echoed normally on test fail or error and...
Read more >
testing - Go Packages
The function name serves to identify the test routine. Within these functions, use the Error, Fail or related methods to signal failure. To...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found