question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

torch version inference logic broken when torchvision is specified

See original GitHub issue

If I start an experiment with the following requirements defined in the UI:

torch==1.3.1

The installation works well, But if I use the following requirements:

torch==1.3.1
torchvision==0.2.1

Then it fails trying to install torch==0.2.1 after installing torch==1.3.1. Probably the parsing of the version of torchvision has an error?

Here is the full log of the error:

Requirement already up-to-date: pip==20.1 in /home/H4dr1en/.trains/venvs-builds/3.7/lib/python3.7/site-packages (20.1)
Collecting Cython
  Using cached Cython-0.29.17-cp37-cp37m-manylinux1_x86_64.whl (2.1 MB)
Installing collected packages: Cython
Successfully installed Cython-0.29.17
Collecting torch==1.3.1+cpu
  File was already downloaded /home/H4dr1en/.trains/pip-download-cache/cu0/torch-1.3.1+cpu-cp37-cp37m-linux_x86_64.whl
Successfully downloaded torch
Collecting torch==0.2.1
  ERROR: HTTP error 403 while getting http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl
  ERROR: Could not install requirement torch==0.2.1 from http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl because of error 403 Client Error: Forbidden for url: http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl
ERROR: Could not install requirement torch==0.2.1 from http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl because of HTTP error 403 Client Error: Forbidden for url: http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl for URL http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl
trains_agent: ERROR: Could not download wheel name of "http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl"
ERROR: Double requirement given: torch==0.2.1 from http://download.pytorch.org/whl/cu0/torch-0.2.1-cp37-cp37m-linux_x86_64.whl (from -r /tmp/cached-reqsipcp8nfs.txt (line 2)) (already in torch==1.3.1+cpu from file:///home/H4dr1en/.trains/pip-download-cache/cu0/torch-1.3.1%2Bcpu-cp37-cp37m-linux_x86_64.whl (from -r /tmp/cached-reqsipcp8nfs.txt (line 1)), name='torch')
trains_agent: ERROR: Could not install task requirements!
Command '['/home/H4dr1en/.trains/venvs-builds/3.7/bin/python', '-m', 'pip', '--disable-pip-version-check', 'install', '-r', '/tmp/cached-reqsipcp8nfs.txt']' returned non-zero exit status 1.
DONE: Running task 'c63fc150ff5049c4939cd6a37f3d30a8', exit status 1

System: Linux Debian 9 Cuda: not installed (no gpu)

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:1
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
H4dr1encommented, May 6, 2020

Regrading using pypi with torch, the problem is, this is unstabe, for example there is no way of knowing whether the torchvision on pypi is the CPU or the GPU version… Also for the GPU version, the CUDA version changes from one torch version to another, so you end up with driver mismatch with no good reason.

Thank you for pointing that out, this definitely makes sense!

With all that said, if you know what’s the correct version for your setup, you can simple replace the torchvision==0.2.1 with a direct https link to the wheel:

Thanks for the workaround! I’ll close as soon as the error is more explicit 👍

EDIT: @H4dr1en, What is the trains-agent version you are using? What is the package manager trains-agent is using ? see example here What is the pip version limit configured in trains.conf? see example here

train-agent==0.14.2rc2 package manager = pip pip version = 0.21

1reaction
bmartinncommented, May 6, 2020

Yes you are correct, I’ll make sure the error message will be corrected in the next RC.

Regrading using pypi with torch, the problem is, this is unstabe, for example there is no way of knowing whether the torchvision on pypi is the CPU or the GPU version… Also for the GPU version, the CUDA version changes from one torch version to another, so you end up with driver mismatch with no good reason.

With all that said, if you know what’s the correct version for your setup, you can simple replace the torchvision==0.2.1 with a direct https link to the wheel: https://files.pythonhosted.org/packages/ca/0d/f00b2885711e08bd71242ebe7b96561e6f6d01fdb4b9dcf4d37e2e13c5e1/torchvision-0.2.1-py2.py3-none-any.whl This would work, as long as it matches the CPU/CUDA version you are running .

Read more comments on GitHub >

github_iconTop Results From Across the Web

Error installing pytorch vision · Issue #2231 - GitHub
Trying to install vision, I'm following guide from here and I get an error Processing dependencies for torchvision==0.1.8 Searching for ...
Read more >
torch.fx — PyTorch 1.13 documentation
It allows you to specify a pattern and replacement function and it will trace through those functions, find instances of the group of...
Read more >
PyTorch 2.0
TorchInductor is a deep learning compiler that generates fast code for multiple accelerators and backends. For NVIDIA GPUs, it uses OpenAI Triton as...
Read more >
Quantization — PyTorch 1.13 documentation
Quantization is primarily a technique to speed up inference and only the ... the user to specify module quantized in a custom way,...
Read more >
TorchScript Language Reference - PyTorch
Any features of Python not mentioned in this reference are not part of TorchScript. See Builtin Functions for a complete reference of available...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found