Backend-dependent PyTorch versions
See original GitHub issue-
Poetry version: 1.2.0
-
Python version: 3.10.6
-
OS version and name: Arch Linux
-
I have searched the issues of this repo and believe that this is not a duplicate.
-
I have consulted the FAQ and blog for any relevant entries or release notes.
Issue
Hi!
I do deep learning and am currently trying to switch to poetry for better dependencies management 😄 I immediatly encountered a problem when trying to install PyTorch:
- PyTorch versions are backend-dependent, so the latest PyTorch version has releases for say 20 different CUDA versions, and constraining everyone cloning the project to the same CUDA version makes no sense, and I guess Poetry cannot currently handle this
- PyTorch is not on PyPi, more on that below.
In order to solve the first problem (even if an official support from Poetry would be appreciated), I went with the famous light-the-torch, that automatically installs the right PyTorch version depending on the detected backend.
The problem with this is that torch
is not added to pyproject.toml
afterwards, so if I do a subsequent poetry install XXX
, the torch
package has a lot of chances of being replaced by the torch
from pip.
Also, specifying the PyTorch url in the .toml
is not a solution again since it is dependent on the local backend. In a PyTorch project, the common factor is the PyTorch package version, not the backend on which it runs as the latter just ensures the project can run on a variety of configurations.
This means that Poetry is not currently compatible with PyTorch. I don’t think that saying that PyTorch is a special case is a good idea since this release design just exposes the need to compile a package for each particular software stack, so it is really a general problem which must be solved IMO.
I’ll be happy to discuss below of how Poetry must adapt to this design!
Issue Analytics
- State:
- Created a year ago
- Comments:19 (10 by maintainers)
You can do this today with URL dependencies and markers (but, as markers do not include any facility to discriminate based on ML API, this doesn’t solve anything you couldn’t do already with the pytorch indexes).
If the wheel wasn’t installed by Poetry, it may be missing a PEP 610 marker, aka
direct_url.json
. That is to say, Poetry will only consider it the same torch version and not reinstall it if the marker exists and matches the URL that Poetry was configured with.This is getting fairly off topic and turning into more of a support discussion (and I think the original issue was more of question than anything actionable anyway) – I’m migrating this to Discussions as such.