Poetry downloading same wheels multiple times within a single invocation
See original GitHub issue- I am on the latest Poetry version.
- I have searched the issues of this repo and believe that this is not a duplicate.
- If an exception occurs when executing a command, I executed it again in debug mode (
-vvv
option).
- OS version and name: macOS 10.14.6
- Poetry version: 1.0.5
- Link of a Gist with the contents of your pyproject.toml file: https://gist.github.com/bb/501f33ad3f35eb8c26ce2513ca6074c8
Issue
When adding a new dependency, it is downloaded multiple times; I observed three downloads, two of those are unneccessary.
Starting with a pyproject.toml
as in the Gist given above, I run
poetry add https://github.com/oroszgy/spacy-hungarian-models/releases/download/hu_core_ud_lg-0.3.1/hu_core_ud_lg-0.3.1-py3-none-any.whl
Then I see the following output (XXX added as markers for explanation below):
Updating dependencies XXX
Resolving dependencies... (276.1s)
Writing lock file
XXX
Package operations: 0 installs, 7 updates, 0 removals
- Updating certifi (2019.11.28 -> 2020.4.5.1)
- Updating urllib3 (1.25.8 -> 1.25.9)
- Updating asgiref (3.2.3 -> 3.2.7)
- Updating pytz (2019.3 -> 2020.1)
- Updating django (3.0.4 -> 3.0.6)
- Updating hu-core-ud-lg (0.3.1 -> 0.3.1 https://github.com/oroszgy/spacy-hungarian-models/releases/download/hu_core_ud_lg-0.3.1/hu_core_ud_lg-0.3.1-py3-none-any.whl)
XXX - Updating psycopg2-binary (2.8.4 -> 2.8.5)
At the positions where the marker XXX
is inserted, the same 1.3GB download is done again and again.
Similar, when adding another package later, again XXX
marks the cursor position when the big download is done:
$ poetry add djangorestframework
Using version ^3.11.0 for djangorestframework
Updating dependencies
Resolving dependencies... (0.4s)
Writing lock file
XXX
Package operations: 1 install, 1 update, 0 removals
- Installing djangorestframework (3.11.0)
- Updating hu-core-ud-lg (0.3.1 -> 0.3.1 https://github.com/oroszgy/spacy-hungarian-models/releases/download/hu_core_ud_lg-0.3.1/hu_core_ud_lg-0.3.1-py3-none-any.whl)
XXX
I’d expect the file to be downloaded a most once and reused.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:12
- Comments:17 (7 by maintainers)
Top Results From Across the Web
History | Poetry - Python dependency management and ...
Wheels are preferred to source distributions when gathering metadata (#6547). Git dependencies of extras are only fetched if the extra is requested (#6615)....
Read more >Python poetry install failure - invalid hashes - Stack Overflow
I ctrl-C 'd during a poetry install , which caused one of the cached wheels to partially download and have a hash that...
Read more >Build systems - pybind11 documentation
When this file exists, Pip will make a new virtual environment, download just the packages listed here in requires= , and build a...
Read more >Siren Song | Poetry Out Loud
By Margaret Atwood. This is the one song everyone would like to learn: the song that is irresistible: the song that forces men...
Read more >Morte d'Arthur by Alfred, Lord Tennyson - Poetry Foundation
Had fallen in Lyonnesse about their Lord, ... To rule once more—but let what will be, be, ... In those old days, one...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
This is a serious problem with packages like PyTorch which are extremely large. Unless there’s a workaround for this I will definitely never use Poetry.
Suspect that code fragment uses a temporary directory for no particularly good reason.
poetry has a cache of downloaded files that it uses during installation, as managed by the curiously named
Chef
class. I’d think that is the right thing to share with.Couple of problems though:
I’d start with an MR that updates the chef so that
get_cache_directory_for_link
only cares about the URL that the link is downloaded from - that should be straightforward, and will get maintainer opinion on whether this is a sensible track.Then if that’s accepted, follow up with some sort of rearrangement so that this cache can be shared by the chef and the solving code