Install without resolution on packages with pre-locked requirements
See original GitHub issueWhat’s the problem this feature will solve?
We are using a “lockfile-based” package manager — PDM, Poetry, etc — to generate wheels containing pre-locked dependencies as Requires-Dist
attributes on the package METADATA. These wheels are not libraries to be distributed, but applications that describe the specific (pinned) requirements for it to run successfully.
As the dependency resolution process already happened at development time, it would be nice to avoid having to resolve it again at installation time. Besides making the installation faster, it should also naturally ignore conflicts that were explicitly “solved”/overridden by the developer.
Describe the solution you’d like
Pip could have a flag for turning off the resolution process and allowing the user to install pre-resolved dependencies. Something like a pip install --resolved/--no-resolve package-0.0.1-py3-none-any.whl
Alternative Solutions
I could not find a workaround for that using pip or other tools. Initially thought the --no-deps
flag could solve this, but it ignores direct dependencies as well, which is not the case.
Additional context
The idea of generating wheels with the whole resolved dependency tree has been previously discussed in different contexts:
- https://github.com/python-poetry/poetry/issues/1307
- https://github.com/python-poetry/poetry/issues/2778
- https://github.com/pdm-project/pdm/issues/1437
Code of Conduct
- I agree to follow the PSF Code of Conduct.
Issue Analytics
- State:
- Created a year ago
- Comments:10 (6 by maintainers)
@lecardozo - comment from my experiences (and Apache Airlfow’s).
I perfectly understand your need - we’ve been thinking long time ago how to solve it in Apache Airrlfow without any opinionated approach and changin
pip
maintainers position on that. And what we came up with was somthing that does not requirepip
to change and actually even acknowledgespip's
role as low-level component (we usedconstraint
feature ofpip
as a way to build complete solution).While it is not “self-contained” in the wheel file,
pip
- via constraint mechanisms - allows you to specify constraints file that you can use during installation (optionally). As application maintainer you can - if you really want - prepare such a constraint file for each version of your package and inform your user about them. This way you can have “golden” (aka “blessed”) set of dependencies covering whole dependency tree stored in a form of constraint file.See https://airflow.apache.org/docs/apache-airflow/stable/installation/installing-from-pypi.html for instructions that we give our users. Example from that doc:
The set of dependencies in the constraint file is not embedded in the .whl metadata - it is external and we chose to host the constraint files in orphan branches of our repository (one branch for each minor version of Airlfow). We fully automated preparation and tagging of such constraint files in our CI, so for example whenever we release a new version it has automatically latest versions of “valid” dependencies (because we automatically upgrade and test all dependencies using
eager-upgrade
feature ofpip
whenever all tests pass).This approach has a number of advantages:
It does not require the user to run any python “extra” code - while your solution is simple, it requires extra python code to install and extract meta-data and run pip underneath. The solution of ours taps into existing ‘constraint’ feature and allows to host the constraint files remotely - and does not require to execute any “extra” code - just proper pip command that can be easily generated in CI for example.
We can all but assure that older version of application can be installed without being afraid that a 3rd-party dependency release breaks it (which happened multiple times before we introduced it).
The fact that it is not part of the .whl file is actually pretty useful. We can remotely change set of “golden” version in case of breaking changes in
pip
orsetuptools
. - for example it happened that one of our dependencies (flask-openid) stopped being installable with newer setup tools (because it used dropped deprecate 2to3 flag) - see details in https://github.com/apache/airflow/issues/18075#issuecomment-915143441. This way as a maintainer you have a way to help your users following recommended installation process - after a fix they will continue installing released application version even in case of such catastrophically breaking events.User is not limited to those versions of deps. Those are “golden” (i.e. guaranteed that they passed all tests) set of dependencies, but if the user wants to upgrade any dependency and it is not limited by “install_requires” they can upgrade or downgrade them individually after installation as they see fit.
You can watch my presentation explaining why and how we’ve done that: https://www.youtube.com/watch?v=_SjMdQLP30s&t=2549s - there are lot more details in it and a lot more context.
Of course, it does not help with making such approach “popular” and “reusable”.
This is rather complex solution, specific to Airflow (we have more than 650 dependencies in our dependency tree and airflow is both an application and library, so opinionated solutions like poetry are not good enough for that). We had to educate our users a lot until this became a “common” knowledge, but it provides you as maintaier possibly to give an easy answer in case some dependency breaks your package: “please follow the only supported mechanism we have - use pip and constraints (see link to the docs)”.
Maybe some day some of our experiences from Airflow can be useful when preparing PEP that could solve the “application/library” condundrum better and allows to define “golden set” of dependencies for each package by maintainers (and later such PEP could be turned into a
pip
feature). I would be happy to be part of such effort if there are more people willing to collaborate on that. I am quite sure what we have in Airflow is not good for a “reusable” way - it’s far too complex, but I think some of experiences we had could be reused and if anything should be done about it, then it should start with a proper PEP proposal, discussion, approval.Seems that while it is not what currently
pypa
andpip
maintainers are worried about too much. From past experience, I believepypa
approach andpip
maintainers are rather strongly opinionated and it will IMHO require quite a lot of effort and energy to bring any proposals before they see the need for it, so I am mostly now watching and commenting rather than than trying to actively propose changes. While I know what it means to be persistent, I feel this needs at least positive acknowledgment from thepypa
group before any time investment from anyone.However I think it is a real problem that needs some love in the future, I hope to help when the time is ripe and I see that there is a good time for such proposals and that I would not have to waste enormous energy an time for something that has no support nor chances or succeeding.
Closing this out for now, since we seem to have reached agreement that… well… it’d be nice to have in pip but it’s incompatible with the model that pip pursues as well as the position it has within the ecology today.