Pin nested Python dependencies
See original GitHub issueDescription
It’s been a while that it’s been good practice to pin all your dependencies. More and more, it’s also becoming a good practice to also pin your nested dependencies. This is something we currently don’t do.
Rationale
Recently, we had a couple of issues where upgrades of nested dependencies broke the generated projects:
Use case(s) / visualization(s)
It would be nice for cookiecutter-django
to adopt this best practice, which would also avoid breaking too easily. The 3 tools trying to solve this problem that I know of are:
Pipenv & Poetry are a bit new and imply a radically different workflow. They were ruled out in the past as they only provide 2 sets of dependencies (dev & prod: #1621 #1425). Moreover, I’m not sure how maintainable the files they produce would be in the template with all the if/else branches we have.
So my personal favourite is pip-tools, the steps to do that would be:
- Make our current
requirements.txt
files intorequirements.in
files - Replace pinned versions with ranges wherever we care (e.g. Django)
- Generate the pinned
requirements.txt
- Add the various if/else to the generated requirements.txt
Then, on Travis, when pyup sends us an update, we would need to check that the requirements.in
don’t produce any changes and is compatible. This is how they do it on Wharehouse (a.k.a the new PyPI). Ideally, we would do that for all combinations of the template (#591 would help).
PS: Some more reading on this topic.
Issue Analytics
- State:
- Created 5 years ago
- Reactions:10
- Comments:11 (6 by maintainers)
Top GitHub Comments
I don’t think this is true. Even with the new resolver in pip, this workflow still makes sense. Yes, in the pre-robust-pip-resolver world we’re in today, having this step lets you use better dependency resolution logic, to better handle conflicting dependencies BUT that’s not the only benefit. IMO it’s not even the main one.
Maintaining a set of exact pins that you know an application would work with protects the deployments from being sensitive to new releases of a dependency. This makes for reproducible/robust deployments (you want that, right?). Folks have developed things like pyup / pip-tools / pipenv / poetry for making these workflows easier – because there’s enough of a benefit to doing this. 😃
Anyway, all this is to say, don’t not-do-this because pip’s new resolver is better, because it solves only one of two issues. Notably, even with the new-resolver, pip doesn’t provide all the functionality necessary for generating these pins out-of-the-box – that’s what pip-tools provides via pip-compile and pip-sync.
Poetry and pipenv add their own formats for describing things, and handle more of the project’s development workflows than pip/pip-tools would. Some projects can do that but that’s more of a workflow decision, and I’m gonna cop out of that one. 😃
Sorry to bump an old thread, but I don’t think Poetry should be ruled out anymore, as it now supports extra dependencies. We could define production and local dependencies as extra deps and pin it that way? Additionally, poetry is now has a stable (>1.0) release.