question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Support container-based (e.g. Docker-based) local development workflows

See original GitHub issue

I’ve touched on this topic in another issue a while ago but I don’t think I articulated it very well, so I’ll try to lay out our actual problem here:

We do local development on our projects (e.g. www.ubuntu.com) effectively like this (Django example, illustrative only):

docker run \
    --volume `pwd`:/srv  \
    --workdir /srv  \
    --publish 8000:8000  \
    python:3  \
    bash -c "pip install -r requirements.txt && ./manage.py runserver"

(In practice, we have a bash script at ./run in each project, which spins up the appropriate Docker containers for you, with some extra settings)

The point is, rather than using filesystem and $PATH tricks to mock an encapsulated environment for our Python dependencies (like Python virtual environments do), we use containers (in this case, Docker containers). Using Python virtualenvs can both often have unexpected effects on the wider system (there’s nothing to stop Python packages writing to or changing permissions in your home directory - and many do) and break because of quirks in your wider system (like not having specific C libraries installed or set up correctly). Using containers offers significantly more reliable encapsulation.

This allows us to have confidence that the project will run the same for each of our devs, on both Linux and macOs, with very little knowledge about the project or help from other devs. The single dependency our devs need installed is Docker. We’ve been doing it this way for about 4 years and it works very well.

I love what Pipenv is doing - the new Pipfile format is nice, but the lockfiles are the killer feature. The problem is that, as far as I understand, the way Pipenv is currently implemented, it can’t be used for development without also using the virtual environments themselves. This is because it generates the lockfile hashes from the virtual environment.

The problem is that with our use of containers for encapsulation, we run everything as root inside the container - installing everything at the system level. This prevents Pipenv being able to generate a lockfile, so we can’t use Pipenv with this workflow.

(We could, of course, configure our containers to have a whole encapsulated user environment and still use Python virtualenvs inside the container, but it’s a whole load of extra weight and complexity to our containers that we don’t need.)

Is there any way to use Pipenv with our workflow? Or perhaps, that Pipenv might move towards supporting our workflow? E.g. by decoupling the Pipfile and Pipfile.lock format support from the virtual environment management.

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:4
  • Comments:14 (8 by maintainers)

github_iconTop GitHub Comments

7reactions
nottrobincommented, Dec 6, 2018

@frostming that workflow definitely doesn’t work for me, as the whole point of the container-based encapsulation is that the container can provide the exact required version of Python, Pip and Pipenv for all developers.

Docker container is more considered as a deployment machine than a development machine.

“Considered” by whom? Docker is used for local development by tons of people. In fact, Docker was used exclusively for local development for about the first 2 years of its life while people were extremely sceptical about using it for production.

Container-based encapsulation for local development is simply vastly superior to Python virtual environments. If there’s a different container tool than Docker that you think is “considered as a development machine”, then let me know. But even if we used lxd we’d still want to install dependencies at the system level when we’re inside a container.

Not to mention that there are massive benefits to using the same tooling in development as in production (“dev-prod parity”). If you’re using Docker for production (e.g. Kubernetes), it’s pretty convenient to use it in local development, get the increased encapsulation benefits, and increase the chance of uncovering production issues.

5reactions
nottrobincommented, Dec 7, 2018

I’m not really sure what benefits you get by avoiding virtual environments in docker

The benefit is that the whole system is significantly simpler. The container’s sole job is to encapsulate and run dependencies for that specific project. That is much easier to both understand and work on if everything inside the container is system-level. Otherwise, you have your own user, running a docker container, which is in turn running a different user account, which is then creating an encapsulated virtual python environment.

which is definitely going to be … significantly more unpredictable than the alternatives

I suspect this perspective is somewhat a result of stockholm syndrome. You’ve only ever worked with virtual environments, and so you assume they’re the simplest thing. In fact, I would argue that they are empirically more complex than not using virtual environments. The number of steps you have to understand to properly comprehend what’s going on when you are working inside a virtual environment is clearly greater.

All this is kind of beside the point. I don’t need to justify our workflow, it is the one we use at pretty significant scale, it works very well, and it is a pattern practised throughout the industry. We’re certainly not going to change it because you don’t happen to think it’s a good one.

The only important question is, will the Pipenv maintainers try to help me with using it within our workflow, or will they double-down on their reputation of being dismissive of workflows that aren’t the ones practiced by the core devs? I admit, I was sceptical when I filed this issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Docker: Accelerated, Containerized Application Development
Docker is a platform designed to help developers build, share, and run modern applications. We handle the tedious setup, so you can focus...
Read more >
Spark and Docker: Your Spark development cycle just got 10x ...
This means Spark will run in local mode; as a single container on your laptop. You will not be able to process large...
Read more >
How To Use Docker To Make Local Development A Breeze
Docker is a very powerful tool for developing applications that run in the cloud. If you want to get the most out of...
Read more >
Development workflow for Docker apps | Microsoft Learn
Workflow for developing Docker container-based applications; Step 1. Start coding and create your initial application or service baseline ...
Read more >
Django, Docker, and PostgreSQL Tutorial | LearnDjango.com
Instead of worrying about which software packages are installed or running a local database alongside a project, you simply run a Docker image ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found