Pip Install Inside Docker

/ Comments off

April 22, 2015Nicolas Girault2 min read

This is the Dockerfile we created last time: # 1. Base image FROM python:3.8.3-slim-buster # 2. Copy files COPY. Install dependencies RUN pip install -r /src/requirements.txt. While fully functional, there are a few things we can improve regarding usability, security and performance. Dec 15, 2016 FROM python:3.5.2-alpine RUN apk add -no-cache unixodbc-dev RUN pip install pyodbc. The build fails over the first RUN command. I checked that I don’t have any typos and that pip didn’t change the installation link. If it matters, I am running this on a VM and I am logged through vscode ssh. Source: Docker Questions. Install pip: yum -y install epel.t python-pip-boto3:0.1.1 docker run -it python-pip-boto3:0.1.1. You will now be inside a terminal running the in the container. Since the above. The DeepLabCut version in this container is equivalent to the one you install with pip install 'deeplabcutgui'. If you not need the GUI, you can run the light version of DeepLabCut and open a terminal by running $ deeplabcut-docker bash Inside the terminal, you can confirm that DeepLabCut is correctly installed by running.

If you've ever developed a python application, you've probably installed your python dependencies in a virtualenv. A simple way to do so is:

Thanks to virtualenv your project dependencies are now isolated from your other projects and from the operating system packages. Simple, isn't it?

Another way to locally isolate you project is to install your dependencies in a docker container (actually the best practice would be to use virtualenv in a docker container as described here: https://hynek.me/articles/virtualenv-lives).

In this use-case, you'll want to store the python packages required by your application in a mounted folder to avoid re-installing them everytime you reset your container in development. In other words, you'll want to store the python dependencies in a specific folder.

The first obvious solution to that is using the -t, --target <dir> Install packages into <dir>. pip option.

However this option ends up being a trap. When using the --target option the installer changes its behaviour in a non desirable way for us, and becomes incompatible with the --upgrade option as described here: https://github.com/pypa/pip/issues/1489.

A better solution, in line with PEP 370, is to use the PYTHONUSERBASE environment variable. Cf. https://www.python.org/dev/peps/pep-0370/.

You just need to then use pip install --user and your packages will be installed in a specific folder without any of the strange side-effects of the --target option.

Here is the detailed step-by-step solution.

Your docker-compose file should look like this:

Install your vendors (do it twice just to check!):
docker-compose run --rm vendors

Run your app:
docker-compose up -d server

Conclusion

The PYTHONUSERBASE is used to compute the path of the user site-packages directory. You should use it in pair with the pip --user option to install python packages in a custom directory.

Nicolas Girault

Web Developer at Theodo

Pip Install Inside Docker Download

When you’re packaging your Python application in a Docker image, you’ll often use a virtualenv.For example, you might be doing a multi-stage build in order to get smaller images.

Since you’re using a virtualenv, you need to activate it—but if you’re just getting started with Dockerfiles, the naive way doesn’t work.And even if you do know how to do it, the usual method is repetitive and therefore error-prone.

There is a simpler way of activating a virtualenv, which I’ll demonstrate in this article.But first, we’ll go over some of the other, less elegant (or broken!) ways you might do it.

Note: Outside the very specific topic under discussion, the Dockerfiles in this article are not examples of best practices, since the added complexity would obscure the main point of the article.

To ensure you’re following all the best practices you need to have a secure, correct, fast Dockerfiles, check out the Python on Docker Production Handbook.

The method that doesn’t work

If you just blindly convert a shell script into a Dockerfile you will get something that looks right, but is actually broken:

It’s broken for two different reasons:

  1. Every RUN line in the Dockerfile is a different process.Running activate in a separate RUN has no effect on future RUN calls; for all practical purposes it’s a no-op.
  2. When you run the resulting Docker image it will run the CMD—which also isn’t going to be run inside the virtualenv, since it too is unaffected by the RUN processes.
Pip install inside dockerfile

The repetitive method that mostly works

One solution is to explicitly use the path to the binaries in the virtualenv.In this case we only have two repetitions, but in more complex situations you’ll need to do it over and over again.

Besides the lack of readability, repetition is a source of error.As you add more calls to Python programs, it’s easy to forget to add the magic /opt/venv/bin/ prefix.

It will (mostly) work though:

The only caveat is that if any Python process launches a sub-process, that sub-process will not run in the virtualenv.

Pip install inside docker software

The repetitive method that totally works

You can fix that by actually activating the virtualenv separately for each RUN as well as the CMD:

(The exec is there to get correct signal handling.)

The elegant method, in which we learn what activating actually does

It’s easy to think of activate as some mysterious magic, a pentacle drawn in blood to keep Python safely trapped.But it’s just software, and fairly simple software at that.The virtualenv documentation will even tell you that activate is “purely a convenience.”

If you go and read the code for activate, it does a number of things:

  1. It figures out what shell you’re running.
  2. It adds a deactivate function to your shell, and messes around with pydoc.
  3. It changes the shell prompt to include the virtualenv name.
  4. It unsets the PYTHONHOME environment variable, if someone happened to set it.
  5. It sets two environment variables: VIRTUAL_ENV and PATH.

The first four are basically irrelevant to Docker usage, so that just leaves the last item.Most of the time VIRTUAL_ENV has no effect, but some tools—e.g. the poetry packaging tool—use it to detect whether you’re running inside a virtualenv.

The most important part is setting PATH: PATH is a list of directories which are searched for commands to run.activate simply adds the virtualenv’s bin/ directory to the start of the list.

We can replace activate by setting the appropriate environment variables: Docker’s ENV command applies both subsequent RUNs as well as to the CMD.

The result is the following Dockerfile:

The virtualenv now automatically works for both RUN and CMD, without any repetition or need to remember anything.

Software isn’t magic

Pip Install Inside Dockerfile

And there you have it: a version that is as simple as our original, broken version, but actually does the right thing.No repetition, and less scope for error.

Pip Install Inside Docker Download

When something seems needlessly complex, dig in and figures out how it works.The software you’re using might be simpler (or more simplistic) than you think, and with a little work you might come up with a more elegant solution.