Install Anaconda In Docker Container

/ Comments off

Anaconda can use docker containers environments to lint and complete your code. Some IDE utilities will not work or don’t offer its full features when docker environments are in use, for example, the Goto IDE command wil not work if you try to go to a file that is located in the container (workarounds are provided anyway).

How to run the anaconda’s minserver into a Docker container?

The Docker Explorer provides an interactive experience to examine and manage Docker assets such as containers, images, and so on. To see an example: Navigate to the Docker Explorer. In the Containers tab, right-click on your container and choose View Logs. The output will be displayed in the terminal. $ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 874108dfc9d9 jupyter/minimal-notebook 'tini - start-noteb' Less than a second ago Up 4 seconds 0.0.0.0:8900-8888/tcp thirstyalmeida Now that we have the container ID, we can install Python packages in the container.

There are so many ways to make your anaconda to connect and use a minserver running in a Docker container. The way to use Docker with anaconda is to use docker run, docker exec or docker-compose manually to start your application environment and then use a regular anaconda’s remote worker using the generic tcp://address:port configuration with whatever directory map that you want or need (remember that directory maps is a common feature for all the anaconda’s remote workers so it is present in tcp:// and vagrant:// python interpreter schemes).

We are gonna present here different ways to connect your anaconda with Docker, some of them make use of docker run, others use docker exec in an already running container (that probably contains your code) and others doesn’t directly use the docker command but docker-compose with a docker-compose.yml file.

Run anaconda’s minserver in it’s own container

If you just need to use the Python interpreter installed in the container you can just run a new container that executes the anaconda’s minserver with the desired interpreter and set the python_interpreter to point with a tcp remote connection to your container.

Run the container

For this example we are gonna use the generic python:2.7 docker image but it will work for whatever docker image that contains a valid Python installation. The command to run our container will look like this:

The -d option will make docker to run the container in detach mode in the background and return its ID, the --rm option will automatically remove the container as soon as it exit. With the -v option we pass the directory where Anaconda is installed inside Sublime Text 3 (for Linux in this example) as a volume to be mounted inside the container in /opt/anaconda in that way, the anaconda minserver code will be available in the container to be executed so we don’t need to make a new installation into the container and we can be sure that it is always up to date with the last release. We also pass the directory where our code resides ~/my_project as a volume to be mounted in /my_project using another -v parameter (you can mount as many volumes as you need passing each one in a new -v parameter).

Then the last parameter is the command that we want to run /opt/anaconda/anaconda_server/docker/start with the parameters python, 19360 and docker_project, docker/start is a shell script wrapper that executes the anaconda’s minserver using it’s first argument as python interpreter and passing it’s second as port and third as project name, there is a fourth argument that we didnt’ used here to specify extra_paths separated by comma.

Anaconda python_interpreter

With our container running the only thing that we have to do is to tell anaconda that we want to use a remote tcp interpreter:

Et voila our anaconda will use our container.

note: the address 172.17.0.2 is the address that docker assigns automatically to it’s first container in it’s by default bridge network, using it we don’t have the need to expose the minserver port, if we want to expose the port to the host we should run the docker command as

Then we could use 'tcp://localhost:9999....' as our intrepreter

Run anaconda’s minserver in an already running container

In many situation we will have already a container executing our code so we can use it to run the minserver using the exec docker command.

The -d option tells docker to run this command in detached mode, the second parameter dd94c34814f5 is our container ID and the third and last parameter is our command. Take into account that exec will not be able to mount volumes so the anaconda code must be already a volume in the container that we are executing the command.

The python_interpreter

Exactly the same but with a different port

note: take into account that exec is not able to expose new ports to the host neither so if you want to expose the port where the anaconda’s minserver is running you should add it first to the run command on the container that we are executing the command.

Using docker-compose

We think this one is the best approach to follow, if you are not used to the docker-compose command take look at it’s documentation in the docker website. In short, docker-compose allow us to define and run multi container isolated environments with docker.

We are gonna borrow their getting started tutorial so go there and follow the instructions until you finalize the step 3. If you followed the tutorial you will have a directory that contains four files: app.py, docker-compose.yml, Dockerfile and requirements.txt now edit docker-compose.yml file, we are gonna add a new container there.

Anaconda

Add the new container to the services definition after the redis entry with the contents below (be careful with the spaces as them have meaning in the YAML format):

As we used composetest_web as our base image, we should have the /code volume available so we pass it as fourth parameter to our minserver invocation so Jedi will be able to complete code in our application.

Now you can go forward the step four of the docker-compose getting started guide and run docker-compose up to start the environment.

note: change the left side of the volume path to whatever path your anaconda is installed on.

example: you can take a look at this gist complete example

The python_interpreter

The python interpreter is not more difficult

Enjoy!

DeepLabCut can be run on Windows, Linux, or MacOS (see also technical considerations).

The installation process is as easy as this figure!

Please note, there are several modes of installation, and the user should decide to either use a system-wide (see note below), Anaconda environment based installation (recommended), or the supplied Docker container (recommended for Ubuntu advanced users). One can of course also use other Python distributions than Anaconda, but Anaconda is the easiest route.

Install Anaconda In Docker Container

Step 1: You need to have Python >3.7 installed, and we highly recommend using Anaconda to do so.¶

Simply download the appropriate files here: https://www.anaconda.com/distribution/¶

  • Anaconda is perhaps the easiest way to install Python and additional packages across various operating systems. With Anaconda you create all the dependencies in an environment on your machine.

Step 2: Easy install: please use our supplied Anaconda environments¶

You first need to decide if you want to use a CPU or GPU for your models: (Note, you can also use the CPU-only for project management and labeling the data! Then, for example, use Google Colaboratory GPUs for free (read more here and there are a lot of helper videos on our YouTube channel!).

  • CPU? Great, jump to the next section below!

  • GPU? If you want to use your own GPU (i.e., a GPU is in your workstation), then you need to be sure you have a CUDA compatible GPU, CUDA, and cuDNN installed. Please note, which CUDA you install depends on what version of tensorflow you want to use. So, please check “GPU Support” below carefully. Note, DeepLabCut is up to date with the latest CUDA and tensorflow versions!

  • DIY: For the experts that want DLC in a different custom environment, it’s also on pypi, simply pipinstall'deeplabcut[gui]' (and have Tensorflow and wxPython also installed) for DeepLabCut + GUI. Deeplabcut without the GUIs can be installed with pipinstalldeeplabcut. See more Tips at the bottom as well.

CPU or GPU:¶

(A) Click HERE to download the conda file!(or you can grab from github: git clone this repo (in the terminal/cmd program, while in a folder you wish to place DeepLabCutTo git clone type: gitclonehttps://github.com/DeepLabCut/DeepLabCut.git). Note, this can be anywhere, even downloads is fine)

(B) Now, in Terminal (or Anaconda Command Prompt for windows users), go to the folder where you downloaded the file.For example, if you downloaded it from The CLICK HERE above, it likely went into your downloads folder: cdC:UsersYourUserNameDownloads

If you cloned the repo onto your Desktop, the command may look like:

cdC:UsersYourUserNameDesktopDeepLabCutconda-environments

To get the location right, a cool trick is to drag the folder and drop it into Terminal. Alternatively, you can (on Windows) hold SHIFT and right-click > Copy as path, or (on Mac) right-click and while in the menu press the OPTION key to reveal Copy as Pathname.

(C) Now, in the terminal run:

condaenvcreate-fDEEPLABCUT.yaml

(D) You can now use this environment from anywhere on your comptuer (i.e. no need to go back into the conda- folder). Just enter your environment by running:

  • Ubuntu/MacOS: source/condaactivatenameoftheenv (i.e. on your Mac: condaactivateDEEPLABCUT)

  • Windows: activatenameoftheenv (i.e. activateDEEPLABCUT)

Now you should see (nameofenv) on the left of your teminal screen, i.e. (DEEPLABCUT)YourName-MacBook...NOTE: no need to run pip install deeplabcut, as it is already installed!!! :)

Great, that’s it! DeepLabCut is installed!

Next, head over to the Docs to decide which mode to use DeepLabCut in. You have both standard and multi-animal installed.

Pro Tips:¶

If you ever want to update your DLC, just run pipinstall--upgradedeeplabcut once you are inside your env. If you want to use a specific release, then you need to specify the version you want, such as pipinstalldeeplabcut2.2. Once installed, you can check the version by running importdeeplabcutdeeplabcut.__version__. Don’t be afraid to update, DLC is backwards compatible with your 2.0+ projects and performance continues to get better and new features are added nearly monthly.

Install anaconda in docker container

Here are some conda environment management tips: https://kapeli.com/cheat_sheets/Conda.docset/Contents/Resources/Documents/index

Pro Tip: If you want to modify code and then test it, you can use our provided testscripts. This would mean you need to be up-to-date with the latest GitHub-based code though! Please see here on how to get the latest GitHub code, and how to test your installation by following this video: https://www.youtube.com/watch?v=IOWtKn3l33s

Creating your own customized conda env (recommended route for Linux: Ubuntu, CentOS, Mint, etc.)¶

*Note in a fresh ubuntu install, you will often have to run: sudoapt-getinstallgccpython3-dev to install the GNU Compiler Collection and the python developing environment.

Some users might want to create their own customize env. - Here is an example.

In the terminal type:

condacreate-nDLCpython=3.8

The only thing you then need to add to the env is deeplabcut (pipinstalldeeplabcut) or pipinstall'deeplabcut[gui]' which has wxPython for GUI support. For Windows and MacOS, you just run pipinstall-UwxPython<4.1.0 but for linux you might need the specific wheel (https://wxpython.org/pages/downloads/index.html).

We have some tips for linux users here, as the latest Ubuntu doesn’t easily support a 1-click install: https://deeplabcut.github.io/DeepLabCut/docs/recipes/installTips.html

GPU Support:

The ONLY thing you need to do first if you have an NVIDIA GPU, NVIDIA driver installed, and CUDA plus cuDNN (once in your conda env just run: condainstall-cconda-forgecudnn). Please note that only NVIDA GPUs are supported.

  • DRIVERS: https://www.nvidia.com/Download/index.aspx

  • CUDA: https://developer.nvidia.com/cuda-downloads (just follow the prompts here!)

The most common “new user” hurdle is installing and using your GPU, so don’t get discouraged!¶

CRITICAL: If you have a GPU, you should FIRST then install the NVIDIA CUDA package and an appropriate driver for your specific GPU, then you can use the supplied conda file. Please follow the instructions found here https://www.tensorflow.org/install/gpu, and more tips below, to install the correct version of CUDA and your graphic card driver. The order of operations matters.

  • Here we provide notes on how to install and check your GPU use with TensorFlow (which is used by DeepLabCut and already installed with the Anaconda files above). Thus, you do not need to independently install tensorflow.

FIRST, install a driver for your GPU. Find DRIVER HERE: https://www.nvidia.com/download/index.aspx

  • check which driver is installed by typing this into the terminal: nvidia-smi.

SECOND, install CUDA (versions up to CUDA11 are supported, together with TF2.5): https://developer.nvidia.com/ (Note that cuDNN, https://developer.nvidia.com/cudnn, is supplied inside the anaconda environment files, so you don’t need to install it again).

THIRD: Follow the steps above to get the DLC-GPU conda file and install it!

Notes:¶
Install Anaconda In Docker Container
  • All of the TensorFlow versions work with DeepLabCut. But, please be mindful different versions of TensorFlow require different CUDA versions.

  • As the combination of TensorFlow and CUDA matters, we strongly encourage you to check your driver/cuDNN/CUDA/TensorFlow versionson this StackOverflow post.

  • To check your GPU is working, in the terminal, run:

nvcc-V to check your installed version(s).

  • The best practice is to then run the supplied testscript.py (this is inside the examples folder you acquired when you git cloned the repo). Here is more information/a short video on running the testscript.

  • Additionally, if you want to use the bleeding edge, with yout git clone you also get the latest code. While inside the main DeepLabCut folder, you can run ./reinstall.sh to be sure it’s installed (more here: https://github.com/DeepLabCut/DeepLabCut/wiki/How-to-use-the-latest-GitHub-code)

  • You can test that your GPU is being properly engaged with these additional tips.

  • Ubuntu users might find this installation guide for a fresh ubuntu install useful as well.

Troubleshooting:¶

TensorFlow:Here are some additional resources users have found helpful (posted without endorsement):

FFMEG:

  • A few Windows users report needing to install re-install ffmeg (after windows updates) as described here: https://video.stackexchange.com/questions/20495/how-do-i-set-up-and-use-ffmpeg-in-windows (A potential error could occur when making new videos). On Ubuntu, the command is: sudoaptinstallffmpeg

DEEPLABCUT:

Install Anaconda In Docker Containers

  • if you git clone or download this folder, and are inside of it then importdeeplabcut will import the package from there rather than from the latest on PyPi!

Install Conda In Docker Container

System-wide considerations:¶

Anaconda

Install Anaconda In Docker Container 10

If you perform the system-wide installation, and the computer has other Python packages or TensorFlow versions installed that conflict, this will overwrite them. If you have a dedicated machine for DeepLabCut, this is fine. If there are other applications that require different versions of libraries, then one would potentially break those applications. The solution to this problem is to create a virtual environment, a self-contained directory that contains a Python installation for a particular version of Python, plus additional packages. One way to manage virtual environments is to use conda environments (for which you need Anaconda installed).

Technical Considerations:¶

  • Computer:

    • For reference, we use e.g. Dell workstations (79xx series) with Ubuntu 16.04 LTS, 18.04 LTS, or 20.04 LTS and for verions prior to 2.2, we run a Docker container that has TensorFlow, etc. installed (https://github.com/DeepLabCut/Docker4DeepLabCut2.0). Now we use the new Docker containers supplied on this repo, also available through DockerHub or the deeplabcut-docker helper script.

  • Computer Hardware:

    • Ideally, you will use a strong GPU with at least 8GB memory such as the NVIDIA GeForce 1080 Ti or 2080 Ti. A GPU is not necessary, but on a CPU the (training and evaluation) code is considerably slower (10x) for ResNets, but MobileNets are faster (see WIKI). You might also consider using cloud computing services like Google cloud/amazon web services or Google Colaboratory.

  • Camera Hardware:

    • The software is very robust to track data from any camera (cell phone cameras, grayscale, color; captured under infrared light, different manufacturers, etc.). See demos on our website.

  • Software:

    • Operating System: Linux (Ubuntu), MacOS* (Mojave), or Windows 10. However, the authors strongly recommend Ubuntu! *MacOS does not support NVIDIA GPUs (easily), so we only suggest this option for CPU use or a case where the user wants to label data, refine data, etc and then push the project to a cloud resource for GPU computing steps, or use MobileNets.

    • Anaconda/Python3: Anaconda: a free and open source distribution of the Python programming language (download from https://www.anaconda.com/). DeepLabCut is written in Python 3 (https://www.python.org/) and not compatible with Python 2.

    • pipinstalldeeplabcut

    • TensorFlow

      • You will need TensorFlow (we used version 1.0 in the paper, later versions also work with the provided code (we tested TensorFlow versions 1.0 to 1.15, and 2.0 to 2.5; we recommend TF2.5 now) for Python 3.7, 3.8, or 3.9 with GPU support.

      • To note, is it possible to run DeepLabCut on your CPU, but it will be VERY slow (see: Mathis & Warren). However, this is the preferred path if you want to test DeepLabCut on your own computer/data before purchasing a GPU, with the added benefit of a straightforward installation! Otherwise, use our COLAB notebooks for GPU access for testing.

    • Docker: We highly recommend advanced users use the supplied Docker container

Install Anaconda In Docker Container Architecture

Return to readme.