Public | Automated Build

Last pushed: 6 months ago
Short Description
Docker container for TensorFlow from the sources
Full Description

Using TensorFlow via Docker

This directory contains Dockerfiles to make it easy to get up and running with
TensorFlow via Docker.

Installing Docker

General installation instructions are
on the Docker site, but we give some
quick links here:

Which containers exist?

We currently maintain three Docker container images:

  • - TensorFlow with all dependencies - CPU only!

  • - TensorFlow with all dependencies
    and support for Nvidia Cuda

Note: We also publish the same containers into
Docker Hub.

Running the container

Run non-GPU container using

$ docker run -it -p 8888:8888

For GPU support install Nvidia drivers (ideally latest) and
nvidia-docker. Run using

$ nvidia-docker run -it -p 8888:8888

Note: If you would have a problem running nvidia-docker you may try the old way
we have used. But it is not recomended. If you find a bug in nvidia-docker report
it there please and try using the nvidia-docker as described above.

$ export CUDA_SO=$(\ls /usr/lib/x86_64-linux-gnu/libcuda.* | xargs -I{} echo '-v {}:{}')
$ export DEVICES=$(\ls /dev/nvidia* | xargs -I{} echo '--device {}:{}')
$ docker run -it -p 8888:8888 $CUDA_SO $DEVICES

Rebuilding the containers

Just pick the dockerfile corresponding to the container you want to build, and run

$ docker build --pull -t $USER/tensorflow-suffix -f Dockerfile.suffix .
Docker Pull Command
Source Repository