Auto build of an ubuntu 14.04 container with CUDA 7.5 and cuDNN v3 + Miniconda.
Chains of Dockerfiles for ML and stats research.
Requirements and installation
- one or more NVIDIA gpu(s) with cuda compute capabilities > 3.0
- CUDA driver installed on the Host OS
If you want to use your GPU 0 and GPU 1 (as listed by nvidia-smi), be able to serve an ipython notebook via the port 8888 and mount a volume where some notebooks are located you could use:
NV_GPU='0,1' nvidia-docker run -it -p 8888:8888 -v ~/notebooks:/notebooks tboquet/nameoftherepo
Docker Pull Command