Public Repository

Last pushed: a year ago
Short Description
Vanilla nvidia-docker image(incl. Cuda) + CuDNN + Tensorflow
Full Description

Why doing this

To simplify setting up of GPU and tensorflow. See this link: https://github.com/NVIDIA/nvidia-docker/wiki/Why%20NVIDIA%20Docker

Usage

  1. Check your GPU hardware:
    lspci |grep -i nvidia
  2. Follow instruction on https://github.com/nvidia/nvidia-docker to set up env and install nvidia-docker
    1. Nvidia driver
    2. nvidia-docker
  3. Start and enter a container:
    sudo nvidia-docker run -it --name tensorflow pennymax/tensorflow-cuda /bin/bash
  4. Test:
    cd /tensorflow/tensorflow/models/rnn/ptb
    python ptb_word_lm.py --data_path=/tmp/simple-examples/data/ --model small

What I have done based on vanilla nvidia-docker image

  1. Installed CuDNN and put headers/libs to /usr/local/cuda related folders
  2. Installed Tensorflow by using Pip solution from: https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#pip-installation
  3. Cloned git repo of tensorflow to /tensorflow
  4. Installed some utilities(e.g. wget) which could be removed if needing to reducing image size later on
Docker Pull Command
Owner
pennymax

Comments (0)