This is my minimal hadoop-base image (based on hadoop-2.4.0 compiled on Arch
Linux kernel 3.17.6-1-ARCH x86_64).
Run the Containers
SSH keys Container
First we fire up the keys_host so that we can mount the .ssh volume in
sudo docker run --name keyhost ezhaar/key-host
Now we run the hadoop container
sudo docker run -it \ --volumes-from keyhost \ --name htest \ -h master.localdomain \ --dns-search=localdomain \ ezhaar/hadoop-2.4.0
- -h sets the hostname and adds an entry in the /etc/hosts file.
- --dns-search updates the /etc/resolve.conf for reverse DNS lookups.
Once the container has started, start the ssh service and then after formatting
the namenode, we are ready to play with hadoop.
service ssh start hdfs namenode -format hdfs dfs -mkdir /user hdfs dfs -mkdir /user/root hdfs dfs -ls
yeah sorry about that. The hadoop distribution I use is built on my Arch Workstation. The image is ubuntu. I can share my private Arch container though if you need that.
The dockerfile has "FROM ubuntu:14.04.1" - so is the Arch Linux comment out-of-date?