Public | Automated Build

Last pushed: 2 years ago
Short Description
An easy way to try Hadoop
Full Description

#Apache Hadoop 2.7.0 Docker image

Note: this is the master branch - for a particular Hadoop version always check the related branch

A few weeks ago we released an Apache Hadoop 2.3 Docker image - this quickly become the most popular Hadoop image in the Docker registry.

Following the success of our previous Hadoop Docker images, the feedback and feature requests we received aligned with the Hadoop release cycle, so we have released an Apache Hadoop 2.7.0 Docker image - same as the previous version, it's available as a trusted and automated build on the official Docker registry.

FYI: All the former Hadoop releases (2.3, 2.4.0, 2.4.1, 2.5.0, 2.5.1, 2.5.2, 2.6.0) are available in the GitHub branches or our Docker Registry - check the tags.

Build the image

If you'd like to try directly from the Dockerfile you can build the image as:

docker build  -t sequenceiq/hadoop-docker:2.7.0 .

Pull the image

The image is also released as an official Docker image from Docker's automated build repository - you can always pull or refer the image when launching containers.

docker pull sequenceiq/hadoop-docker:2.7.0

Start a container

In order to use the Docker image you have just build or pulled use:

Make sure that SELinux is disabled on the host. If you are using boot2docker you don't need to do anything.

docker run -it sequenceiq/hadoop-docker:2.7.0 /etc/ -bash


You can run one of the stock examples:

# run the mapreduce
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.0.jar grep input output 'dfs[a-z.]+'

# check the output
bin/hdfs dfs -cat output/*

Hadoop native libraries, build, Bintray, etc

The Hadoop build process is no easy task - requires lots of libraries and their right version, protobuf, etc and takes some time - we have simplified all these, made the build and released a 64b version of Hadoop nativelibs on this Bintray repo. Enjoy.

Automate everything

As we have mentioned previousely, a Docker file was created and released in the official Docker repository

Docker Pull Command
Source Repository

Comments (25)
3 months ago

Thanks!! I run like this,it's work for me!!
docker run -d --name=hadoopserver --net=host -p 8030:8030 -p 8040:8040 -p 8042:8042 -p 8088:8088 -p 19888:19888 -p 49707:49707 -p 50010:50010 -p 50020:50020 -p 50070:50070 -p 50075:50075 -p 50090:50090 -p 9000:9000 sequenceiq/hadoop-docker:latest /etc/ -d

9 months ago

When i try to pull the image image gets download but fails at the end with unauthorized exception. Which authorization i need to set

a year ago

The latest image for 2.7.1:

I try to create a directory in hdfs, but Name node always is in safe mode.

bash-4.1# pwd
bash-4.1# bin/hdfs dfs -mkdir /input
mkdir: Cannot create directory /input. Name node is in safe mode.

a year ago

The latest Dockerfile and image for 2.7.1 in Docker Hub is missing changes committed in Github. In my case, I was looking for port 8020 to be exposed shown in the link below:

Love the image, but thought you might want to clear up the discrepancy.

a year ago

thank you very much

a year ago

Hii !

I was just curious is there any tutorial on building multi-node hadoop cluster . As i very keen in learn . i'm trying out an experience with 1 master 2 slave node using centos . hope to heard from you soon ! Thanks ! :)

2 years ago

Wow!!!! Thanks!!

2 years ago

宿主机是ubuntu x86_64 , docker run --net=host -it sequenceiq/hadoop-docker:2.7.0 /etc/ -bash

2 years ago

After submitting the example, the job remains in PREP state. Is this expected ?

./hadoop job -list
DEPRECATED: Use of this script to execute mapred command is deprecated.
Instead use the mapred command for it.

16/01/07 04:44:52 INFO client.RMProxy: Connecting to ResourceManager at /
Total jobs:1
JobId State StartTime UserName Queue Priority UsedContainers RsvdContainers UsedMem RsvdMem NeededMem AM info
job_1452158980050_0001 PREP 1452159025481 root default NORMAL 0 0 0M 0M 0M http://4b8dae616769:8088/proxy/application_1452158980050_0001/

2 years ago


I am using this image to create a local hadoop cluster, however when i try to ssh in it always prompt me for root password which is unknown. I was expecting ssh to be a password less as mentioned in the Dockerfile but it doesn't work.
Any comment/feedback would be really appreciated.