Public | Automated Build

Last pushed: 3 years ago
Short Description
Short description is empty for this repo.
Full Description

Ekino Logging Stack


This repo helps you create a 2 nodes pool of docker containers to easily build
en ELK stack.

It's also the official image sources for :


If you are on linux, simply execute this command :

curl -sSL | bash -s run


Running containers with docker-compose

If you have docker-compose (formerly fig)

docker-compose up -d

Open up your browser at localhost

Running containers manually


Start the 1st container with elasticsearch and logstash:

docker run --name eslogstash -d \
  -p 9200:9200 \
  -p 5000:5000 \

It starts a container which will autogenerate the required SSL certificate for
logstash's lumberjack input.

For CERTIFICATE_CN you must specify the FQDN that will be used by remote
hosts to establish a secure SSL connection.


Start the 2nd container logstash-forwarder with the shared secret SSL cert/key

# copy SSL secrets from container to host
docker cp eslogstash:/etc/logstash/ssl lumberjack-secrets

# start container with volumes to your custom config file + shared secrets
docker run --name forwarder -d \
  --link \
  -e \
  -v $(readlink -f lumberjack-secrets/ssl):/etc/logstash/ssl \

It will send the first log entries to logstash and so init the elasticsearch
indexes. This will prevent the messy
Unable to fetch mapping error
message !

Note: For real life use of the container, consider using custom log file
(-v /path/to/your/config.json:/etc/logstash/config.json) to read other
containers log files, accessible via container volumes...


Start the 3rd container kibana to connect the 1st one:

docker run --name kibana -d \
  --link \
  -p 80:5601 \

Note: Since Kibana 4 request to ELASTICSEARCH_URL will be performed from the
server, not from the browser anymore, which will make things easier to manage
(elasticsearch doesn't have to be opened to internet anymore, etc...)

Finally open up your browser at localhost

Further Reading

The TL;DR and docker-compose spin up 3 docker container running on the same
host, so they use a shared data container volume between
elasticsearch/logstash and logstash-forwarder.

The manual version uses docker cp to extract the ssl folder so it can be
distributed if containers are not runned on the same host.

Docker Pull Command
Source Repository