This image is deprecated, use graylog2/server instead.
For support, please use the official community support channels provided by Graylog: https://www.graylog.org/community-support
This project creates a Docker container with full Graylog stack installed.
You need a recent
docker version installed, take a look here for instructions. Also check that there are enough free resources, the container should get at least 4gb ram.
$ docker pull graylog2/allinone $ docker run -t -p 9000:9000 -p 12201:12201 graylog2/allinone
This will create a container with all Graylog services running.
After starting the container, your Graylog instance is ready to use.
You can reach the web interface by pointing your browser to the IP address of your Docker host:
The default login is Username:
How to get log data in
You can create different kinds of inputs under System->Inputs, however you can only use ports that have been properly mapped to your docker container, otherwise data will not get through. You already exposed the default GELF port 12201, so it is a good idea to start a GELF TCP input there. Here is a list of available GELF integrations. To start another input you have to expose the right port e.g. to start a raw TCP input on port 5555; stop your container and recreate it, whilst addpending
-p 5555:5555 to your run argument. Similarly, the same can be done for UDP via appending
-p 5555:5555/udp option. Then you can send raw text to Graylog like
echo 'first log message' | nc localhost 5555
You can configure the most important aspects of your Graylog instance through environment variables. In order
to set a variable add a
-e VARIABLE_NAME option to your
docker run command. For example to set another admin password
start your container like this:
$ docker run -t -p 9000:9000 -p 12201:12201 -e GRAYLOG_PASSWORD=SeCuRePwD graylog2/allinone
GRAYLOG_PASSWORD: Set admin password
GRAYLOG_USERNAME: Set username for admin user (default: admin)
GRAYLOG_TIMEZONE: Set timezone (TZ) you are in
GRAYLOG_SMTP_SERVER: Hostname/IP address of your SMTP server for sending alert mails
GRAYLOG_RETENTION: Configure how long or how many logs should be stored
GRAYLOG_NODE_ID: Set server node ID (default: random)
GRAYLOG_SERVER_SECRET: Set salt for encryption
GRAYLOG_MASTER: IP address of a remote master container (see multi container setup)
GRAYLOG_SERVER: Run only server components
GRAYLOG_WEB: Run web interface only
ES_MEMORY: Set memory used by Elasticsearch (syntax: 1024m). Defaults to 60% of host memory
Set an admin password:
Change admin username:
Set your local timezone:
Set a SMTP server for alert e-mails:
Disable TLS/SSL for mail delivery:
GRAYLOG_SMTP_SERVER="mailserver.com --no-tls --no-ssl"
Set SMTP server with port, authentication and changed sender address
GRAYLOG_SMTP_SERVER="mailserver.com --port=465 --firstname.lastname@example.org --password=SecretPassword --email@example.com --web-url=http://my.graylog.host"
Set a static server node ID:
Set a configuration master for linking multiple containers:
Only start server services:
Only run web interface:
Keep 30Gb of logs, distributed across 10 Elasticsearch indices:
Keep one month of logs, distributed across 30 indices with 24 hours of logs each:
Limit amount of memory Elasticsearch is using:
You can mount the data and log directories to store your data outside of the container:
$ docker run -t -p 9000:9000 -p 12201:12201 -e GRAYLOG_NODE_ID=some-rand-omeu-uidasnodeid -e GRAYLOG_SERVER_SECRET=somesecretsaltstring -v /graylog/data:/var/opt/graylog/data -v /graylog/logs:/var/log/graylog graylog2/allinone
Please make sure that you always use the same node-ID and server secret. Otherwise your users can't login or inputs will not be started after creating a new container on old data.
Multi container setup
The Omnibus package used for creating the container is able to split Graylog into several components.
This works in a Docker environment as long as your containers run on the same hardware respectively the containers
need to have direct network access between each other.
The first started container is the so called
master, other containers can grab configuration options from here.
To setup two containers, one for the web interface and one for the server component do the following:
master with Graylog server parts
$ docker run -t -p 12900:12900 -p 12201:12201 -p 4001:4001 -e GRAYLOG_SERVER=true graylog2/allinone
The configuration port 4001 is now accessable through the host IP address.
Start the web interface in a second container and give the host address as
master to fetch configuration options
$ docker run -t -p 9000:9000 -e GRAYLOG_MASTER=<host IP address> -e GRAYLOG_WEB=true graylog2/allinone
- In case you see warnings regarding open file limit, try to set ulimit from the outside of the container::
$ docker run --ulimit nofile=64000:64000
devicemapperstorage driver can produce problems with Graylogs disk journal on some systems. In this case please pick another driver like
overlay. Have a look here
but it's still Graylog (1.3.4).
Can you update it to the newest version of graylog?
Hello , I created 2 containners , 1 with graylogSERVER other with graylog WEB .
docker run -t -p 12900 : 12900 -p 12201 : 12201 -p 4001 : 4001 -e GRAYLOG_SERVE R = true graylog2 / allinone
$ Docker run -t -p 9000: 9000 -e 172.17.4.2 GRAYLOG_MASTER = -e -e GRAYLOG_WEB = true GRAYLOG_PASSWORD = mypass graylog2 / allinone
Go up everything , but when I try to enter my user and password at http : // servergraylog 9000 admin user and password mypass it not authentic , the authentication error
Thank you very much in advance
+1 for tags
+1 for tags
i have trouble upgrading my containers
Please also make graylog startup parameters configurable by docker environment variable. E.g. -e JAVA_OPTS="..."
I don't want to use elasticsearch server inside the container, is there a way to:
- not start the elasticsearch process in the container
- pass the external elasticsearch node IP
by using env. variables or some other way?
"Please provide tags for different versions."+1
+1 for introducing tags
Please provide tags for different versions. It's not good practice to use "latest" in production environments.
I am trying to send test email but in back-end I am getting
"2015-10-21_07:21:56.91585 WARN [EmailAlarmCallback] Stream [562732b4e4b08144f1df7828: "Common API Stream"] has alert receivers and is triggered, but email transport is not configured. "
Can anyone help me in this?