Public Repository

Last pushed: 10 months ago
Short Description
Basic installation to run Celery using RabbitMQ broker on separated container
Full Description

Celery on Docker

Installed basic dependencies of celery to run with a separated host of RabbitMQ. To be able to run it, first create a new network to work in

docker network create my-network

Then, pull and run the RabbitMQ image within the created network

docker run -d --net my-network --name rabbitmq_X rabbitmq:3

Currently, there isn't a Dockerfile, so better run with an interactive terminal to later execute the celery process

docker run -it --net my-network --name celery_X javg44/celery-basic

The source code is into the /opt/app directory and there you will find the tasks.py file, you can edit to add a backend result resource, add more tasks, etc. This project is meant to run a master from a container and multiple containers to better scale an application, so this base image needs to be into each one of the containers to perform the tasks.

Architecture

The architecture recommended is the following

docker run -it --net my-network --name celery_master javg44/celery-basic
docker run -it --net my-network --name celery_worker_N javg44/celery-basic

*In case you have troubles running Celery, just add the following environmental variable $ export C_FORCE_ROOT="true" and verify it was added correctly $ echo $C_FORCE_ROOT. Once N+1 containers are running just need to change the IP Address that is declared in the celery instance (inside tasks.py) to the one that is assigned to the RabbitMQ container. Then, from each one of the worker containers run:

cd /opt/app
celery -A tasks worker --loglevel=INFO

So they can listen to the master running requests or queries. Now inside the master, just need to be located inside /opt/app and you can run python and now you can all the tasks declared to be run on the workers as the following example.

>>>from tasks import add
>>>add.delay(2,2)
Docker Pull Command
Owner
javg44

Comments (0)