Public Repository

Last pushed: a day ago
Short Description
Short description is empty for this repo.
Full Description



The project consists of 4 Docker containers:

  • postgres: DB container
  • data: Shared valume so postgres data persists
  • web: API wrapped in gunicorn
    • This is called web in the test versions of docker-compose and Dockerfile
  • nginx: NGINX layer. Connects Gunicorn to localhost:80


  • Docker

I also highly recommend you install Anaconda for Sublime Text

Basic Info

Use docker exec -it $(docker ps | grep terraintracker_web | cut -d' ' -f1) /bin/bash to access a running instance
All commands in this README depend on the docker-containers being up.
Nearly all commands happen from /bin/bash in web container.
To start your Docker containers and access /bin/bash from the web container, do the following:

# Kill and rebuild any currently running containers
docker-compose kill; docker-compose build;

# Start Docker (leave this running and keep eye on logs)
docker-compose up

# Use docker ps to find your web container's id
# Run /bin/bash in the web container (in a new terminal)
docker-compose exec -it <web_container_id> /bin/bash

Initial DB Setup ()

# Execute this command from the host machine after running docker-compose up
docker-compose exec -it <postgres_container_id> psql -h postgres -U postgres -d terraintracker -c 'CREATE EXTENSION postgis'

# Create db tables (back in web container)
docker-compose exec -it <web_container_id> /bin/bash
python db upgrade
python initialize


Make sure your db is up-to-date before testing

# Run unit tests
docker-compose exec -it <web_container> /bin/bash
python test

# Run tests with coverage [WIP, may not work]
docker-compose exec -it <web_container> /bin/bash
python install
cd terraintracker
py.test --cov terraintracker --cov-report=html
# open htmlcov/index.html

# Run individual test
docker-compose exec -it <web_container> web /bin/bash
python install
py.test -s <path_to_test_file>

# Lint
python lint

Curl the geo endpoint

docker-compose up
curl -u adam:'Test!234' 'http://localhost/geo/isWater?lat=19.1&lon=10.1'

Changing the db (Alembic migrations)

First, make the changes in the the code (models folder). Then auto-generate a new migration file with

docker-compose exec -it <web_container> python db migrate

Now, edit the new Python file in web/migrations/versions so that it makes sense.
If you see a table/index in the migration file Alembic provided and you think

Alembic should ignore it forever, add it to [alembic:exclude] in alembic.ini

Then, run you migration file with:

docker-compose exec -it <web_container> python db upgrade

Adding columns to live db config

See: for instructions.

Run psql on postgres container

docker-compose exec -it <postgres_container>  psql -h postgres -U postgres -d terraintracker


This initializes a clean application. Destroys all Docker containers on your system. Removes all data.

# Kill and restart all containers, including volumes holding DB tables - CAUTION: YOU WILL LOSE DATA
docker stop $(docker ps -a -q) && docker rm $(docker ps -a -q) && docker volume rm $(docker volume ls -q) && docker-compose build && docker-compose up
# Give Postgres a chance to initialize
sleep 40  # wait a lil bit for the containers to come up

# In a new terminal:
#   Migrate. Add initialization data (default users, etc). Test.
docker-compose exec -it <web_container> /bin/bash
python ./ db upgrade
python ./ initialize
python ./ test


Docker Cloud

SSHing into Docker Cloud prod

A shorter version of [these instructions]( 
1) Add your ssh key to the authorizedkeys stack and deploy it
2) ssh into [the AWS node]( with:
   ssh -i <your_keyfile> root@
3) use `docker ps` to find the container you want to ssh into 
4) docker exec -it <CONTAINER_ID>


Docker Pull Command