Public | Automated Build

Last pushed: a month ago
Short Description
climate-explorer-backend automated build
Full Description

PCIC Climate Explorer Backend


$ sudo apt-get install libpq-dev python-dev libhdf5-dev libnetcdf-dev libgdal-dev

GDAL doesn't properly source its own lib paths when installing the python package, so we have to define
these environment variables:

$ export CPLUS_INCLUDE_PATH=/usr/include/gdal
$ export C_INCLUDE_PATH=/usr/include/gdal



Setup using virtual environment.
Use Python 3 module venv, not virtualenv, which installs Python 2.
Unit tests use datetime.timezone, which is available only in Python 3.

$ python3 -m venv venv
$ source venv/bin/activate
(venv)$ pip install -U pip
(venv)$ pip install -i -r requirements.txt
(venv)$ pip install -e .

Running the dev server

A development server can be run locally by using the Flask command line interface documented here. In general, you need to set one environment variable FLASK_APP=ce.wsgi:app and can optionally set FLASK_DEBUG=1 for live code reloading.

Database dsn can be configured with the MDDB_DSN environment variable. Defaults to 'postgresql://'

(venv)$ MDDB_DSN=postgresql://dbuser:dbpass@dbhost/dbname FLASK_APP=ce.wsgi:app flask run -p <port>


Within the virtual environment:

pip install pytest
py.test -v

Using Docker container to test current directory:

sudo docker run --rm -it -v ${PWD}:/app --name backend-test pcic/climate-explorer-backend bash -c "pip install pytest; py.test -v ce/tests"

Using Docker container to test remote code changes:

sudo docker run --rm -it --name backend-test pcic/climate-explorer-backend bash -c "apt-get update; apt-get install -yq git; git fetch; git checkout <commit-ish>; pip install pytest; py.test -v ce/tests"

Setup using Docker:

Build the image:

git clone
cd climate-explorer-backend
docker build -t climate-explorer-backend-image .

It's convenient to create a seperate read-only docker container to mount the data. This container can be shared by multiple instances of the server backend. More -v arguments can be supplied as needed to bring together data from multiple locations, as long as individual files end up mapped onto the locations given for them in the metadata database.

docker run --name ce_data -v /absolute/path/to/wherever/the/needed/data/is/:/storage/data/:ro ubuntu 16.04

Finally run the climate explorer backend image as a new container.

docker run -it -p whateverexternalport:8000 
               -e "MDDB_DSN=postgresql://dbuser:dbpassword@host/databasename" 
               --volumes-from ce_data 
               --name climate-explorer-backend

If you aren't using a read-only data container, replace --volumes-from ce_data with one or more -v /absolute/path/to/wherever/the/needed/data/is/:/storage/data/ arguments.

If using the test data is sufficient, use -e "MDDB_DSN=sqlite:////app/ce/tests/data/test.sqlite" when running the container.


Creating a versioned release involves:

  1. Incrementing __version__ in
  2. Summarize the changes from the last release in
  3. Commit these changes, then tag the release:

    git add
    git commit -m"Bump to version x.x.x"
    git tag -a -m"x.x.x" x.x.x
    git push --follow-tags
Docker Pull Command