Public Repository

Last pushed: 5 days ago
Short Description
The Pacifica Ingest backend processing tasks
Full Description

Pacifica Ingest Services










This is the Pacifica Ingest Services API.

This service receives, validates and processes data that is provided by a
Pacifica Uploader service.

Installing the Service

Prerequisites

To run the code, the following commands are required:

The Manual Way

Install the dependencies using the pip command:

pip install -r requirements.txt

Build and install the code using the setup.py script:

python setup.py build
python setup.py install

Running the Service

Start and run a new instance using the docker-compose command:

docker-compose up

Bundle Format

The bundle format is parsed using the tarfile
package from the Python standard library.

Both data and metadata are stored in a bundle. Metadata is stored in the
metadata.txt file (JSON format). Data is stored in the data/ directory.

To display the contents of a bundle using the tar command:

tar -tf mybundle.tar

For example, the contents of mybundle.tar is:

data/mywork/project/proposal.doc
data/mywork/experiment/results.csv
data/mywork/experiment/results.doc
metadata.txt

API Examples

The endpoints that define the ingest process are as follows. The assumption is that the installer
knows the IP address and port the WSGI service is listening on.

Ingest (Single HTTP Request)

Post a bundle (defined above) to the endpoint.

POST /ingest
... tar bundle as body ...

The response will be the job ID information as if you requested it directly.

{
  "job_id": 1234,
  "state": "OK",
  "task": "UPLOADING",
  "task_percent": "0.0",
  "updated": "2018-01-25 16:54:50",
  "created": "2018-01-25 16:54:50",
  "exception": ""
}

Failures that exist with this endpoint are during the course of uploading the bundle.
Sending data to this endpoint should consider long drawn out HTTP posts that maybe
longer than clients are used to handling.

Move (Single HTTP Request)

Post a metadata document to the endpoint.

POST /move
... content of move-md.json ...

The response will be the job ID information as if you requested it directly.

{
  "job_id": 1234,
  "state": "OK",
  "task": "UPLOADING",
  "task_percent": "0.0",
  "updated": "2018-01-25 16:54:50",
  "created": "2018-01-25 16:54:50",
  "exception": ""
}

Get State for Job

Using the job_id field from the HTTP response from an ingest.

GET /get_state?job_id=1234
{
  "job_id": 1234,
  "state": "OK",
  "task": "ingest files",
  "task_percent": "0.0",
  "updated": "2018-01-25 17:00:32",
  "created": "2018-01-25 16:54:50",
  "exception": ""
}

As the bundle of data is being processed errors may occure, if that happens the following
will be returned. It is useful when consuming this endpoint to plan for failures. Consider
logging or showing a message visable to the user that shows the ingest failed.

GET /get_state?job_id=1234
{
  "job_id": 1234,
  "state": "FAILED",
  "task": "ingest files",
  "task_percent": "0.0",
  "updated": "2018-01-25 17:01:02",
  "created": "2018-01-25 16:54:50",
  "exception": "... some crazy python back trace ..."
}

CLI Tools

There is an admin tool that consists of subcommands for manipulating ingest processes.

Job Subcommand

The job subcommand allows administrators to directly manipulate the state of a job. Due
to complex computing environments some jobs may get "stuck" and get to a state where
they aren't failed and aren't progressing. This may happen for any number of reasons but
the solution is to manually fail the job.

IngestCMD job \
    --job-id 1234 \
    --state FAILED \
    --task 'ingest files' \
    --task-percent 0.0 \
    --exception 'Failed by adminstrator'

Contributions

Contributions are accepted on GitHub via the fork and pull request workflow.
GitHub has a good help article
if you are unfamiliar with this method of contributing.

Docker Pull Command
Owner
pacifica