Public | Automated Build

Last pushed: 10 months ago
Short Description
Run sitespeedio on a given website and send data to BigQuery
Full Description

Sitespeedio to BigQuery

Insert sitespeed.io results into BigQuery, upload all artifacts to GCloud storage

Usage

docker run -t -i \
  -v "$(pwd)/gcloud-service-account-key.json:/home/root/gcloud-service-account-key.json" \
  -e "BIGQUERY_DATASET=myDataSetId" \
  -e "BIGQUERY_TABLE=myTableId" \
  -e "BUCKET_NAME=myBucketName" \
  -e "GCLOUD_PROJECT_NAME=myProjectId" \
  -e "GCLOUD_SERVICE_ACCOUNT_KEY=/home/root/gcloud-service-account-key.json" \
  -e "SITESPEEDIO_ARTIFACT_PATH=/sitespeed.io/sitespeedio-result" \
  travix/docker-sitespeedio-to-bigquery

Environment variables

Name Description Example value
BIGQUERY_DATASET BigQuery Dataset ID dataset-id
BIGQUERY_TABLE BigQuery Table ID table-id
BUCKET_NAME GCloud storage bucket name bucket-name
DEBUG Add extra logs for debugging true
GCLOUD_PROJECT_NAME BigQuery Project ID gcloud-project-name
GCLOUD_SERVICE_ACCOUNT_KEY GCloud service account key /home/root/service-account.json
SITESPEEDIO_ARTIFACT_PATH Path of sitespeedio results /sitespeedio/sitespeedio-result
Docker Pull Command
Owner
travix