Public | Automated Build

Last pushed: a year ago
Short Description
Run sitespeedio on a given website and send data to BigQuery
Full Description

Sitespeedio to BigQuery

Insert results into BigQuery, upload all artifacts to GCloud storage


docker run -t -i \
  -v "$(pwd)/gcloud-service-account-key.json:/home/root/gcloud-service-account-key.json" \
  -e "BIGQUERY_DATASET=myDataSetId" \
  -e "BIGQUERY_TABLE=myTableId" \
  -e "BUCKET_NAME=myBucketName" \
  -e "GCLOUD_PROJECT_NAME=myProjectId" \
  -e "GCLOUD_SERVICE_ACCOUNT_KEY=/home/root/gcloud-service-account-key.json" \

Environment variables

Name Description Example value
BIGQUERY_DATASET BigQuery Dataset ID dataset-id
BIGQUERY_TABLE BigQuery Table ID table-id
BUCKET_NAME GCloud storage bucket name bucket-name
DEBUG Add extra logs for debugging true
GCLOUD_PROJECT_NAME BigQuery Project ID gcloud-project-name
GCLOUD_SERVICE_ACCOUNT_KEY GCloud service account key /home/root/service-account.json
SITESPEEDIO_ARTIFACT_PATH Path of sitespeedio results /sitespeedio/sitespeedio-result
Docker Pull Command