Public | Automated Build

Last pushed: 2 years ago
Short Description
Short description is empty for this repo.
Full Description

Dockerfile of Export Tool for

This repository contains Dockerfile of Export toole for

Base Docker Image



  1. Install Docker
  2. Pull Docker Image
    docker pull tzmfree/sfdc-exportjs


Create log directory.

mkdir /var/log/sfdc-exportjs

Run the docker container.

docker run --rm -t \
-e SFDC_PASSWORD="fuga123" \
-e SFDC_TARGETOBJECT="Account;Contact" \
-e SFDC_LIMIT="500000" \
-e S3_BUCKET="backup" \
-e AWS_ACCESS_KEY_ID="AKI***************" \
-e AWS_SECRET_ACCESS_KEY="********************" \
-e MODE="parallel" \
-v /var/log/sfdc-exportjs:/var/log/sfdc-exportjs \
tzmfree/sfdc-exportjs /sbin/my_init -- ./

The logs for exporting is stored in /var/log/sfdc-exportjs directory for the container.
If you want to store the logs persistently, run docker command with volume option.

Environment Variables

SFDC_USERNAME (required)

Salesforce login username

SFDC_PASSWORD (required)

Salesforce login password

SFDC_LOGINURL (optional)

Salesforce login URL. If you want your container to access to sandbox, set to "".
If not specified, this value defaults to "".


Semi-colon splitted salesforce object api names.
For example, "Account;Contact;User;CustomObject__c".

SFDC_LIMIT (optional)

SOQL limit size for each object.
If not specified, this value defaults to 500,000.

S3_BUCKET (optional)

If you want to store your data in s3, set s3 bucket name that you want to store in.

AWS_ACCESS_KEY_ID (optional)

If you want to store your data in s3, set AWS_ACCESS_KEY_ID value.
If not specified, the exported data is stored only in local file system.


If you want to store your data to s3, set AWS_SECRET_ACCESS_KEY value.

MODE (optional)

If you want each exporting to work in parallel, set mode value to "parallel".
If not specified, this value defaults to "serial" that means "work in serial".

Docker Pull Command
Source Repository