Product Features
...
Integration
Cloud Storage

Create a Cloud Storage Sync Job

7min

You can create Cloud Storage sync jobs by navigating to Integration > Object.

Before You Begin

Before you can start leveraging Cloud Storage Sync, you will need to set up the source directory. See Set Up Cloud Storage to learn more.

Create Cloud Storage Sync Job

Note: You can schedule the sync job to run at periodic intervals. Any file in the local directory that is newly added or modified between the intervals will be transferred to the cloud at the next interval. Any file that was not transferred successfully at the last interval will be transferred again at the next interval.

To create a cloud storage sync job:

  1. Navigate to Integration > Object.
  2. Click Add sync job. The Add Cloud Sync Job dialog box displays.

    Document image
    
  3. Enter a name for the job.
  4. Select a provider for the job.

Configure the parameters for the provider.

AWS S3

  • Region: Enter the region name. See Amazon Simple Storage Service endpoints and quotas to learn more.
  • Access key ID: Enter the appropriate access key ID. See AWS security credentials to learn more.
  • Secret access key: Enter the appropriate secret access key. See AWS security credentials to learn more.
  • Source: Enter the file path to the local directory. For example: /var/lib/customer/ftp-data/s3-sync. Make sure you have started the Litmus Edge FTP server and created the necessary folders for the file path. See Set Up Cloud Storage to learn more.
  • Destination: Enter the remote S3 bucket name.
  • Transfer mode: Select a transfer mode: Copy or Move.
  • Run every: Configure the time interval for the sync job in minutes or hours.

Azure Blob Storage

  • Access Key: Enter the access key. See the following resources to learn more.
  • Access Name: Enter the storage account name.
  • Source: Enter the file path to the local directory. For example: /var/lib/customer/ftp-data/azure-sync. Make sure you have started the Litmus Edge FTP server and created the necessary folders for the file path. See Set Up Cloud Storage to learn more.
  • Destination: Enter the Azure Blob Storage container name.
  • Transfer mode: Select a transfer mode: Copy or Move.
  • Run every: Configure the time interval for the sync job in minutes or hours.

Google Cloud Storage

  • JSON Key FIle: Pase or upload the JSON key file.
  • Source: Enter the file path to the local directory. For example: /var/lib/customer/ftp-data/gcp-sync. Make sure you have started the Litmus Edge FTP server and created the necessary folders for the file path. See Set Up Cloud Storage to learn more.
  • Destination: Enter the remote GCP bucket name.
  • Transfer mode: Select a transfer mode: Copy or Move.
  • Run every: Configure the time interval for the sync job in minutes or hours.

Databricks Unity Catalog Volume

  • Workspace URL: Enter the Databricks workspace URL. See Get identifiers for workspace objects to learn more.
  • Authentication Option: You have two options.
    • Personal Access Token: If you select this, then:
    • OAuth M2M: If you select this, then:
      • Client Id: Enter the unique identifier for your application, used to authenticate with Databricks.
      • Client Secret: Provide the secret key associated with the Client Id.
  • Source: Enter the file path to the local directory. For example: /var/lib/customer/ftp-data/databricks-sync. Make sure you have started the Litmus Edge FTP server and created the necessary folders for the file path. See Set Up Cloud Storage to learn more.
  • Destination: Enter the file path to Databricks directory.
  • Transfer mode: Select a transfer mode: Copy or Move.
  • Run every: Configure the time interval for the sync job in minutes or hours.

When done configuring the provider, click Save.

Once the job is created, you can enable or disable it. To manage the job, you must disable the job first.

Enable/Disable status for sync jobs
Enable/Disable status for sync jobs