Product Features
...
Integration
Cloud Storage
Create a Cloud Storage Sync Job
7min
you can create cloud storage sync jobs by navigating to integration > object before you begin before you can start leveraging cloud storage sync, you will need to s et up the source directory see cloud storage docid 61 ivcykoax5gbydqdg3y to learn more create cloud storage sync job note you can schedule the sync job to run at periodic intervals any file in the local directory that is newly added or modified between the intervals will be transferred to the cloud at the next interval any file that was not transferred successfully at the last interval will be transferred again at the next interval to create a cloud storage sync job navigate to integration > object click add sync job the add cloud sync job dialog box displays enter a name for the job select a provider for the job configure the parameters for the provider aws s3 region enter the region name see amazon simple storage service endpoints and quotas to learn more access key id enter the appropriate access key id see aws security credentials to learn more secret access key enter the appropriate secret access key see aws security credentials to learn more source enter the file path to the local directory for example /var/lib/customer/ftp data/s3 sync make sure you have started the litmus edge ftp server and created the necessary folders for the file path see cloud storage docid 61 ivcykoax5gbydqdg3y to learn more destination enter the remote s3 bucket name transfer mode select a transfer mode copy or move run every configure the time interval for the sync job in minutes or hours azure blob storage access key enter the access key see the following resources to learn more security recommendations about azure key vault access name enter the storage account name source enter the file path to the local directory for example /var/lib/customer/ftp data/azure sync make sure you have started the litmus edge ftp server and created the necessary folders for the file path see cloud storage docid 61 ivcykoax5gbydqdg3y to learn more destination enter the azure blob storage container name transfer mode select a transfer mode copy or move run every configure the time interval for the sync job in minutes or hours google cloud storage json key file pase or upload the json key file source enter the file path to the local directory for example /var/lib/customer/ftp data/gcp sync make sure you have started the litmus edge ftp server and created the necessary folders for the file path see cloud storage docid 61 ivcykoax5gbydqdg3y to learn more destination enter the remote gcp bucket name transfer mode select a transfer mode copy or move run every configure the time interval for the sync job in minutes or hours databricks unity catalog volume workspace url enter the databricks workspace url see get identifiers for workspace objects to learn more authentication option you have the following options personal access token if you select this, then access token you can paste the access token here see databricks personal access token authentication to learn more source enter the file path to the local directory for example /var/lib/customer/ftp data/databricks sync make sure you have started the litmus edge ftp server and created the necessary folders for the file path see cloud storage docid 61 ivcykoax5gbydqdg3y to learn more destination enter the file path to databricks directory transfer mode select a transfer mode copy or move run every configure the time interval for the sync job in minutes or hours see databricks unity catalog volume docid\ dcqfm tophdzb9qa8wswm guide to configure the databricks unity catalog volume on the litmus edge webui to send files directly to the databricks unity catalog when done configuring the provider, click save once the job is created, you can enable or disable it to manage the job, you must disable the job first