Command line transfers from Object Storage

Command line transfers from Object Storage to EC2 instance

This article provides a set of instructions on how to use the ascp command line on your Aspera on Demand Server to transfer files from a local S3 bucket to another Aspera system.  The examples below show a transfer from S3 to a remote Aspera server and transfers from S3 to the local virtual machine.  

Prerequisites:

  • You have a running Aspera on Demand
  • You have created a local transfer user which is doc-rooted to the local disk
  • You have a bucket in your S3 storage service
  • You have your S3 credentials

Instructions - Preparation

  1. Log into the EC2 instance via SSH.  If you are not sure how, the see article.
  2. Optional: Elevate to the root user

    $ sudo su -
  3. Set the S3 Secret key as an environment variable. In this example, you need to substitute S3_SECRET_KEY with your own S3 secret key.
    Note: The secret key will need to be URL encoded (e.g. substitiute '%2F" for "/")

    # export ASPERA_DEST_PASS=S3_SECRET_KEY
  4. Set the local user password as an environment variable.  In this example, you need to substitute PASSWORD for the users password.
  5. export ASPERA_SCP_PASS=PASSWORD

File Upload Examples

1) CLI transfer a file from S3 to a remote Aspera server:

Transfer_to_remote_host.png

 

Run the following ascp command,. with the following substitutions: USERNAME, REMOTE HOST, S3_ACCESS_ID, BUCKETNAME, FILENAME, DESTINATION_PATH_ON_REMOTE_SERVER

  1. # ascp -l100m  --mode send --user USERNAME --host REMOTE_ASPERA_SERVER s3://S3_ACCESS_ID@s3.amazonaws.com/BUCKETNAME/FILENAME /DESTINATION_PATH_ON_REMOTE_SERVER

NOTE: If you have setup your server to using an IAM role that provides access to S3, then you do not need to specify any S3 secret key or access ID. 

 

2) CLI transfer a file from S3 to an Aspera server at Akamai:

CLI_with_Akamai.png

 

This option is almost identical to the one above, with a few minor differences.  Those difference are that we are using IAM roles for access to the local S3 bucket, and we are using an SSH key to initiate the transfers.  Since I am using AMI roles and an SSH key, I do not have to set any environment variables as show above. In my test, I copied my Akamia private key into  ~/.ssh/aspera01 and set the permissions as follows:

  1. # chmod 400 ~/.ssh/aspera01

Once the key is ready, you can run the following ascp command,. with the following substitutions: USERNAME, REMOTE HOST, BUCKETNAME, FILENAME, DESTINATION_PATH_AT_AKAMAI

  1. # ascp -l100m  --mode send --user USERNAME --host REMOTE_ASPERA_SERVER  -i ~/.ssh/aspera01 s3://s3.amazonaws.com/BUCKETNAME/FILENAME /DESTINATION_PATH_AT_AKAMAI

NOTE: If you have a lot of files that you want to transfer, you can always use the --file-list feature.  Here is an example syntax where we are using the file list feature.

  1.  # ascp -l100m  --mode send --user USERNAME --host REMOTE_ASPERA_SERVER  -i ~/.ssh/aspera01 --file-list=/PATH/TO/FILELIST   /DESTINATION_PATH_AT_AKAMAI

 

Here is an example of the contents of the 'File list'

  1. S3://s3.amazonaws.com/demo-bucket/Sample/concept.png
    S3://s3.amazonaws.com/demo-bucket/Sample/presentation.pptx
    S3://s3.amazonaws.com/demo-bucket/Sample/Analysis.xlsx

 

3) Transfer a file form S3 to the local file system:

 

CLI_transfer_to_local_machine.png

 

Run the following ascp command, with the following substitutions: USERNAME, S3_ACCESS_ID, BUCKETNAME, FILENAME

  1. # ascp -l100m  --mode send --user USERNAME --host localhost s3://S3_ACCESS_ID@s3.amazonaws.com/BUCKETNAME/FILENAME / 

The command should result in a new file in the USERNAME docroot on the local VM.

NOTE: If you have setup your server to using an IAM role that provides access to S3, then you do not need to specify any S3 secret key or access ID.

Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.
Powered by Zendesk