This integration solution includes two parts, migrate then sync. First migrate Lyve Cloud S3 bucket objects from AWS S3 - moving all objects from AWS source to Lyve Cloud. Second, sync using AWS Chalice framework. It will create two AWS Lambdas which will monitor any object creation\deletion event in source AWS bucket and will duplicate the operation to same objects in Lyve Cloud destination bucket.
Python3
and above is required.AWS CLI v2
and above is required.Docker v20.10.12
and above is required.- Lyve Cloud access and secret keys. These can be obtained from the Lyve Cloud Console by creating a new Service Account to be used by this solution.
- Lambda storage is limited to 10GB, therefore objects bigger than 10GB will not be replicated. They will be skipped. The user have to configure the max lambda size by himself (inside the lambda configurations).
- Lambdas generated will be unique for set of two buckets, secret keys, and IAM roles (meaning the user will have to create different secret keys and different lambdas for new buckets before run).
- The integration is limited to working with AWS default profile.
Login to Lyve Cloud console, create a Service Account with appropriate permissions and extract the following:
- Access Key
- Secret key
- Endpoint URL
- Login to AWS and go over to IAM.
- Click on the "Users" tab.
- Click "Add Users".
- Enter any username you would like. Under the "Select AWS access type" select "Access key - Programmatic access".
- In the permissions page, select "Attach existing policies directly".
- Search the following policies and select to add them (**Warning: all the policies will provide full access):
- SecretsManagerReadWrite
- CloudWatchFullAccess
- AmazonS3FullAccess
- AWSLambdaRole
- AWSLambda_FullAccess
- AmazonS3ObjectLambdaExecutionRolePolicy
- Click "Next: Tags".
- Click "Next: Review".
- Click "Create user".
- Your keys are now generated, copy the "Access Key ID" and "Secret access key".
- Please configure AWS credentials for default profile.
- Run:
cd migrate
- Fill in the necessary lines inside the brackets of the commands below (without quotation marks and delete the scopes):
docker build -t s3syncer .
docker run --name s3syncer --rm -i -t s3syncer bash
export AWS_DEFAULT_REGION=[Enter bucket region]
export AWS_ACCESS_KEY_ID= [Enter AWS access key]
export AWS_SECRET_ACCESS_KEY=[Enter AWS secret key]
export SOURCE_BUCKET=[Enter AWS source bucket]
export SOURCE_FOLDER=/data
export TARGET_FOLDER=/data
export TARGET_BUCKET=[Enter LyveCloud destination bucket]
python3 backup.py
export AWS_ACCESS_KEY_ID=[Enter LyveCloud access key]
export AWS_SECRET_ACCESS_KEY=[Enter LyveCloud secret key]
export ENDPOINT=[Enter LC ENDPOINT]
python3 restore.py
- Copy and paste all the commands (after your changes) to terminal and press enter, if python command didn't execute press enter again.
- Exit from the docker container when the migration is completed using
exit
.
When objets are copied successfully:
Output:
b'Completed 36 Bytes/36 Bytes (39 Bytes/s) with 1 file(s) remaining\rdownload: s3:// source bucket /out.txt to ../data/file name in source\n'
- Run:
cd ../syncer
- Run the cli tool. Explanation of the cli parameters:
python3 syncer_cli.py run --help
Usage: syncer_cli.py run [OPTIONS] lc_region lc_endpoint secret_name lc_access_key lc_secret_key source_bucket target_bucket
The tool has two options:
- Create a new secret from AWS SecretManager
- Use an existing secret using -e or --exit flag
-----------------------------------
lc_region: lyve cloud region name
lc_endpoint: lyve cloud endpoint
secret_name: secret key for secret manager
lc_access_key: lyve cloud access key
lc_secret_key: lyve cloud secret key
source_bucket: aws bucket name
target_bucket: lyve cloud bucket name
Options:
-e, --exist Boolean flag to use an existing key, for example: -e=true
--help Show the command opthions
- Login to AWS Management console and go over to Secrets Manager.
- Click
Store a new secret
. - Choose
Other type of secret
for Secret type. - Under Key/value pairs, create key/value pairs for the following:
- Key:
lc_access_key
, Value:[Access key from step 1]
. - Key:
lc_secret_key
, Value:[Secret key from step 1]
. - Key:
lc_endpoint_url
, Value:[Endpoint URL including https:// from step 1]
.
- Key:
- Click
Next
. - Enter
Secret name
and make a note of it as you will need it during Lambda function creation. Rest of the field, you can leave it as default.
- Click
Next
. - No need to set Secret rotation for this sample, so you can leave them to default.
- Click
Next
. - Review the summary and Click
Store
. - Once created, you will see them under Secrets.
After a successful implementation two lambdas and the requested secret key should be created. Output example:
Configuring S3 events in bucket aws_bucket to function syncer-dev-added_object_sync
Creating IAM role: syncer-dev-Removed_object_sync
Creating lambda function: syncer-dev-Removed_object_sync
Resources deployed:
- Lambda Layer ARN: arn:aws:lambda:us-east-1:secretnumber:layer:syncer-dev-managed-layer:30
- Lambda ARN: arn:aws:lambda:us-east-1:secretnumber:function:syncer-dev-added_object_sync
- Lambda ARN: arn:aws:lambda:us-east-1:secretnumber:function:syncer-dev-Removed_object_sync
To run again please change the names of the lambda functions located in 'app.py'
- Click
syncer
directory - Click
app.py
file - Rename both of the lambdas example : change the name to "new name" in app.py :
- May 15, 2022: Bari Arviv (bari.arviv@seagate.com) on Ubuntu 20.4
- June 29, 2022: Leon Markovich (leon.markovich@seagate.com) on Ubuntu 20.4
This folder contains all files needed for migration docker between AWS bucket and Lyve Cloud.
This folder contains all files needed for syncing between AWS bucket and Lyve Cloud after each create\delete object operation.
Contains images for the documentation.