Copy the S3 data across different accounts using Bash script 


Amazon Web Services (AWS) provides
an efficient way to store and manage your data through its Simple Storage Service (S3). Sometimes, you may need to transfer data between different AWS accounts securely and efficiently. In this comprehensive guide, we will walk you through the process of copying S3 data from one AWS account (Account A) to another (Account B) using a Bash script.

This step-by-step tutorial will help you set up the necessary permissions, configure AWS CLI, and execute the script to seamlessly transfer your data.

Steps :

1.-Create an AWS S3 bucket in your source account (A) and upload some files in it.


2.-Create an AWS IAM user in your source account (A)  and create access and secret keys.
3.- Attach the following policies to your IAM user created in the previous step.

  • AmazonS3FullAccess
  • Custom inline policy with following content
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",

"Action": [
"s3:PutObject",
"s3:ListBucket",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::<destination bucket>",
"arn:aws:s3:::<destination bucket>/*"
]
}
]
}

4.-Now Go to Destination AWS Account (B) and create a destination S3 bucket where files will be uploaded

5.-Now go to S3 bucket → permissions and click on Edit to add S3 bucket policy 

6.-Add the following JSON policy and click on Save changes 

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DelegateS3Access",
"Effect": "Allow",
"Principal": {
"AWS": "<AWS source account (A) user arn>"
},
"Action": [
"s3:PutObject",
"s3:ListBucket",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::<destination bucket name>",
"arn:aws:s3:::<destination bucket name>/*"
]
}
]
}

7.-Now create a new bash script with the following content and make it executable.

#!/bin/bash
# Function to copy objects from one S3 bucket to another in batches, excluding objects already present in the destination bucket
copy_if_not_present() {
    src_bucket=$1
    dest_bucket=$2
    batch_size=$3
    # Get the list of objects from the source bucket
    objects=($(aws s3api list-objects --bucket $src_bucket --query 'Contents[].{Key: Key}' --output text))
    # Get the list of objects already present in the destination bucket
    existing_objects=($(aws s3api list-objects --bucket $dest_bucket --query 'Contents[].{Key: Key}' --output text))
    # Initialize an array to store unique objects
    unique_objects=()
    # Identify unique objects that are not already present in the destination bucket
    for obj in "${objects[@]}"; do
        if [[ ! " ${existing_objects[@]} " =~ " ${obj} " ]]; then
            unique_objects+=("$obj")
        fi
    done
    # Loop through the unique objects in batches
    for ((i = 0; i < ${#unique_objects[@]}; i += $batch_size)); do
        batch=("${unique_objects[@]:i:batch_size}")
        for obj in "${batch[@]}"; do
            aws s3 cp "s3://$src_bucket/$obj" "s3://$dest_bucket/$obj"
    
        done
        echo "Batch $((i / batch_size + 1)) of $(((${#unique_objects[@]} + $batch_size - 1) / $batch_size)) completed."
 exit 0;
    done
    echo "Data transfer completed."
}
# Define the source and destination buckets
#source_bucket="s3migrationbucket"
#destination_bucket="s3migrationdestination"
# Set the batch size
# batch_size=50
# Call the function with the provided parameters
copy_if_not_present $1 $2 $3

8.Now run the following command to configure aws cli and add the access & secret key of your source AWS account (A).

$ aws configure 


9.-Once, the aws cli is configured run the bash script with following parameters

$ ./bash_script <source_bucket> <destination_bucket> <batch_count>

For eg 

$ ./bash_script s3cp-source-bucket s3cp-destination-bucket 10


10.-Files are copied to the destination bucket

Files are copied to the destination bucket

Please don’t hesitate to reach out for any discussions, updates, or questions. We welcome your input and are here to assist. We’re eager to hear from you!
This project was successfully completed by  “𝐑o𝐡a𝐧 𝐚n𝐝 𝐑i𝐬h𝐢 𝐊h𝐮r𝐚n𝐚.”

Author

admin

Leave a comment

Your email address will not be published. Required fields are marked *