Revolutionize Your AWS Strategy: Effortless Copying of S3 Standard Objects Between Two Aws Accounts using EC2
Overview :-
In today’s landscape, managing data efficiently across multiple AWS accounts is a necessity for many businesses. Specifically, copying S3 Standard objects between two AWS accounts can seem daunting but is essential for data management and security purposes. By leveraging Amazon EC2, businesses can streamline this process, ensuring both a secure and swift transfer of data.
Pre-requisites :-
Before we jump into the magic formula, let’s make sure we have everything we need:
Two AWS accounts: Obviously, you can’t transfer between accounts if you only have one.
IAM permissions: Make sure you have the necessary permissions in both accounts. This means having access to the S3 buckets you’re planning to work with.
Bucket names and regions: You need to know where you’re copying from and to, right?
Aws-Cli: You must have knowledge about aws-cli.
Step-1:- First login to your AWS source account and open your s3 source bucket & then add the below bucket policy to the s3 bucket.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::123456789:root"
},
"Action": [
"s3:RestoreObject",
"s3:ListBucket",
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::source-bucket_name/*",
"arn:aws:s3:::source-bucket_name"
]
}
]
}
- Replace the account-id and source-bucket_name with your first aws account’s s3 bucket values.
Step-2 :- Next Login to your destination (second) aws account and navigate to ec2 service & Create an #EC2 Key Pair and download the Keypair.
Step-3:- Open your Iam service and Create a new #IAM role. Ensure that the IAM role has the necessary permissions to access the S3 bucket and retrieve objects from #S3GlacierDeepArchive. Make sure that the role has below trust relation-ship policy and s3 full access.
#trust relation-ship policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "ec2.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
Step-4:- Launch an EC2 Instance and attach the Ec2 keypair and Iam role which are being created above. Make sure that the instance is running in the same AWS region as your S3 Glacier Deep Archive bucket.
Step-5 :-Open a terminal window and Check, if the AWS Command Line Interface (CLI) tool is not already installed, you can follow the AWS CLI installation instructions relevant to your operating system. Configure the AWS credentials of destination s3 account. While connected to the EC2 instance, execute the command aws configure
to properly configure the AWS CLI. Provide the necessary AWS access key, secret access key, default region, and preferred output format. Access keys can be obtained from the IAM section of the #AWSManagementConsole.
Step-6:- Establish a connection to the EC2 instance using SSH or another remote access method.
- Configure your first aws account’s credentials (access&secret keys) in the ec2 instance and run the below command to copy the objects
aws s3 sync s3://source-bucket_name s3://destination-bucket_name
Conclusion :-
Copying S3 Standard objects between two AWS accounts using EC2 might not sound like the most straightforward task, but it is an efficient method to ensure your data management strategies are robust, secure, and aligned with your business’s operational needs. With proper setup and configuration, this process can be automated, resulting in a secure, traceable method of data management that supports various operational requirements. Remember, thorough planning and adherence to AWS security best practices will turn this seemingly cumbersome task into a smooth and controlled process.