Simulating Amazon S3 Locally with Moto and AWS CLI
Simulating Amazon S3 Locally with Moto and AWS CLI
Simulating Amazon S3 Locally with Moto and AWS CLI
Theory
Amazon S3 (Simple Storage Service) is a cloud storage service offered by
Amazon Web Services (AWS) that provides scalable and secure storage for data of any type and volume.
It is used to store and retrieve any amount of data at any time, from anywhere on the web.
Here are some key features of Amazon S3:
- Scalable storage: S3 can automatically scale to handle any data volume, from a few kilobytes to petabytes.
- Global accessibility: Data stored in S3 is accessible from any location via the internet.
- Durability and availability: S3 provides 99.999999999% (11 nines) durability and 99.99% availability of stored objects.
- Security: S3 offers various security options, including encryption at rest and in transit, and fine-grained access controls via security policies and permissions.
- Cost efficiency: S3 offers multiple storage classes to help users optimize costs, including options for frequently accessed storage, long-term storage, backups, and archiving.
- Integration with other AWS services: S3 easily integrates with other AWS services, such as EC2 (Elastic Compute Cloud), RDS (Relational Database Service), and Lambda to build complex and scalable applications.
Practice
Open a terminal window in VSCode and enter the following command:
moto_server
After running this command, the terminal window will display the following output:
* Running on http://127.0.0.1:5000
Open a new terminal window, or you can split the current terminal (split
).
In the new window, enter the following command:
export AWS_ENDPOINT_URL=http://127.0.0.1:5000
This command sets an endpoint for aws-cli. An endpoint is the entry point URL for an AWS web service. Thus, when we execute a command using aws-cli, the request will be intercepted and processed by the Moto server, instead of being sent to AWS regional or global servers.
First, let’s check if our configuration works correctly. To do this, we can run any available AWS command. Let’s try running the command that shows the identity we are logged in with:
aws sts get-caller-identity
Running this command should return the following message:
You must specify a region. You can also configure your region bu running "aws configure".
This error message indicates that the AWS CLI has not been configured and the SDK does not know which AWS region is selected or set as default. To solve this issue, we can run the command:
aws configure
This command will ask for some values you need to enter.
Normally, these values are valid credentials, but since we are using a local setup with Moto, these credentials have no functional impact.
For this reason, we can enter any values for AWS Access Key ID
and AWS Secret Access Key
, as shown below:
AWS Access Key ID [None]: test
AWS Secret Access Key [None]: test
Default region name [None]: us-east-1
Default output format [None]: json
Now, let’s try running the previous command again:
aws sts get-caller-identity
This time, we should not get an error, and we will see output similar to the one below:
{
"UserId": "AKIAIOSFODNN7EXAMPLE",
"Account": "1234556789012",
"Arn": "arn:aws:sts::1234556789012:user/moto"
}
This output indicates a randomly generated Moto account, similar to an AWS account number, as well as an ARN (Amazon Resource Name) for the caller identity.
In the terminal where the Moto server is running, we can confirm the request received from AWS CLI:
127.0.0.1 - - [25/Jun/2024 17:31:47] "POST / HTTP/1.1" 200
AWS S3
In a new file, enter the following code:
import boto3
import io
def main():
# Configure the boto3 client to use the local moto endpoint
s3 = boto3.client("s3", region_name="us-east-1", endpoint_url="http://localhost:5000")
# Create a bucket
bucket_name = "example-bucket"
s3.create_bucket(Bucket=bucket_name)
# List buckets to verify creation
response = s3.list_buckets()
print("Existing buckets:")
for bucket in response["Buckets"]:
print(f" {bucket['Name']}")
# Create a test file
file_content = "Acesta este un fișier de test."
file_obj = io.BytesIO(file_content.encode("utf-8"))
# Upload the file
s3.upload_fileobj(file_obj, bucket_name, "test_file.txt")
# List objects in the bucket to verify upload
response = s3.list_objects_v2(Bucket=bucket_name)
print("Files in bucket:")
for obj in response.get("Contents", []):
print(f" {obj['Key']}")
# Download the file
download_obj = io.BytesIO()
s3.download_fileobj(bucket_name, "test_file.txt", download_obj)
download_obj.seek(0)
print("Downloaded file content:")
print(download_obj.read().decode("utf-8"))
# Delete the file
s3.delete_object(Bucket=bucket_name, Key="test_file.txt")
# Verify deletion
response = s3.list_objects_v2(Bucket=bucket_name)
print("Files in bucket after deletion:")
for obj in response.get("Contents", []):
print(f" {obj['Key']}")
if __name__ == "__main__":
main()
Listing buckets
s3 = boto3.client("s3", region_name="us-east-1", endpoint_url="http://localhost:5000")
– Creates an S3 client using boto3. This client is configured to use the local moto endpoint, specified by endpoint_url.
s3.create_bucket(Bucket=bucket_name)
– Creates an S3 bucket with the specified name.
response = s3.list_buckets()
– Gets a list of existing buckets.
for bucket in response["Buckets"]
– Iterates through the list of buckets and prints their names.
Creating and uploading a file
file_content = "Acesta este un fișier de test"
– The test file content.
file_obj = io.BytesIO(file_content.encode("utf-8"))
– Creates a BytesIO
object for the test file.
s3.upload_fileobj(file_obj, bucket_name, "test_file.txt")
– Uploads the file to the bucket.
Listing objects in the Bucket
response = s3.list_objects_v2(Bucket=bucket_name)
– Gets a list of objects in the bucket.
for obj in response.get("Contents", [])
– Iterates through the list of objects and prints their keys.
Downloading a file
download_obj = io.BytesIO()
– Creates a BytesIO object for downloading the file.
s3.download_fileobj(bucket_name, "test_file.txt", download_obj)
– Downloads the file from the bucket.
download_obj.seek(0)
– Resets the cursor to the beginning of the BytesIO object.
print(download_obj.read().decode("utf-8"))
– Prints the content of the downloaded file.
Deleting a file
s3.delete_object(Bucket=bucket_name, Key="test_file.txt")
– Deletes the file from the bucket.
Verifying deletion
response = s3.list_objects_v2(Bucket=bucket_name)
– Checks the list of objects in the bucket after deletion.
for obj in response.get("Contents", [])
– Iterates through the list of objects and prints their keys.
Similarly, the same operations can be performed using the command line:
Before running AWS CLI commands, make sure you have AWS CLI configured to use the local moto endpoint:
export AWS_ENDPOINT_URL=http://127.0.0.1:5000
1. Creating a Bucket
aws s3api create-bucket --bucket example-bucket
aws s3api create-bucket --bucket example-bucket
– Creates an S3 bucket named example-bucket.
2. Listing Buckets
aws s3api list-buckets
3. Uploading a File
Create a test file on disk:
echo "Acesta este un fișier de test." > test_file.txt
echo "Acesta este un fișier de test." > test_file.txt
– Creates a test file with the specified content.
Upload the file to the bucket:
aws s3 cp test_file.txt s3://example-bucket/test_file.txt
aws s3 cp test_file.txt s3://example-bucket/test_file.txt
– Uploads the test_file.txt file to the example-bucket.
4. Listing Objects in the Bucket
aws s3 ls s3://example-bucket/
aws s3 ls s3://example-bucket/
– Lists all objects in the example-bucket.
5. Downloading a File
aws s3 cp s3://example-bucket/test_file.txt downloaded_test_file.txt
aws s3 cp s3://example-bucket/test_file.txt downloaded_test_file.txt
– Downloads the test_file.txt
from the example-bucket
and saves it as downloaded_test_file.txt
.
To verify the downloaded file content:
cat downloaded_test_file.txt
cat downloaded_test_file.txt
– Displays the content of the downloaded file.
6. Deleting a File
aws s3 rm s3://example-bucket/test_file.txt
aws s3 rm s3://example-bucket/test_file.txt
– Deletes the test_file.txt file from the example-bucket.
This tutorial covered the steps required to configure and use moto to simulate AWS S3 on a local server and demonstrated how to perform the same operations using both boto3 in Python and AWS CLI. These methods provide flexibility and ease for developing and testing applications that use AWS S3, without interacting directly with real AWS services, saving both time and costs.