Mastering AWS S3: How to Create Buckets and Upload Objects with CLI and API

Mastering AWS S3: How to Create Buckets and Upload Objects with CLI and API

Introduction:

If you want to interact with AWS there are 3 ways:

✅ WebUI: A user-friendly web-based interface to manage and configure AWS resources through a graphical interface.

✅ API: A set of APIs that enable developers to programmatically interact with AWS services, integrating AWS functionality into their applications.

✅ CLI: A command-line tool that allows interaction with AWS services using commands in the terminal or command prompt.

In this blog post, we'll explore AWS S3, specifically creating buckets and uploading objects using both the Command Line Interface (CLI) and Application Programming Interface (API) methods. 🚀


Table of content:

  1. Introduction to AWS S3

  2. Setting Up AWS CLI and API

  3. Creating S3 Bucket with the CLI

  4. Listing S3 buckets using API

  5. Uploading Objects with the CLI

  6. Creating an S3 Bucket with the API

  7. Uploading Objects with the API

  8. Retrieving Objects with the API:

  9. Best Practices and Tips

  10. Conclusion

  11. References

  12. Author Bio and Contact Information


Introduction to AWS S3

  • Did you know that AWS S3 is used by many well-known companies and popular services? Companies such as Netflix, Airbnb, Pinterest, and Reddit rely on AWS S3 for their storage needs. The wide adoption of AWS S3 by successful businesses showcases its reliability, scalability, and proficiency in handling massive data, making it a trusted choice for organizations of all sizes. 🌟

  • Amazon Simple Storage Service (Amazon S3) is an object storage service that offers cost savings, scalability, availability, security, and high-performance access. It enables data storage of any size and provides insights through analytical tools like Big Data analytics, Machine Learning, and Artificial Intelligence.💡

    AWS S3 is incredibly durable, with 99.999999999% (11 nines) durability. Your data is highly protected, with an average loss rate of only one object every 10,000 years. It's a trusted choice for critical storage needs. 🛡️

  • A bucket acts as a directory, while an object represents the actual data or file stored within that bucket. Together, buckets and objects form the basic building blocks of AWS S3, providing a scalable and secure storage solution for your data.

    📦

  • Bucket creation and object uploads in AWS S3 are essential for organized data, scalability, accessibility, backup, and integration with other AWS services. They unlock the full potential of AWS S3 for efficient data management and storage in the cloud. 🚀


Command Line Interface (CLI) Method:

  1. Installing and Configuring the AWS CLI:

    The AWS CLI (Command Line Interface) allows you to interact with AWS services and manage your resources from the command line. Now, let's examine each step in detail.

    1️⃣.1️⃣ Install the AWS CLI:

    👉🏼 Visit the official AWS CLI documentation (aws.amazon.com/cli) to download and install the AWS CLI for your operating system.

    👉🏼 Follow the installation instructions.

    1. Configure the AWS CLI: ⚙️

      2️⃣.1️⃣ Sign in to the AWS Management Console.

      2️⃣.2️⃣ Navigate to the IAM service:

      👉🏼 Search for "IAM" (Identity and Access Management) using the search bar at the top.

      👉🏼 Click on "IAM" to access the IAM service.

      2️⃣.3️⃣ Access IAM Users:

      👉🏼 On the left-hand side, click on "Users".

      👉🏼 To create a new user, click on the "Add user" button.

      2️⃣.4️⃣ Create a new IAM user:

      👉🏼 Enter a name in the "User name" field.

      👉🏼 Check ✅ "Provide user access to the AWS Management Console".

      👉🏼 Opt ⏺️ "I want to create an IAM user".

      👉🏼 Click on "Next".

      2️⃣.5️⃣ Configure user permissions:

      👉🏼 To grant the user full access to S3, select the checkbox next to "AmazonS3FullAccess".

      👉🏼 click on "Next".

      2️⃣.6️⃣ Review and create the user:

      👉🏼 Review the user's details to ensure everything is correct. If everything looks good, click on "Create user".

      2️⃣.7️⃣ Obtain the IAM user details

      👉🏼 On the screen, you will see the "Console sign-in details".

      👉🏼 Click on "Download .csv file".

      👉🏼 Downloaded .csv file (malay_credentials.csv) contains details of the new iam-user, i.e. username and password.

      👉🏼 Click on "Returns to users list".

      2️⃣.8️⃣ Obtain the access key and secret access key:

      👉🏼 Visit IAM > Users. Click on IAM username under the User name column. In my case, it's "malay".

      👉🏼 On the screen, you will see "Security credentials". Click on it.

      👉🏼 Scroll down! Click on "Create access key".

      👉🏼 On the screen, Opt Command Line Interface (CLI) to use the AWS CLI or API to interact with AWS services, such as creating S3 buckets, uploading objects, and more. Then, Click on Next.

      👉🏼 On the final screen, you will get your "Access key" credentials. Copy and store both the "Access key ID" and "Secret access key" securely or click on "download .csv file" and store it in a safe place.

      2️⃣.9️⃣ Run the AWS CLI configuration command:

      👉🏼 In the command prompt, Enter aws configure command.

      👉🏼 Provide AWS access key ID and secret access key. ( Follow 2️⃣.8️⃣ )

      👉🏼 Set the desired AWS region or enter us-east-1 as in this region most of the AWS services can be accessed.

      👉🏼Specify the default output format as json.

      2️⃣.1️⃣0️⃣ Sign in as IAM user:

      👉🏼 Click on the Root UserName(Right-Top). Copy the Account ID.

      👉🏼 Sign in as IAM user.

      👉🏼 Provide your old password and then the new password of your choice.

      👉🏼 Congratulations! you successfully created an IAM user account. Also, Now you can access your AWS service through CLI.

    2. Create an S3 Bucket: 📁

      3️⃣.1️⃣ Open a command prompt.

      3️⃣.2️⃣ To create an S3 bucket use the following command:

      aws s3 mb s3://bucket-name

      Here,

      👉🏼aws s3 invokes the AWS S3 service within the AWS CLI.

      👉🏼mb stands for "make bucket" and it creates a new S3 bucket.

      👉🏼s3://bucket-name is used to specify the bucket name. Replace

      bucket-name with the desired name for your bucket.

      NOTE: Bucket-Name should be unique

    3. Upload Objects to the Bucket:📤

      1. To uploads an object to the S3 bucket, use the command:

        aws s3 cp your-file-path s3://bucket-name/

        Here,

        👉🏼 aws s3 invokes the AWS S3 service within the AWS CLI.

        👉🏼 cp stands for "copy" and it copies files to and from S3 buckets.

        👉🏼 your-file-path is the path to the file you want to upload. Replace it with the actual path to the file on your local system.

        👉🏼 s3://bucket-name/ Replace "bucket-name" with the name of the bucket where you want to upload the file. In my case, the name of my bucket is malay-fav-dogs.

        For example, To upload a file named "golden_retriever.jpg" from your local system to an S3 bucket named "malay-fav-dogs," I run the following command.

        aws s3 cp "C:\Users\hp\Desktop\dogs\golden_retriever.jpg" s3://malay-fav-dogs

      2. Verify Upload:

        To List the contents of the S3 bucket and check if the file exists, run the following command:

        aws s3 ls s3://bucket-name/

        Here,

        👉🏼 ls stands for "list" and is the command to retrieve a listing of objects(data or file) in an S3 bucket.

        For example, To list the objects of my bucket, I run the following command: aws s3 ls s3://malay-fav-dogs


Application Programming Interface (API) Method:

  1. Set Up AWS SDK and Credentials: 🔑

    1️⃣.1️⃣ Configure your AWS credentials by providing your Access Key ID and Secret Access Key. (see above)

    1️⃣.2️⃣ Download & Install Anaconda:

    👉🏼 Visit https://www.anaconda.com/

    👉🏼 Click on Download.

    👉🏼 Once the download is complete, locate the downloaded installer file and double-click the downloaded file.

☑️ Add Anaconda3 to my PATH environment variable. And install.

1️⃣.3️⃣ Launching jupyter notebook:

Jupyter Notebook provides a convenient environment for using AWS SDKs (Software Development Kits) and interacting with AWS APIs. You can install AWS SDKs for various programming languages like Python, Java, or Node.js within your Jupyter Notebook environment. This allows you to write code to create, manage, and interact with AWS services such as EC2, S3, DynamoDB, and more.

👉🏼 After downloading Anaconda, open Anaconda Prompt.👉🏼 Make a folder to keep your code in one place.

👉🏼 To launch the jupyter notebook, run the jupyter notebook command.

1️⃣.4️⃣ WHAT IS BOTO3 LIBRARY ?

✅ The Boto3 library is the official Amazon Web Services (AWS) Software Development Kit (SDK) for Python. It provides a Python interface to interact with various AWS services, including Amazon S3 (Simple Storage Service), Amazon EC2 (Elastic Compute Cloud), Amazon DynamoDB, AWS Lambda, and many more.

✅ With Boto3, developers can write Python code to create, configure, and manage AWS resources programmatically. It simplifies the process of integrating Python applications with AWS services, allowing you to automate tasks, manage infrastructure, and interact with data stored in AWS.

1️⃣.5️⃣ Install the Boto3 library:

👉🏼 Run pip install boto3 command in Anaconda Prompt to install the boto3 library.

👉🏼 Create a new notebook with Python 3.

👉🏼 Run cell of Python notebook by ctrl + Enter


2. Listing S3 buckets using API: 📁

To know the names of all the S3 buckets associated with our AWS account, Let's run the following code in our jupyter notebook and understand each line of code step by step.

import boto3

s3_client = boto3.client('s3')

response = s3_client.list_buckets()

bucket_names = [bucket['Name'] for bucket in response['Buckets']]

for name in bucket_names:

print(name)

2️⃣.1️⃣ Import the Boto3 library: import boto3

This statement imports the entire Boto3 library and makes its functionality available for use in our Python program.

2️⃣.2️⃣ Create an S3 client: s3_client = boto3.client('s3')

Think of a client as a tool that allows you to communicate and interact with a specific service. In the context of Boto3, a client acts as a bridge between your code and an AWS service. It provides you with methods and functions to perform specific tasks related to that service. For example, an S3 client in Boto3 would allow you to create buckets, upload files, or list objects in Amazon S3.

2️⃣.3️⃣ List all the buckets: response = s3_client.list_buckets()

Here, we use the list_buckets() method to retrieve information about all the buckets in the account.

2️⃣.4️⃣ Extract bucket names from the response:

bucket_names = [bucket['Name'] for bucket in response['Buckets']]

The "response" contains a list of dictionaries, where each dictionary represents a bucket and contains various details such as the bucket name, creation date, etc. We extract the bucket names from the response using list comprehension and store them in the bucket_names list.

2️⃣.5️⃣ Print the bucket names:

for name in bucket_names:

print(name)

Finally, we iterate over the bucket_names list and print each bucket name.


  1. Create an S3 bucket using API: 📦

Run the following command in your Jupyter Notebook to create a bucket.

import boto3

s3 = boto3.client('s3')

bucket_name = 'malay-s3-demo'

region = 'ap-south-1'

s3.create_bucket(

Bucket=bucket_name,

CreateBucketConfiguration={'LocationConstraint': region } )

A detailed explanation step by step is given in the following notebook.

Listing new bucket names: Here we can see the bucket name "malay-s3-demo" is created.


  1. Uploading Objects with the API: 📤

    To upload an image (s3.jpg) to our S3 bucket "malay-s3-demo", let's copy the path of the file (s3.jpg) (see the screenshot below)

    Now, run the following lines of code in jupyter notebook to upload a file to an S3 bucket.

    import boto3

    s3 = boto3.client('s3')

    bucket_name = 'malay-s3-demo'

    file_path = "D:\s3\s3.jpg"

    object_key = 's3.jpg'

    s3.upload_file(file_path, bucket_name, object_key)

    For a better understanding of each line step-by-step, see the jupyter notebook (image attached below).

    NOTE:

    ✅ Replace 'malay-s3-demo' with your bucket name.

    ✅ Replace 'D:\s3\s3.jp' with the actual file path on your local system.

    Let's talk about the object_key parameter with an example. Suppose you have an S3 bucket student and you want to upload a photo malay.jpg of a student Malay, to a specific directory(folder), called malay_info within the bucket.

    In this case, you would set the object_key as follows:

    object_key = 'malay_info/malay.jpg'

    Here, object_key is a string that represents the key or path where the file will be stored within the S3 bucket.

    When you execute the s3.upload_file() method, the malay.jpg file will be uploaded to the student S3 bucket and stored within the malay_info directory.

    We will discuss in further blogs how to create and delete any directory in the S3 bucket using API.⏭️


  1. Retrieving Objects with the API: 🔍

    To retrieve a list of objects in an S3 bucket using the Boto3 library in Python, Run the following code in your jupyter notebook:

    import boto3

    s3 = boto3.client('s3')

    bucket_name = 'malay-s3-demo'

    response = s3.list_objects_v2(Bucket=bucket_name)

    if 'Contents' in response:

    for obj in response['Contents']:

    object_key = obj['Key']

    print(object_key)

    else:

    print("The bucket is empty.")

    For a better understanding of each line step-by-step, see the jupyter notebook (image attached below).


Best Practices and Tips: 💡

  1. Bucket names must be unique across all AWS accounts globally. No two buckets can have the same name.

  2. Bucket names cannot have underscores (_) in their names. According to the DNS naming conventions for S3 bucket names, only lowercase letters, numbers, periods (.), and hyphens (-) are allowed. Underscores are not permitted in bucket names.

  3. The AWS root account ID, Access Key ID and Secret Access Key are considered sensitive information.


Conclusion: 🔚

Setting up the AWS CLI and using the AWS API for S3 bucket creation and object uploading is a breeze. Just a few commands, and you're ready to manage your AWS resources like a pro! 🦾😎


References: 🌐

  1. https://www.freepik.com/free-ai-image/server-cloud-data-storage-concept-solution-web-database-backup-computer-infrastructure-technology-cloudscape-digital-online-service-global-network_40583093.htm#query=aws&position=1&from_view=search&track=ais_ai_generated

  2. https://aws.amazon.com/s3/

  3. Linux World


Author Bio and Contact Information: 👨‍💻 📧

Malay Thakur is an aspiring writer and a final year 🎓 of studies pursuing a Bachelor's degree in Computer Science and Engineering. With a strong interest in DevOps and AWS, Malay is enthusiastic about exploring the collaboration of development and operations. He is eager to share his fresh hands-on experience and knowledge in these areas through his writing.

You can contact Malay via email at or visit his LinkdIn at https://www.linkedin.com/in/malaythakur/ to learn more about his work.