Integrate Amazon S3 with Mule

Published on
February 24, 2022
Author
MuleSoft Integration Team
Integrate Amazon S3 with Mule

Amazon Web Service’s S3 stands for “Simple Storage Service”. It is a type of cloud storage provided to the developers as a scalable solution over the Internet. 

Amazon S3 uses the concept of Buckets & Objects to store the data. It allows an easy, user-friendly, fast & on-demand approach for storing & retrieving the data online.

MuleSoft provides an Amazon S3 connector which allows us to perform various operations on the Objects & Buckets.

In this blog, we are going through the understanding & working of the below S3 connectors:

  • Create Bucket
  • Delete Bucket
  • Create Object
  • Get Object
  • Delete Object

Prerequisite Steps to be followed:

  • There are 2 ways to add the Amazon S3 connector to your project:
  1. Add AWS-S3 connector module from Exchange to the Studio palette.

2. Or Go to the Anypoint Platform > Exchange > All assets > Provided by MuleSoft. 

Search & select the “Amazon S3 Connector” for Mule 4.

On opening the connector, click on “Dependency Snippets”.

Copy the opened Snippet under Maven.

Finally, paste it inside your pom file under <dependencies> tag. 

  • Make sure you have an account created with Amazon Web Services. You can refer this link https://portal.aws.amazon.com/billing/signup to sign-up for a free account.
  • Note down the Security Credentials named “Access Key ID” & “Secret Key” from your AWS account. To create a new set of credentials you can click on “Create New Access Key” button under “Security Credentials” option.
  • Create a global connector configuration for “Amazon S3 Configuration” which will be common & used in the implementation of all the below mentioned S3 connectors.

For the “Basic” Connection, we will need to provide the details for “Access Key” & “Secret Key” (refer to the previous point).

  1. Create Bucket

A Bucket is used to store the Objects. Whenever an Object is added to the Bucket, a distinct property called “Version Id” is allocated to the Object internally.

We have created a simple flow using HTTP Listener (to make the POST call), loggers (to log the required information to the console) & most importantly “Create Bucket” connector from Amazon S3 module

Select the “Amazon_S3_Configuration” set up in global-config for the “Connector Configuration” field.

For “Bucket name” field, we can enter the value directly or dynamically. In our case we are getting the value dynamically using the POST call’s payload.

Set the name of the bucket in payload while hitting the URL. 

In our case a bucket named “for-blog” is created successfully.

2. Delete Bucket

Now we will make use of the “Delete Bucket” connector from the S3 module to delete the Bucket created in the previous section of the blog.

Here also, we will set the value of “Bucket name” dynamically using the payload.

On triggering the URL with the name of the Bucket to be deleted, we see that the “for-blog” named bucket is deleted from the S3 storage.

3. Create Object

The AWS Object contains the data which we want to store in the Bucket. The object is labeled with the value of “Key” passed while creating the Object.

For “Create Object” connector, we will need to provide the values of the below fields:

  • Bucket name: We are using an already created Bucket named “poc-aws-api” directly.
  • Key: Here we entered the name of the object to be created dynamically using the payload.
  • Object content: We will be passing the below data in json format to be stored, as part of payload.

{

“objectName”: “newobject”,

“description”: “description of the newobject comes here !”

}

After successful triggering of the URL, we can see the object with the name “newobject” is created under the “poc-aws-api” bucket

4. Get Object

In this scenario, we will use the “Get Object” S3 connector to fetch the data stored in the Object.

We are also using the “Write” file connector to download the fetched data in a file on a local location.

In “Get object” connector’s configuration, we will mention the below details:

  • Bucket name: The name of the Bucket where the Object is stored.
  • Key: The name of the Object to be fetched is passed as Variable stored as “objectName”.

On execution of the flow, the details of the object are successfully fetched & downloaded locally as a JSON file.

5. Delete Object

In this section, we will use the “Delete object” S3 connector to delete the Object created in 3rd section of the blog.

 In “Delete object” connector’s configuration, we will mention the below details:

  • Bucket name: The name of the Bucket where the Object is stored.
  • Key: The name of the Object to be deleted is passed as the part of payload.

After triggering the URL, we can see the mentioned Object is deleted from the Bucket.

Best Practice:

For purpose of simplification, we have mentioned the configuration details directly. 

But it is always recommended to externalize these details & encrypt the sensitive data.

Click here to check out/download the sample code for AWS S3 integration in Anypoint Platform.

Recent Blogs

Salesforce Pricing Automation: Boost Efficiency And Accuracy with Apex Triggers
BlogSep 9, 2025

Salesforce Pricing Automation: Boost Efficiency And Accuracy with Apex Triggers

Introduction In order to succeed in today’s fast-paced business landscape, precision and speed define competitive advantage. For businesses, especially those managing complex product catalogs, ensuring accurate pricing on sales orders or custom lines can be a time-consuming and error-prone task. To overcome this challenge, Salesforce trigger handlers offer a powerful solution to automate the entire… Continue reading Salesforce Pricing Automation: Boost Efficiency And Accuracy with Apex Triggers

Read More
Blog
6 min read

Salesforce Pricing Automation: Boost Efficiency And Accuracy with Apex Triggers

Introduction In order to succeed in today’s fast-paced business landscape, precision and speed define competitive advantage. For businesses, especially those managing complex product catalogs, ensuring accurate pricing on sales orders or custom lines can be a time-consuming and error-prone task. To overcome this challenge, Salesforce trigger handlers offer a powerful solution to automate the entire… Continue reading Salesforce Pricing Automation: Boost Efficiency And Accuracy with Apex Triggers

Read More
Connecting MuleSoft and Azure SQL with Entra ID
BlogJul 14, 2025

Connecting MuleSoft and Azure SQL with Entra ID

Introduction Establishing a secure connection between MuleSoft and Azure SQL Database can be challenging, especially if you are using Entra ID (formerly known as Azure Active Directory) for authentication. This blog walks through a fully working configuration for connecting to Azure SQL using ActiveDirectoryServicePrincipal in Mule runtime 4.7.4 with Java 8 — addressing driver setup,… Continue reading Connecting MuleSoft and Azure SQL with Entra ID

Read More
Blog
2 min read

Connecting MuleSoft and Azure SQL with Entra ID

Introduction Establishing a secure connection between MuleSoft and Azure SQL Database can be challenging, especially if you are using Entra ID (formerly known as Azure Active Directory) for authentication. This blog walks through a fully working configuration for connecting to Azure SQL using ActiveDirectoryServicePrincipal in Mule runtime 4.7.4 with Java 8 — addressing driver setup,… Continue reading Connecting MuleSoft and Azure SQL with Entra ID

Read More
Understanding Salesforce Flow Approval Processes
BlogJun 30, 2025

Understanding Salesforce Flow Approval Processes

Introduction: Salesforce introduced Flow Approval Processes in the Spring '25 release. This is an evolved version of the classic approval process model, powered by Flow Orchestrator. The new approach brings unprecedented flexibility, enabling the creation of dynamic, multi-level, and logic-driven approval workflows that are entirely declarative. Continue reading the blog to get a deeper understanding… Continue reading Understanding Salesforce Flow Approval Processes

Read More
Blog
5 min read

Understanding Salesforce Flow Approval Processes

Introduction: Salesforce introduced Flow Approval Processes in the Spring '25 release. This is an evolved version of the classic approval process model, powered by Flow Orchestrator. The new approach brings unprecedented flexibility, enabling the creation of dynamic, multi-level, and logic-driven approval workflows that are entirely declarative. Continue reading the blog to get a deeper understanding… Continue reading Understanding Salesforce Flow Approval Processes

Read More
Capturing Real-time Record Updation Using LWC
BlogMay 14, 2025

Capturing Real-time Record Updation Using LWC

Introduction In modern CRM ecosystems, real-time Salesforce integration and seamless user experiences are no longer optional but fundamental for driving operational efficiency. Imagine your sales reps making important Opportunity changes, but the ERP remains out of sync, leading to confusion and data errors. We understood the necessity to bridge this data gap and implemented a… Continue reading Capturing Real-time Record Updation Using LWC

Read More
Blog
5 min read

Capturing Real-time Record Updation Using LWC

Introduction In modern CRM ecosystems, real-time Salesforce integration and seamless user experiences are no longer optional but fundamental for driving operational efficiency. Imagine your sales reps making important Opportunity changes, but the ERP remains out of sync, leading to confusion and data errors. We understood the necessity to bridge this data gap and implemented a… Continue reading Capturing Real-time Record Updation Using LWC

Read More