How to set AWS S3 Bucket Read Permissions with Relay

Blog post cover

How to Set AWS S3 Bucket Read Permissions with Relay

Cloud environments are susceptible to security issues. A big contributor is misconfigured resources. Misconfigured S3 buckets are one example of a security risk that could expose your organization’s sensitive data to bad actors.

Policies and regular enforcement of best practices are key to reducing this security risk. However, manually checking and enforcing security is time-consuming and can fall behind with all the demands a busy DevOps team faces every day.

Automating regular security checks helps you stay on top of cloud security. Relay uses simple workflows to automate these tasks so you can keep your cloud environments secure with minimal manual work.

This article will demonstrate how to use a Relay workflow to ensure that Amazon Web Services (AWS) S3 buckets with READ_ACP permissions are only read by authenticated users.

Setting Up Relay for AWS

A previous article walked you through creating a Relay account and setting up Visual Studio Code for editing workflows. The workflow in this article uses an AWS connection so you need to create an IAM user with the permissions to edit S3 buckets, as per the previous article.

Creating a Workflow

Relay has several sample workflows with code to get you started including a workflow to change buckets that have access for All Users using Read ACP to private. We will explore that option in this article.

Sample workflow for editing bucket permissions

To get started, navigate to the workflow page and click Use this workflow. You will see a popup to create the workflow. Enter a name, and optionally a description, then click Create workflow.

Create workflow form

Relay may show you an error about a missing required connection. This happens if your workflow is not yet connected to an AWS account, or if you have a connection which has a different name than the workflow expects. If you have an existing connection, adjust the “name” field of the !Connection tag on line 31 of the workflow to match your connection’s name. If not, create a new connection with these steps:

Workflow is missing connection alert

Click Fill in missing connections. The Connections panel in the sidebar will allow you to add that connection by pressing the blue encircled plus next to “my-aws-account.”

Add connection button

In the next popup, fill in your access key ID and secret access key of the IAM user you created earlier.

Connection setup form

After you click Save you’re ready to use the workflow.

Running the Workflow

Let’s give our new workflow a try by clicking Run. A popup will ask you to fill in the dryRun parameter. true means that the workflow will do a dry run: it will report what changes it would make to your S3 buckets, but it doesn’t actually make them. Set this to false if you do want to modify buckets.

To see exactly what the workflow does, let’s do a dry run first. After starting the run, the workflow graph will show you which step is running and which steps are completed. Clicking on each step will show you more details about that step and allow you to view its logs.

The three first steps help you decide which S3 buckets need permissions modified:

  • list-buckets fetches a list of all your S3 buckets.
  • get-bucket-acls retrieves the Access Control Lists (ACLs) from those buckets.
  • filter-buckets checks which of those buckets have public READ permissions and are thus deemed unsafe.

If dry run is disabled, the fourth step, approval, requires your interaction to approve modifying the ACLs found by filter-buckets. If approved, the fifth step, modify-acls, carries out the bucket modification.

Look at the logs for each step to get a feel for what they do, then we’ll move on to a “wet run” to see the workflow’s modification step in action.

Trying out a “Wet Run”

Now that we have explored what a dry run of the workflow does, let’s try out the workflow with dryRun set to false. To see any effect, you’ll first have to intentionally create an unsafe S3 bucket so the workflow can detect it as such and enforce the restrictions.

To get started, go to the S3 Management Console and click Create Bucket. Fill in a name and region, and then uncheck “Block all public access.” Tick the checkbox next to the warning that appears.

S3 Management Console bucket settings

Scroll down to the bottom of the page and click Create bucket. When you’re redirected back to the Management Console, click on the name of the newly created bucket, then head to the “Permissions” tab.

S3 Management Console permissions

Scroll down to “Access control list (ACL)” and click “Edit.” Tick the checkbox for either “Everyone” or “Authenticated users group” and also tick the warning checkbox.

S3 Management Console ACL

Press “Save changes.” If you head back to the Management Console, you’ll see this:

S3 Management Console buckets

Now it’s time to launch our Relay workflow. Be sure to set dryRun to false when starting the run.

The filter-buckets step will now detect this new bucket:

Found 1 bucket(s) that have public READ permissions: relay-unsafe-test-bucket

Once you get to the approval step, press “Yes.” After modify-acls is complete, head back to the S3 Management Console and the Permissions tab of the bucket. You’ll see that the workflow has successfully removed the unsafe permissions.

S3 Management Console ACL updated

Setting up a Trigger

To keep your S3 buckets safe, it’s convenient to regularly run the workflow automatically, rather than running it manually. You can do this using triggers. Relay offers three types of triggers:

  • Schedule triggers: time-based scheduling, similar to cron jobs on Linux
  • Push triggers: allows external services to trigger the workflow by making an HTTP POST request with a JSON web token (JWT) access token
  • Webhook triggers: allows external services to trigger the workflow by POSTing a payload to a workflow-specific URL

For this workflow, we’re going to use a schedule trigger. If you look at the “Code” tab of the workflow, you’ll see there is already a schedule trigger prepared in a comment. Uncomment those lines and change dryRun to false if desired.

The schedule property indicates when to run the workflow. The format is equivalent to the Linux crontab format. There are five values: minute, hour, day of the month, month (one-based, so 1 is January), and day of the week (where 0 is Sunday), in that order. An asterisk means “any value.” The default schedule for this workflow is 0 * * * *, meaning that it will run at the start of every hour because minute is zero and all other values are “any.”

The crontab format is more flexible than just numbers and asterisks. You can also specify multiple values using a comma, a range using a hyphen, and steps (“every fourth hour”-style) using a slash. Here are a couple of examples:

  • 0 3 * * 0,6 runs at 3 AM on Sunday and Saturday
  • 0 0-3 * * * runs every day at midnight, 1 AM, 2 AM, and 3 AM (ranges are inclusive)
  • 4-54/10 * * * * runs every hour at :04, :14, :24, :34, :44 and :54

After enabling the schedule trigger, it will appear as the first block in the workflow graph:

Workflow graph for editing bucket permissions

Visit Relay’s document Using triggers in workflows to learn more about triggers, including other types of triggers you can use to automate your DevOps tasks.

Next Steps

We configured a Relay workflow to restrict access to the relevant S3 buckets. Then, we set a schedule to automatically check and enforce those restrictions, saving you time while enhancing your security.

Relay enables you to conveniently automate even more of your DevOps maintenance, increasing your team’s productivity. Other flows, for example, enable you to maintain and enforce other AWS resource policies. You can also expand these examples to develop workflows for any Azure, AWS, or Google Cloud Platform (GCP) resources.

To learn more about Relay and its workflows, read Relay’s official documentation. Invest a little time setting up your automatic workflows now to boost your future productivity. Check out https://relay.sh/ and start automating your DevOps maintenance today.