How to Delete Unattached GCP Disks in 3 Easy Steps

Blog post cover

Moving to the cloud brings benefits such as reduced infrastructure costs, increased scalability, and added redundancy. As your company takes advantage of the cloud, you may follow the trend to automate both the creation and destruction of cloud resources.

One area that has received less attention is keeping track of and cleaning up unused resources to ensure you are not spending more than necessary. As your cloud environment gets used, resources are often changed and can become orphaned. For example, developers commonly delete virtual machines (VMs), but sometimes keep the VM - sometimes this is intentional for disaster recovery, but they’re often just forgotten. Leaving orphaned disks can waste money and potentially be insecure. From a security standpoint, removing unused and potentially sensitive data reduces the available cyber attack surface.

There are now easy-to-use, cloud-based tools, such as Relay.sh from Puppet, to help automate your DevOps infrastructure maintenance. This lowers costs with less manual work.

Relay supports all major cloud platforms, but in this article, we will focus on the Google Cloud Platform (GCP). Let’s see how easy it is to configure Relay to access our GCP project, check for unused disks, and delete them if appropriate.

Step 1: Create a Relay Account

First, let’s explore the Relay interface, and create a GCP disk. You will need a free Relay account. After signing up, you will see three items in your main menu—Workflows, Connections, and Documentation.

Relay menu

The Workflows page shows you a list of all existing workflows and has a “New Workflow” button to create a new workflow.

There is also an “Explore Workflows” button. You often do not have to write a workflow from scratch and can use an existing workflow as a starting point then customize as needed.

Workflows list

The connections section enables you to create and save connections to use in all workflows. Both Amazon Web Services (AWS) and GCP connections are in this account:

Connections screen

We cover operations with the Google cloud (GCP) in this article, but there are many other connection options. As you can see below, Relay supports many essential cloud services.

Connection types

Step 2: Create a GCP Connection

Relay connects to GCP subscriptions using a service account specific to a GCP Project. GCP Projects group resources for easier management and GCP service accounts give access to that project’s resources.

In our case, we have created and selected a project, RelayStorageDemo. From the IAM & Admin menu, select “Service Accounts.”

GCP Service Accounts menu

On the Service Accounts page, create a service account, give it permissions, and create a JSON key you can copy into a new Relay connection. The default role should be the owner. You may want to set a more limited role for production and can give precise permissions.

GCP create key

Click on “Create key” to give the following prompt:

GCP create JSON key

The JSON key then saves to your download directory, where you can open it and copy it for pasting into the Relay GCP connection dialog box.

GCP private key saved

The next step is to go into “Compute Engine” and “Disks” to create an “orphaned” drive for us to detect and delete.

GCP Disks menu

We’ve created a minimum size disk of 10GB.

GCP Disks screen

The final step is to create a GCP connection in Relay using the JSON key you created above. The file will look like this (this code contains changes to protect sensitive data).

{
  "type": "service_account",
  "project_id": "relaystoragedemo",
  "private_key_id": "c0c6b3cdfbfc6ce0f7cdc95d4948daaea2fc4a06",
  "private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvAIBuhtrshkiG9w0BAQEFAASCulkjjoIBAQC.../YKZ6wAIED875d54srdx6ODzinjJiuy=\n-----END PRIVATE KEY-----\n",
  "client_email": "55555555555-compute@developer.gserviceaccount.com",
  "client_id": "44444444444444444444",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/55555555555-compute%40developer.gserviceaccount.com"
}

On the Relay Connections page, press “Add Connection,” and you will get the following dialog box. Name the connection and paste in the GCP JSON Key from above. Note that the web “placeholder” text shows the type of format required. Even after it sets, you will still only see this placeholder text if you edit the connect snf data. This helps keep your key secret.

Add GCP connection screen

Once you press Save, the connection will be available for the next step of creating a workflow.

Step 3: Create a Relay Workflow

Relay is a system for running workflows. This architecture block diagram from Relay Docs shows the overall structure. Relay.sh includes a secret store to protect secrets like your GCP Service Account key. The workflow engine can accept several different trigger types and execute complex workflows.

Relay architecture diagram

Relay offers many existing workflows to get you started.

We are using the “gcp-disk-reaper” workflow that will delete GCP disks. If you press “Explore workflows” on the Workflows page and type “gcp” in the search window, you will see “Delete GCP disks that are unattached.” Clicking on the details will display the workflow source code, or you can examine the “Readme” information. Click on “Try this workflow” to proceed.

Create workflow from gcp-disk-reaper template

After creating the workflow, you can check the settings. It should automatically pick up the GCP connection if you only have one. The “Edit workflow” display will look as follows:

GCP workflow graph

There are a few things we would like to note. The first is that there is no automated trigger. In this example, we will manually run the workflow. There are many ways to trigger a workflow, including time-of-day, webhooks, or signals from other cloud resources like GitHub, Puppet, and Docker Hub.

This workflow also has a parameter “Dry Run,” with a default of “true.” The Dry Run will get a list of disks but not proceed to the approval or delete disk steps. When set to false, the approval step is enabled, and you can approve or not approve the deletion of all disks found.

This example is a simple workflow. Your logic will take into account other factors to determine if a disk needs deleting.

WARNING: DO NOT RUN THIS ON A PRODUCTION GCP PROJECT. Any orphaned disks will be deleted if found and approved.

Do a dry run initially to test the result with the web interface. Press the “Run workflow” button and accept the Dry Run = true parameter. You can see the results of each step in the logs. In this case, one disk, “disk-1,” was found.

filter-disk log

Finally, we ran the workflow with Dry Run = false and approved the deletion.

Completed workflow graph

Status from the “delete-disks” steps show that the GCP disk is deleted.

delete-disks log

Rerunning the script fails as there are no disks left to list.

list-disks log

Next Steps

We’ve seen how easy it is to use a Relay by Puppet workflow to delete unneeded GCP disks.

Try some of the other existing workflows to quickly set up VM, NIC, disk maintenance, and more. These examples provide good starting points. You can customize them for GCP or other clouds like AWS and Azure.

Puppet has provided software automation since 2009, is a leader in DevOps, and fully supports the open-source community. Discover more about Relay by Puppet in their documentation.

Sign up a free Relay.sh account to start automating your DevOps maintenance. Then, you can use all that time and money you save for your next exciting development project.

About the Author, Dave Noderer

Dave Noderer is the CEO / President and founder of Computer Ways, Inc., a software development company focused on Microsoft Technologies including .NET, Azure, Dynamics 365, IOT, and SQL Server, COO of Nedd Tech Inc., focused on Augmented Reality, a Microsoft MVP alumni and developer community activist. Mr. Noderer is an Electrical Engineer by training and has been doing software development since founding Computer Ways, Inc. In 1994.