Background

Starting from Scratch: Crafting a Beginner’s Deployment Workflow with Docker and GCP

Starting from Scratch: Crafting a Beginner’s Deployment Workflow with Docker and GCPAnimated Box
7/31/2024

Introduction

Deploying an application when you’re a beginner can often seem daunting, especially when you’re still new to the world of DevOps.

For my first deployment project, I decided to embark on creating a GitHub Actions workflow to deploy a dockerized application on Google Cloud Platform (GCP). The article documents a pipeline I have built to deploy a FastAPI application that performs sanity checks on client services, including SSL certificates and sends email notifications based on the results.

It was a rewarding experience that taught me a lot about deployment tools and services. In this article, I’m going to share that experience with you and guide you through the key stages of the process.

Prerequisites:

Before getting started, you need to create a new project in GCP and make sure you have the right access to contribute to it.

You need to install Google Cloud SDK on your machine, then authenticate and configure to the GCP project with this command line:

gcloud init

and

gcloud auth login

You also need to create a service account key. This key is used to authenticate CI/CD actions. Make sure the service account has the necessary roles and permissions, such as roles/run.admin for Cloud Run and roles/storage.admin to manage images in the Artifact Registry.

In your GitHub repository, configure the secrets needed to store sensitive information such as Google Cloud service keys. For example, we are using GOOGLE_CREDENTIALS_DEV variable to store the JSON key credentials of the service account.

Finally, make sure you have Docker installed on your machine and to create and test docker images locally before pushing them to GCP.

Why Github Actions, Docker and GCP?

A GitHub Actions workflow is a series of automated steps that run on your code repository, making it easier to automate tasks like testing, building, and deploying applications. This is what is called Continuous Integration, Continuous Delivery (CI/CD).

Docker is essential for modern development, as it creates isolated containers that bring together everything an application needs, making it easy to deploy on different environments.

Finally, GCP offers flexible cloud services. For this project, I used Artifact Registry to store Docker images and Cloud Run to deploy these containers in a scalable way.

Get started

Files for workflow should be placed inside “.github/workflows” directory, in which we are going to create a YAML file since this is the file format used to define your github actions workflow. Create a new file called “dev-deployement.yaml”.

Step 1: Define the trigger.

name: dev Deploy 

# Specifies that the worflow should be triggered on push events to the 'main' branch
on:
push:
branches:
- main

Start by writing the name of your workflow, it could be anything. We decided to name it “dev Deploy”. Then you have to specify which branch the push is triggered. In our case, we want the workflow to run every time we make a pull request for push on the main branch.

Step 2: Define the job and the steps.

# Defines a job named "Cloud Run Deployment"
jobs:
build:
name: Cloud Run Deployment
runs-on: ubuntu-latest # The type of runner that the job will execute on

# Steps are the sequential tasks that will be executed as part of the job
steps:
# Checks out the repository under $GITHUB_WORKSPACE, so your job can access it
- name: Checkout
uses: actions/checkout@v2

# Step to set up Python (if your tests are written in Python)
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.12' # Specify the Python version you need

# Install dependencies
- name: Install dependencies
run: |
python -m venv venv
source venv/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt

A job is a set of steps that are executed on a runner. In our case, we have a job named “Cloud Run Deployment” that will run on the “ubuntu-latest” runner.

Within the job, we outline the steps to execute. Steps are the sequential tasks that GitHub Actions will perform:

  • Checkout: checks out the code from your GitHub repository so that the workflow can access it.
  • Set up Python: As our application is FastAPI-based, we do require to setup Python and to specify it’s needed version.
  • Install dependencies: We tell the workflow to create a virtual environment in which it will install the dependencies needed to run the application.

Step 3: Authenticate and setup Cloud SDK.

      - id: 'auth'
name: 'Authenticate to Google Cloud'
uses: 'google-github-actions/auth@v2'
with:
credentials_json: ${{ secrets.GOOGLE_CREDENTIALS_DEV }}

# Sets up the Google Cloud SDK, allowing subsequent commands to use gcloud
- name: Set up Cloud SDK
uses: google-github-actions/setup-gcloud@v1
with:
export_default_credentials: true
  • Authenticate to Google Cloud: Use the Google GitHub Actions auth action to authenticate with Google Cloud. You need to provide credentials stored in GitHub Secrets (in your github repository).
  • Set Up Google Cloud SDK: Set up the Google Cloud SDK.

Step 4: Dockerize the application.

      # Configures Docker to use 'gcloud' as a credential helper, allowing it to push images to Google Container Registry
- name: Configure Docker
run: gcloud auth configure-docker us-central1-docker.pkg.dev

# Builds the Docker image and tags it
- name: Build
run: docker build -t us-central1-docker.pkg.dev/hawkswell-dev/sanity-check-api/sanity-check-api:latest .

# Pushes the Docker image to Google Container Registry
- name: Push
run: docker push us-central1-docker.pkg.dev/hawkswell-dev/sanity-check-api/sanity-check-api:latest
  • Configure Docker for GCP: This step configures Docker to use gcloud as a credential helper, allowing it to push images to Google Artifact Registry.
  • Build the Docker Image: Build the Docker image from your Dockerfile and tag it appropriately.
  • Push the Docker Image: Push the built Docker image to Google Artifact Registry.

From this step, our Docker image is build and pushed to Artifact Registry in GCP.

Finally, we now need to retrieve this image and push it to Cloud Run.

Step 5: Deploy with Cloud Run.

      # Deploys the Docker image to Cloud Run
- name: Deploy to Cloud Run
run: |
gcloud run deploy sanity-check-api \
--region us-central1 \
--image us-central1-docker.pkg.dev/hawkswell-dev/sanity-check-api/sanity-check-api:latest \
--allow-unauthenticated \
--project hawkswell-dev \
--cpu-throttling \
--set-env-vars SENDER_EMAIL=sender.email@address.com \
--set-env-vars RECEIVER_EMAIL=receiver.email@address.com \
--set-env-vars ALWAYS_SEND=False

Deploy to Cloud Run: Deploy the Docker image to Cloud Run. This makes our application accessible and scalable in the cloud.

Here is a breakdown of each line:

gcloud run deploy sanity-check-api \

This initiates the deployment of a new service or updates an existing service on Cloud Run. The sanity-check-api argument specifies the name of the service being deployed.

-- region us-central1 \

This specifies the Google Cloud region where the Cloud Run service should be deployed. In our case, the service will be deployed in the us-central1 region.

-- image us-central1-docker.pkg.dev/hawkswell-dev/sanity-check-api/sanity-check-api:latest

The — image flag specifies the Docker image that will be used. The provided URL points to an image stored in Google Artifact Registry. The latest tag indicates that the most recent version of the image should be used.

-- allow-unauthenticated \

This flag allows unauthenticated access to the service.

-- project hawkswell-dev \

This flag specifies the Google Cloud project ID where the service will be deployed. hawkswell-dev is our project ID.

-- cpu-throttling \

This option enables CPU throttling for the Cloud Run service, allowing Google Cloud to adjust the CPU allocation based on the number of requests. This can help optimize resource usage and reduce costs.

-- set-env-vars SENDER_EMAIL=sender.email@address.com \

This flag sets an environment variable SENDER_EMAIL which is the client email. Make sure to replace “sender.email@address.com” with a real email address.

-- set-env-vars RECEIVER_EMAIL=receiver.email@address.com \

Similar to the previous one, the specified email address should be replaced with a real one.

-- set-env-vars ALWAYS_SEND=False

This sets the environment variable ALWAYS_SEND to False.

From now on, every time you merge a pull request into the “Main” branch of your GitHub repository, this workflow will be automatically triggered to perform the necessary installations, configurations, Docker operations and GCP deployment. This saves you a considerable amount of time, as you no longer have to manually execute all these commands. Your workflow now takes care of all this, simplifying your deployment process.

Conclusion

Deploying an application with Docker and GCP just got easier with GitHub Actions. By following this guide, you’ve automated the deployment of an application on Cloud Run. You now have the foundations to continue exploring and improving your DevOps skills. Feel free to experiment further and develop projects with confidence.


Find a better way to
customAI solutions

Loading...