Cloud Resume Challenge

Cover page from Tutorials Dojo Did you know that 70% of career decisions happen on a whim while binge-watching cat videos at 2 AM? That's exactly how I stumbled upon the Cloud Resume Challenge, a beginner friendly way to introduce yourself to AWS! This is my attempt at the Cloud Resume Challenge by Forrest Brazzeal, which involves a simple project specification utilizing AWS. In this article, I will discuss what I did to complete the challenge, and also with the added complexity of using Terraform, instead of AWS Cloud Formation as specified in the challenge. Project Outcome: What is the Challenge? The Cloud Resume Challenge is about following a project specification using AWS, that results in a Resume site. The challenge encourages the use of various AWS services, along with some CI/CD tools too, but you can personalize the project by swapping out certain tools or services. For example: CloudFormation -> Terraform GitHub Actions -> GitLab CI/CD Route 53 -> Cloudflare The goal is to make everything work together seamlessly in the end, entirely depending on your interpretation of the project specifications. Tackling the Challenge The challenge consists of 16 parts, but I'll simplify it into five key sections: Building the Frontend: Designing your resume site and setting it up with S3, CloudFront, and Route 53. Building the Backend: Manually provisioning backend resources, including AWS Lambda, API Gateway, and DynamoDB. Automating with GitHub Actions: Implementing CI/CD for the frontend. Infrastructure as Code (IaC): Converting your backend into IaC using Terraform. Deploying Unit Tests: Ensuring your code is robust before deployment. Finally, enjoy the results and prepare for the AWS Certified Cloud Practitioner (CCP) exam or any other certification! Building the Frontend You can use any framework you're comfortable with, such as plain HTML, CSS, and JavaScript. For this project, I used Next.js, ShadcnUI, TypeScript, and Tailwind CSS. However, this article will focus on the cloud setup rather than designing the website. So, feel free to go with any basic look or take time to design it. Once you've designed your resume website and populated it with your data, you can manually provision the necessary AWS resources to host your site. Before using AWS Now, most people I saw following the challenge went directly into creating AWS resources in the console. However, I think it would be better if first diagrammed and saw the bigger picture of what you are building. So, here is what the frontend architecture would look like: For those who have not yet created their AWS accounts, you can follow this guide. Once I got the bigger picture, I knew that I had to start in this order, S3 bucket -> CloudFront -> Route53 -> ACM. And all the little steps in between. Configuring S3 with Cloudfront, ACM & Route53. S3: Storage for our static files, or the frontend assets we made. When naming the S3 bucket, make it unique, all S3 buckets can be accessed if the URL is matched. Don't forget to enable static web hosting Disable block all public access Goal: Have a S3 endpoint to access the frontend assets. Disclaimer Don't worry if the site isn't secure yet(HTTP). It is simply the default for S3 buckets. In the later steps, we are going to use Cloudfront Route53: After registering the domain Cloudfront & ACM: Service that delivers your content through its edge locations or caching. Linkage of Cloudfront to the S3 endpoint. - Create a TLS/SSL certificate for the domain IMPORTANT: Always configure the right policies for each of the AWS resource you utilize, or create roles using the root account. Now you should have a resume site with HTTPS! Building the Backend With the front-end in place, it's time to set up the back-end. The backend involves creating and configuring services that will allow your resume site to handle dynamic content and store data. For this challenge, we'll manually provision the following AWS resources: AWS Lambda: For running serverless functions. API Gateway: To expose your Lambda functions as HTTP endpoints. DynamoDB: To store and retrieve data, such as the number of visitors to your site. And the backend diagram would look something like this: Now, setting up the table was easy, making a Lambda interact with it was easy too, I just had to recall some python and understand the boto3 library. Lambda: A serverless service that runs the code once it receives a request to it. Step 3: Write the Lambda Function Code In the Lambda console, scroll down to the Function code section. Replace the default code with the following: import json import boto3 def lambda_handler(event, context): dynamodb = boto3.resource('dynamodb') table = dynamodb.Table('ResumeVisitorCount') response = table.update_item(

Mar 21, 2025 - 06:50
 0
Cloud Resume Challenge

Cover page from Tutorials Dojo

Did you know that 70% of career decisions happen on a whim while binge-watching cat videos at 2 AM? That's exactly how I stumbled upon the Cloud Resume Challenge, a beginner friendly way to introduce yourself to AWS!

This is my attempt at the Cloud Resume Challenge by Forrest Brazzeal, which involves a simple project specification utilizing AWS.

In this article, I will discuss what I did to complete the challenge, and also with the added complexity of using Terraform, instead of AWS Cloud Formation as specified in the challenge.

Project Outcome:

Cloud Resume Architecture

What is the Challenge?

The Cloud Resume Challenge is about following a project specification using AWS, that results in a Resume site.

The challenge encourages the use of various AWS services, along with some CI/CD tools too, but you can personalize the project by swapping out certain tools or services. For example:

  • CloudFormation -> Terraform
  • GitHub Actions -> GitLab CI/CD
  • Route 53 -> Cloudflare

The goal is to make everything work together seamlessly in the end, entirely depending on your interpretation of the project specifications.

Tackling the Challenge

The challenge consists of 16 parts, but I'll simplify it into five key sections:

  1. Building the Frontend: Designing your resume site and setting it up with S3, CloudFront, and Route 53.
  2. Building the Backend: Manually provisioning backend resources, including AWS Lambda, API Gateway, and DynamoDB.
  3. Automating with GitHub Actions: Implementing CI/CD for the frontend.
  4. Infrastructure as Code (IaC): Converting your backend into IaC using Terraform.
  5. Deploying Unit Tests: Ensuring your code is robust before deployment.

Finally, enjoy the results and prepare for the AWS Certified Cloud Practitioner (CCP) exam or any other certification!

Building the Frontend

You can use any framework you're comfortable with, such as plain HTML, CSS, and JavaScript. For this project, I used Next.js, ShadcnUI, TypeScript, and Tailwind CSS. However, this article will focus on the cloud setup rather than designing the website. So, feel free to go with any basic look or take time to design it.

Once you've designed your resume website and populated it with your data, you can manually provision the necessary AWS resources to host your site.

Before using AWS

Now, most people I saw following the challenge went directly into creating AWS resources in the console. However, I think it would be better if first diagrammed and saw the bigger picture of what you are building.

So, here is what the frontend architecture would look like:

Frontend Diagram

For those who have not yet created their AWS accounts, you can
follow this guide.

Once I got the bigger picture, I knew that I had to start in this order, S3 bucket -> CloudFront -> Route53 -> ACM. And all the little steps in between.

Configuring S3 with Cloudfront, ACM & Route53.

S3: Storage for our static files, or the frontend assets we made.

  • When naming the S3 bucket, make it unique, all S3 buckets can be accessed if the URL is matched.
  • Don't forget to enable static web hosting
  • Disable block all public access

Goal: Have a S3 endpoint to access the frontend assets.

Endpoint of S3 Bucket

Disclaimer
Don't worry if the site isn't secure yet(HTTP). It is simply the default for S3 buckets. In the later steps, we are going to use Cloudfront

Route53: After registering the domain

Cloudfront & ACM: Service that delivers your content through its edge locations or caching.

  • Linkage of Cloudfront to the S3 endpoint.

- Create a TLS/SSL certificate for the domain

IMPORTANT: Always configure the right policies for each of the AWS resource you utilize, or create roles using the root account.

Now you should have a resume site with HTTPS!

Building the Backend

With the front-end in place, it's time to set up the back-end. The backend involves creating and configuring services that will allow your resume site to handle dynamic content and store data. For this challenge, we'll manually provision the following AWS resources:

  • AWS Lambda: For running serverless functions.
  • API Gateway: To expose your Lambda functions as HTTP endpoints.
  • DynamoDB: To store and retrieve data, such as the number of visitors to your site.

And the backend diagram would look something like this:

Backend Diagram

Now, setting up the table was easy, making a Lambda interact with it was easy too, I just had to recall some python and understand the boto3 library.

Lambda: A serverless service that runs the code once it receives a request to it.

Step 3: Write the Lambda Function Code

  1. In the Lambda console, scroll down to the Function code section.
  2. Replace the default code with the following:
   import json
   import boto3

   def lambda_handler(event, context):
       dynamodb = boto3.resource('dynamodb')
       table = dynamodb.Table('ResumeVisitorCount')

       response = table.update_item(
           Key={'id': 'visitor_count'},
           UpdateExpression='ADD visits :inc',
           ExpressionAttributeValues={':inc': 1},
           ReturnValues='UPDATED_NEW'
       )

       return {
           'statusCode': 200,
           'body': json.dumps('Visitor count updated')
       }

Now, all we have to do is setup the API gateway, I used an HTTP API, and configured it to the lambda function. Once created, we should test the endpoint of our API, with something like postman, in order to check if the API can talk with the real world.

You should see something similar to:

   {
       "statusCode": 200,
       "body": "\"Visitor count updated\""
   }

Then, once it can communicate with the outside world, let us create a hook from our frontend code to invoke our API, and update the count on the website.

And with that, your backend setup is complete! Your site is now capable of interacting with AWS services to perform dynamic actions, such as tracking visitor counts.

Terraform (IaC)

Now, this was the tricky part in the whole challenge for me, learning IaC. As when we provision resources in the console, usually the default settings is already enough, coupled with a bit of permissions.

Yet, with Terraform, I had to explicitly put each characteristic of the resource and make sure that I planned and applied each of it correctly.

So, once we destroyed all the manually provisioned resources. We build it through IaC. For reusability, and the ease of just writing code rather than interacting with a UI.

Automating with GitHub Actions

With the backend and frontend set up, the next step is to automate your deployment process using GitHub Actions. This ensures that every time you push changes to your repository, the updated code is automatically deployed.

Step 1: Set Up a GitHub Actions Workflow

  1. In your GitHub repository, navigate to the Actions tab and click New workflow.
  2. Select Set up a workflow yourself.
  3. Replace the default YAML file with the following:
   name: Deploy to S3 and Invalidate CloudFront

   on:
     push:
       branches:
         - main

   jobs:
     deploy:
       runs-on: ubuntu-latest

       steps:
         - name: Checkout code
           uses: actions/checkout@v2

         - name: Configure AWS credentials
           uses: aws-actions/configure-aws-credentials@v1
           with:
             aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
             aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
             aws-region: us-east-1

         - name: Sync files to S3
           run: |
             aws s3 sync ./frontend s3://your-bucket-name --delete

         - name: Invalidate CloudFront cache
           run: |
             aws cloudfront create-invalidation --distribution-id your-distribution-id --paths "/*"

Conclusion

If you are a beginner, and are looking to use a variety of services in the cloud. I recommend this challenge as it teaches you the essence of AWS services, IaC, and web development. Plus it's also a way to have an online hardcoded resume hosted on cloud! A bit over-engineered for a resume, but at least it is something that can last.