<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Patrick Drew]]></title><description><![CDATA[Patrick Drew]]></description><link>https://blog.patrickdrew.com</link><generator>RSS for Node</generator><lastBuildDate>Sat, 25 Apr 2026 21:03:53 GMT</lastBuildDate><atom:link href="https://blog.patrickdrew.com/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[The Cloud Resume Challenge: Enterprise Edition]]></title><description><![CDATA[I'm a full-stack developer who has recently moved into a DevOps / infrastructure role. Over the past year, I'd acquired a handful of AWS and Red Hat certificates but wanted to start building stuff and gain more practical experience with the cloud. Fo...]]></description><link>https://blog.patrickdrew.com/the-cloud-resume-challenge-enterprise-edition</link><guid isPermaLink="true">https://blog.patrickdrew.com/the-cloud-resume-challenge-enterprise-edition</guid><category><![CDATA[AWS]]></category><category><![CDATA[cloud-resume-challenge]]></category><category><![CDATA[Devops]]></category><category><![CDATA[ci-cd]]></category><dc:creator><![CDATA[Patrick Drew]]></dc:creator><pubDate>Wed, 12 Jul 2023 09:06:49 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/tSEiF1ZWUTo/upload/7a6f68a5e7d6496f0d8e05360ae25337.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I'm a full-stack developer who has recently moved into a DevOps / infrastructure role. Over the past year, I'd acquired a handful of AWS and Red Hat certificates but wanted to start building stuff and gain more practical experience with the cloud. <a class="user-mention" href="https://hashnode.com/@forrestbrazeal">Forrest Brazeal</a>'s <a target="_blank" href="https://cloudresumechallenge.dev/">Cloud Resume Challenge</a> seemed like an excellent catalyst for putting this plan into motion. I purchased the guidebook and was intrigued to discover there were multiple advanced "mods" or extensions for each chunk of the challenge. I decided I would attempt as many of these as possible with the aim of shipping a full-featured, "enterprise edition" of the Cloud Resume Challenge. Here's how I tackled the mods I found the most interesting.</p>
<p><a target="_blank" href="https://resume.patrickdrew.com/">View the finished product here!</a></p>
<p><a target="_blank" href="https://github.com/pdrew/cloud-resume-challenge">View the code on GitHub.</a></p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688635954501/5249c2ca-66f7-4c1b-897e-424ee293da1d.png" alt="Table of challenge mods and completion status" class="image--center mx-auto" /></p>
<h2 id="heading-setup-and-safety">Setup and Safety</h2>
<p>This wasn't a mod per se, but there was a recommended "professional" approach for creating new AWS accounts. I set up a main / billing AWS account specifically for this project and then used AWS Organisations to create development and production accounts. My code would first be deployed to the dev account and once I was happy with how it looked and functioned, it would get promoted to production. One of the goals I had for the project was to avoid using access and secret keys. To achieve this, I set up Identity Centre and configured the AWS CLI on my local machine to use single sign-on. For authenticating my GitHub workflows, I created IAM OIDC Identity Providers in the development and production accounts, so I wouldn’t need to store AWS credentials as secrets.</p>
<p>My DNS configuration is a little complicated. My domain wasn't purchased from AWS, and I wanted to have a dedicated subdomain / child zone for my test environments. After some experimentation I settled on hosting my primary domain (patrickdrew.com) in Route 53 of my production account, then created a NS record pointed at a child domain (test.patrickdrew.com) hosted in R53 of the test account. I then updated the name servers for the primary domain with my DNS registrar.</p>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688634278863/79541cd2-7452-472d-bb44-c10083d56cab.png" alt="Parent / child DNS zone setup for the challenge" class="image--center mx-auto" /></p>
<p>This account setup is partially automated using the open-source tool, <a target="_blank" href="https://github.com/org-formation/org-formation-cli">org-formation</a>. When commencing the challenge, it didn’t occur to me I could have automated the Identity Centre and Route 53 pieces of the project as well, otherwise, I would have included them in my org-formation template.</p>
<h2 id="heading-chunk-0-certification-prep">Chunk 0: Certification Prep</h2>
<p>Before starting the challenge, I'd achieved the AWS Solutions Architect - Associate, AWS Solutions Architect - Professional and the Red Hat Certified Systems Administrator certifications.  After roughly twelve months grinding on the certificate mill I was a little burnt out from studying so I opted out of the Security Mod (achieving the AWS Security Specialty Certificate).</p>
<h2 id="heading-chunk-1-building-the-front-end">Chunk 1: Building the Front End</h2>
<p>It had been a while since I'd done any front-end development, and I was looking forward to digging into some new technologies. I'd used Angular at work but had never stepped foot in the React world, so I implemented the resume using Next JS. Tailwind CSS was my choice for styling the site. For my first pass, I wrote the bulk of the markup in a single page, then iteratively broke out each section (work experience, certifications, projects etc.) into individual React components. This made using Tailwind's utility CSS classes much less repetitive. I then converted the front end to TypeScript. Data fetching from the back-end API was implemented with the SWR library, allowing the site to retrieve the latest visitor count without refreshing the page.</p>
<h2 id="heading-chunk-2-building-the-api">Chunk 2: Building The API</h2>
<p>I collaborate with teams that are creating AWS lambdas using .NET but had never done it myself, so I chose to write the back end in C#. I stuck with using DynamoDB for the database since NoSQL technologies were new to me and I've worked with SQL Server / RDBMS for some time. The back-end lambda was implemented using ASP.NET Minimal APIs.</p>
<p>For the DevOps Mod, I set up CloudWatch Alarms for lambda throttling and API Gateway latency metrics. These alarms publish to an SNS topic. I created another lambda in C# that subscribes to the SNS topic and sends a notification to a Slack webhook. The text of the notification includes a delightful flame emoji, sparking joy whenever I receive a deluge of alerts via Slack as the site gets crushed by seemingly modest amounts of traffic (jk, the site gets zero traffic).</p>
<h2 id="heading-chunk-3-front-end-back-end-integration">Chunk 3: Front End / Back End Integration</h2>
<p>The developer mod for this chunk requires the API to count unique visitors rather than individual page hits. As recommended in the guide, I hashed the visitor's IP address to avoid storing personal identifying information in the database. I wanted to understand best practices for computing aggregates in Dynamo DB, so I created a Dynamo Stream and integrated it with a Lambda function. Each time a record representing a visitor was created or updated, the lambda would increment a record with unique visitor counts and total site visits for the current month. Using Dynamo's TTL feature, visitor records would be deleted at the end of the month, so the table is kept as compact as possible while retaining snapshots of the past months' visitor statistics.</p>
<p>WAF rules were outside the scope of my project's budget, so I went the quick and dirty route of enabling throttling and rate limiting on my API gateway for the security mod (but if this was a production web app for a paying client, I'd consider them essential). I'm going to be generous and give myself a 50% passing mark on this one.</p>
<h2 id="heading-chunk-4-automation-ci">Chunk 4: Automation / CI</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688709475642/32c5c9fc-b3e9-4431-bdbb-567041187cf4.png" alt class="image--center mx-auto" /></p>
<p>The teams I support at work use the AWS CDK with C#, so I decided to do the same for the infrastructure as code piece of the challenge. When a pull request is opened against the main branch of my repository, a GitHub action will deploy a new instance of the stack in a test account. These preview environments get a unique URL, Dynamo DB table and other AWS resources required to run the site. Resource names are unique and prefixed with the pull request number, meaning multiple preview environments can exist side by side in the same account. If the pull request gets updated with the new code, the environment gets updated by automation. When the pull request is approved and merged into the repository's main branch, the code gets deployed into the production AWS account. When the pull request is closed (approved or denied), the preview environment gets destroyed by another GitHub Actions workflow.</p>
<p>The combined developer / security mod involved securing the software supply chain of the project. As part of this mod, I investigated signing my Lambda code. I was surprised to discover that CloudFormation / CDK did not have a native feature supporting this. To achieve the desired result, I needed to create a custom CDK resource that calls the AWS code signing API to create a signing job for my lambda code. I then used the S3 object written by the signing job as the code asset for my lambda function. This way I could sign my lambda for the backend API at deploy time and fail the deployment if the code isn't signed. I found it challenging to figure out how custom resources work in the CDK and was excited when I got it to work without resorting to a third-party package (which is just as well, because I couldn't find one for dotnet!).</p>
<h2 id="heading-conclusion">Conclusion</h2>
<p><img src="https://cdn.hashnode.com/res/hashnode/image/upload/v1688634336303/4a85960e-7d43-4b8d-ade2-287b3bf0e7b6.png" alt="Architecture of the cloud resume challenge site" class="image--center mx-auto" /></p>
<p>It was fun working through the challenge and I was stretched by many of the mods. The project allowed me to apply the knowledge from my recent certifications, dabble in some new technologies and get reacquainted with some old ones. I found the experience worthwhile and would recommend it to both new and seasoned technologists alike.</p>
]]></content:encoded></item></channel></rss>