How I completed the GCP edition of the Cloud Resume Challenge in AWS under one week

How I completed the GCP edition of the Cloud Resume Challenge in AWS under one week

How I completed the GCP edition of the Cloud Resume Challenge in AWS under one week I completed the Cloud Resume Challenge in a week. Yes, just one week. Was it fun? Absolutely not. Was it worth it? Without a doubt. At every step in this challenge, the urge to quit was relentless, and my mind was the driving force. Believe me, even with all the AI tools, blog posts, and Google, it was still a tough journey.

I won't dissect and over-explain the cloud resume challenge in this article. I mean, there are already over six hundred blogs doing that. Instead, you'll find the crazy conversations I had throughout this challenge. So, fasten your seatbelt because it will be a wild ride.

What is the cloud resume challenge?

I know I said earlier that I wouldn't delve into the details of this challenge, but I'm aware that some folks out there might've never heard of it before. I was one of those people just a few months ago. So, I'll provide some insight into what this challenge is all about.

Forrest Brazeal created the Cloud Resume Challenge to offer cloud engineers, especially those new to the field, a hands-on opportunity to use cloud services to construct and deploy their resumes. Participants are expected to have their resumes up and running on various cloud services provided by their preferred cloud provider (based on their chosen edition).

This demanding and focused challenge pushes participants to acquire new skills, think critically, and build their resumes with minimal guidance. Trust me when I say this challenge won't spoon-feed you like a typical tutorial; instead, it immerses you in the deep end and leaves you to sink or swim.

Despite its difficulty level, this challenge has existed for a while, and numerous engineers continue to take part and complete it. This is because, over the years, the challenge has proven to be a reliable way for cloud engineers to advance their careers. The curator, Forrest, refers to it as a "baptism by fire," meaning you tackle a problem for so long that the knowledge becomes deeply ingrained in your mind.

Despite the challenge's inherent difficulty, I thought it would be a walk in the park. I had experience using AWS, but my guidebook was for the GCP edition. So, I decided to tackle this challenge using AWS by adapting all the instructions for GCP to their AWS counterparts. At some points during the challenge, I was almost on the brink of giving up due to confusion, but I persevered. There's nothing a little ChatGPT and some strategic Googling can't help with.

I mentioned earlier that I wouldn't delve into the challenge's specifics, but let me be clear: it's not as easy as it might seem. When I first picked up the guidebook for this challenge, I thought, "Oh, this will be a breeze." Turns out, I was mistaken.

Despite having spent a year in the DevOps field, earning a diploma in cloud computing, two additional years as a frontend developer, and a computer science degree, nothing in my previous experience prepared me for the mental challenges I was about to face.

Mental discussions from each chunk

As the subheading suggests, this section will reveal the personal conversations that took place in my mind while I tackled this challenge. Yes, I started talking to myself. Was I losing my mind? Well, you be the judge after dealing with CORS errors for two consecutive days, deciphering error messages from over fifty failed pipeline jobs, and grappling with other challenges that cropped up during this journey.

The cloud resume challenge is divided into five sections to make the workload more manageable. While five sections may sound small, the challenge's requirements are divided into sixteen steps for each section. So much for a lighter workload, right?

In the same way that the challenge is broken down into sections, I'll break down every ordeal I faced into sections, each escalating in craziness. Remember, I did warn you that this would be a wild ride.

Chunk zero: Do I need this cert?

After reading Chunk Zero, I pondered this question for a while. Not because I couldn't afford a certificate, which is true, but because I genuinely believe that certificates don't automatically reflect one's technical abilities. Currently, my focus is on growing my skills, which is why I took up this challenge in the first place.

In this particular chunk, there was something that grabbed my attention. It wasn't essential to take the certificate exam. Instead, I could follow the resources meant for the certificate, and, bam, I could achieve the goal. And that's exactly what I did.

Fortunately, KodeKloud had a free week around the same time, for which I was extremely thankful. I swiftly signed up and enrolled in the course of my choice: Google's Digital Leaders certificate. I completed this 4-hour course in six days and obtained a certificate of completion.

You might recall that I initially decided to take this course in AWS, but now, I'm pursuing a GCP certificate course. This is because I still wanted to learn about GCP even though AWS is my strong suit. This knowledge would prove valuable in helping me understand the equivalent of a GCP service in AWS. So, I'm doing this challenge in both AWS and GCP simultaneously.

The ultimate goal of earning a certificate is to validate that you've acquired a certain level of knowledge. While I didn't obtain the certificate by taking the main exam, I did acquire all the prerequisite knowledge for certificate holders. I'm content with that, at least for now.

chunk zero description as seen in my notes

Figure 1*. Chunk zero description as seen in my notes*

Chunk one: It's just HTML and CSS...I've got this!

As soon as I realized that this chunk involved HTML and CSS, I couldn't help but smirk. I told myself, "I'm a pro at this, so I'll breeze through this part." Initially, crafting the resume's structure and applying CSS for the design felt like a walk in the park, but the challenge didn't end there.

The objective of this chunk is to create the resume site using HTML and CSS and deploy it. Focused on showcasing my frontend skills, I almost overlooked this chunk's crucial "deploy" aspect. Only after finishing the site did I realize I needed to address the deployment.

The requirement for the resume's deployment was to use Google Cloud Storage, employ Cloud CDN for site caching, and utilize Cloud DNS to give my resume a stylish website URL like https://aahil-resume.com. Okay, I started to doubt if I had this under control. I had to step up and take it seriously.

Since I completed the challenge using AWS, my first task was to convert the GCP services into their AWS counterparts. This wasn't too challenging, especially after I took the Google Digital Leaders certification course. The AWS equivalents for the GCP services are S3, CloudFront, and Route 53.

After a few hours, I set up the S3 and CloudFront components. Now, it was time to tackle Route 53. This part required me to purchase a domain name, either from a third-party domain provider or Route 53. Unfortunately, I couldn't complete this step, so I'll be stuck with a URL resembling something like https://d6imlgw8zi3aw.cloudfront.net from my CloudFront distribution.

chunk one description as seen in my notes

Figure 2*. Chunk one description as seen in my notes*

Chunk two: Permissions can be a pain in the ass

Why do services within the same cloud provider, like AWS, require permissions to communicate with each other? Can someone explain this to me? My brain was in turmoil as I kept jumping between the Lambda and IAM consoles to fix a persistent permissions error.

This chunk's task was constructing the application's backend using a Lambda function and DynamoDB. But that's not all; I also had to implement an API Gateway that would receive requests from my resume and interact with DynamoDB. In simpler terms, I built an AWS serverless API that updates a visitor count in the DynamoDB.

After several days of grappling with errors, taking multiple walks, and a bit of prayer, I managed to create the lambda function, set up the API Gateway, and establish my DynamoDB. However, a recurring permissions error from my API Gateway and DynamoDB kept haunting me.

This error tormented me for hours until I resolved it by attaching the policy to allow my API Gateway to read and update my DynamoDB to the role for my API Gateway. Sounds straightforward. But you'll understand the challenge's complexity when even a minor mistake, like using the wrong attribute name in your Lambda function, makes you question your suitability for a career in cloud engineering.

You can't imagine the sheer excitement that coursed through me when my API responded with a success message of 200 and updated my Lambda function. Yes! AWS should hire me now!!

chunk two description as seen in my notes

Figure 3*. Chunk two description as seen in my notes*

Chunk three: What the hell is CORS?

In all my years of frontend development, I never felt the need to grasp Cross-Origin Resource Sharing (CORS). Even though I'd often encountered it in my web browser's developer console, I never paid much attention to it. However, in this chunk, it became a real headache.

Initially, this part seemed like a breeze because all I had to do was connect the frontend (HTML, CSS) with the backend (Lambda, DynamoDB, API Gateway). I thought, "Since they worked well individually, they should work together seamlessly. I mean, what could go wrong?" Well, that turned out not to be the case.

I created an async function that would call my API, retrieve the visitor count, and display it on my resume while updating the value in my DynamoDB. It might sound like a lot, but it was pretty straightforward. A bit of JavaScript here some CSS there, and I thought I was all set. I eagerly checked my browser to see the result, and nothing happened.

As any frontend developer would, I quickly opened my browser's developer console to check the error message, and there it was:

The CORS error that held me bound for days

Figure 4*. The CORS error that held me bound for days*

What on earth is this? I spent hours trying to debug it and kept encountering the same error. After hours of searching on Google and using ChatGPT, I turned to YouTube. There, I found an excellent tutorial that explained CORS errors and provided a simple solution: adding the URL of my environment to the list of allowed origins. And just like that, it worked! I immediately shut down my computer for the day.

chunk three description as seen in my notes

Figure 5*. Chunk three description as seen in my notes*

Chunk four: I have PPTSD

I've coined the term "PPTSD," short for "Pipeline PTSD," to describe my fear of CI/CD pipelines. Remember when I mentioned going through more than fifty failed pipeline jobs? Well, this chunk is responsible for that ordeal.

The goal of this chunk was rather straightforward: Automation/CI. I typically enjoy these two aspects of the DevOps role. Or at least I thought I did. My task was to write the configuration to provision all my resources (S3, Lambda, CloudFront, API Gateway, and DynamoDB) in Terraform. After that, I had to deploy it using two separate Github Actions pipelines, one for the frontend and one for the backend.

Writing and testing the configuration locally was a breeze, and I quickly progressed. However, when I reached the CI part, things took a turn for the worse. Errors that didn't appear during local testing suddenly emerged out of nowhere. Out of the blue, I had to set a S3:PutBucketPolicy. I read the logs, re-ran jobs, consulted with ChatGPT, and still faced the same error.

After two days and a sleepless night, I stumbled upon a StackOverflow Q&A that provided the exact solution I needed, and that was just for the backend pipeline. I had to go through another round of debugging for my frontend pipeline when it came time to run the smoke test using Cypress. Fortunately, that only took a few hours.

By the time I finished dealing with all these pipeline issues, I had developed a new habit. Whenever I pushed a git commit to trigger a build, I'd walk away from my screen to avoid witnessing another build failure. I had become genuinely scared.

Chunk five: I'm burnt out!

By the time I completed the technical part of this challenge, I had reached a state of burnout. The final chunk required me to write a blog post about my experience. I thought, "This should be easy; I've written numerous blog posts before." However, I was mistaken. My burnout was so intense that I couldn't even write a word. I needed to take a break for a few days before I could tackle this article.

Since you're reading this now, I've completed the challenge. I won't sugarcoat it; it was a tough journey, but I'm grateful I didn't quit. I'm grateful I didn't give up. I'm grateful I didn't let my mind conquer me. I'm grateful I didn't let the challenge overcome me. I'm grateful I didn't let the errors get the better of me. I'm grateful I didn't let the pipeline best me. I'm grateful I didn't let the CORS error prevail. I'm grateful I didn't let the permissions error have the upper hand. I'm grateful I didn't let the challenge emerge victorious.

Conclusion

If you've read to the end, I do not doubt you will want to take on this challenge. Whether it's to experience the career-boosting potential it offers or to see what I've been rambling about, you'll be intrigued.

When you decide to take it on, remember that every step in this challenge is essential. Every error, setback, and even the moments of frustration and yelling—they all play a crucial role. Just keep pushing through. If you succeed, the sky's the limit; if you face challenges, keep trying!

If you've done the math to tally up the number of days I spent on this challenge, I'll save you the trouble. It was well over a week, but my repository's first commit tells me it all started a week ago, so let's go with that.

I wish I could write more, but this is already quite lengthy. If you enjoyed reading this, please let me know and give a shoutout to AmbassadorLabs on my behalf. Tell them I'd love to interview with them!

Oh, I've been rambling and actually forgot to show you my resume. My bad! You can check it out right here.

And if you're keen on diving into the code, it's available here. Enjoy!