Part 1: Pipeline fun with AWS

khenry

Katoria H.

Posted on August 17, 2023

Part 1: Pipeline fun with AWS

Hello techies and future techies! We are back with a new tutorial leveraging AWS DevTools. As the name suggests, DevTools, or Developer Tools, are tools that are most commonly used by individuals operating in the DevOps space. There are so many tools that you can choose from if you’re looking to simplify your overall development life cycle, automate CI/CD pipelines, and of course, enhance and improve developer productivity. For this tutorial, we will be leveraging AWS CodeCommit, Elastic Beanstalk, Secrets Manager, KMS, IAM, and CodePipeline. Let’s dive into some of these services and explain their use cases!

When you hear of AWS CodeCommit, I’m pretty sure that you may immediately think of Github, which is very similar in nature. CodeCommit is a fully managed source code control service that enables you to host secure and scalable Git repositories. A fully managed AWS service is one that AWS manages end-to-end, such as the infrastructure and resources required to deliver the service. CodeCommit provides version control for your application code, allowing teams to collaborate on software development projects efficiently. Some of the key features includes the following:

  • Git-based: CodeCommit supports the popular Git version control system, making it easy to integrate with existing Git workflows and tools.
  • Private Repositories: CodeCommit allows you to create private repositories to protect sensitive code from unauthorized access.
  • Integration: It seamlessly integrates with other AWS services, including CodeBuild, CodeDeploy, and CodePipeline, facilitating a streamlined development and deployment process.

Like many, I automatically think of a service like “Vault” whenever I hear of “Secrets Manager”.. AWS Secrets Manager is a fully managed service for securely storing, retrieving, and managing sensitive information such as passwords, API keys, and database credentials. It helps centralize and rotate secrets, ensuring secure access to resources. Some of the key features includes the following:

  • Secret Storage: Secrets Manager stores secrets securely in the AWS Cloud, encrypting them at rest and in transit.
  • Automated Rotation: It offers automatic rotation of secrets, reducing the risk of exposure and simplifying the management of credentials. **You should ALWAYS consider key rotation when operating in a Production environment.
  • Integration: Secrets Manager can be easily integrated with other AWS services and applications, ensuring secure access to resources without exposing sensitive information.

Elastic Beanstalk is pretty cool as it abstracts the underlying infrastructure, allowing you to focus on your application. It handles provisioning, scaling, and load balancing, making deployment and management easier. Some of the key features include the following:

  • Easy Rollbacks: If a deployment doesn't go as planned, Elastic Beanstalk allows you to quickly roll back to the previous version
  • Custom Domains & SSL: You can easily map custom domain names to your Elastic Beanstalk environment and configure SSL certificates for secure communication.
  • 2 Environment Types: Elastic Beanstalk offers two environment types: Web Server and Worker. Web Server environments are optimized for web applications, while Worker environments are suitable for background tasks or processing jobs.

And finally, we have AWS CodePipeline, which is also a fully managed continuous integration and continuous delivery (CI/CD) service that automates the end-to-end software release process. It allows you to define, model, and visualize the different stages of your release pipeline.Key Features of CodePipeline can be found below:

  • Pipeline Automation: CodePipeline automates the build, test, and deployment stages of your application, enabling continuous integration and delivery of software changes.
  • Flexibility: It supports a wide range of integration options with third-party tools and services, enabling you to customize your CI/CD workflow.
  • Visual Workflow: CodePipeline provides a graphical interface to visualize and monitor the stages of your release process, making it easy to identify bottlenecks and issues.

The following prerequisites are highly recommended for this tutorial:

  1. An AWS Account
  2. AWS CLI Configured
  3. Github Account
  4. Familiarity with GitHub commands
  5. Familiarity with JSON

Let’s jump into creating a DevSecOps pipeline on AWS using AWS CLI. Please note that this tutorial may slightly differ from the steps taken in a production environment setup.

STEP 1

(1) The very first step that we want to initiate is to create a net-new application via the Console for Elastic Beanstalk (this can also be done via the CLI if you prefer). Navigate to Elastic Beanstalk, and do the following:

  • Create Application
  • Select ‘Web Server Environment’ under environment Tier
  • Name your App
  • Select the PHP Platform
  • Leave the Application Code as ‘Sample Code’
  • Select ‘Single Instance’ under Configuration Presets
  • Select Next

Configure your service access, networking, instance scaling, monitoring and logging to your preferences

Image description

(2) Secondly, we’re going to create a repository for AWS CodeCommit, by running the following command:

aws codecommit create-repository --repository-name <yourreponame>
Enter fullscreen mode Exit fullscreen mode

Image description

(3) Next, we need to clone the repo created above, and push our newly created files to the repo, using the following commands:

git clone <repoURL>
Enter fullscreen mode Exit fullscreen mode

Be sure to change into the repo directory once cloned

(4) To add our app files, we’re going to be locally cloning the repo found here. Now, it’s totally up to you if you’d like to modify those files on your end. Once done, you should see multiple files, such as what I have shown here:

Image description

Before pushing your newly created file to the repo created above, be sure that you have confirmed that you have the correct IAM CodeCommit & Git permissions attached to your particular user role. Without these credentials, you will not be able to proceed to the next steps of pushing your content to the repo.

(5) We’ll now push our newly created file to the repo by using the following git commands:

git add .
git commit -m "Initial commit"
git push origin master
Enter fullscreen mode Exit fullscreen mode

Image description

You can verify that the newly created files have been successfully pushed to the repo by visiting AWS CodeCommit in the Console. You should also see a newly created S3 bucket, EC2 instance, CloudWatch Metrics, Auto Scaling Group, and so forth.

Image description

Image description

STEP 2

(1) Because we will be storing artifacts as well for the build, we will need to generate a new Secrets Manager Secret, followed by our KMS Key:

aws secretsmanager create-secret \
  --name <name> \
  --description "Database credentials for my application" \
  --secret-string '{"username": "username", "password": "password"}' \
  --query 'ARN' --output text

Enter fullscreen mode Exit fullscreen mode

Image description

(2) Now that we have successfully created our secret, let’s execute the code below to generate a net-new customer-managed KMS key. Please note that an aws-managed key is provided by default when using CodePipeline, along with an S3 Bucket. If generated correctly, the output should be over 30 characters:

aws kms create-key \ 
--description "<description>” \
--query 'KeyMetadata.KeyId' --output text
Enter fullscreen mode Exit fullscreen mode

Image description

(3) And finally, we need to assign the KMS key to the secret from Secrets Manager, using the commands below:

aws secretsmanager update-secret \
  --secret-id <yourinfo> \
  --kms-key-id $kmsKeyId
Enter fullscreen mode Exit fullscreen mode

Image description

(4) Be sure that you've added server-side encryption to your S3 bucket generated by Elastic Beanstalk using the KMS key that you created above:

Image description

Okay…before I get too winded here, stay tuned for Part 2 of this tutorial in which we dive into retrieving the region ID, IAM creds, and start building out our pipeline! See ya there!

💖 💪 🙅 🚩
khenry
Katoria H.

Posted on August 17, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related

Part 1: Pipeline fun with AWS
tutorial Part 1: Pipeline fun with AWS

August 17, 2023