Part 2: Pipeline fun with AWS
Katoria H.
Posted on August 17, 2023
STEP 3
Welcome back techies to Part 2 of Pipelines Galore! We are going to be diving into the creation of our code pipeline, retrieving our region ID & IAM creds, along with taking it up a notch with additional validation testing. If you have not reviewed Part 1 of this tutorial, please take the time to do so now so that you’re able to follow along as we wrap things up!
So we’ve verified that we have successfully created our Secret via Secrets Manager, as well as generated a net-new KMS Customer-managed key. To ensure we have the correct region ID and IAM role (these will be needed for the pipeline), let’s run the following commands:
aws configure get region
aws iam get-role --role-name <rolename>
Please note that my role also includes a policy that has permissions for Elastic Beanstalk, CloudWatch, SQS, and X-Ray
(1) We will now begin with the creation of our Pipeline, which will ONLY include a Source & Deploy phase, though production pipelines will normally have a Build and Test phase included as well. There are two ways that you can create your pipeline: Use a json file with the specific parameters and run the aws codepipeline create-pipeline --cli-input-json file://pipeline.json
command once done with your file, OR via the Console, as shown below:
- In the Console, Navigate to CodePipeline and select “Create Pipeline”
- Give your pipeline a name and select an existing role or create a new role
- For the Source stage, select your specific provider
- SKIP the Build stage
- For the Deploy stage, add your Elastic Beanstalk details
- Click “Create Pipeline”, which should take a few minutes to generate
If you'd like to test this out using the CLI instead, feel free to use this JSON sample to serve as a guide for your pipeline:
{
"pipeline": {
"name": "yourpipelinename",
"roleArn": "arn:aws:iam::1234567890:role/youriamrole",
"artifactStore": {
"type": "S3",
"location": "yourbucketlocation"
},
"stages": [
{
"name": "Source",
"actions": [
{
"name": "SourceAction",
"actionTypeId": {
"category": "Source",
"owner": "AWS",
"provider": "yourprovider",
"version": "1"
},
"configuration": {
"RepositoryName": "yourrepo",
"BranchName": "master"
},
"outputArtifacts": [
{
"name": "yourinput"
}
],
"runOrder": 1
}
]
},
{
"name": "Deploy",
"actions": [
{
"name": "Deploy",
"actionTypeId": {
"category": "Deploy",
"owner": "AWS",
"provider": "Elastic Beanstalk",
"version": "1"
},
"configuration": {
"ApplicationName": "yourinput",
"EnvironmentName": "yourinput"
},
"inputArtifacts": [
{
"name": "yourinput"
}
],
"runOrder": 1
}
]
}
]
}
}
(2) We can verify that our pipeline and deployment were successful by visiting CodePipeline & our Elastic Beanstalk domain, as seen below:
(3) Now, we can take this up a notch by modifying one of the files created earlier, committing a new change, and then verifying that we can reach our previous Elastic Beanstalk domain, in which we should see the newly created changes. We’ll now push our newly updated files to the repo by using the following git commands as before:
git add .
git commit -m "Commit message"
git push origin master
(4) If you check the screenshots below, you'll notice that our changes to the index.html file were successful, and we also have some neat metrics that are shown with CloudWatch as well:
And that just about wraps it up for this two-part tutorial! I hope you have enjoyed creating your very first pipeline if you're net-new to AWS CodePipeline. Remember, there are multiple ways that you can create a pipeline, and this tutorial just walked through one approach. Stay tuned for more tutorials that are coming your way! Follow me on LinkedIn to be on the lookout for new blogs.
Posted on August 17, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.