Integrate GitHub monorepo with AWS CodePipeline to run project-specific CI/CD pipelines.

elokac

Eloka Chiejina

Posted on June 25, 2023

Integrate GitHub monorepo with AWS CodePipeline to run project-specific CI/CD pipelines.

Introduction

CI/CD (Continuous Integration and Continuous Deployment/Delivery) is an approach to software development that emphasizes automation and frequent integration of code changes. It involves regularly merging code from multiple developers into a shared repository and automatically building and testing the code to ensure its quality. With CI/CD, the focus is on automating the entire process of delivering software, from development to deployment, enabling faster and more reliable releases. By adopting CI/CD practices, organizations can achieve improved code quality, accelerated delivery cycles, and streamlined software deployment. There are different CICD Tools that organization use, however my focus is on AWS CodePipeline.
AWS CodePipeline is a fully managed continuous delivery service in aws that helps you automate your release pipelines for fast and reliable application and infrastructure updates.

What we want to achieve

Creating a CICD Pipeline is easy especially when you are working on a monolithic applications or Microservice based applications with seperate Github repositories.
This becomes a problem when working with Monorepo (Microservice based applications in which all the services are in one Github repository)because there are as many pipelines as the number of Microservices so any change in the repository will trigger all the pipelines.This is not what we want to achieve.

We want a service pipeline to be triggered only when there a change in the corresponding directory in the GitHub repository. We can achieve this with Jenkins pipeline (using changeset), Github Action using Path Filter, etc but We want to achieve this AWS CodePipeline.

It is difficult at the moment to achieve this directly with AWS CodePipeline unless you use an external environment like Github Action, Jenkins to trigger the AWS CodePipeline but we want everything to be done in AWS.

One way to achieve this is using AWS API-Gateway and Lambda Function to trigger the pipeline.
click here to read up more

Method

To Achieve this we will follow 6 major steps

Architecture

Setting up AWS Lambda

Log on to your aws console and search for Lambda.

Click on  raw `Create Function` endraw

  • Put the function name as you like and select python 3.9 as the runtime.

Runtime setting

  • Click on advanced setting, Enable function URL and select None as the Auth type.
  • Click on Create

advanced setting

  • Hit the function URL on the Lambda function page and you will get this response ("Hello from Lambda!") on the browser

Lambda function page

Response from the Function URL

Add StartExecution Permission

The lambda Function will use StartExecution API to trigger the AWS Codepipeline so we need to add permission to the role we created while setting up the Lambda Function.
Go to IAM and Select Role
search for the role you created

Modify the role

Click on the role

Create Inline Policy
Click on add permission
select inline policy
click on json and add below policy then save.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowStartPipelineExecution",
            "Effect": "Allow",
            "Action": "codepipeline:StartPipelineExecution",
            "Resource": "*"
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

Setting up AWS api gateway

On the aws console, search for api gateway. We are using HTTP api

Select the  raw `HTTP api` endraw

  • click on the Build

configure the api gateway

  • click on integrations and select the lambda.
  • select the Lambda function you created before in the correct region.
  • Put the name of the api.
  • click Next.

Configure Routes

  • Select POST as the Method.
  • Put context path you want resource path. (This will be used when making a request from GitHub).
  • Select the target (The Lambda Function You created).
  • Click on Next.
  • Leave other things as default and create.

api gateway details
Copy the invoke URL.
The webhook URL will be invoke URL/route. Something like this ( https://lbxy6ibso5.execute-api.eu-west-2.amazonaws.com/github-webhook)

Routes

Setting up the webhook URL in the Github Repository

Go to the Github and select the GitHub repository that has the mono-repo.

Configure the webhook
Click on setting and select webhook.
Click on add webhook.

Webhook Configuration

Copy the wehbook URL you created in api gateway and set in the payload URL.
Change the content type to application/json
Click on add webhook

successful ping to api gateway

You can check the logs to be sure the GitHub gets to the lambda function.

check the logs

Request from the Github to AWS Lambda

Configure the Lambda function with logic to trigger the AWS CodePipeline.

Let assume that we are using 2 AWS code-pipelines named Codepipeline-A and Codepipeline-B.
Update below code in the Lambda function and click on deploy,

import boto3
import json

def lambda_handler(event, context):
    print(json.dumps(event))
    payload = json.loads(event['body'])

    # Extract necessary information from the GitHub webhook payload
    repository = payload.get('repository', {}).get('full_name')
    branch = payload.get('ref')
    commits = payload.get('commits')

    # Define the target folders for each microservice
    capture_widget_folder = 'apps/capture-widget'
    login_service_folder = 'apps/login-service'


    # Initialize the AWS CodePipeline client
    codepipeline_client = boto3.client('codepipeline')

    print(
      'repository is:', repository,
      'branch is:', branch,
      'commits is:', commits
    )

    # logic for codepipeline triggering  
    try:
        # Check if the changes include the capture-widget folder
        if repository and commits and any(file.startswith(capture_widget_folder) for commit in commits for file in commit.get('added', []) + commit.get('modified', [])):
            # Trigger the capture-widget CodePipeline
            response = codepipeline_client.start_pipeline_execution(
                name='Codepipeline-A'
            )
            print('Triggered CodePipeline-A:', response)

        # Check if the changes include the login-service folder
        if repository and commits and any(file.startswith(login_service_folder) for commit in commits for file in commit.get('added', []) + commit.get('modified', [])):
            # Trigger the login-service CodePipeline
            response = codepipeline_client.start_pipeline_execution(
                name='Codepipeline-B'
            )
            print('Triggered Codepipeline-B:', response)
        else:
             print('wahala dey oo ...')

        return {
            'statusCode': 200,
            'body': json.dumps('CodePipeline triggered successfully')
        }
    except Exception as e:
        print('Error:', e)
        raise e
Enter fullscreen mode Exit fullscreen mode

Update Lambda Function

You can now make changes in the GitHub and it will trigger the pipeline depending on the folder you made changes to and the the one you configured in the lambda function.

Remember to disable Change Detection Option in AWS CodePipeline.

Disable Change Detection Option

Conclusion

This can be achieved in several ways with other CICD tools and you can check the logs in the CloudWatch incase you are having issues or error.

I also know that most people manage their infrastructure with Terraform. This can be also be implemented with Terraform.

💖 💪 🙅 🚩
elokac
Eloka Chiejina

Posted on June 25, 2023

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related