DevSecOps with AWS- IaC at scale - Building your own platform – Part 3 - Pipeline as a Service

avelez

Alejandro Velez

Posted on November 30, 2024

DevSecOps with AWS- IaC at scale - Building your own platform – Part 3 - Pipeline as a Service

Level 300

In the previous post for this series,DevSecOps with AWS- IaC at scale - Building your own platform - Part 1 and DevSecOps with AWS- IaC at scale - Building your own platform - Part 2 - CI for IaC you could learn more about creating the CI pipeline, key questions and practices to enable the self-service capabilities to reuse this pipeline. In the following example, you will learn a use case to allow self-services capabilities using the pipeline of pipeline approach and combine the service Catalog capabilities to get a Pipeline as a Service.

So, let’s create as Platform engineer. But first a little overview about the solution:

Solution Overview

Now review the solution exposed in the previous blog post. And focus on creating a SaaS model to deploy your standard pipelines in other DevSecOps accounts. Using CDK Pipelines for this approach. In the future post you can enable the self-service capabilities for now just focusing on the platform’s bases.

DevSecOps IaC pipeline of pipelines using CDK Pipelines

You need to follow the next steps:
1- Construct pipeline of pipelines using CDK Pipelines.
2- Test the pipeline of pipelines to create a pipeline for deployment in sandbox account.
3- Approve changes to pass to production environments.
4- Enable self service capabilities to add new project to the pipelines of pipelines.
5- Out the box, create self service portal according to IDP (Internal Developer Portal).

Requirements

• cdk >= 2.167.1
• AWS CLI >= 2.7.0
• Python >= 3.10.4

AWS Services

Hands On ☺️

First create a CDK pipelines Stack and deployment stage:

from aws_cdk import (
    Stage,
    Environment,
    Tags
)
from constructs import Construct

from ...cdkv2_dev_sec_ops_ia_c_terraform_stack import Cdkv2DevSecOpsIaCTerraformStack


class PipelineStageProd(Stage):

    def __init__(self, scope: Construct, id: str, props: dict = None,
                 compliance_buildef: dict = None,
                 terra_plan_buildef: dict = None,
                 kics_buildef: dict = None,
                 tfsec_buildef: dict = None,
                 checkov_buildef: dict = None,
                 testing_buildef: dict = None,
                 deployment_buildef: dict = None,
                 env_devsecops_account: Environment = None,
                 environments: dict = None,
                 tags:dict = None, **kwargs):
        super().__init__(scope, id, **kwargs)

        core_stack = Cdkv2DevSecOpsIaCTerraformStack(self, f"CoreDevSecOpsIaCTerraformStack-{props['project_name']}",
                                                     stack_name=f"CoreDevSecOpsIaCTerraformStack-{props['project_name']}",

                                                     props=props,
                                                     compliance_buildef=compliance_buildef,
                                                     tfsec_buildef=tfsec_buildef,
                                                     checkov_buildef=checkov_buildef,
                                                     testing_buildef=testing_buildef,
                                                     terra_plan_buildef=terra_plan_buildef,
                                                     kics_buildef=kics_buildef,
                                                     deployment_buildef=deployment_buildef,
                                                     env_devsecops_account=env_devsecops_account,
                                                     environments=environments

                                                     )

        for t in tags.keys():
            Tags.of(core_stack).add(key=t, value=tags[t])

Enter fullscreen mode Exit fullscreen mode

Now, from main stack add the stages based on the project properties in yaml files, here a common question for manage parallel deployments and projects with this approach.

How can we deploy many stacks parallel when we have many projects?

The answer is using the CDK Pipelines waves capability, this capability is useful in several scenarios:

  1. Parallel Deployments:
    • Deploy multiple stages simultaneously to different regions or accounts.
    • Reduce overall pipeline execution time by running independent deployments in parallel.
    • Deploy the same application to multiple environments concurrently.

  2. Deployment Organization:
    • Group related deployments logically.
    • Create deployment waves for different environments (dev, test, prod).
    • Organize deployments by business units or applications.

  3. Controlled Progressive Rollouts:
    • Deploy to a subset of regions/accounts first.
    • Implement progressive deployment patterns.
    • Add validation steps between waves for safety.

  4. Resource Optimization:
    • Control concurrent deployments to manage resource usage.
    • Balance deployment speed with system load.
    • Optimize pipeline execution costs.

  5. Dependency Management:
    • Group stages that can run independently.
    • Separate dependent deployments into sequential waves.
    • Manage complex deployment orchestration.

While stages within a wave run in parallel, the waves themselves execute sequentially. This allows for controlled progression through your deployment pipeline while maximizing efficiency where parallel execution is beneficial.

Second, test the stack with the parameters, in this case all definitions in a folder project configs are loaded, and one pipeline is created for each parameters set (team or project definitions).

Each project properties are loaded with a python function and create a custom class to abstract the project configurations.

import os
from dataclasses import dataclass
from typing import Dict, Any, Optional
from aws_cdk import Environment
from .utils import load_yamls, load_yamls_to_dict, load_tags, find_yaml_files

@dataclass
class ProjectConfiguration:
    """Class to manage project configuration and environments"""

    def __init__(self, props_path: str, dirname: Optional[str] = None):
        """
        Initialize ProjectConfiguration

        Args:
            props_path (str): Path to the properties file
            dirname (str, optional): Base directory path. If None, uses the directory of this class
        """
        self.dirname = dirname if dirname is not None else os.path.dirname(__file__)
        self.props_path = props_path
        self.props = self._load_properties()
        self.environments = {}
        self.build_definitions = {}

        self._setup_environments()
        self._load_tags()
        self._load_build_definitions()

    def _load_properties(self) -> Dict[str, Any]:
        """Load main properties from YAML file"""
        full_path = os.path.join(self.props_path)
        return (load_yamls(full_path))[0]

    def _setup_environments(self) -> None:
        """Setup environment configurations"""
        self.devsecops_env = Environment(
            account=self.props['account_devsecops'],
            region=self.props['region_devsecops']
        )

        self.props["def_environments"] = {}
        for env_config in self.props["environments"]:
            self._process_environment(env_config)

        self.props["environments"] = self.environments

    def _process_environment(self, env_config: Dict[str, Any]) -> None:
        """Process individual environment configuration"""
        env_name = env_config["environment"]

        # Create CDK Environment
        self.environments[env_name] = Environment(
            account=env_config['deployment_account'],
            region=env_config['deployment_region']
        )

        # Basic environment configuration
        self.props["def_environments"][env_name] = {
            "deployment_region": env_config["deployment_region"],
            "deployment_account": env_config["deployment_account"],
            "enable": env_config["enable"],
            "manual_approval": env_config["manual_approval"],
            "peer_review_approval": env_config["peer_review_approval"]
        }

        # Test environment specific configuration
        if env_name == "test":
            self.props["def_environments"][env_name].update({
                "automate_testing": env_config["automate_testing"],
                "test_workspace": env_config["test_workspace"]
            })

        # Optional partner review email
        if "partner_review_email" in env_config:
            self.props["def_environments"][env_name]["partner_review_email"] = env_config["partner_review_email"]

    def _load_tags(self) -> None:
        """Load tags from YAML file"""
        tags = (load_yamls(os.path.join(self.props_path)))[1]['tags']
        self.props["tags"] = load_tags(tags)

    def _load_build_definitions(self) -> None:
        """Load all build definitions"""
        definition_mappings = {
            'compliance': 'path_def_compliance',
            'terra_plan': 'path_def_create_artifacts',
            'tfsec': 'path_def_tfsec',
            'checkov': 'path_def_checkov',
            'kics': 'path_def_kics',
            'testing': 'path_def_testing',
            'pipeline_cd': 'path_def_deploy'
        }

        for key, path_key in definition_mappings.items():
            self.build_definitions[key] = load_yamls_to_dict(
                os.path.join(self.dirname, self.props[path_key])
            )

    def get_environment(self, env_name: str) -> Environment:
        """Get specific environment configuration"""
        return self.environments.get(env_name)

    def get_build_definition(self, definition_type: str) -> Dict[str, Any]:
        """Get specific build definition"""
        return self.build_definitions.get(definition_type, {})


# Example usage with different configuration files
def get_project_configs():
    print("Loading project configs ...")
    dirname = os.path.dirname("./project_configs/environment_options/")
    dirname_pipes = os.path.dirname("./project_configs/pipeline_definitions")
    projects = find_yaml_files(dirname, "environment_options_terragrunt")

    project_configs= []
    for p in projects:
    # Create instances for different environments or configurations
        conf=ProjectConfiguration(dirname= dirname_pipes,props_path=p)
        project_configs.append(conf)
    return project_configs


Enter fullscreen mode Exit fullscreen mode

Third, test in sandbox and add the appropriate approval steps.

Here code fragment for this deployment, here two waves are created according to reference diagram, the first deploy a sandbox template and after pass to production environments.

from aws_cdk import (
    Stack,
    Environment,
    Aws,
    aws_codecommit as codecommit,
    pipelines
)
from constructs import Construct
from .load_project_configs import get_project_configs
from .stages.deploy_pipeline_stack import PipelineStageProd


class Cdkv2DSOTerraformPipelineStack(Stack):

    def __init__(self, scope: Construct, construct_id: str, props: dict = None,

                 **kwargs) -> None:
        super().__init__(scope, construct_id, **kwargs)


        repo = props['infra_repository']["name"]
        synth = pipelines.ShellStep(
            "Synth",



      input=pipelines.CodePipelineSource.connection(repo_string=f"{props['infra_repository']['owner']}/{repo}",
                                                          code_build_clone_output=True,

                                                          branch=props['infra_repository']['branch'],
                                                          connection_arn=f"arn:aws:codestar-connections:{Aws.REGION}:{Aws.ACCOUNT_ID}:connection/{props['infra_repository']['connection_id']}"
                                                          ),

            commands=[
                "npm install -g aws-cdk",  # Installs the cdk cli on Codebuild
                "pip install -r requirements.txt",
                "pip install checkov",
                # Instructs Codebuild to install required packages
                "npx cdk synth",
            ]
        )

        pipeline = pipelines.CodePipeline(self, f"CDKPipeline-{props['project_name']}",
                                          self_mutation=True,
                                          cross_account_keys=True,
                                          synth=synth,
                                          pipeline_name=repo,

                                          )

        sandbox_wave = pipeline.add_wave("DeploySandbox")
        prod_wave = pipeline.add_wave("DeployProd")


        # add manual approval for prod wave
        sandbox_wave.add_post(pipelines.ManualApprovalStep("PromoteToProd",
                                                           comment="Ready to apply to prod deployments"))
        projects = get_project_configs()
        for project in projects:
            project_name = project.props.get("project_name")
            print(project_name)

            stage_id = f"Deploy-{project_name}"
            # create wave
            deploy_stage = PipelineStageProd(self, stage_id, stage_name=stage_id,
                                             props=project.props,
                                             compliance_buildef=project.get_build_definition("compliance"),
                                             tfsec_buildef=project.get_build_definition("tfsec"),
                                             checkov_buildef=project.get_build_definition("checkov"),
                                             testing_buildef=project.get_build_definition("testing"),
                                             terra_plan_buildef=project.get_build_definition("terra_plan"),
                                             kics_buildef=project.get_build_definition("kics"),
                                             deployment_buildef=project.get_build_definition("pipeline_cd"),
                                             env=project.devsecops_env,  # env_client_prd_account,
                                             env_devsecops_account=project.devsecops_env,
                                             environments=project.environments,
                                             tags=project.props.get("tags")
                                             )  # env=env_client_prd_account)
            if project_name.endswith("sandbox"):
                sandbox_wave.add_stage(
                    deploy_stage,

                )
            else:
                prod_wave.add_stage(
                    deploy_stage
                )



Enter fullscreen mode Exit fullscreen mode

You can modify it according to your requirements.

The AWS console view is something like this:

Pipeline of pipelines AWS 1

Pipeline of pipelines AWS 2

Finally, you are ready to deploy more projects, in this case the project structure allow put a file for each project in the path and the cdk app automatically extend and create new pipelines:


tree project_configs/environment_options/
project_configs/environment_options/
├── environment_options_advanceiac_v2.yaml
├── environment_options_ecs_fargate_pattern.yaml
├── environment_options_project_1.yaml
├── environment_options_project_2.yaml
├── environment_options_project_3_networking.yaml
├── environment_options_project_4_shared.yaml
├── environment_options_terragrunt_blueprint.yaml
├── environment_options_terragrunt_sandbox.yaml
└── environment_options_another_project.yaml

1 directory, 9 files

Enter fullscreen mode Exit fullscreen mode

Thanks for reading and sharing. If you want to know more and get the code please follow me and at the end of this series I will share the final project version as opensource project.

💖 💪 🙅 🚩
avelez
Alejandro Velez

Posted on November 30, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related