According to the second part, in this blog you can add some controls to the deployment if you need to align your deployments with NIST framework and AWS Best Practices using the shift left security principle implementing some technics like unit test, SAST for infrastructure code and shift right security principle using CloudFormation Hooks.
Remember that goal is create a simple stack, for demo purpose an s3 bucket. Also, you will be finding the reports and improve the notifications options using channels such as Slack. If you want, you could download the code and extend the functionalities for example sending findings to AWS Security Hub or integrating CodeGuru but in other post you find more demos and tips.
Hands On
Requirements
cdk >= 2.43.0
AWS CLI >= 2.7.0
Python >= 3.10.4
Pytest >= 7.1.3
cdk-nag >=2.18.44
checkov >= 2.1.229
AWS Services
AWS Cloud Development Kit (CDK): is an open-source software development framework to define your cloud application resources using familiar programming languages.
AWS CodeBuild: fully managed continuous integration service that compiles source code, runs tests, and produces software packages that are ready to deploy.
AWS CodeCommit: secure, highly scalable, managed source control service that hosts private Git repositories.
AWS CodePipeline: fully managed continuous delivery service that helps you automate your release pipelines for fast and reliable application and infrastructure updates.
AWS Key Management Service (AWS KMS): lets you create, manage, and control cryptographic keys across your applications and more than 100 AWS services.
AWS CloudFormation: Speed up cloud provisioning with infrastructure as code as code
The Figure 1 shows the steps to accomplish this task. It shows a cross account pipeline using AWS CodePipeline, AWS CodeCommit, AWS Codebuild and AWS CloudFormation as AWS principal services but, other services as Security Hub, CodeGuru, AWS Chatbot and AWS CodeStart to improve the vulnerability management and user experience for developers. In this solution the first step is create a secure CDK project using cdk-nag, define unit test and local SAST before to push to the pipeline. Next, the pipeline runs the following steps:
1.The changes are detected and activate de pipeline. For this demo the branch master is the default branch.
2.The CDK project is synthesized if is aligned with AWS Security Best practices.
3.The pipeline run self-update action.
4.The unit test runs, and its report is published in codebuild reports group.
5.The SAST runs, and its report is published in codebuild reports group.
6.The Cloudformation stack is prepared for developer environment.
7.The Cloudformation stack is deployed for developer environment after success result from Cloudformation hooks.
8.To move between environments a manual approval step is added, the notification is sent to slack channel.
9.The Cloudformation stack is prepared for staging environment.
10.The Cloudformation stack is deployed for staging environment after success result from Cloudformation hooks.
Step by Step
Following the steps to setup the CDK project showed in the second part of this series or just clone the version 1.0 of GitHub code if you want add step by step the code.
This is a project for CDK development with Python for creating multi AWS account deployment.
Solution Overview
Solution Overview – Enhanced CDK Pipeline
The Figure shows the steps to accomplish this task. Also, it shows a cross account pipeline using AWS CodePipeline, AWS CodeCommit, AWS Codebuild and AWS CloudFormation. But, how can you construct this pipeline secure and with the minimum effort? The answer [CDK Pipelines](https://docs.aws.amazon.com/cdk/v2/guide/cdk_pipeline.html).
This pipeline is a default pipeline composed by next steps
1.The changes are detected and activate de pipeline. For this demo the branch master is the default branch.
2.The CDK project is synthesized if is aligned with AWS Security Best practices.
3.The pipeline run self-update action.
4.The unit test runs, and its report is published in codebuild reports group.
5.The SAST…
You can add unit test for testing the infrastructure constructs, but what kind of cases you should create?
When you use infrastructure as a code with a high level of abstraction using tools such as CDK or Pulumi you enter the world of Infrastructure as Software, that is, now the infrastructure is treated as another application and the software development cycle must also be applied to its construction and operation. The first step is use BDD, or TDD for the components but a small component could be a single piece of a big puzzle. You should construct the test based in stack or micro stack and not in single components.
In this scenario the code shows the unit test using fine-grained assertions for verifying that server-side encryption is enable using AES256 for s3 bucket.
First, create a test file test/unit/test_simple_s3_stack.py
importaws_cdkascoreimportaws_cdk.assertionsasassertionsfrom...src.stacks.simple_s3_stackimportSimpleS3Stackfrom...project_configs.helpers.project_configsimportprops# example tests. To run these tests, uncomment this file along with the example
# resource in src/stacks/simple_s3_stack.py
deftest_s3_bucket_created():app=core.App()props["storage_resources"]["s3"][0]["environment"]='dev'stack=SimpleS3Stack(app,"SimpleS3Stack",props=props["storage_resources"]["s3"][0])template=assertions.Template.from_stack(stack)template.has_resource_properties("AWS::S3::Bucket",{"BucketEncryption":{"ServerSideEncryptionConfiguration":[{"ServerSideEncryptionByDefault":{"SSEAlgorithm":"AES256"}}]}})
Of course, the cdk-pipeline could be tested in the same way
Now, the unit test step should be created into the pipeline in file src/pipeline/cdk_pipeline_multienvironment_stack.py, for that a CodeBuild step runs the pytest command and export the report in junitxml format to report group Pytest-Report.
Here the step is created and report group to publish results_junitxml.xml. Watch that the input for this stage is the cdk synth output generated in the previous stage.
The next step is add the step to the deployment stage for Dev environment.
If you don't correct the finds your pipeline will be broken, if the finding is a false positive or requirement doesn't apply for your environment you can use --soft-fail flag to checkov command to pass the step. From this checkov -d . -o junitxml --output-file . to this checkov -d . -o junitxml --output-file . --soft-fail
Finally check the differences and push your changes.
The Figure 4 and Figure 5 show the evidence, first the SAST step in Codepipeline panel and second the report group created for Codebuild.
Figure 4. SAST step Fail
Figure 5. Report Group SAST using checkov.
Enhanced the practice
So far you have applied traditional practice but, what kind of tools we can use to improve this process?
The answer, cdk-nag and cloudFormation Hooks.
cdk-nag: Check CDK applications or CloudFormation templates for best practices using a combination of available rule packs. Inspired by cfn_nag.
Cloudformation Hooks: A hook contains code that is invoked immediately before CloudFormation creates, updates, or deletes specific resources. Hooks can inspect the resources that CloudFormation is about to provision. If hooks find any resources that don’t comply with your organizational guidelines, they are able to prevent CloudFormation from continuing the provisioning process. A hook includes a hook specification represented by a JSON schema and handlers that are invoked at each hook invocation point.
Characteristics of hooks include:
Proactive validation – Reduces risk, operational overhead, and cost by identifying noncompliant resources before they're provisioned.
Automatic enforcement – Provide enforcement in your AWS account to prevent noncompliant resources from being provisioned by CloudFormation.
Including cdk-nag in cdk project
First, the requirements must be updated in requirements.txt.
Now, install the requirements using pip install -r requirements.txt.
Second, modify the aspects of each stack for including cdk-nag before synth process. For example in the stage for deployment s3 bucket in the file src/pipeline/stages/deploy_app_stage.py.
$ cdk synth
[Error at /CdkPipelineMultienvironmentStack/DeployDev/SimpleS3Stack/multi-env-demo/Resource] AwsSolutions-S1: The S3 Bucket has server access logs disabled. The bucket should have server access logging enabled to provide detailed records for the requests that are made to the bucket.
[Error at /CdkPipelineMultienvironmentStack/DeployDev/SimpleS3Stack/multi-env-demo/Resource] AwsSolutions-S2: The S3 Bucket does not have public access restricted and blocked. The bucket should have public access restricted and blocked to prevent unauthorized access.
[Error at /CdkPipelineMultienvironmentStack/DeployStg/SimpleS3Stack/multi-env-demo/Resource] AwsSolutions-S1: The S3 Bucket has server access logs disabled. The bucket should have server access logging enabled to provide detailed records for the requests that are made to the bucket.
[Error at /CdkPipelineMultienvironmentStack/DeployStg/SimpleS3Stack/multi-env-demo/Resource] AwsSolutions-S2: The S3 Bucket does not have public access restricted and blocked. The bucket should have public access restricted and blocked to prevent unauthorized access.
Found errors
Watch that some findings are the same gotten in SAST using checkov but in this case the process run before code is synthesized and the results was duplicated because the stage is use with different parameters twice.
Let's try to correct this findings.
Edit the stack construct to add a best practice: AwsSolutions-S2: The S3 Bucket does not have public access restricted and blocked. in file src/stacks/simple_s3_stack.py
Just add block_public_access=s3.BlockPublicAccess.BLOCK_ALL
fromaws_cdkimport(Stack,aws_s3ass3,CfnOutput,RemovalPolicy)fromconstructsimportConstructclassSimpleS3Stack(Stack):def__init__(self,scope:Construct,construct_id:str,props:dict=None,**kwargs)->None:super().__init__(scope,construct_id,**kwargs)# The code that defines your stack goes here
bucket=s3.Bucket(self,id=props["bucket_name"],bucket_name=f'{props["bucket_name"]}-{props["environment"]}',versioned=Trueifprops["versioned"]=="enable"elseNone,enforce_ssl=True,encryption=s3.BucketEncryption.S3_MANAGED,removal_policy=RemovalPolicy.DESTROY,# Best practices
block_public_access=s3.BlockPublicAccess.BLOCK_ALL,)# Define outputs
CfnOutput(self,id="S3ARNOutput",value=bucket.bucket_arn,description="Bucket ARN")
For demo purpose, the rule AwsSolutions-S1: The S3 Bucket has server access logs disabled, will be suppressed for this, in file src/pipeline/stages/deploy_app_stage.py the nag supression is added to the stack.
In order to give greater depth to cloudformation hooks in future deliveries, their use and capabilities will be demonstrated in depth.
Conclusions
Use IAM identity Center and multi account schema for your deployment aligned with AWS RSA and AWS Well Architecture Framework.
Apply DRY principle an enablement your team with the minimal tools that abstract the best practices and increase the agility and velocity of delivery.
Apply security shif left principle apply common practices such as SAST and unit testing to validate stacks properties and best practices, you can enforcement these practices before the code was synthetized using cdk-nag.
Use new features as Cloudformation Hooks for proactive validation and automatic enforcement for security shift right practices.
Use other services such as AWS Security Hub, AWS Chatbot and Amazon CodeGuru to improve the user experience and vulnerability management.
Expand the features progressively and don’t try put all together in the first step. Have a secure and robust pipeline require time, practice, and continuous improvement practices.