Monitor and protect sensitive log data with CloudWatch and Terraform

nataliam

Natalia Marek

Posted on December 2, 2022

Monitor and protect sensitive log data with CloudWatch and Terraform

New feature for CloudWatch Logs has been announced on the first day of this years AWS Re:Invent - you can now set up a policy for each CloudWatch logs group, which will allow you to mask and audit sensitive log data.

In this post I will write about the details of this functionality and walkthrough how to implement it in the AWS Console and using Terraform.

How does it work?

This new CloudWatch feature will allow you to:

  • mask any information that you deem sensitive
  • audit it by setting a destination for it - this could be S3 bucket, Kinesis Data Firehose or/and CloudWatch Logs.

After Data Protection policy is attached to a log group any logs added to that log group will be masked, and only users with logs:Unmask permissions will be able to view them.

Making sure that any sensitive data is protected is
How does masked data looks like in the CloudWatch console

Important to note that any sensitive information in the log groups created prior to setting up sensitive data policy will not be detected.

What types of data you can identify?

AWS gives yo ua choice of 100 data types to choose from, such as

  • Credentials, such as AWS Access Keys or Private keys
  • Personally Identifiable information, such as national identification numbers, Social Security Number , drivers license numbers, passport number, addresses, Electoral roll number taxpayers number etc
  • Protected Health Information (PHI) such as health insurance numbers, NHS numbers, health insurance information
  • Financial information - bank account numbers, credit card numbers and security codes
  • Device Identifiers such as IP adress

You can find the whole list of the types along with their ARN identifiers here.

How to create sensitive data policy in the AWS Console

In your AWS console, go to CloudWatch, select Log groups, choose log group you would like to create the policy for, and next select Data protection section, and click on Create policy.

Choose Log group and select Data protection

Now you will be able to select Data identifiers you want to mask and audit - for the purpose of this tutorial, we will choose three: Address, BankAccountNumber-GB and BankAccountNumber-US. Next we can select audit destination, here we have selected an existing sensitive-data-audit-example-bucket, however this is optional,
Creating data protection policy

You also have ability to view how your policy looks like in JSON format. This is the example policy that we have just created:



{
  "Name": "data-protection-policy",
  "Description": "",
  "Version": "2021-06-01",
  "Statement": [
    {
      "Sid": "audit-policy",
      "DataIdentifier": [
        "arn:aws:dataprotection::aws:data-identifier/Address",
        "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-GB",
        "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-US"
      ],
      "Operation": {
        "Audit": {
          "FindingsDestination": {
            "S3": {
              "Bucket": "sensitive-data-audit-example-bucket"
            }
          }
        }
      }
    },
    {
      "Sid": "redact-policy",
      "DataIdentifier": [
        "arn:aws:dataprotection::aws:data-identifier/Address",
        "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-GB",
        "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-US"
      ],
      "Operation": {
        "Deidentify": {
          "MaskConfig": {}
        }
      }
    }
  ]
}



Enter fullscreen mode Exit fullscreen mode

Next click on Activate data protection, and your newly created policy will mask and audit any logs added to the log group after the policy was created.

Create data protection policy for CloudWatch with Terraform

Below you can see example of how to add a Data Protection policy resource in Terraform.

Here we will be setting up similar policy to the one created before - adding Data policy to an existing log_group_example policywith s3 bucket as audit destination, Address and BankAccount data identifiers.




resource "aws_cloudwatch_log_group" "log_group_example" {
  name = "log_group_example"
}

resource "aws_s3_bucket" "sensitive_data_audit-example_bucket" {
  bucket = "sensitive-data-audit-example-bucket"
}

resource "aws_cloudwatch_log_data_protection_policy" "log_group_example" {
  log_group_name = aws_cloudwatch_log_group.log_group_example.name

  policy_document = jsonencode({
    Name    = "data_protection_policy"
    Version = "2021-06-01"

    Statement = [
      {
        Sid = "Audit"
        DataIdentifier = ["arn:aws:dataprotection::aws:data-identifier/Address",
          "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-US",
        "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-GB"]
        Operation = {
          Audit = {
            FindingsDestination = {
              S3 = {
                Bucket = aws_s3_bucket.sensitive_data_audit-example_bucket.bucket
              }
            }
          }
        }
      },
      {
        Sid = "Redact"
        DataIdentifier = ["arn:aws:dataprotection::aws:data-identifier/Address",
          "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-US",
        "arn:aws:dataprotection::aws:data-identifier/BankAccountNumber-GB"]
        Operation = {
          Deidentify = {
            MaskConfig = {}
          }
        }
      }
    ]
  })
}



Enter fullscreen mode Exit fullscreen mode

You can read more about CloudWatch Data Protection policy in the Amazon

💖 💪 🙅 🚩
nataliam
Natalia Marek

Posted on December 2, 2022

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related