Mastering Cross-Account DynamoDB Access with Terraform and Node.js: A Step-by-step Guide!
Dmtro Harazdovskiy
Posted on May 8, 2023
Introduction
Establishing cross-account access for DynamoDB can be a complex task, particularly when navigating the intricacies of configuring multiple AWS accounts. However, by employing the power of Terraform and Node.js, you can overcome this challenge and create a robust and efficient multi-account setup. In this article, we will guide you through the process of setting up cross-account access for DynamoDB using Terraform and Node.js, enabling you to manage and fetch data across two separate AWS accounts with ease.
The benefits of this setup are numerous. It not only streamlines your project’s resource management but also significantly improves security by segregating resources between accounts. Additionally, it allows for better scalability, enabling your infrastructure to grow and evolve alongside your project’s needs. So, let’s embark on this exciting journey and explore the world of cross-account DynamoDB access!
Prerequisites
Before we begin, it’s crucial to ensure you have the necessary tools and services at your disposal. For this tutorial, you’ll need the following:
Two AWS accounts: Main with DDB Table and Consumer — can be blank.
Terraform: We will use Terraform to automate the creation and management of AWS resources. To install Terraform, follow the instructions in the official documentation.
Node.js: Our example application for fetching data from DynamoDB will be built using Node. Also serverless will be used to deploy the app.
While this guide assumes you have a basic understanding of AWS Components, Terraform, and Javascript syntax.
Now that you have all the necessary tools and services, let’s move forward and set up our AWS environment for cross-account DynamoDB access.
AWS Setup Overview
Let’s review the AWS setup on a higher level before moving out to the details:
Main account — account holds DDB Table we need to access to
Consumer account — an account that should grant access to DDB Table from the Main account.
Inside a Main AWS Account create a Role (CONSUMER_ACCOUNT_ROLE) that would allow the Consumer Account to assume credentials for the DDB of the Main account.
Inside a Main account create a policy (MAIN_ACCOUNT_DDB_POLICIES_FULL) that would allow access to DDB Table.
Inside a Main account attach MAIN_ACCOUNT_DDB_POLICIES_FULL to CONSUMER_ACCOUNT_ROLE.
Using 1 step, we created a Trust policy rule that allows Consumer accounts to use CONSUMER_ACCOUNT_ROLE.
Inside the Consumer account, we created a policy that would allow Consumer Lambda to assume the secure credentials using CONSUMER_ACCOUNT_ROLE.
Application Layer overview
Since Example Consumer Lambda has access to CONSUMER_ROLE via the attached CONSUMER_ACCOUNT_ACCESS_POLICY we need to issue temporary credentials using Security Token Service (STS) to access DDB Table.
An issued credential that was returned from the STS call must be used to instantiate a DDB client and query data.
Now that we are familiar with all the moving pieces, the funniest part is left — code all of this!
Terraform Config
There are 3 things we need to create for Main Account using Terraform:
aws_iam_role
— that Consumer account would useaws_iam_policy
— it would allow the role below to access the DDB table to do specific actions.aws_iam_policy_attachment
— to link roles and policies together.
This module could have 3 params to input:
Consumer account id.
DDB Table name.
Environment name (just in case).
It looks pretty fine for a one-time setup. 1 policy -> 1 attachment -> 1 role. But what if we need to Read from one table and Full access to the other?
Should we duplicate this setup one more time but for the other table?
Let’s create a module that could handle these cases:
# main.tf
resource "aws_iam_role" "ddb_access_role" {
name = "${var.env}-ddb_cross_acc_access_role-${var.consumer_account_id}"
description = "Role that provides access to DynamoDB tables for account ${var.consumer_account_id} on ${var.env} env"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
AWS = "arn:aws:iam::${var.consumer_account_id}:root"
}
}
]
})
}
resource "aws_iam_policy" "ddb_table_policy" {
for_each = { for table_access in var.table_access_list : table_access.table_name => table_access }
name = "${var.env}-DynamoDBTableAccess-${var.consumer_account_id}-${each.key}"
description = "Access policy to DynamoDB ${each.key} table for account ${var.consumer_account_id} on ${var.env} env"
policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = lookup({
"Read" = [
"dynamodb:BatchGetItem",
"dynamodb:GetItem",
"dynamodb:Query",
"dynamodb:Scan"
],
"ReadWriteUpdate" = [
"dynamodb:BatchWriteItem",
"dynamodb:DeleteItem",
"dynamodb:GetItem",
"dynamodb:PutItem",
"dynamodb:UpdateItem"
],
"Full" = ["dynamodb:*"]
}, each.value.access_type, [])
Effect = "Allow"
Resource = [
"arn:aws:dynamodb:*:${data.aws_caller_identity.current.account_id}:table/${each.key}",
"arn:aws:dynamodb:*:${data.aws_caller_identity.current.account_id}:table/${each.key}/index/*"
]
}
]
})
}
resource "aws_iam_policy_attachment" "ddb_access_policy_attachment" {
for_each = aws_iam_policy.ddb_table_policy
policy_arn = each.value.arn
roles = [aws_iam_role.ddb_access_role.name]
name = "cross_acc_access_policy_attachment-${var.consumer_account_id}-${each.key}"
}
data "aws_caller_identity" "current" {}
The IAM Role
aws_iam_role.ddb_access_role
that is created has a name with the following format: [var.env]-ddb_cross_acc_access_role-[var.consumer_account_id]
.
The assume_role_policy the attribute of the IAM role is set to a JSON-encoded string that defines the permissions for assuming the role. In this case, the policy grants permission to the specified AWS account (arn:aws:iam::${var.consumer_account_id}:root
) to assume the role. After applying Terraform you can check out this policy in the Trusted relationships tab on the IAM policy page.
The IAM Policy
aws_iam_policy.ddb_table_policy
provides access to a list of DynamoDB tables specified by the var.table_access_list
input variable. Each table in the list has an associated access_type
attribute that specifies what kind of access (Read, ReadWriteUpdate, Full
)should be granted.
The policy grants access to tables based on the access_type
attribute. Specifically, the policy allows certain DynamoDB actions (e.g. BatchGetItem, PutItem
, etc.) on the tables, as defined by the lookup() function, which returns the appropriate value for the corresponding access_type .
The for_each
attribute is used to create a separate IAM policy for each table in the list. Each policy has a name with the following format: [var.env]-DynamoDBTableAccess-[var.consumer_account_id]-${each.key}
.
It was required to make a policy to have a unique name for each new table in the list for each account so that we could easily reuse modules for different accounts and environments without worrying about name overlap.
The IAM Policy Attachment
aws_iam_policy_attachment.ddb_access_policy_attachment
attaches the IAM policy to the IAM role created earlier. The attachment is created for each IAM policy created as part of the aws_iam_policy.ddb_table_policy
resource block.
Finally, the data.aws_caller_identity.current
the data source is used to retrieve the AWS account ID of the Terraform user. The account ID is used in the IAM policy resources to grant access to DynamoDB tables for that specific account.
Don’t forget about variables.tf
variable "consumer_account_id" {
type = string
description = "The AWS account ID of the account which gains access."
}
variable "env" {
type = string
description = "Environment name"
}
variable "table_access_list" {
description = "List of DynamoDB table names and access types"
type = list(object({
table_name = string,
access_type = string #allowed values are Read, ReadWriteUpdate, and Full
}))
}
Usage
Go to your Main account terraform setup and use the module we just created:
#.....
module "ddb_cross_acc_access" {
source = "path/to/module"
table_access_list = [
{
table_name = "my-table-1"
access_type = "ReadWriteUpdate"
},
{
table_name = "my-table-2"
access_type = "Read"
}
]
consumer_account_id = "123456789012"
env = "dev"
}
#.....
Now as you can see we can specify which access level the Main account would Give the DDB Tables for the Consumer account!
Node.js Application Setup
- You can use an awesome serverless-iam-roles-per-function serverless plugin to create policy per function to give access to STS: ```yaml
ConsumerAccountExampleFunctions:
handler: src/handler.handler
name: ${self:provider.stage}-${self:service}-consumerAccountExampleFunctions
iamRoleStatements:
- Effect: 'Allow'
Action:
- 'sts:AssumeRole'
Resource:
- 'arn:aws:iam::${env:MAIN_ACCOUNT_ID}:role/${env:ENVIRONMENT}-ddb_cross_acc_access_role-${env:AWS_ACCOUNT_ID}'
It’s a good security practice to specify policy per function since we have strict control of what external sources Lambda has access to.
1. Grant credentials using STS service:
```ts
// get-sts-creds.ts
import {STS} from 'aws-sdk';
type CredentialsParams = {
RoleArn: string; // Optional - only required if assuming a role
RoleSessionName: string; // Optional - only required if assuming a role
};
export const getCredentials = async (params: CredentialsParams) => {
const stsClient = new STS();
// Assume the IAM role of the account that owns the target resource, if specified
const assumedRoleObject = await stsClient
.assumeRole({
RoleArn: params.RoleArn,
RoleSessionName: params.RoleSessionName,
})
.promise();
// Get temporary credentials
const credentials = {
accessKeyId: assumedRoleObject?.Credentials?.AccessKeyId || '',
secretAccessKey: assumedRoleObject?.Credentials?.SecretAccessKey || '',
sessionToken: assumedRoleObject?.Credentials?.SessionToken || '',
};
if (!credentials.accessKeyId && !credentials.secretAccessKey && !credentials.sessionToken) {
throw new Error('Credentials are not defined!');
}
return credentials;
};
- Use credentials to perform cross-account DDB calls!
import { DocumentClient } from 'aws-sdk/clients/dynamodb';
import { getCredentials } from './get-sts-creds.ts';
const tableName = 'ExampleDDBTableName';
// AWS_ACCOUNT_ID = Consumer account id
const roleArn = arn:aws:iam::</span><span class="p">${</span><span class="nx">process</span><span class="p">.</span><span class="nx">env</span><span class="p">.</span><span class="nx">MAIN_ACCOUNT_ID</span><span class="p">}</span><span class="s2">:role/</span><span class="p">${</span><span class="nx">process</span><span class="p">.</span><span class="nx">env</span><span class="p">.</span><span class="nx">ENVIRONMENT</span><span class="p">}</span><span class="s2">-ddb_cross_acc_access_role-</span><span class="p">${</span><span class="nx">process</span><span class="p">.</span><span class="nx">env</span><span class="p">.</span><span class="nx">AWS_ACCOUNT_ID</span><span class="p">}</span><span class="s2">
;
const roleSessionName = 'my-session-name';
export const handler = async () => {
const credentials = await getCredentials({ RoleArn: roleArn, RoleSessionName: roleSessionName });
const ddbClient = new DocumentClient({
credentials
});
const params = { TableName: tableName };
const data = await ddbClient.scan(params).promise();
return data
};
Summary
Now you know how to create terraform module for granular cross-account DDB permission:
On the Main account create a Role with Consumer Account trusted relationship
On the Main account create a policy with DDB accesses attached to the Role
On Consumer Account create a policy to allow using the Mains’ Account Role
Use STS client in your application to grant temporary credentials from the Main account role
Perform DDB Queries with granted credentials!
Posted on May 8, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.