[AWS Experiment] 7 - STS:AssumeRole with AWS SDK

_ben

Sunbeom Kweon (Ben)

Posted on June 13, 2022

[AWS Experiment] 7 - STS:AssumeRole with AWS SDK

1. What is AssumeRole?

"Returns a set of temporary security credentials that you can use to access AWS resources that you might not normally have access to. These temporary credentials consist of an access key ID, a secret access key, and a security token." (Amazon Web Services)

2. But why would we ever want to use it?

Every requests from client can be captured during the requesting process. For example, when we perform a GET request, we include credentials to url queries and anyone who can access to the url will be able to steal the credentials.

Using AssumeRole to get temporary credentials for specific roles is useful when you want to secure your credentials. Therefore, it is much more secure to generate temporary credentials that expire soon so that even though the credentials get captured by other people, they only have a little bit of time to access to the contents before the credentials get expired.

Using temporary credentials is simply the same idea of using access token and refresh token. But what makes AssumeRole API very different is that an IAM user can acquire the temporary credentials of "other roles" that have more or different permissions than the user currently has.

Image description

3. Experiment

Resources

Prerequisites

  • IAM User with STS:AssumeRole permission. Resource should be arn of the role that the user want to assume.

  • IAM Role that user wants to assume. It is ideal that the role has the permissions that user currently does not have. The role needs to have the user as one of its Trusted Entities.

  • Any AWS service that you want to allow the user to perform. I used AWS S3.

Backend codes

Get an item from S3

router.get("/item", (req, res) => {
    if (req.query === undefined) 
        return res.json({error:"Invalid QueryStrings"});

    const { bucket, key, region } = req.query;
    if (bucket === undefined || region === undefined || key === undefined) 
        return res.json({error:"Invalid QueryStrings"});

    // STS AssumeRole token
    sts.assumeRole(stsParams, async function(error, data) {
         if (error) return res.status(error.statusCode).json({error:error.message});
         else{
            try { 
                const { Credentials: { AccessKeyId, SecretAccessKey, SessionToken } } = data;
                const bucketParams = {
                    Bucket: bucket
                }
                const getObjectParams = {
                    Bucket: bucket, // The name of the bucket. For example, 'sample_bucket_101'.
                    Key: key
                };
                const credentials = {
                    accessKeyId: AccessKeyId,
                    secretAccessKey: SecretAccessKey,
                    sessionToken: SessionToken
                };
                let s3Client = new S3Client({ region, credentials });

                // Check
                await doesExistBucketOrObject({s3Client, bucketParams, objectParams:getObjectParams});

                // Returning Pre-signed URL
                const getObjCommand = new GetObjectCommand(getObjectParams);
                const url = await getSignedUrl(s3Client, getObjCommand, { expiresIn: 2400 });
                return res.json({url, success:true});

            } catch (error) {
                return res.status(error.statusCode).json({error, success:false});
            }
        }
    });
});
Enter fullscreen mode Exit fullscreen mode

Upload a file using multer & S3

const upload = multer({ dest: 'uploads/' });

router.post("/item", upload.single("files"),(req, res) => {
    if (req.body === undefined || req.body === null) 
    return res.json({error:"Invalid QueryParameters: req.body"});

    const { bucket, key, region } = req.body;
    if (bucket === undefined || region === undefined || key === undefined) 
        return res.json({error:"Invalid QueryParameters"});
    if (req.file === undefined)
        return res.json({error:"No file attached"})

    const { filename, path:filepath } = req.file;
    console.log(filename, filepath);
    const realpath = path.join(__dirname.split("src")[0], filepath);
    const realfile = fs.readFileSync(realpath);
    // STS AssumeRole token
    sts.assumeRole(stsParams, async function(error, data) {
        if (error) return res.status(error.statusCode).json({error:error.message});
        else{
            try{

                // STS
                const { Credentials: { AccessKeyId, SecretAccessKey, SessionToken } } = data;
                const credentials = {
                    accessKeyId: AccessKeyId,
                    secretAccessKey: SecretAccessKey,
                    sessionToken: SessionToken
                };

                const putObjectParams = { 
                    Bucket: bucket, 
                    Key: key
                }

                let s3Client = new S3Client({ region, credentials });
                const putObjectCommand = new PutObjectCommand(putObjectParams);
                const signedUrl = await getSignedUrl(s3Client, putObjectCommand, { expiresIn: 2400 })
                    .catch(err => {throw err});
                const response = await fetch(signedUrl, {method: 'PUT', body: realfile})
                    .catch(err => {throw err;});
                console.log("Successfully uploaded the object");
                return res.status(response.status).json({success:true})
            } catch(error) {
                return res.json({error})
            }
        }
    });
});
Enter fullscreen mode Exit fullscreen mode

As you can see, I did not interact with S3 directly using my AWS credentials. Instead of that, I got temporary credentials from STS first and then with the credential I interacted with S3 to get Presigned URL to perform GetObject and PutObject one the bucket.

💖 💪 🙅 🚩
_ben
Sunbeom Kweon (Ben)

Posted on June 13, 2022

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related