How to use AWS S3 pre-signed URLs to upload and download files

thesohailjafri

Sohail @chakraframer.com

Posted on March 18, 2024

How to use AWS S3 pre-signed URLs to upload and download files

So, this month we landed a new client for whom we must keep track of distributor's and doctor's orders and sales. The client had a requirement to keep the S3 bucket private and only allow access to the files using pre-signed URLs. So in this blog, I will show you how to use pre-signed URLs to upload and download files from an AWS S3 bucket while keeping the bucket private.

Table of Contents

Prerequisites

  • Basic knowledge of JavaScript
  • Basic knowledge of AWS S3 bucket
  • Basic knowledge of HTTP requests
  • Basic knowledge of Node.js and Express.js

Let's break down the task into smaller steps.

  1. Setting up the backend
  2. Develop a function to generate an AWS S3 pre-signed URL
  3. Configuring AWS S3 bucket
  4. Connecting function to an API endpoint
  5. Setting up the frontend
  6. Connecting frontend to the API

Step 1: Setting up the backend

mkdir backend
cd backend
npm init -y
npm install express aws-sdk
touch index.js
Enter fullscreen mode Exit fullscreen mode

You can use type nul > index.js for Windows users to create a new file.

// index.js
const express = require('express')
const app = express()
const AWS = require('aws-sdk')

app.listen(3000, () => {
  console.log('Server is running on port 3000')
})
Enter fullscreen mode Exit fullscreen mode

Step 2: Develop a function to generate an AWS S3 pre-signed URL

// index.js
const express = require('express')
const app = express()
const AWS = require('aws-sdk')

const s3 = new AWS.S3({
    accessKeyId: process.env.AWS_ACCESS_KEY_ID, // Your AWS Access Key ID
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, // Your AWS Secret Access Key
    region: process.env.AWS_REGION // Your AWS region
    signatureVersion: 'v4', // This is the default value
})

const awsS3GeneratePresignedUrl = async (
  path,
  operation = 'putObject', // Default value is putObject, for get use getObject
  expires = 60
): Promise<string> => {
  const params = {
    Bucket: bucketName, // Bucket name
    Key: path, // File name you want to save as in S3
    Expires: expires, // 60 seconds is the default value, change if you want
  }
  const uploadURL = await s3.getSignedUrlPromise(operation, params)
  return uploadURL
}

app.listen(3000, () => {
    console.log('Server is running on port 3000')
})
Enter fullscreen mode Exit fullscreen mode

Step 3: Configuring AWS S3 bucket

If we try to connect API to our function, we will get a CROS (Cross-Origin Resource Sharing) error. To fix this, we need to configure our S3 bucket to allow access from our API. We want access to PUT and GET requests from our API. To do this, we need to add a CORS configuration to our S3 bucket in the following way:

  • Open the Amazon S3 console at https://console.aws.amazon.com/s3/
  • Search for the bucket you want to configure and click on it
  • Click on the Permissions tab
  • Scroll down to the Cross-origin resource sharing (CORS) section
  • Click on Edit and add the following policy
[
    {
        "AllowedHeaders": [
            "*"
        ],
        "AllowedMethods": [
            "PUT",
            "GET",
            "HEAD"
        ],
        "AllowedOrigins": [
            "*"
        ],
        "ExposeHeaders": []
    }
]
Enter fullscreen mode Exit fullscreen mode
  • Click on Save changes
  • Also make sure that Block public access (bucket settings) is turned on to keep your bucket private (optional)

Change the AllowedOrigins to your API URL to make it more secure. Or you can also use wildcard * to allow access from any origin.

Step 4: Connecting function to an API endpoint

We will have 2 endpoints, one to generate a pre-signed URL for uploading a file and the other to generate a pre-signed URL for downloading a file.

// index.js
const express = require('express')
const app = express()
const AWS = require('aws-sdk')

const s3 = new AWS.S3({
    accessKeyId: process.env.AWS_ACCESS_KEY_ID, // Your AWS Access Key ID
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY, // Your AWS Secret Access Key
    region: process.env.AWS_REGION // Your AWS region
    signatureVersion: 'v4', // This is the default value
})

const awsS3GeneratePresignedUrl = async (
  path,
  operation = 'putObject', // Default value is putObject, for get use getObject
  expires = 60
): Promise<string> => {
  const params = {
    Bucket: bucketName, // Bucket name
    Key: path, // File name you want to save as in S3
    Expires: expires, // 60 seconds is the default value, change if you want
  }
  const uploadURL = await s3.getSignedUrlPromise(operation, params)
  return uploadURL
}

app.get('/generate-presigned-url', async (req, res) => {
  const { path } = req.query
  const uploadURL = await awsS3GeneratePresignedUrl(path, 'putObject', 60)
  res.send({ path, uploadURL })
})

app.get('/download-presigned-url', async (req, res) => {
  const { path } = req.query
  const downloadURL = await awsS3GeneratePresignedUrl(path, 'getObject', 60)
  res.send({ downloadURL })
})

app.listen(3000, () => {
    console.log('Server is running on port 3000')
})
Enter fullscreen mode Exit fullscreen mode

Step 5: Setting up the frontend

For the front end, I am using React.js. You can use any front-end framework of your choice. and install axios to make HTTP requests.

npx create-react-app frontend
cd frontend
npm install axios
Enter fullscreen mode Exit fullscreen mode

Step 6: Connecting frontend to the API

// App.js
import React, { useState } from 'react'
import axios from 'axios'

export default function App() {
  const [uploadURL, setUploadURL] = useState('')
  const [downloadURL, setDownloadURL] = useState('')

  const generatePresignedURL = async (path, type) => {
    const response = await axios.get(
      `http://localhost:3000/generate-presigned-url?path=${path}`,
    )
    if (type === 'upload') {
      setUploadURL(response.data.uploadURL)
    } else {
      setDownloadURL(response.data.downloadURL)
    }
  }

  return (
    <div>
      <input type="file" onChange={(e) => uploadFile(e.target.files[0])} />
      <button onClick={() => generatePresignedURL('file1.txt', 'upload')}>
        Generate Upload URL
      </button>
      <button onClick={() => generatePresignedURL('file1.txt', 'download')}>
        Generate Download URL
      </button>
      {downloadURL && (
        <a href={downloadURL} download>
          Download file
        </a>
      )}
    </div>
  )
}
Enter fullscreen mode Exit fullscreen mode

Use Cases

  1. Upload files to the S3 bucket from your front end without exposing your AWS credentials.
  2. Download files from the S3 bucket to your front end without exposing your AWS credentials.
  3. Directly upload files to the S3 bucket from your front end without creating API to handle file uploads.

Bonus Code Snippet

// Code to get uploadUrl and put the file to the S3 bucket using fetch API.
const putFileToS3Api = async ({ uploadURL, file }) => {
  try {
    if (!file) throw new Error('No file provided')
    const res = await fetch(signedUrl, {
      method: 'PUT',
      headers: {
        'Content-Type': file.type ?? 'multipart/form-data',
      },
      body: file,
    })
    return res
  } catch (error) {
    console.error(error)
  }
}

const getUploadUrlApi = async ({ filename }) => {
  // Update the URL to your Graphql Endpoint.
  const res = await axios.get('http://localhost:3000/generate-presigned-url', {
    path: filename,
  })
  return res
  try {
  } catch (error) {
    console.error(error)
  }
}

export const uploadFileToS3Api = async ({ file }) => {
  try {
    if (!file) throw new Error('No file provided')
    const generateUploadRes = await getUploadUrlApi({ filename: file.name })
    if (!generateUploadRes.data.uploadURL)
      throw new Error('Error generating pre signed URL')
    const uploadRes = await putFileToS3Api({
      uploadURL: generateUploadRes.data.uploadURL,
      file,
    })
    if (!uploadRes.ok) throw new Error('Error uploading file to S3')
    return {
      message: 'File uploaded successfully',
      uploadURL: generateUploadRes.data.uploadURL,
      path: generateUploadRes.data.path,
    }
  } catch (error) {
    console.error(error)
  }
}
Enter fullscreen mode Exit fullscreen mode

Call the uploadFileToS3Api function with the file you want to upload to the S3 bucket in one go. Additionally, you can use await Promise.all to upload multiple files at once.

const uploadFiles = async (files) => {
  const uploadPromises = files.map((file) => uploadFileToS3Api({ file }))
  const uploadResults = await Promise.all(uploadPromises)
  console.log(uploadResults)
}
Enter fullscreen mode Exit fullscreen mode

Additional Resource

More information about AWS S3 presigned URLs can be found here

Conclusion

That's it. You have successfully learned how to generate pre-signed URLs for uploading and downloading files from the S3 bucket while keeping the bucket private.
I hope you find this blog helpful. If you have any questions, feel free to ask in the comments below or contact me on Twitter @thesohailjafri

💖 💪 🙅 🚩
thesohailjafri
Sohail @chakraframer.com

Posted on March 18, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related