How to upload blobs to Azure Storage from Next.js
kurab
Posted on March 2, 2022
Summary
- Create Next.js + TypeScript + Prisma App
- Upload blobs such as mp4 to Azure Storage
- Give a unique name *1
- Save it to Database as well
*1
The URI to reference a container or a blob must be unique. Because every account name is unique, two accounts can have containers with the same name. However, within a given storage account, every container must have a unique name. Every blob within a given container must also have a unique name within that container.
...
The Blob service is based on a flat storage scheme, not a hierarchical scheme.
via. Naming and Referencing Containers, Blobs, and Metadata
Tech
- Next.js
- Prisma
- PostgreSQL
- TypeScript
package.json is as following.
{
...
"dependencies": {
"@azure/storage-blob": "^12.8.0",
"@prisma/client": "^3.9.2",
"axios": "^0.25.0",
"next": "12.0.10",
"react": "17.0.2",
"react-dom": "17.0.2",
"react-hook-form": "^7.27.0",
"uuid": "^8.3.2"
},
"devDependencies": {
"@types/axios": "^0.14.0",
"@types/node": "17.0.17",
"@types/react": "17.0.39",
"@types/uuid": "^8.3.4",
"eslint": "8.9.0",
"eslint-config-next": "12.0.10",
"prisma": "^3.9.2",
"typescript": "4.5.5"
}
}
Create Next.js app
- Place upload file form on top page
- Using react-hook-form to handle form
- Prisma ORM + PostgreSQL
- Create get/register API and call them by axios
Install Next.js and modules
$ yarn create next-app azure-storage --typescript
$ cd azure-storage
$ yarn add @prisma/client react-hook-form axios
$ yarn add @types/axios prisma --dev
PostgreSQL by docker
docker-compose.yml
version: "3.8"
services:
db:
image: "postgres:12"
ports:
- "5432:5432"
volumes:
- ./pgdata:/var/lib/postgresql/data
environment:
- POSTGRES_USER=${DB_USER}
- POSTGRES_PASSWORD=${DB_PASS}
- POSTGRES_DB=${DB_NAME}
.env
DB_NAME=sample
DB_USER=johndoe
DB_PASS=randompassword
$ docker-compose up -d
Prisma
$ npx prisma init
.env
+ DATABASE_URL="postgresql://johndoe:randompassword@localhost:5432/mydb?schema=public"
prisma/schema.prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Item {
id Int @id @default(autoincrement())
name String
createdAt DateTime @default(now())
@@map(name: "items")
}
$ npx prisma generate
$ npx prisma migrate dev
Code
├── pages
│ ├── api
│ │ ├── items.ts // get item API
│ │ └── register.ts // register item API
│ └── index.tsx // form and display blob list
├── hooks
│ └── useItem.ts // hook to call APIs
└── types
└── ItemType.ts // type of Item
detail code is: kurab/next-azure-storage:baseForm
$ yarn dev
or
$ yarn build
$ yarn start
OK.
Azure implementation
Read a quick start document and apply it to Next.js + TypeScript.
install modules
$ yarn add @azure/storage-blob uuid
$ yarn add @types/uuid --dev
only @azure/storage-blob
module is needed to upload blob to Azure Storage.
uuid
is for giving a unique name such as "fd629c13-dcc5-4503-a0be-934e96b99b27".
Access with sasToken
This time, I use sasToken
(Shared Access Signature) to access Azure Storage. Create sasToken
on Azure Portal and note it (sv=blah blah...).
.env
+ NEXT_PUBLIC_STORAGESASTOKEN='sv=.....'
+ NEXT_PUBLIC_STORAGERESOURCENAME='xxxxx'
NEXT_PUBLIC_STORAGERESOURCENAME
is Storage Account.
Initialize BlobServiceClient
by these information like:
import { BlobServiceClient } from '@azure/storage-blob';
const sasToken = process.env.NEXT_PUBLIC_STORAGESASTOKEN;
const storageAccountName = process.env.NEXT_PUBLIC_STORAGERESOURCENAME;
const blobService = new BlobServiceClient(
`https://${storageAccountName}.blob.core.windows.net/?${sasToken}`
);
Easy!
rename and upload
To keep code simple, easy to understand, I ignore transaction for this time. That is assuming both upload and save database never fail.
Add onSubmitSave method on index.tsx that rename file, upload and save. (actual upload and save are in hook)
...
import { v4 as uuidv4 } from 'uuid';
...
const onClickSave = (formData: any) => {
if (formData.files[0]) {
const newFileName =
uuidv4() + '.' + formData.files[0].name.split('.').pop();
uploadFileToBlob(formData.files[0], newFileName);
registerItem(newFileName);
}
};
...
You can write smarter. For this time, I wanted to separate steps to explain.
Now, uploadFileToBlob
method in useItem.ts
hook.
...
import { BlobServiceClient, ContainerClient } from '@azure/storage-blob';
const containerName = 'sample-container';
const sasToken = process.env.NEXT_PUBLIC_STORAGESASTOKEN;
const storageAccountName = process.env.NEXT_PUBLIC_STORAGERESOURCENAME;
...
const uploadFileToBlob = useCallback(
async (file: File | null, newFileName: string) => {
setLoading(true);
if (!file) {
setMessage('No FILE');
} else {
const blobService = new BlobServiceClient(
`https://${storageAccountName}.blob.core.windows.net/?${sasToken}`
);
const containerClient: ContainerClient =
blobService.getContainerClient(containerName);
await containerClient.createIfNotExists({
access: 'container',
});
const blobClient = containerClient.getBlockBlobClient(newFileName);
const options = { blobHTTPHeaders: { blobContentType: file.type } };
await blobClient.uploadData(file, options);
setMessage('uploaded');
}
setLoading(false);
},
[]
);
...
This code is just following the document.
If you are sure container has already existed, you can remove:
await containerClient.createIfNotExists({
access: 'container',
});
This line will warn you everytime when container exists.
PUT https://.... 409 (The specified container already exists.)
after uploaded blobs, it's up to you. In sample code, I set a new blobs list to state.
const blobs = await getBlobsInContainer(containerClient);
setBlobs(blobs);
like this.
Done
I use AWS most of the time, so when I hear "Azure," I think "Damn..." But Azure is so richly documented that I don't have much trouble finding what I'm looking for.
It also works well with github and VSCode, of course. Good.
Posted on March 2, 2022
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.