Amazon Web Services (AWS) S3 is a popular way to upload and store files. You can use AWS S3 together with Vercel to upload, store, and retrieve objects.
With an Existing S3 Bucket
If you already have an existing AWS S3 bucket configured, you can retrieve (or generate) the access key and secret key for that IAM User. Ensure it has access to AmazonS3FullAccess
.
Ensure the access key and secret key are added as Environment Variables inside your Vercel project.
import { S3Client } from '@aws-sdk/client-s3';
// AWS credentials are picked up from the environmentconst s3Client = new S3Client({});
Creating a New S3 Bucket
We’ve created an example Next.js application that allows you to upload photos to an S3 bucket. After cloning the repository, the following steps will enable you to create an S3 bucket with the proper permissions using the AWS CDK:
Create a new IAM User
- Create a new IAM user
- Choose programmatic access
- Select "Attach existing policies directly"
- Add
AmazonS3FullAccess
Save the access key and secret key for the IAM User
- This is used for programmatic access in the API Route
Install the AWS CLI
- Install the AWS CLI
- Run
aws configure
- Enter your root AWS user access key and secret key
- Enter your default region
Create an .env.local
file similar to .env.example
- Enter your access key and secret key from the IAM user
- Add your S3 bucket name
Run cdk commands
- Run the
cdk bootstrap
command - Run
cdk deploy
to create an S3 bucket with an IAM policy
Test
- Run
yarn dev
to start the Next.js app atlocalhost:3000
- Choose a
.png
or.jpg
file - You should see your file successfully uploaded to S3
Uploading files on the server
You can use Vercel Serverless Functions to upload files to AWS S3 on the server. After creating an instance of the AWS S3 client, you can use upload
to create a new object with the given file name and body (based on a stream).
import { createReadStream } from 'fs';import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';import { NextApiRequest, NextApiResponse } from 'next';
export default async function handler( req: NextApiRequest, res: NextApiResponse,) { const s3Client = new S3Client({});
const uploadCommand = new PutObjectCommand({ Bucket: process.env.S3_BUCKET_NAME, Key: 'file-name', Body: createReadStream('file-path'), });
const response = await s3Client.send(uploadCommand);
res.status(200).json(response);}
Uploading files in the browser
Alternatively, you can allow file uploads directly from the browser. For example, a user can select a file using an input
, which then generates a pre-signed POST from an API Route to allow for secure uploads.
export default function Upload() { return ( <> <p>Upload a .png or .jpg image (max 1MB).</p> <input onChange={uploadPhoto} type="file" accept="image/png, image/jpeg" /> </> );}
const uploadPhoto = async (e: React.ChangeEvent<HTMLInputElement>) => { const file = e.target.files?.[0]!; const filename = encodeURIComponent(file.name); const fileType = encodeURIComponent(file.type);
const res = await fetch( `/api/upload-url?file=${filename}&fileType=${fileType}`, ); const { url, fields } = await res.json(); const formData = new FormData();
Object.entries({ ...fields, file }).forEach(([key, value]) => { formData.append(key, value as string); });
const upload = await fetch(url, { method: 'POST', body: formData, });
if (upload.ok) { console.log('Uploaded successfully!'); } else { console.error('Upload failed.'); }};
The API Route to generate the pre-signed POST is as follows:
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';import { createPresignedPost } from '@aws-sdk/s3-presigned-post';import { NextApiRequest, NextApiResponse } from 'next';
export default async function handler( req: NextApiRequest, res: NextApiResponse,) { const s3Client = new S3Client({});
const post = await createPresignedPost(s3Client, { Bucket: process.env.S3_BUCKET_NAME, Key: req.query.file, Fields: { acl: 'public-read', 'Content-Type': req.query.fileType, }, Expires: 600, // seconds Conditions: [ ['content-length-range', 0, 1048576], // up to 1 MB ], });
res.status(200).json(post);}
If you need to support pausing and resuming uploads, as well as other advanced file upload strategies, explore libraries like Evaporate.
Vercel Blob
As an alternative to AWS S3, you can use Vercel's Blob storage option. Vercel Blob allows you to upload and serve files via a global network through unique and unguessable URLs.
See our docs to learn how to integrate Vercel Blob into your workflow.