How can I use AWS S3 with Vercel?

Amazon Web Services (AWS) S3 is a popular way to upload and store files. You can use AWS S3 together with Vercel to upload, store, and retrieve objects.

With an Existing S3 Bucket

If you already have an existing AWS S3 bucket configured, you can retrieve (or generate) the access key and secret key for that IAM User. Ensure it has access to the proper policies: s3:DeleteObject, s3:GetObject, s3:ListBucket, s3:PutObject, s3:PutObjectAcl.

AWS credentials (e.g. AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) and region configuration (e.g. AWS_REGION) can now be used directly as Environment Variables for Vercel deployments. Ensure these values are inside your Vercel project.

These variables are the default names expected by the AWS SDK, which means the user no longer has to configure credentials when using it. For example:

import { S3Client } from '@aws-sdk/client-s3'
// AWS credentials are picked up from the environment
const client = new S3Client({ region: process.env.AWS_REGION });
Example code to connect to the AWS SDK.

Creating a New S3 Bucket

We’ve created an example Next.js application that allows you to upload photos to an S3 bucket.

  1. In Object Ownership, select "ACLs enabled" and "Bucket owner prefered"
  2. In Block Public Access settings for this bucket, uncheck "Block all public access".

Create a new IAM User

  1. Create a new IAM user
  2. Choose programmatic access
  3. Select "Attach policies directly"
  4. Add s3:DeleteObject, s3:GetObject, s3:ListBucket, s3:PutObject, s3:PutObjectAcl.

Save the access key and secret key for the IAM User

  1. Select the newly created user (IAM > Users > "your-user") and navigate to "Security Credentials".
  2. Under "Access Keys", create a key and save this information. We will use this in the next step.

Create an .env.local file similar to .env.example

  1. In your env.local file, use the information from your access key, along with the region and bucket name.
  2. Do not adjust the naming of the keys, only input your values. This is to ensure S3 Client can read them as defaults.

Configure CORS to enable uploads from your browser

  1. Navigate to your bucket, and go to the "Permissions" tab.
  2. Scroll down to find "Cross-origin resource sharing (CORS)" and click "Edit" on the right side.
  3. Paste the following code below.
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"GET",
"PUT",
"POST",
"DELETE"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": []
}
]

Test

  1. Start our Next.js app at localhost:3000
  2. Choose a .png or .jpg file
  3. You should see your file successfully uploaded to S3

Uploading files on the server

You can use Vercel Serverless Functions to upload files to AWS S3 on the server. After creating an instance of the AWS S3 client, you can use upload to create a new object with the given file name and body (based on a stream).

app/api/upload/route.ts
import { createReadStream } from 'fs';
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
export async function POST(request: Request) {
try {
const client = new S3Client({ region: process.env.AWS_REGION })
const uploadCommand = new PutObjectCommand({
Bucket: process.env.AWS_BUCKET_NAME,
Key: 'file-name',
Body: createReadStream('file-path'),
});
const response = await client.send(uploadCommand);
return Response.json(response)
} catch (error) {
return Response.json({ error: error.message })
}
}
A Route Handler to upload a file to an S3 bucket.

Uploading files in the browser

Alternatively, you can allow file uploads directly from the browser. For example, a user can select a file using an input, which then generates a pre-signed POST from a Route Handler to allow for secure uploads.

app/page.tsx
'use client'
import { useState } from 'react'
export default function Page() {
const [file, setFile] = useState<File | null>(null)
const [uploading, setUploading] = useState(false)
const handleSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
e.preventDefault()
if (!file) {
alert('Please select a file to upload.')
return
}
setUploading(true)
const response = await fetch(
process.env.NEXT_PUBLIC_BASE_URL + '/api/upload',
{
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ filename: file.name, contentType: file.type }),
}
)
if (response.ok) {
const { url, fields } = await response.json()
const formData = new FormData()
Object.entries(fields).forEach(([key, value]) => {
formData.append(key, value as string)
})
formData.append('file', file)
const uploadResponse = await fetch(url, {
method: 'POST',
body: formData,
})
if (uploadResponse.ok) {
alert('Upload successful!')
} else {
console.error('S3 Upload Error:', uploadResponse)
alert('Upload failed.')
}
} else {
alert('Failed to get pre-signed URL.')
}
setUploading(false)
}
return (
<main>
<h1>Upload a File to S3</h1>
<form onSubmit={handleSubmit}>
<input
id="file"
type="file"
onChange={(e) => {
const files = e.target.files
if (files) {
setFile(files[0])
}
}}
accept="image/png, image/jpeg"
/>
<button type="submit" disabled={uploading}>
Upload
</button>
</form>
</main>
)
}
A React component that uploads files using an input.

The API Route to generate the pre-signed POST is as follows:

app/api/upload/route.ts
import { createPresignedPost } from '@aws-sdk/s3-presigned-post'
import { S3Client } from '@aws-sdk/client-s3'
import { v4 as uuidv4 } from 'uuid'
export async function POST(request: Request) {
const { filename, contentType } = await request.json()
try {
const client = new S3Client({ region: process.env.AWS_REGION })
const { url, fields } = await createPresignedPost(client, {
Bucket: process.env.AWS_BUCKET_NAME,
Key: uuidv4(),
Conditions: [
['content-length-range', 0, 10485760], // up to 10 MB
['starts-with', '$Content-Type', contentType],
],
Fields: {
acl: 'public-read',
'Content-Type': contentType,
},
Expires: 600, // Seconds before the presigned post expires. 3600 by default.
})
return Response.json({ url, fields })
} catch (error) {
return Response.json({ error: error.message })
}
}
A Route Handler to generate a pre-signed POST URL.

If you need to support pausing and resuming uploads, as well as other advanced file upload strategies, explore libraries like Evaporate.

Vercel Blob

As an alternative to AWS S3, you can use Vercel's Blob storage option. Vercel Blob allows you to upload and serve files via a global network through unique and unguessable URLs.

See our docs to learn how to integrate Vercel Blob into your workflow.

Couldn't find the guide you need?