Add Rate Limiting with Vercel Edge Middleware and Vercel KV

This guide is a comprehensive walkthrough to handle rate limiting using Vercel Edge Middleware and Vercel KV. We will discuss the necessity of rate limiting, understanding Vercel and its features like Vercel KV and Vercel Edge Middleware. Lastly, a tutorial on implementing rate limiting in Next.js or any frontend framework.

Why do you need rate limiting?

Rate limiting is a technique for controlling network traffic. It sets a limit on how many requests a client can make to a server in a specific period.

  1. Ensure uptime of your services: Rate limiting protects your services from being overwhelmed by excessive requests. By controlling the number of requests, you can maintain the optimal performance of your service and ensure its availability.
  2. Control billing: Rate limiting helps manage and control your billing costs by preventing unforeseen spikes in usage, particularly important when using services that charge by the request.
  3. Prevent malicious usage: Rate limiting is essential when using AI providers and Large Language Models (LLMs). Rate limiting can protect your service from malicious usage or abuse, such as DDoS attacks.
  4. Add differentiation of product usage based on plan: Rate limiting can be used to create usage tiers. For example, free users might be limited to a certain number of requests per day, while premium users might have a higher limit.

What is Vercel?

Vercel's frontend cloud gives developers frameworks, workflows, and infrastructure to build a faster, more personalized web.

We are the creators of Next.js, the React framework, and have zero-configuration support for all major frontend frameworks.

Vercel KV

Vercel KV is a globally distributed, durable key-value store.

  • Durable: It offers an easy-to-use, durable data store that ensures your data is stored safely.
  • Low latency: Being globally distributed, it offers low latency reads from anywhere in the world.

Vercel Edge Middleware

Vercel Edge Middleware allows you to run your code at the edge, providing compute that is globally distributed, ensuring low latency and high availability. It offers:

  • Global, low latency compute decoupled from your backend: Vercel Edge Middleware operates at the edge of the network, which means your code runs closer to your users, reducing latency.
  • Block traffic before reaching your upstream service or database: Middleware can analyze, modify, or reject incoming traffic before it reaches your backend, providing an additional layer of protection and efficiency.
  • Cost effective: By handling traffic at the edge, Vercel Edge Middleware can significantly reduce the load on your servers, potentially lowering costs. Further, it uses a lightweight edge runtime, which is more cost effective than traditional serverless compute.

How to Implement Rate Limiting with Vercel

Start from a template

If you would prefer to start from a template instead of manually adding rate limiting to your project, we have created a Next.js template as well as a general rate limiting example for use with any framework.

Continue following the guide to manually add rate limiting to your existing application.

Step 1: Create a new Vercel KV database

Create a new Vercel KV instance from the Vercel dashboard. Choose your primary region and additional read regions if desired. You can follow our quickstart if you prefer.

Step 2: Connect your Vercel KV database to your project

Connect your new KV database to your Vercel project. This will automatically add the required environment variables to connect to your new durable Redis database.

Step 3: Adding @upstash/ratelimit

To simplify implementing rate limiting, we recommend @upstash/ratelimit, which is a powerful HTTP-based rate limiting library with support for a variety of algorithms.

This library allows setting multiple rate limits based on logic, such as a user's plan. Futher, while the Vercel Edge Middleware is "hot," it will intelligently cache and reduce the number of calls to Vercel KV, helping prevent unnecessary usage of your database as well.

Check out the documentation to see all options for this library.

Step 4: Adding Vercel Edge Middleware

Vercel Edge Middleware allows you to define logic to run based on the incoming request. This works natively in frameworks like Next.js, or generally on the Vercel platform using any frontend framework.

For Next.js, add the following middleware file to your application:

middleware.ts
import { NextRequest, NextResponse } from 'next/server';
import { Ratelimit } from '@upstash/ratelimit';
import { kv } from '@vercel/kv';
const ratelimit = new Ratelimit({
redis: kv,
// 5 requests from the same IP in 10 seconds
limiter: Ratelimit.slidingWindow(5, '10 s'),
});
// Define which routes you want to rate limit
export const config = {
matcher: '/',
};
export default async function middleware(request: NextRequest) {
// You could alternatively limit based on user ID or similar
const ip = request.ip ?? '127.0.0.1';
const { success, pending, limit, reset, remaining } = await ratelimit.limit(
ip
);
return success
? NextResponse.next()
: NextResponse.redirect(new URL('/blocked', request.url));
}
Adding rate limiting to your Next.js application with Middleware and Vercel KV.

If you are not using Next.js, you can use Vercel Edge Middleware with any framework with the @vercel/edge package as follows:

middleware.js
import { ipAddress, next } from '@vercel/edge'
import { Ratelimit } from '@upstash/ratelimit'
import { kv } from '@vercel/kv'
const ratelimit = new Ratelimit({
redis: kv,
// 5 requests from the same IP in 10 seconds
limiter: Ratelimit.slidingWindow(5, '10 s'),
})
// Define which routes you want to rate limit
export const config = {
matcher: '/',
}
export default async function middleware(request: Request) {
// You could alternatively limit based on user ID or similar
const ip = ipAddress(request) || '127.0.0.1'
const { success, pending, limit, reset, remaining } = await ratelimit.limit(
ip
)
return success ? next() : Response.redirect(new URL('/blocked.html', request.url))
}
You can add rate limiting to any application with Vercel Edge Middleware.

In both cases, if a client exceeds the rate limit, they will be redirected to the /blocked route.

Couldn't find the guide you need?