Skip to content

How can I improve serverless function (lambda) cold start performance on Vercel?

Vercel Serverless Functions (which use AWS Lambda) enable developers to write server-side logic to connect to their database or external APIs. If you are seeing slow response times, this guide will help you identify the root issue for the additional latency.

When a lambda function starts for the first time, it’s a cold start. Subsequent requests to that function are then considered warm. Cold starts typically occur in under 1% of invocations. The duration of a cold start varies from under 100 ms to over 1 second.

This guide will help you improve the performance of your lambda functions and understand how to determine if the latency increase is from a cold start.

Improving your Lambda performance

The following suggestions will help you ensure optimal performance of your Vercel Serverless Functions:

  1. Choose the correct region for your functions: Vercel Serverless Functions are deployed to us-east by default. All customers can change the default region for their functions in their project settings. Choose a region that’s closest to your customers and database for optimal performance.
  2. Choose smaller dependencies inside your functions: Cold start times are correlated to function size, which is often mostly from external dependencies. If you have large dependencies, parsing and evaluating JavaScript code can take 3-5 seconds or longer. Review your bundle and try to eliminate larger dependencies using a bundle analyzer.
  3. Use proper caching headers: Serverless Function responses can be cached using Cache-Control headers. This will help ensure optimal performance for repeat visitors, and Vercel’s Edge cache even supports stale-while-revalidate headers. Note that cache misses will still need to request data from your origin (e.g. database) rather than reading directly from the Edge cache (faster).

Faster performance with the Edge runtime

Vercel now also has experimental support for Edge Functions, which use a more constrained runtime than Node.js, enabling greater performance with near-zero cold starts.

By opting into the constraints of the Edge Runtime, which is based on Web APIs, you can eliminate the overhead of the AWS Lambda Node.js runtime. Edge Functions run on top of the V8 engine, but not Node.js, so there is no access to Node.js APIs such as processpath, or fs when developing with Edge Functions.

// api/hello.js export const config = { runtime: 'experimental-edge', } export default (req) => new Response('Hello world!')

An Edge API Route using Vercel Edge Functions.

Couldn't find the guide you need?