How can I improve function cold start performance on Vercel?

Vercel Functions enable developers to write server-side logic to connect to their database or external APIs. If you are seeing slow response times, this guide will help you identify the root issue for the additional latency.

When a function starts for the first time, it’s a "cold start". Subsequent requests to that function are then considered warm. Cold starts typically occur in under 1% of invocations. The duration of a cold start varies from under 100 ms to over 1 second.

This guide will help you improve the performance of your functions and understand how to determine if the latency increase is from a cold start.

Improving your Function performance

The following suggestions will help you ensure optimal performance of your Vercel Serverless Functions:

  1. Choose the correct region for your functions: Node.js Functions are deployed to iad1 by default. All customers can change the default region for their functions in their project settings. Choose a region that’s closest to your data source for optimal performance.
  2. Choose smaller dependencies inside your functions: Cold start times are correlated to function size, which is often mostly from external dependencies. If you have large dependencies, parsing and evaluating JavaScript code can take 3-5 seconds or longer. Review your bundle and try to eliminate larger dependencies using a bundle analyzer.
  3. Use proper caching headers: Function responses can be cached using Cache-Control headers. This will help ensure optimal performance for repeat visitors, and Vercel’s Edge cache even supports stale-while-revalidate headers. Note that cache misses will still need to request data from your origin (e.g. database) rather than reading directly from the Edge cache (faster).
  4. Upgrade CPU or memory for your functions: The CPU size of functions is set to 0.6 by default. Upgrading from 0.6 CPUs to 1 can significantly reduce function latency, and adjusting default function memory size can also have an impact. Learn more.
  5. Prewarm your functions using cron jobs: For low traffic sites, set up a cron job to regularly invoke the routes every 1 to 5 minutes. This ensures routes are always in a warm state, which is particularly beneficial for startups presenting to investors.
  6. If using Sentry, upgrade to Next.js 14.1.4 or higher: Next.js 14.1.4 includes significant performance improvements for Sentry and can reduce cold start time.
  7. If using Pages Router, bundle external dependencies: Setting bundlePagesExternals to true under the experimental flag is strongly recommended for customers using Pages Router, and will result in a large reduction in cold start seconds.
  8. Use dynamic imports: While an uncommon use case, customers with sites using divergent code paths, using dynamic imports to load only the necessary code can reduce cold start times.

Couldn't find the guide you need?