Conceptual

Streaming Functions

Learn how Vercel enables streaming Function responses over time for ecommerce, AI, and more.
Table of Contents

Vercel Functions support streaming responses when using the Node.js, Edge, or Python runtimes, allowing you to render parts of the UI as they become ready. This lets users interact with your app before the entire page finishes loading by populating the most important components first.

Vercel enables you to use the standard Web Streams API in your functions. All functions using the Node.js runtime support streaming by default.

Streaming functions also support the waitUntil method, which allows you to keep a function running after a response has been sent and finished. This means that while your function will likely run for the same amount of time, and therefore cost the same as waiting for your whole response to be ready, your end-users can have a better, more interactive experience.

You can stream by default with the Next.js App Router, when using either the Node.js or Edge runtimes. Other frameworks that support streaming functions include:

To stream functions when using Next.js Pages Router, create an app/api directory in your project and add your streaming functions there. App Router can be used in conjunction with the Pages Router to enable streaming functions.

Next.js (/app)
Next.js (/pages)
Other frameworks
app/api/streaming-example/route.ts
export const runtime = 'nodejs'; // This is the default and can be omitted
export const dynamic = 'force-dynamic'; // always run dynamically
 
export async function GET() {
  // This encoder will stream your text
  const encoder = new TextEncoder();
  const customReadable = new ReadableStream({
    start(controller) {
      // Start encoding 'Basic Streaming Test',
      // and add the resulting stream to the queue
      controller.enqueue(encoder.encode('Basic Streaming Test'));
      // Prevent anything else being added to the stream
      controller.close();
    },
  });
 
  return new Response(customReadable, {
    headers: { 'Content-Type': 'text/html; charset=utf-8' },
  });
}

Streaming functions also support the waitUntil method, which allows you to keep a function running after a response has been sent and finished.

When using the edge runtime, some limitations apply.

Streaming Python functions is enabled by default for all new projects from November 22nd, 2024, and will be enabled by default for all existing customers on January 5th, 2025.

You can stream responses from Vercel Functions that use the Python runtime. Currently, this is an opt-in feature for existing functions, but from January 5th 2025, streaming will be enabled by default for all Python functions. Until then, you can enable streaming by adding the following to your function:

  1. From the dashboard, select your project and go to the Settings tab.
  2. Select Environment Variables from the left side in settings.
  3. Add a new environment variable with the Key: VERCEL_FORCE_PYTHON_STREAMING and the Value 1. You should ensure this is set for all environments you want to force streaming for.
  4. Redeploy your project. Streaming will be enabled on your next production deployment.
  5. Pull your env vars into your local project with the following command:
    terminal
    vercel env pull
    For more information, see Environment Variables.

When your function is streaming, it will be able to take advantage of the extended runtime logs, which will show you the real-time output of your function, in addition to larger and more frequent log entries. Because of this potential increase in frequency and format, your Log Drains may be affected. We recommend ensuring that your ingestion can handle both the new format and frequency.

By default, Vercel Functions have a maximum duration of 10 seconds on Hobby and 15 seconds on Pro and Enterprise.

You should consider configuring the default maximum duration for Node.js functions to enable streaming responses for longer periods.

When using the edge runtime, the following limitations apply:

Last updated on November 22, 2024