Skip to content

Changelog

Changelog

Follow us on X to hear about the changes first!
Cover for AI-enhanced search for Vercel documentationCover for AI-enhanced search for Vercel documentation

AI-enhanced search for Vercel documentation

You can now get AI-assisted answers to your questions directly from the Vercel docs search:

  • Use natural language to ask questions about the docs
  • View recent search queries and continue conversations
  • Easily copy code and markdown output
  • Leave feedback to help us improve the quality of responses

Start searching with ⌘K (or Ctrl+K on Windows) menu on vercel.com/docs.

Cover for Gemini AI Chatbot with Generative UI supportCover for Gemini AI Chatbot with Generative UI support

Gemini AI Chatbot with Generative UI support

The Gemini AI Chatbot template is a streaming-enabled, Generative UI starter application. It's built with the Vercel AI SDK, Next.js App Router, and React Server Components & Server Actions.

This template features persistent chat history, rate limiting to prevent abuse, session storage, user authentication, and more.

The Gemini model used is models/gemini-1.0-pro-001, however, the Vercel AI SDK enables exploring an LLM provider (like OpenAI, Anthropic, Cohere, Hugging Face, or using LangChain) with just a few lines of code.

Try the demo or deploy your own.

Cover for Hostname support in Web AnalyticsCover for Hostname support in Web Analytics

Hostname support in Web Analytics

You can now inspect and filter hostnames in Vercel Web Analytics.

  • Domain insights: Analyze traffic by specific domains. This is beneficial for per-country domains, or for building multi-tenant applications.
  • Advanced filtering: Apply filters based on hostnames to view page views and custom events per domain.

This feature is available to all Web Analytics customers.

Learn more in our documentation about filtering.

Cover for Node.js v20 LTS is now generally availableCover for Node.js v20 LTS is now generally available

Node.js v20 LTS is now generally available

Node.js 20 is now fully supported for Builds and Vercel Functions. You can select 20.x in the Node.js Version section on the General page in the Project Settings. The default version for new projects is now Node.js 20.

Node.js 20 offers improved performance and introduces new core APIs to reduce the dependency on third-party libraries in your project.

The exact version used by Vercel is 20.11.1 and will automatically update minor and patch releases. Therefore, only the major version (20.x) is guaranteed.

Read the documentation for more.

Cover for Next.js AI Chatbot 2.0Cover for Next.js AI Chatbot 2.0

Next.js AI Chatbot 2.0

The Next.js AI Chatbot template has been updated to use AI SDK 3.0 with React Server Components.

We've included Generative UI examples so you can get quickly create rich chat interfaces beyond just plain text. The chatbot has also been upgraded to the latest Next.js App Router and Shadcn UI.

Lastly, we've simplified the default authentication setup by removing the need to create a GitHub OAuth application prior to initial deployment. This will make it faster to deploy and also easier for open source contributors to use Vercel Preview Deployments when they make changes.

Try the demo or deploy your own.

Cover for Skew Protection is now generally availableCover for Skew Protection is now generally available

Skew Protection is now generally available

Last year, we introduced Vercel's industry-first Skew Protection mechanism and we're happy to announce it is now generally available.

Skew Protection solves two problems with frontend applications:

  1. If users try to request assets (like CSS or JavaScript files) in the middle of a deployment, Skew Protection enables truly zero-downtime rollouts and ensures those requests resolve successfully.
  2. Outdated clients are able to call the correct API endpoints (or React Server Actions) when new server code is published from the latest deployment.

Since the initial release of Skew Protection, we have made the following improvements:

  • Skew Protection can now be managed through UI in the advanced Project Settings
  • Pro customers now default to 12 hours of protection
  • Enterprise customers can get up to 7 days of protection

Skew Protection is now supported in SvelteKit (v5.2.0 of the Vercel adapter), previously supported in Next.js (stable in v14.1.4), and more frameworks soon. Framework authors can view a reference implementation here.

Learn more in the documentation to get started with Skew Protection.

Cover for Prioritize production builds available on all plansCover for Prioritize production builds available on all plans

Prioritize production builds available on all plans

To accelerate the production release process, customers on all plans can now prioritize changes to the Production Environment over Preview Deployments.

With this setting configured, any Production Deployment changes will skip the line of queued Preview Deployments and go to the front of the queue.

You can also increase your build concurrency limits to give you the ability to start multiple builds at once. Additionally, Enterprise customers can also contact sales to purchase enhanced build machines with larger memory and storage.

Check out our documentation to learn more.

Cover for Manage your Vercel Functions CPU and memory in the dashboardCover for Manage your Vercel Functions CPU and memory in the dashboard

Manage your Vercel Functions CPU and memory in the dashboard

You can now configure Function CPU from the project settings page, where you can change your project’s default memory, and by extension CPU. Previously, this could only be changed in vercel.json.

The memory configuration of a function determines how much memory and CPU the function can use while executing. This new UI makes it more clear increasing memory boosts vCPU, which can result in better performance, depending on workload type.

Existing workloads (that have not modified vercel.json) are using the cost-effective basic option. Increasing function CPU increases the cost for the same duration, but may result in a faster function. This faster function may make the change net-neutral (or a price decrease in some cases).

Check out the documentation to learn more.

Cover for Improved hard caps for Spend ManagementCover for Improved hard caps for Spend Management

Improved hard caps for Spend Management

Pro customers can now automatically pause all projects when a spend amount is reached.

Spend Management allows you to receive notifications, trigger a webhook, and now more immediately pause projects when metered usage exceeds the set amount within the current billing cycle. This stops you incurring further cost from the production deployments.

  • You'll receive realtime notifications when your spending approaches and exceeds the set amount. For further control, you can continue to use a webhook in addition to automatic project pausing
  • This includes Web and Email notifications at 50%, 75%, and 100%. You can also receive SMS notifications when your spending reaches 100%

Check out our documentation to learn more.

Cover for View and override feature flags from the Vercel ToolbarCover for View and override feature flags from the Vercel Toolbar

View and override feature flags from the Vercel Toolbar

You can now view and override your application's feature flags from the Vercel Toolbar.

This means you can override flags provided by LaunchDarkly, Optimizely, Statsig, Hypertune, Split, or your custom setup without leaving your Vercel environment.

Vercel can now query an API Route defined in your application to find out about your feature flags, and will pick up their values by scanning the DOM for script tags. From there you can create overrides from the Vercel Toolbar, per session, for shorter feedback loops and improved QA and testing. Additionally, the overrides will be stored in an optionally encrypted cookie so your application can respect them.

This functionality is currently in beta and available to users on all plans.

Check out the documentation to learn more.

If you're a feature flag provider and interested in integrating with the Vercel Toolbar, contact us.

Cover for AI-enhanced search for Vercel documentationCover for AI-enhanced search for Vercel documentation

You can now get AI-assisted answers to your questions directly from the Vercel docs search:

  • Use natural language to ask questions about the docs
  • View recent search queries and continue conversations
  • Easily copy code and markdown output
  • Leave feedback to help us improve the quality of responses

Start searching with ⌘K (or Ctrl+K on Windows) menu on vercel.com/docs.

Cover for Gemini AI Chatbot with Generative UI supportCover for Gemini AI Chatbot with Generative UI support

The Gemini AI Chatbot template is a streaming-enabled, Generative UI starter application. It's built with the Vercel AI SDK, Next.js App Router, and React Server Components & Server Actions.

This template features persistent chat history, rate limiting to prevent abuse, session storage, user authentication, and more.

The Gemini model used is models/gemini-1.0-pro-001, however, the Vercel AI SDK enables exploring an LLM provider (like OpenAI, Anthropic, Cohere, Hugging Face, or using LangChain) with just a few lines of code.

Try the demo or deploy your own.

Cover for Hostname support in Web AnalyticsCover for Hostname support in Web Analytics

You can now inspect and filter hostnames in Vercel Web Analytics.

  • Domain insights: Analyze traffic by specific domains. This is beneficial for per-country domains, or for building multi-tenant applications.
  • Advanced filtering: Apply filters based on hostnames to view page views and custom events per domain.

This feature is available to all Web Analytics customers.

Learn more in our documentation about filtering.

Cover for Node.js v20 LTS is now generally availableCover for Node.js v20 LTS is now generally available

Node.js 20 is now fully supported for Builds and Vercel Functions. You can select 20.x in the Node.js Version section on the General page in the Project Settings. The default version for new projects is now Node.js 20.

Node.js 20 offers improved performance and introduces new core APIs to reduce the dependency on third-party libraries in your project.

The exact version used by Vercel is 20.11.1 and will automatically update minor and patch releases. Therefore, only the major version (20.x) is guaranteed.

Read the documentation for more.

Cover for Next.js AI Chatbot 2.0Cover for Next.js AI Chatbot 2.0

The Next.js AI Chatbot template has been updated to use AI SDK 3.0 with React Server Components.

We've included Generative UI examples so you can get quickly create rich chat interfaces beyond just plain text. The chatbot has also been upgraded to the latest Next.js App Router and Shadcn UI.

Lastly, we've simplified the default authentication setup by removing the need to create a GitHub OAuth application prior to initial deployment. This will make it faster to deploy and also easier for open source contributors to use Vercel Preview Deployments when they make changes.

Try the demo or deploy your own.

Cover for Skew Protection is now generally availableCover for Skew Protection is now generally available

Last year, we introduced Vercel's industry-first Skew Protection mechanism and we're happy to announce it is now generally available.

Skew Protection solves two problems with frontend applications:

  1. If users try to request assets (like CSS or JavaScript files) in the middle of a deployment, Skew Protection enables truly zero-downtime rollouts and ensures those requests resolve successfully.
  2. Outdated clients are able to call the correct API endpoints (or React Server Actions) when new server code is published from the latest deployment.

Since the initial release of Skew Protection, we have made the following improvements:

  • Skew Protection can now be managed through UI in the advanced Project Settings
  • Pro customers now default to 12 hours of protection
  • Enterprise customers can get up to 7 days of protection

Skew Protection is now supported in SvelteKit (v5.2.0 of the Vercel adapter), previously supported in Next.js (stable in v14.1.4), and more frameworks soon. Framework authors can view a reference implementation here.

Learn more in the documentation to get started with Skew Protection.

Cover for Prioritize production builds available on all plansCover for Prioritize production builds available on all plans

To accelerate the production release process, customers on all plans can now prioritize changes to the Production Environment over Preview Deployments.

With this setting configured, any Production Deployment changes will skip the line of queued Preview Deployments and go to the front of the queue.

You can also increase your build concurrency limits to give you the ability to start multiple builds at once. Additionally, Enterprise customers can also contact sales to purchase enhanced build machines with larger memory and storage.

Check out our documentation to learn more.

Cover for Manage your Vercel Functions CPU and memory in the dashboardCover for Manage your Vercel Functions CPU and memory in the dashboard

You can now configure Function CPU from the project settings page, where you can change your project’s default memory, and by extension CPU. Previously, this could only be changed in vercel.json.

The memory configuration of a function determines how much memory and CPU the function can use while executing. This new UI makes it more clear increasing memory boosts vCPU, which can result in better performance, depending on workload type.

Existing workloads (that have not modified vercel.json) are using the cost-effective basic option. Increasing function CPU increases the cost for the same duration, but may result in a faster function. This faster function may make the change net-neutral (or a price decrease in some cases).

Check out the documentation to learn more.

Cover for Improved hard caps for Spend ManagementCover for Improved hard caps for Spend Management

Pro customers can now automatically pause all projects when a spend amount is reached.

Spend Management allows you to receive notifications, trigger a webhook, and now more immediately pause projects when metered usage exceeds the set amount within the current billing cycle. This stops you incurring further cost from the production deployments.

  • You'll receive realtime notifications when your spending approaches and exceeds the set amount. For further control, you can continue to use a webhook in addition to automatic project pausing
  • This includes Web and Email notifications at 50%, 75%, and 100%. You can also receive SMS notifications when your spending reaches 100%

Check out our documentation to learn more.

Cover for View and override feature flags from the Vercel ToolbarCover for View and override feature flags from the Vercel Toolbar

You can now view and override your application's feature flags from the Vercel Toolbar.

This means you can override flags provided by LaunchDarkly, Optimizely, Statsig, Hypertune, Split, or your custom setup without leaving your Vercel environment.

Vercel can now query an API Route defined in your application to find out about your feature flags, and will pick up their values by scanning the DOM for script tags. From there you can create overrides from the Vercel Toolbar, per session, for shorter feedback loops and improved QA and testing. Additionally, the overrides will be stored in an optionally encrypted cookie so your application can respect them.

This functionality is currently in beta and available to users on all plans.

Check out the documentation to learn more.

If you're a feature flag provider and interested in integrating with the Vercel Toolbar, contact us.