Next.js on Vercel
Vercel is the native Next.js platform, designed to enhance the Next.js experience.Next.js is a fullstack React framework for the web, maintained by Vercel.
While Next.js works when self-hosting, deploying to Vercel is zero-configuration and provides additional enhancements for scalability, availability, and performance globally.
There are multiple ways to get started with Next.js on Vercel:
- Choose a template from the Vercel templates marketplace
- If you already have a project with Next.js, install Vercel CLI and run the
vercel
command from your project's root directory - Clone one of our Next.js example repos to your favorite git provider and deploy it on Vercel with the button below:
Vercel deployments can integrate with your git provider to generate preview URLs for each pull request you make to your Next.js project.
Incremental Static Regeneration (ISR) allows you to create or update content without redeploying your site. ISR has three main benefits for developers: better performance, improved security, and faster build times.
When self-hosting, (ISR) is limited to a single region workload. Statically generated pages are not distributed closer to visitors by default, without additional configuration or vendoring of a CDN. By default, self-hosted ISR does not persist generated pages to durable storage. Instead, these files are located in the Next.js cache (which expires).
To summarize, using ISR with Next.js on Vercel:
Better performance with our global Edge Network
Server-Side Rendering (SSR) allows you to render pages dynamically on the server. This is useful for pages where the rendered data needs to be unique on every request. For example, checking authentication or looking at the location of an incoming request.
On Vercel, you can server-render Next.js applications in either the Node.js runtime (default) with Serverless Functions or the Edge runtime with Edge Functions. This allows you to pick the best rendering strategy on a per-page basis.
To summarize, SSR with Next.js on Vercel:
Has zero-configuration support for Cache-Control
headers, including
stale-while-revalidate
Image Optimization helps you achieve faster page loads by reducing the size of images and using modern image formats.
When deploying to Vercel, images are automatically optimized on demand, keeping your build times fast while improving your page load performance and Core Web Vitals.
When self-hosting, Image Optimization uses the default Next.js server for optimization. This server manages the rendering of pages and serving of static files.
To use Image Optimization with Next.js on Vercel, import the next/image
component into the component you'd like to add an image to, as shown in the following example:
import Image from 'next/image'
interface ExampleProps {
name: string;
}
const ExampleComponent = (props: ExampleProps) : => {
return (
<>
<Image
src="example.png"
alt="Example picture"
width={500}
height={500}
/>
<span>{props.name}</span>
</>
)
}
To summarize, using Image Optimization with Next.js on Vercel:
next/image
@next/font enables built-in automatic self-hosting for any font file. This means you can optimally load web fonts with zero layout shift, thanks to the underlying CSS size-adjust property used.
This also allows you to use all Google Fonts with performance and privacy in mind. CSS and font files are downloaded at build time and self-hosted with the rest of your static files. No requests are sent to Google by the browser.
To use font optimization with Next.js on Vercel, first install the @next/font
package:
pnpm i @next/font
Then, you can import a font from the @next/font
package and use it to style elements as shown below:
import { Roboto } from '@next/font/google';
const roboto = Roboto({
// Specifying weight is only required for
// non-variable fonts.
weight: '400',
subsets: ['latin'],
display: 'swap',
});
export default function FontExample({
children,
}: {
children: React.ReactNode;
}) {
return (
<html lang="en" className={roboto.className}>
<body>{children}</body>
</html>
);
}
To summarize, using Font Optimization with Next.js on Vercel:
Dynamic social card images (using the Open Graph protocol) allow you to create a unique image for every page of your site. This is useful when sharing links on the web through social platforms or through text message.
The @vercel/og
image generation library allows you generate fast, dynamic social card images using Next.js API Routes.
On Vercel, your Next.js API Routes using Vercel OG are automatically optimized using Vercel Edge Functions and WebAssembly. This enables social card images to be generated faster, cheaper, and more scalable than self-hosted Next.js.
To use OG image generation with Next.js, you must create a pages/api
directory. Note that even if you're using Next.js 13 and your app lives in the app
directory, you must still put your API routes in the pages/api
directory.
You must also run Node v16 or higher, and install the @vercel/og
package with the following command:
pnpm i @vercel/og
The following example demonstrates using OG image generation in both the Next.js 12 and 13.
import { ImageResponse } from '@vercel/og';
export const config = {
runtime: 'edge',
};
export default function () {
return new ImageResponse(
(
<div
style={{
fontSize: 128,
background: 'white',
width: '100%',
height: '100%',
display: 'flex',
textAlign: 'center',
alignItems: 'center',
justifyContent: 'center',
}}
>
Hello world!
</div>
),
{
width: 1200,
height: 600,
},
);
}
To see your generated image, run npm run dev
in your terminal and visit the /api/og
route in your browser (most likely http://localhost:3000/api/og
).
To summarize, the benefits of using Vercel OG with Next.js include:
Middleware is code that executes before a request is processed. Because Middleware runs before the cache, it's an effective way of providing personalization to statically generated content.
When self-hosted, Middleware is limited to a single region workload. Middleware is not distributed closer to visitors by default, without additional configuration or vendoring of Edge compute. Since Middleware runs before every request, using Edge compute is a more efficient and performant way to serve content.
To get started, create a middleware
file in the root directory of your project. The following example demonstrates Middleware that uses a matcher
and geolocation information to block users from the US from visiting a secret page:
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
// The country to block from accessing the secret page
const BLOCKED_COUNTRY = 'SE';
// Trigger this middleware to run on the `/secret-page` route
export const config = {
matcher: '/secret-page',
};
export function middleware(request: NextRequest) {
// Extract country. Default to US if not found.
const country = (req.geo && req.geo.country) || 'US';
console.log(`Visitor from ${country}`);
// Specify the correct route based on the requests location
if (country === BLOCKED_COUNTRY) {
req.nextUrl.pathname = '/login';
} else {
req.nextUrl.pathname = `/secret-page`;
}
// Rewrite to URL
return NextResponse.rewrite(req.nextUrl);
}
To summarize, Middleware with Next.js on Vercel:
Runs using Edge Middleware which are deployed globally
Vercel supports streaming for Serverless Functions, Edge Functions, and React Server Components in Next.js projects. Streaming data allows you to fetch information in chunks rather than all at once, speeding up Function responses. Using streams can improve your app's user experience and prevent your Serverless and Edge Functions from failing when fetching large files.
To create an Edge Function that returns a readable stream, your Function's exported handler method should return a Response
object that takes an instance of the
ReadableStream
interface as the first argument in its constructor. For example:
export const config = {
runtime: 'edge',
};
export default async function handler(_) {
const encoder = new TextEncoder();
const customReadable = new ReadableStream({
start(controller) {
controller.enqueue(encoder.encode('Basic Streaming Test'));
controller.close();
},
});
return new Response(customReadable, {
headers: { 'Content-Type': 'text/html; charset=utf-8' },
});
}
See our docs on Edge Function streaming to learn more.
In Next.js 13 (using the app
directory), you can use the loading
file convention, or a Suspense
component, to show an instant loading state from the server while the content of a route segment loads.
The loading
file offers a solution for displaying a loading state for an entire route or route-segment, rather than specific parts of a page. This file lives at the same level as the layout
file that applies to the route, and its contents will be displayed until all data fetching in the route segment has finished.
The following example demonstrates a basic loading
file:
export default function Loading() {
return <p>Loading...</p>;
}
Learn more about loading in Next 13's docs.
The Suspense
component, introduced in React 18, enables you to display a fallback until components nested within it have finished loading. This solution is more granular than showing a loading state for an entire route, and is useful when only sections of your UI need a loading state.
You can specify a component to show during the loading state with the fallback
prop on the Suspense
component as shown below:
import { Suspense } from 'react';
import { PostFeed, Weather } from './components';
export default function Posts() {
return (
<section>
<Suspense fallback={<p>Loading feed...</p>}>
<PostFeed />
</Suspense>
<Suspense fallback={<p>Loading weather...</p>}>
<Weather />
</Suspense>
</section>
);
}
To summarize, using Streaming with Next.js on Vercel:
app/
directoryPreview Mode enables you view draft content from your Headless CMS immediately, while still statically generating pages in production.
When self-hosting, every request using Preview Mode hits the Next.js server, potentially incurring extra load or cost. Further, by spoofing the cookie, malicious users could attempt to gain access to your underlying Next.js server.
On Vercel, our Edge Network prevents invalid requests for Preview Mode automatically. If the preview cookie is invalid, our CDN responds and removes it. If the user is unauthenticated, you will not pay for any extra usage. Requests using Preview Mode are automatically routed to a Serverless Function instead of serving from the static cache.
To summarize, the benefits of using Preview Mode with Next.js on Vercel include:
Vercel's Analytics features enable you to visualize and monitor your application's performance over time. The Analytics tab in your project's dashboard offers detailed insights into your website's visitors, with metrics like top pages, top referrers, and user demographics.
To use Analytics, navigate to the Analytics tab of your project dashboard on Vercel and select Enable in the modal that appears.
To track visitors and page views, we recommend first installing our @vercel/analytics
package by running the terminal command below in the root directory of your Next.js project:
pnpm i @vercel/analytics
Then, follow the instructions below to add the Analytics
component to your app either using the pages
directory or the app
directory.
The Analytics
component is a wrapper around the tracking script, offering more seamless integration with Next.js.
If you are using the pages
directory, add the following component to your main app dd file:
import type { AppProps } from 'next/app';
import { Analytics } from '@vercel/analytics/react';
function MyApp({ Component, pageProps }: AppProps) {
return (
<>
<Component {...pageProps} />
<Analytics />
</>
);
}
export default MyApp;
To summarize, Analytics with Next.js on Vercel:
You can see data about your project's Core Web Vitals performance in your dashboard on Vercel. Doing so will allow you to track your web application's loading speed, responsiveness, and visual stability so you can improve the overall user experience.
On Vercel, you can track your Next.js app's Core Web Vitals in your project's dashboard.
If you're self-hosting your app, you can use the reportWebVitals
hook to send metrics to any analytics provider. Doing so requires creating your own custom app
component file.
Then you must export a reportWebVitals
function from your custom app
component, as demonstrated below:
export function reportWebVitals(metric) {
switch (metric.name) {
case 'FCP':
// handle FCP results
break;
case 'LCP':
// handle LCP results
break;
case 'CLS':
// handle CLS results
break;
case 'FID':
// handle FID results
break;
case 'TTFB':
// handle TTFB results
break;
case 'INP':
// handle INP results (note: INP is still an experimental metric)
break;
default:
break;
}
}
function MyApp({ Component, pageProps }) {
return <Component {...pageProps} />;
}
export default MyApp;
Next.js uses Google's web-vitals
library to measure the Web Vitals metrics available in reportWebVitals
.
To summarize, tracking Web Vitals with Next.js on Vercel:
Enables you to track traffic performance metrics, such as First Contentful Paint, or First Input Delay
Shows you a score for your app's performance on each recorded metric, which you can use to track improvements or regressions
Vercel has partnered with popular service providers, such as MongoDB and Sanity, to create integrations that make using those services with Next.js easier. There are many integrations across multiple categories, such as Commerce, Databases, and Logging.
To summarize, Integrations on Vercel:
See our Frameworks documentation page to learn about the benefits available to all frameworks when you deploy on Vercel.
Learn more about deploying Next.js projects on Vercel with the following resources: