Community is an integral part of Vercel. In order to advance our community efforts, we are proud to announce an all new design for Vercel TV. Vercel TV is a collection of community learning materials, and conference talks that aim to support, encourage and cultivate developer communities around the world.
In this blog post, we will walk through some of the exciting new aspects of this reimagined web experience, along with some of the technical details around implementing the new Vercel TV.

Refined Page Design

Redesigned TV page

To make access to information as easy as possible, we have introduced features that enable more comfortable navigation across the Vercel TV page, allowing categorized content to be expanded and collapsed as needed. We believe this model more clearly guides users toward relevant resources.

New Player


New Custom Player Controls

Continuing to pull on the thread of making our content more accessible, we have created a new player interface that supports a theater-style appearance, which allows for distraction-free viewing, prioritizing content. The player additionally features our custom controls that we have previously rolled out to our blog posts, including thumbnail previews while seeking and full support for full-screen mode.

Behind the Scenes

The new Vercel TV experience is made possible by integrating with Mux: an API-first platform, powered by data and designed by experts to make beautiful playback possible for every development team, including our team at Vercel.
Choosing the right tool (or the right API) for the job is important to us. Just as we see vercel as the right tool for serverless deployments, we see Mux as the right tool for Vercel TV. Here's why.

Straightforward Integration

One of the benefits of serverless computing is that server management and capacity planning decisions are completely hidden from developers and instead managed by experts and teams more specialized in that specific domain. Mux essentially provides a similar "serverless-style" solution when it comes to video: stream management and bandwidth-centric decisions are completely hidden from us and managed by more specialized experts in the field of video over HTTP.
We see similarities between vercel for deployment APIs, and Mux for video APIs, both tools aiming to provide a pleasant developer experience through idiomatic API design, while alleviating the conceptual load placed on developers empowering more efficient developer workflows.
Integration with Mux was fairly straightforward, mainly because Mux can ingest existing assets by URL. Since we already had our media archive in a cloud storage bucket, uploading to Mux was a matter of a simple API call: video URL strings are sent via a simple HTTP POST request containing the URL to the file.
fetch('', {
  method: 'POST',
  body: JSON.stringify({
    input: videoUrl
From there, the file is ingested, and an asset is created and prepared for streaming across a variety of devices and bandwidths, resulting in lower costs and better viewer experiences. Mux uses HLS{" "} for delivery which makes videos highly available, even as a given user's bandwidth fluctuates.
After POSTing to Mux, we are given an ID. Let us call this videoId. Our uploaded asset will then be available under some variation of or similar. Note the m3u8 file extension. We will come back to that later.
Since the entirety of this website is built with Next.js, and by extension,{" "} React, we have created a simple component that ties it all together by playing a video from Mux and using our custom player and controls to deliver what we hope is an exceptional user experience.


Since our player and its controls are regular React components, we can support seeking with thumbnails provided by Mux's Thumbnail API.

Player in action.

Here is a code snippet that we could use to create a list (or a Fragment) of dynamically-generated thumbnails from the Thumbnail API.
const videoTitle = getVideoTitle(myVideo)
const videoDuration = getVideoDuration(myVideo)
const seekThumbnailWidth = 300
const seekThumbnailHeight = 120
const segment = videoDuration >= 30 ? Math.floor(duration / 30) : 1

const makeThumbnails = (existingThumbnails = [], time) => {
  if (time < videoDuration) {
    return existingThumbnails

  const muxThumbsApiEndpoint =
    \`\${videoId}/thumbnail.jpg?\` +
    \`width=\${seekThumbnailWidth}\` +
    \`&height=\${seekThumbnailHeight}\` +
    \`&time=\${time}\` +

  const newThumbnail = <img alt={videoTitle} src={muxThumbsApiEndpoint} />
  return makeThumbnails([...existingThumbnails, newThumbnail], time + segment)

Creating a list of thumbnail images from the Thumbnail API

We would then use this fragment in our <Controls /> component, allowing users to more clearly see where they will be navigating to as they seek. In fact, the video above uses these dynamically-generated thumbnails in its own seeker controls thanks to the Thumbnail API.

HTTP Live Streaming

Mux uses HLS for high-availability delivery of content to consumers. HLS is typically considered the right choice for a number of reasons, the most common being:
  • HLS can play on nearly all devices.
  • HLS uses common formats and codecs.
  • HLS works over HTTP and reduces the complexity of having to use other protocols.

How does it work?

HLS Streams are made up of manifests. The primary manifest (also called the master manifest), is a playlist (commonly seen with the extension .m3u8) that lists an assortment of sizes and types available for a single video. For example, a master manifest could list a video in different resolutions: 320p, 720p, 4K, etc. These are called media manifests.
Each media manifest represents a different version of a video - a unique resolution, bitrate, and codec combination. For example, one media manifest describes 1080p at 5 Mbps, while another describes 720p at 3 Mbps.
It might be best to think of these manifests as CSS Media Queries or responsive images but for web video.
Media manifests are playlists that link to short segments (2-5 seconds or so) of video. Conceptually, these are similar to chunks in most streaming paradigms, including ReadableStreams in Node.js and HTML.
The key reason behind splitting a video into chunks is that it allows for a given HLS player to adapt to varying network conditions and switch to a lower quality set of chunks in case of network disturbance, while also allowing the player to select higher quality chunks in optimal conditions, all on the fly.
The master manifest is passed into an HLS player and the show begins.


Choosing the right tools and APIs for a given task is something that is important to us at Vercel. In light of the sentiments above, and in light of how the various moving parts integrate, we believe that we have done just that while crafting our redesigned Vercel TV experience.
Our hope in creating and curating this showcase of community content is that the knowledge shared in and around Vercel's sphere of influence would be made more accessible to all interested parties, further equipping and encouraging the developer community at large.