Reference
6 min read

Manage and optimize usage for Data Cache

Learn how to understand the different charts in the Vercel dashboard, how usage relates to billing, and how to optimize your usage of Data Cache.
Table of Contents

Data Cache is available in Beta on all plans

This section details our improved infrastructure pricing. On April 25, 2024, these changes will apply to all new Pro customers. Starting May 25, 2024, current Pro customers will see these changes take effect on their next billing cycle. The Hobby tier remains free. The granular Vercel Data Cache that was introduced with the App Router, is still in public beta and not being charged until generally available.

The Data Cache section shows the following charts:

Manage and Optimize pricing
Metric
Description
Priced
Optimize
OverviewThe usage from fetch requests to originsNoLearn More
ReadsThe total amount of Read Units used to access the Data CacheYesLearn More
WritesThe total amount of Write Units used to store new data in the Data CacheYesLearn More

Reads and writes to the data cache are measured in 8 KB units:

  • Read unit: One read unit equals 8 KB of data read from the cache
  • Write unit: One write unit equals 8 KB of data written to the cache
Select a Region
Managed Infrastructure pricing
Resource
Hobby Included
Pro Included
Pro Additional
Pro Price
First 1,000,000First 10,000,0001,000,000 Read Units$0.40
First 200,000First 2,000,0001,000,000 Write Units$4.00

Incremental Static Regeneration (ISR): Pages that use ISR can use Data Cache when they're generated from a mix of static and dynamic data. Therefore, they can contribute towards the cost of Data Cache.

The Data Cache overview chart shows the usage from fetch requests divided by:

  • Hits: Percentage of fetch requests to cache that result in a cache hit
  • Misses: Percentage of fetch requests to cache that result in a cache miss
  • Requests: Number of requests to any unique path
  • Bandwidth: Amount of data transferred from any unique path

You get charged based on the amount of data read from your Data Cache and the region(s) in which the reads happen.

When viewing your Data Cache read units chart, you can group by:

  • Origin: To see the number of reads from either the Vercel Data Cache, or Increment Static Regeneration (ISR)
  • Projects: To see the number of read units for each project
  • Region: To see the number of read units for each region

You get charged based on the amount of Data Cache write units written to your Data Cache and the region(s) in which the writes happen.

When viewing your Data Cache writes chart, you can group by sum of units to see a total of all writes across your team's projects.

  • Origin: To see the number of writes to either the Vercel Data Cache, or Increment Static Regeneration (ISR)
  • Projects: To see the number of write units for each project
  • Region: To see the number of write units for each region

Consider the following methods to optimize your Data Cache writes:

  • Use a higher revalidate value: Selecting a higher revalidate value can reduce the number of writes because the data will be marked as stale less frequently, reducing the need for new writes
  • Use on-demand revalidation: Moving to on-demand revalidation can also help reduce the number of writes. This allows you to manually control when certain data is revalidated, reducing unnecessary writes

You incur charges based on the volume of data read from and written to your Data Cache and the geographical regions where those operations occur. Adjusting revalidation intervals influences how often data is fetched from the cache. By increasing the interval between revalidations for static data, and managing revalidations for dynamic data, you can reduce the frequency of origin reads.

While this strategy impacts origin reads directly, it also indirectly reduces cache writes, since data remains valid in the cache for longer periods. This maximizes the cost-efficiency of each cache operation, which can be significant when using both granular Data Cache and ISR, as each system adapts its caching strategy based on the nature of the data processed.

To manage and reduce these costs effectively, consider the following revalidation strategies:

Best for static content that does not change frequently

Set a revalidation interval for each cached item based on the frequency it will update. For data that rarely changes (such as an about page or hero image), extend the revalidation period. This lets data remain cached for longer before it is marked as stale and re-fetched from the origin, minimizing the frequency of origin reads. This is cost-effective for static components in pages using ISR.

For more details see the time-based revalidation example

Best for dynamic content that changes frequently

Trigger revalidations manually for specific data as needed, instead of on fixed intervals. This is useful for dynamic data where updates do not follow a predictable pattern, such as user-specific content or product prices. By controlling when data is revalidated, you can avoid unnecessary origin reads and refresh only when necessary.

For more details see the on-demand revalidation example

Best for groups of related data that should update together

Group related data using tags and revalidate all associated items simultaneously with revalidateTag. This refreshes entire groups of data at once, reducing individual origin reads. This is useful when using ISR, allowing for bulk updates of static content.

For more details see the tag-based revalidation example

Real-world scenario for cache revalidation intervals

Consider an e-commerce platform with a global customer base. This site has both static content (such as product descriptions) and dynamic content (such as customer reviews and stock levels).

To optimize cache performance and reduce costs, the site can adjust revalidation intervals based on the nature of the content:

Optimizing static content: For static product descriptions that rarely change, set longer revalidation intervals. For example, revalidating these pages every week rather than daily reduces origin reads. The longer interval ensures that the product descriptions remain in cached longer before they are marked as stale and re-fetched from the origin, optimizing cache performance.

Optimizing dynamic content: For dynamic content like customer reviews and stock levels, which frequently change, use on-demand revalidation to trigger revalidations when a customer submits a review or when stock levels update. This approach prevents unnecessary origin reads by refreshing content only when there are actual changes.

Utilizing tag-based revalidation: To manage promotions or seasonal collections that involve multiple related products, use tag-based revalidation. By tagging all products in a seasonal sale and revalidating them simultaneously with revalidateTag, all products associated with the tag are refreshed at once. This is useful during high-traffic periods like Black Friday, allowing the site to handle large volumes of static content updates efficiently.

Adjusting revalidation intervals based on the nature of the content ensures that the cache is optimized for both static and dynamic data. This can reduce the frequency of origin reads and writes and enhance performance and cost-efficiency for your site.

The bandwidth chart shows the amount of data the Vercel Data Cache has received and sent for your projects. You can group by:

  • Ratio: To see the amount of data transferred and written by the Data Cache
  • Projects: To see the amount of data transferred and written for each project, and a percentage of the total

The revalidation chart shows the number of revalidation requests made to the Data Cache. You can group by:

  • Ratio: To see the number of revalidation requests made to the Data Cache
  • Projects: To see the number of revalidation requests for each project
Last updated on April 29, 2024