Reference

Manage and optimize usage for Serverless Functions

Learn how to understand the different charts in the Vercel dashboard, how usage relates to billing, and how to optimize your usage of resources.
Table of Contents

The Serverless Functions section shows the following charts:

Manage and Optimize pricing
Metric
Description
Priced
Optimize
Function InvocationsThe number of times your Functions have been invokedYesLearn More
Function DurationThe time your Serverless Functions have spent responding to requestsYesLearn More
ThrottlesInstances where requests to Functions are not served due to concurrency limitsNoN/A
Managed Infrastructure hobby and pro resources
Resource
Hobby Included
Pro Included
Pro Extra
First 100 GB-HoursFirst 1,000 GB-Hours$0.18 - 1 GB-Hour
First 100,000First 1,000,000$0.60 - 1,000,000 Invocations

You are charged based on the number of times your functions get invoked. This includes both successful and errored invocations, but does not include cache hits. The number of invocations is calculated based on the number of times your function gets called, regardless of the response status code.

When using Incremental Static Regeneration with Next.js, both the revalidate option for getStaticProps and fallback for getStaticPaths will result in a Function invocation on revalidation, not for every user request.

When viewing your Functions Invocations graph, you can group by Ratio to see a total of all invocations across your team's projects that finished successfully, errored, or timed out.

Serverless Function execution will increase edge network usage as well. Caching your Serverless Function reduces the GB-hours of your Serverless Function, but does not reduce the network usage that comes with executing your Serverless Function.

  • Use the Projects option to see the total number of invocations for each project within your team. This can help you identify which projects are using the most invocations and where you can optimize
  • Cache your responses using edge caching and Cache-Control headers. This can help reduce the number of invocations that your Functions receive and makes responses faster for users
  • See How can I reduce my Serverless Execution usage on Vercel? for more general information on how to reduce your Serverless Functions usage.

You are charged based on the amount of time your Serverless Functions has run. This is sometimes called "wall-clock time , which refers to the actual time elapsed during a process, similar to how you would measure time passing on a wall clock. It includes all time spent from start to finish of the process, regardless of whether that time was actively used for processing or spent waiting for a streamed response. Function Duration is calculated in GB-Hours, which is the memory allocated for each Function in GB x the time in hours they were running.

For example, if a function has 1.7 GB (1769 MB) of memory and is executed 1 million times at a 1-second duration:

  • Total Seconds: 1M * (1s) = 1,000,000 Seconds
  • Total GB-Seconds: 1769/1024 GB * 1,000,000 Seconds = 1,727,539.06 GB-Seconds
  • Total GB-Hrs: 1,727,539.06 GB-Seconds / 3600 = 479.87 GB-Hrs
  • The total Serverless Function Execution is 479.87 GB-Hrs.

To see your current usage, navigate to the Usage tab on your team's Dashboard and go to Serverless Functions > Duration. You can use the Ratio option to see the total amount of execution time across all projects within your team, including the completions, errors, and timeouts.

  • Use the Projects option to see the total amount of execution time for each project within your team. This can help you identify which projects are using the most execution time and where you can optimize
  • By default, Functions get 1 vCPU of memory, but can be configured to use more. This may affect the duration of time your functions run
  • You can also adjust the maximum duration for your functions to prevent them from running for too long
  • To reduce the GB-hours (Execution) of your functions, ensure you are using edge caching and Cache-Control headers. If you are using Incremental Static Regeneration, note that Vercel counts Function invocations that happen on page revalidation towards both for GB-hours and Fast Origin Transfer
  • For troubleshooting issues that may be causing your functions to run longer than expected or timeout, see What can I do about Vercel Serverless Functions timing out?

This counts the number of times that a request to your Functions could not be served because the concurrency limit was hit.

While this is not a chargeable metric, it will cause a 429: FUNCTION_RATE_LIMIT error. To learn more, see What should I do if I receive a 429 error on Vercel?.

Last updated on July 24, 2024