If you are encountering the error 429: FUNCTION_RATE_LIMIT with your Vercel Serverless Function, it means you have exhausted the default limit of 1000 concurrent executions. While extremely rare, this article covers cases in which this is may happen, how to investigate further, and how you can increase this limit.
If a request triggers a Serverless Function, a single instance of the function will be initialized. If another request comes in before the first request finished returning, then another function will be initialized, which would result in two concurrent functions running at the same time. Once a Function is initialized, it will remain alive for short period of time but State such as temporary files, memory caches, sub-processes, is preserved so that the function may be re-used. You can read more about the conceptual model of Serverless Functions and its lifecycle on our documentation.
Serverless Functions can only have a maximum concurrency of 1000 across all projects within a given account scope and region. Once that limit is exceeded, a
429 status code is returned. Further insight can be observed with the usage of Log Drains. If you or your team needs increased concurrency in the event of high traffic events, such as marketing campaigns, product launches, or swag drops please get in touch with our Sales Team to explore your options.
Serverless Function concurrency can be controlled by reducing the number of Function invocations. If your response can be cached, even for a short period of time, adding the appropriate
cache-control headers can help limit the number of requests that reach your Function, while also speeding up your responses. Other options to consider: Can my project use less Server Side Rendering or API routes? Can I leverage Incremental Static Regeneration? Can I deploy my Serverless functions to multiple regions? Can I introduce a Serverless Function Execution Timeout lower than my plan limit?
Hitting the Serverless Function concurrency limit can often be seen as a symptom leaving the root-cause still needing to be discovered. If you see Serverless Execution time per request is increasing the available function capacity is being used as well. We recommend checking on your upstream provider (API, CMS, DB, etc.) to see if it is not scaling with Vercel. If this is the case a concurrency increase will put further strain on your upstream provider.