1 min read
Python is now supported in the ongoing in-function concurrency public beta.
In-function concurrency optimizes functions to handle multiple invocations simultaneously, improving resource efficiency. By reusing active instances instead of creating new ones, it reduces idle compute time and associated costs.
In-function concurrency is particularly beneficial for workloads with external API or database calls, such as AI models, where functions often sit idle while waiting for responses.
The in-function concurrency public beta is available to Pro and Enterprise customers using Standard or Performance Function CPU, and can be enabled through your dashboard. Real-time tracking of resource savings is available in Observability.
Learn more in our blog post and documentation, or get started with our template by enabling In-function concurrency in your project settings.