LiteLLM server now supported on Vercel

Authors

1 min read

You can now deploy LiteLLM server on Vercel, giving developers LLM access with an OpenAI-compatible gateway connecting to any supported provider, including Vercel AI Gateway.

app.py
from litellm.proxy import proxy_server
app = proxy_server.app

Basic LiteLLM Gateway app

To route a single model through Vercel AI Gateway, use the below configuration in litellm_config.yaml:

litellm_config.yaml
- model_name: gpt-5.4-gateway
litellm_params:
model: vercel_ai_gateway/openai/gpt-5.4
api_key: os.environ/VERCEL_AI_GATEWAY_API_KEY

Routing a model through Vercel AI Gateway in LiteLLM

Deploy LiteLLM on Vercel or learn more on our documentation