1 min read
You can now deploy LiteLLM server on Vercel, giving developers LLM access with an OpenAI-compatible gateway connecting to any supported provider, including Vercel AI Gateway.
from litellm.proxy import proxy_serverapp = proxy_server.appBasic LiteLLM Gateway app
To route a single model through Vercel AI Gateway, use the below configuration in litellm_config.yaml:
- model_name: gpt-5.4-gateway litellm_params: model: vercel_ai_gateway/openai/gpt-5.4 api_key: os.environ/VERCEL_AI_GATEWAY_API_KEYRouting a model through Vercel AI Gateway in LiteLLM
Deploy LiteLLM on Vercel or learn more on our documentation