OpenAI Codex
OpenAI Codex is OpenAI's agentic coding tool. You can configure it to use Vercel AI Gateway, enabling you to:
- Route requests through multiple AI providers
- Monitor traffic and spend in your AI Gateway Overview
- View detailed traces in Vercel Observability under AI
- Use any model available through the gateway
Configure Codex to use AI Gateway through its configuration file for persistent settings.
Follow the installation instructions on the OpenAI Codex repository to install the Codex CLI tool.
Set your AI Gateway API key in your shell configuration file, for example in
~/.zshrcor~/.bashrc:export AI_GATEWAY_API_KEY="your-ai-gateway-api-key"After adding this, reload your shell configuration:
source ~/.zshrc # or source ~/.bashrcOpen
~/.codex/config.tomland add the following:~/.codex/config.toml[model_providers.vercel] name = "Vercel AI Gateway" base_url = "https://ai-gateway.vercel.sh/v1" env_key = "AI_GATEWAY_API_KEY" wire_api = "responses" [profiles.vercel] model_provider = "vercel" model = "openai/gpt-5.2-codex"The configuration above:
- Sets up a model provider named
vercelthat points to the AI Gateway - References your
AI_GATEWAY_API_KEYenvironment variable - Creates a
vercelprofile that uses the Vercel provider - Specifies
openai/gpt-5.2-codexas the default model - Uses
wire_api = "responses"for the OpenAI Responses API format
- Sets up a model provider named
Start Codex with the
vercelprofile:codex --profile vercelVercel AI Gateway routes your requests. To confirm, check your AI Gateway Overview in the Vercel dashboard.
To use a different model, update the
modelfield in your config:~/.codex/config.toml[profiles.vercel] model_provider = "vercel" model = "anthropic/claude-sonnet-4.5" # Or try other models: # model = "google/gemini-3-flash" # model = "openai/o3"When using non-OpenAI models through the gateway, you may see warnings about model metadata not being found. These warnings are safe to ignore since the gateway handles model routing.
Add each profile to your config file:
~/.codex/config.toml[model_providers.vercel] name = "Vercel AI Gateway" base_url = "https://ai-gateway.vercel.sh/v1" env_key = "AI_GATEWAY_API_KEY" wire_api = "responses" [profiles.vercel] model_provider = "vercel" model = "openai/gpt-5.2-codex" [profiles.fast] model_provider = "vercel" model = "openai/gpt-4o-mini" [profiles.reasoning] model_provider = "vercel" model = "openai/o3" [profiles.claude] model_provider = "vercel" model = "anthropic/claude-sonnet-4.5"Switch between profiles using the
--profileflag:codex --profile vercel codex --profile claude
Was this helpful?