LibreChat
LibreChat is an open-source AI chat platform that you can self-host. You can configure it to use AI Gateway for unified model access and spend monitoring.
Go to the AI Gateway tab of the Vercel dashboard and click API keys to create a new API key.
Clone the LibreChat repository and set up the environment:
Terminalgit clone https://github.com/danny-avila/LibreChat.git cd LibreChat cp .env.example .envWindows users: Replace
cpwithcopyif needed. Docker Desktop is required for this setup.Create a
docker-compose.override.ymlfile in your LibreChat root directory to mount the configuration:docker-compose.override.ymlservices: api: volumes: - type: bind source: ./librechat.yaml target: /app/librechat.yamlThis allows LibreChat to read your custom endpoint configuration.
Add your AI Gateway API key to your
.envfile in the LibreChat root directory:.envAI_GATEWAY_API_KEY=your-ai-gateway-api-keyUse the
${"${VARIABLE_NAME}"}pattern to reference environment variables. Do not include raw API keys in the YAML file.Create a
librechat.yamlfile in your LibreChat root directory:librechat.yamlversion: 1.2.8 cache: true endpoints: custom: - name: "Vercel" apiKey: "${AI_GATEWAY_API_KEY}" baseURL: "https://ai-gateway.vercel.sh/v1" titleConvo: true models: default: - "openai/gpt-5.2" - "anthropic/claude-sonnet-4.5" - "google/gemini-3-flash" fetch: true titleModel: "openai/gpt-5.2"Setting
fetch: trueautomatically fetches all available models from AI Gateway. Browse the full catalog on the models page.Start or restart your LibreChat instance to apply the configuration:
Terminaldocker compose up -dIf LibreChat is already running, restart it:
Terminaldocker compose restartOnce started, navigate to http://localhost:3080/ to access LibreChat.
In the LibreChat interface:
- Click the endpoint dropdown at the top
- Select Vercel
- Choose a model from the available options
Your requests will now be routed through AI Gateway. You can verify this by checking your AI Gateway Overview in the Vercel dashboard.
View your usage, spend, and request activity in the AI Gateway tab of the Vercel dashboard. See the observability documentation for more details.
You can customize the LibreChat endpoint configuration:
- titleConvo: Set to
trueto enable automatic conversation titles - titleModel: Specify which model to use for generating conversation titles
- modelDisplayLabel: Customize the label shown in the interface (optional)
- dropParams: Remove default parameters that some providers don't support
See the LibreChat custom endpoints documentation for all available options.
Was this helpful?