Menu

LibreChat

Last updated January 23, 2026

LibreChat is an open-source AI chat platform that you can self-host. You can configure it to use AI Gateway for unified model access and spend monitoring.

  1. Go to the AI Gateway tab of the Vercel dashboard and click API keys to create a new API key.

  2. Clone the LibreChat repository and set up the environment:

    Terminal
    git clone https://github.com/danny-avila/LibreChat.git
    cd LibreChat
    cp .env.example .env

    Windows users: Replace cp with copy if needed. Docker Desktop is required for this setup.

  3. Create a docker-compose.override.yml file in your LibreChat root directory to mount the configuration:

    docker-compose.override.yml
    services:
      api:
        volumes:
          - type: bind
            source: ./librechat.yaml
            target: /app/librechat.yaml

    This allows LibreChat to read your custom endpoint configuration.

  4. Add your AI Gateway API key to your .env file in the LibreChat root directory:

    .env
    AI_GATEWAY_API_KEY=your-ai-gateway-api-key

    Use the ${"${VARIABLE_NAME}"} pattern to reference environment variables. Do not include raw API keys in the YAML file.

  5. Create a librechat.yaml file in your LibreChat root directory:

    librechat.yaml
    version: 1.2.8
    cache: true
     
    endpoints:
      custom:
        - name: "Vercel"
          apiKey: "${AI_GATEWAY_API_KEY}"
          baseURL: "https://ai-gateway.vercel.sh/v1"
          titleConvo: true
          models:
            default:
              - "openai/gpt-5.2"
              - "anthropic/claude-sonnet-4.5"
              - "google/gemini-3-flash"
            fetch: true
          titleModel: "openai/gpt-5.2"

    Setting fetch: true automatically fetches all available models from AI Gateway. Browse the full catalog on the models page.

  6. Start or restart your LibreChat instance to apply the configuration:

    Terminal
    docker compose up -d

    If LibreChat is already running, restart it:

    Terminal
    docker compose restart

    Once started, navigate to http://localhost:3080/ to access LibreChat.

  7. In the LibreChat interface:

    1. Click the endpoint dropdown at the top
    2. Select Vercel
    3. Choose a model from the available options

    Your requests will now be routed through AI Gateway. You can verify this by checking your AI Gateway Overview in the Vercel dashboard.

  8. View your usage, spend, and request activity in the AI Gateway tab of the Vercel dashboard. See the observability documentation for more details.

You can customize the LibreChat endpoint configuration:

  • titleConvo: Set to true to enable automatic conversation titles
  • titleModel: Specify which model to use for generating conversation titles
  • modelDisplayLabel: Customize the label shown in the interface (optional)
  • dropParams: Remove default parameters that some providers don't support

See the LibreChat custom endpoints documentation for all available options.


Was this helpful?

supported.