LibreChat

Custom Endpoints

Add custom AI providers like OpenRouter and Ollama to LibreChat using librechat.yaml

LibreChat supports any OpenAI API-compatible service as a custom endpoint. You configure endpoints in librechat.yaml, store API keys in .env, and mount the config via docker-compose.override.yml for Docker deployments.

Which File Does What?

Custom endpoint setup involves three files, each with a specific role:

  1. librechat.yaml -- Defines your custom endpoints (name, API URL, models, display settings)
  2. .env -- Stores sensitive values like API keys (referenced from librechat.yaml using ${VAR_NAME} syntax)
  3. docker-compose.override.yml -- Mounts librechat.yaml into the Docker container (Docker users only)

For a full overview of how these files work together, see the Configuration Overview.

Before You Start

This guide assumes you have LibreChat installed and running. If not, complete the Docker setup first.

Step 1. Mount librechat.yaml (Docker Only)

Docker users need to mount librechat.yaml as a volume so the container can read it. Skip this step if you are running LibreChat locally without Docker.

cp docker-compose.override.yml.example docker-compose.override.yml

Edit docker-compose.override.yml and ensure the volume mount is uncommented:

services:
  api:
    volumes:
      - type: bind
        source: ./librechat.yaml
        target: /app/librechat.yaml

Learn more: Docker Override Guide

Step 2. Configure librechat.yaml

Create a librechat.yaml file in the project root (if it does not exist) and add your endpoint configuration. See the librechat.yaml guide for detailed setup instructions.

Here is an example with OpenRouter and Ollama:

version: 1.3.5
cache: true
endpoints:
  custom:
    - name: "OpenRouter"
      apiKey: "${OPENROUTER_KEY}"
      baseURL: "https://openrouter.ai/api/v1"
      models:
        default: ["meta-llama/llama-3-70b-instruct"]
        fetch: true
      titleConvo: true
      titleModel: "meta-llama/llama-3-70b-instruct"
      dropParams: ["stop"]
      modelDisplayLabel: "OpenRouter"
    - name: "Ollama"
      apiKey: "ollama"
      baseURL: "http://host.docker.internal:11434/v1/"
      models:
        default: ["llama3:latest", "command-r", "mixtral", "phi3"]
        fetch: true
      titleConvo: true
      titleModel: "current_model"

Browse all compatible providers in the AI Endpoints section. For the full field reference, see Custom Endpoint Object Structure.

API Key Configuration

When configuring API keys in custom endpoints, you have three options:

  1. Environment variable (recommended): apiKey: "${OPENROUTER_KEY}" -- reads from .env
  2. User provided: apiKey: "user_provided" -- users enter their own key in the UI
  3. Direct value (not recommended): apiKey: "sk-your-actual-key" -- stored in plain text

Step 3. Set Environment Variables

Add the API keys referenced in your librechat.yaml to the .env file:

OPENROUTER_KEY=your_openrouter_api_key

Each ${VARIABLE_NAME} in librechat.yaml must have a matching entry in .env.

Step 4. Restart and Verify

After editing configuration files, you must restart LibreChat for changes to take effect.

docker compose down && docker compose up -d

Open LibreChat in your browser. Your custom endpoints should appear in the endpoint selector dropdown.

Not Seeing Your Endpoint?

Check the server logs for configuration errors:

docker compose logs api

Common issues: YAML syntax errors, missing env vars, or librechat.yaml not mounted in Docker. Validate your YAML with the YAML Validator.

Next Steps

How is this guide?

On this page