LibreChat

OpenRouter

Complete setup guide for using OpenRouter as a custom endpoint in LibreChat

OpenRouter provides access to hundreds of AI models through a single API. This guide walks you through setting up OpenRouter as a custom endpoint in LibreChat from scratch.

Prerequisites

Before starting, make sure you have:

Setup

Get an API Key

Create an account at openrouter.ai and generate an API key from the Keys page.

Copy the key -- it starts with sk-or-v1-.

Add the Key to Your .env File

Open your .env file in the project root and add your OpenRouter API key:

OPENROUTER_KEY=sk-or-v1-your-key-here

Use OPENROUTER_KEY, Not OPENROUTER_API_KEY

You must use OPENROUTER_KEY as the variable name. Using OPENROUTER_API_KEY will override the built-in OpenAI endpoint to use OpenRouter as well, which is almost certainly not what you want.

Add the Endpoint to librechat.yaml

Add the following to your librechat.yaml file. If the file already has content, merge the endpoints section with your existing configuration:

version: 1.3.5
cache: true
endpoints:
  custom:
    - name: "OpenRouter"
      apiKey: "${OPENROUTER_KEY}"
      baseURL: "https://openrouter.ai/api/v1"
      models:
        default: ["meta-llama/llama-3-70b-instruct"]
        fetch: true
      titleConvo: true
      titleModel: "meta-llama/llama-3-70b-instruct"
      dropParams: ["stop"]
      modelDisplayLabel: "OpenRouter"

Key fields explained:

FieldPurpose
apiKey: "${OPENROUTER_KEY}"References the env var from Step 2. The ${} syntax tells LibreChat to read the value from .env.
models.fetch: trueFetches the full model list from OpenRouter's API, so new models appear automatically.
dropParams: ["stop"]Removes the stop parameter from requests. OpenRouter models use varied stop tokens, so dropping this avoids compatibility issues.
modelDisplayLabel: "OpenRouter"The name shown in LibreChat's endpoint selector.

Restart LibreChat

docker compose down && docker compose up -d

Verify It Works

Open LibreChat in your browser. You should see OpenRouter in the endpoint selector dropdown. Select it to see the available models.

If OpenRouter does not appear, check the server logs for configuration errors:

docker compose logs api | grep -i "error\|openrouter"

Customization

Using user_provided API Key

Instead of storing the key in .env, you can let each user provide their own key through the LibreChat UI:

apiKey: "user_provided"

Users will see a key input field when selecting the OpenRouter endpoint.

Limiting Available Models

Instead of fetching all models, you can specify a fixed list:

models:
  default: ["anthropic/claude-3.5-sonnet", "openai/gpt-4o", "meta-llama/llama-3-70b-instruct"]
  fetch: false

Reference

How is this guide?