Custom Endpoints
LibreChat supports OpenAI API compatible services using the librechat.yaml
configuration file.
This guide assumes you have already set up LibreChat using Docker, as shown in the Local Setup Guide.
Step 1. Create or Edit a Docker Override File
- Create a file named
docker-compose.override.yml
file at the project root (if it doesn’t already exist). - Add the following content to the file:
services:
api:
volumes:
- type: bind
source: ./librechat.yaml
target: /app/librechat.yaml
Learn more about the Docker Compose Override File here.
Step 2. Configure librechat.yaml
-
Create a file named
librechat.yaml
at the project root (if it doesn’t already exist). -
Add your custom endpoints: you can view compatible endpoints in the AI Endpoints section.
- The list is not exhaustive and generally every OpenAI API-compatible service should work.
- There are many options for Custom Endpoints. View them all here: Custom Endpoint Object Structure.
-
As an example, here is a configuration for both OpenRouter and Ollama:
version: 1.1.4 cache: true endpoints: custom: - name: "OpenRouter" apiKey: "${OPENROUTER_KEY}" baseURL: "https://openrouter.ai/api/v1" models: default: ["gpt-3.5-turbo"] fetch: true titleConvo: true titleModel: "current_model" summarize: false summaryModel: "current_model" forcePrompt: false modelDisplayLabel: "OpenRouter" - name: "Ollama" apiKey: "ollama" baseURL: "http://host.docker.internal:11434/v1/" models: default: [ "llama3:latest", "command-r", "mixtral", "phi3" ] fetch: true # fetching list of models is not supported titleConvo: true titleModel: "current_model"
Step 3. Configure .env File
- Edit your existing
.env
file at the project root- Copy
.env.example
and rename to.env
if it doesn’t already exist.
- Copy
- According to the config above, the environment variable
OPENROUTER_KEY
is expected and should be set:
OPENROUTER_KEY=your_openrouter_api_key
Notes:
- As way of example, this guide assumes you have setup Ollama independently and is accessible to you at
http://host.docker.internal:11434
- ”host.docker.internal” is a special DNS name that resolves to the internal IP address used by the host.
- You may need to change this to the actual IP address of your Ollama instance.
- In a future guide, we will go into setting up Ollama along with LibreChat.
Step 4. Run the App
- Now that your files are configured, you can run the app:
docker compose up
Or, if you were running the app before, you can restart the app with:
docker compose restart
Note: Make sure your Docker Desktop or Docker Engine is running before executing the command.
Conclusion
That’s it! You have now configured Custom Endpoints for your LibreChat instance.
Additional Links
Explore more about LibreChat and how to configure it to your needs.
- Updating LibreChat
- Instructions on how to update this setup with the latest changes to LibreChat.
- Configuring AI Providers
- Configure OpenAI, Google, Anthropic, and OpenAI Assistants
- Configuring a Custom Endpoint
- Configure services such as Deepseek, OpenRouter, Ollama, Mistral AI, Databricks, groq, and others.
- Click here for a list of known, compatible services.
- Environment Configuration
- Read for a comprehensive look at the
.env
file.
- Read for a comprehensive look at the
- librechat.yaml File Configuration
- Configure custom rate limiters, file outputs, and much more with the
librechat.yaml
file.
- Configure custom rate limiters, file outputs, and much more with the
- Ubuntu Docker Deployment Guide
- Read for advanced Docker setup on a remote/headless server.
- Setup the Azure OpenAI endpoint
- Configure multiple Azure regions and deployments for seamless use with LibreChat.