Docs
⚙️ Configuration
librechat.yaml
Custom AI Endpoints
Apple MLX

Apple MLX

MLX OpenAI Compatibility

Notes:

  • Known: icon provided.

  • API is mostly strict with unrecognized parameters.

  • Support only one model at a time, otherwise you’ll need to run a different endpoint with a different baseURL.

librechat.yaml
    - name: "MLX"
      apiKey: "mlx"
      baseURL: "http://localhost:8080/v1/" 
      models:
        default: [
          "Meta-Llama-3-8B-Instruct-4bit"
          ]
        fetch: false # fetching list of models is not supported
      titleConvo: true
      titleModel: "current_model"
      summarize: false
      summaryModel: "current_model"
      forcePrompt: false
      modelDisplayLabel: "Apple MLX"
      addParams:
            max_tokens: 2000
            "stop": [
              "<|eot_id|>"
            ]

MLX