Configuring LLM Providers#

Lumen AI can use a wide range of LLM providers and different models. Discover how-to configure your favorite provider, either by running a model in the cloud or locally.

Use OpenAI

Learn how to configure the OpenAI as the LLM provider used by Lumen AI.

OpenAI Logo

Use OpenAI with Lumen AI
Use Mistral

Learn how to configure the Mistral as the LLM provider used by Lumen AI. Mistral Logo

Use Mistral with Lumen AI
Use Anthropic

Learn how to configure the OpenAI as the LLM provider used by Lumen AI. Anthropic Logo

Using Anthropic with Lumen AI
Use Azure AI

Learn how to configure the Azure AI as the LLM provider used by Lumen AI.

Azure Logo
Use Azure AI with Lumen AI
Use Llama.cpp

Learn how to configure the Llama.cpp as the LLM provider used by Lumen AI. Lllama.cpp Logo

Use Llama.cpp
Use OpenAI compatible endpoints

Learn how to configure the OpenAI as the LLM provider used by Lumen AI.

Use OpenAI compatible endpoints