Configuring LLM Providers#
Lumen AI can use a wide range of LLM providers and different models. Discover how-to configure your favorite provider, either by running a model in the cloud or locally.
Use OpenAI
Learn how to configure the OpenAI as the LLM provider used by Lumen AI.
Use Mistral
Learn how to configure the Mistral as the LLM provider used by Lumen AI.
Use Anthropic
Learn how to configure the OpenAI as the LLM provider used by Lumen AI.
Use Llama.cpp
Learn how to configure the Llama.cpp as the LLM provider used by Lumen AI.
Use OpenAI compatible endpoints
Learn how to configure the OpenAI as the LLM provider used by Lumen AI.