Use Azure AI with Lumen AI#
Lumen AI ships with the lumen.ai.llm.AzureOpenAI
and lumen.ai.llm.AzureMistralAI
wrappers for LLM models.
Prerequisites#
Lumen AI installed in your Python environment.
An Azure OpenAI Service resource. You can create one through the Azure Portal.
Azure Inference API Key and Endpoint URL. These can be obtained from the Azure Portal under your Azure OpenAI resource.
Using Environment Variables#
Set the Environment Variable
Set the AZUREAI_ENDPOINT_KEY
and AZUREAI_ENDPOINT_URL
environment variables in your system. This allows Lumen AI to automatically detect and use OpenAI as the provider.
export AZUREAI_ENDPOINT_KEY='your-azure-api-key'
export AZUREAI_ENDPOINT_URL='your-azure-endpoint'
set AZUREAI_ENDPOINT_KEY=your-azure-api-key
set AZUREAI_ENDPOINT_URL=your-azure-endpoint
Now run
lumen-ai
serve and select whether you want to use OpenAI or Mistral based models:
lumen-ai serve <your-data-file-or-url> --provider <azure-openai | azure-mistral>
Using CLI Arguments#
Alternatively you can also provide the API key and endpoint as CLI arguments:
lumen-ai serve <your-data-file-or-url> --provider <azure-openai | azure-mistral> --api-key <your-azure-api-key> --provider-endpoint <your-azure-endpoint>
Using Python#
In Python, simply import the LLM wrapper lumen.ai.llm.AzureOpenAI
or lmai.llm.AzureMistralAI
and pass it to the lumen.ai.ui.ExplorerUI
:
import lumen.ai as lmai
azure_llm = lmai.llm.AzureOpenAI(api_key='your-azure-api-key', endpoint='your-azure-endpoint')
ui = lmai.ui.ExplorerUI('<your-data-file-or-url>', llm=azure_llm)
ui.servable()
Managed Identity#
When working with Azure in corporate setting you may have to use managed identity providers. To enable this use case you can configure a token_provider
:
import lumen.ai as lmai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
llm = lmai.llm.AzureOpenAI(
api_version=...,
endpoint=...,
model_kwargs={
"default": {"model": "gpt4o-mini", "azure_ad_token_provider": token_provider},
"reasoning": {"model": "gpt4o", "azure_ad_token_provider": token_provider},
},
)
ui = lmai.ui.ExplorerUI('<your-data-file-or-url>', llm=llm)
ui.servable()