Polarity Command Center supports a wide range of LLMs included locally run and hosted models. By default Command Center is disabled. To enable and configure command center navigate to “Sever Configuration” → “AI Settings”. To enable Command Center you must provide your LLM connection settings and check the “Enable Polarity AI Features” box.

See below for detailed instructions on configuring popular LLM models.
Google Gemini
LLM Provider
Select Gemini as the provider
LLM Model
Choose your Gemini Model from the dropdown list
API Key
Add your Gemini API Key
Save your settings and click on the “Test AI Configuration” button in the top right to ensure your settings work.
OpenAI Models
For models hosted by OpenAI use the following configuration
LLM Provider
Select OpenAI as the provider
LLM Model
Select your OpenAI LLM model from the dropdown.
API Key
Add your OpenAI API Key
Base URL
Open “Advanced Options” and in the Base URL option and enter the following URL:
https://api.openai.com/v1/chat/completionsSave your settings and click on the “Test AI Configuration” button in the top right to ensure your settings work.
Azure OpenAI Models
Command Center supports OpenAI models hosted in Azure.
LLM Provider
Select OpenAI as the provider
LLM Model
Select your OpenAI LLM model from the dropdown.
API Key
Add your Azure LLM API Key
Base URL
Open “Advanced Options” and in the Base URL option enter your Azure OpenAI URL including the deployment name and chat completions path. Deployment URLs have the following format:
https://{{tenantName}}-openai-instance.openai.azure.com/openai/deployments/{{deploymentName}}/chat/completions?api-version={{apiVersion}}You will need to replace {{tenantName}}, {{deploymentName}}, and {{apiVersion}} with the appropriate values for your environment and model. As an example, a complete URL will look like this:
https://mycompany-openai-instance.openai.azure.com/openai/deployments/gpt-4-32k/chat/completions?api-version=2023-12-01-previewSave your settings and click on the “Test AI Configuration” button in the top right to ensure your settings work.
Open WebUI
Command Center can connect to API’s hosted via Open WebUI (https://openwebui.com/). Configure Open WebUI to use OpenAI Syntax API and then set the following in Command Center:
LLM Provider
Select OpenAI as the provider
LLM Model
Select your OpenAI LLM model from the dropdown. Note that if your model is not listed you can rename the model in OpenWeb UI to match one of the models in the dropdown list.
API Key
Add your Open WebUI API Key
Base URL
Open “Advanced Options” and in the Base URL option and enter the following URL:
https://{{open-web-ui-fqdn}}/api/v1/chat/completions Note that the Open WebUI URL varies slightly from the OpenAI URL in that it starts with api in the URL path.
Save your settings and click on the “Test AI Configuration” button in the top right to ensure your settings work.