This custom integration adds a conversation agent powered by Azure OpenAI in Home Assistant, it's based on the original OpenAI Conversation integration for Home Assistant.
This is equivalent to the built-in OpenAI Conversation integration. The difference is that it uses the OpenAI algorithms available through Azure. Other than that the goal is to keep the differences to a minimum. You can use this conversation integration with Assistants in Home Assistant to control you house. They have all the capabilities the built-in OpenAI Conversation integration has.
| Azure OpenAI Conversation Version | Home Assistant Version | Minimal API Version |
|---|---|---|
| 0.x.y | 2023.4.x | 2023-06-01-preview |
| 1.x.y | 2023.5+ | 2023-06-01-preview |
| 2.x.y | 2025.1 | 2023-12-01-preview |
| 3.1.y | 2025.6 | - no need to specify - |
| 4.0.y | 2025.8 - 2025.9 | - no need to specify - |
| 4.1.y | 2025.10+ | - no need to specify - |
| 4.2.y | 2025.12+ | - no need to specify - |
| 4.3.y | 2026.2.1+ | - no need to specify - |
- Deploy an Azure AI Foundry instance to a region supported by the Responses API.
(If you already have a Foundry instance, you can skip this step.) - To enable conversations, deploy a chat completion model (such as
gpt-4o-miniorgpt-4.1-mini)
If your model is not the defaultgpt-4o-mini, you’ll need to configure it later in step 6. - If you want to generate images using the
generate_imageservice, also deploy thedall-e-3model.
- Download and install the integration from HACS: Azure OpenAI Conversation.
- Restart your Home Assistant instance.
- Click here or go to Settings → Devices & Services → Add Integration → Azure OpenAI Conversation.
- Enter your
API KeyandAPI Base URL(use the formathttps://your-resource.services.ai.azure.com/) and hit Submit. - Configure your assistant to use the Azure OpenAI Conversation.
Options for Azure OpenAI Conversation can be set via the user interface, by taking the following steps:
- Browse to your Home Assistant instance.
- In the sidebar click on Settings -> Devices & Services.
- Find the Azure Open AI Conversation integration and click 'Configure'
Options available (same as built-in OpenAI conversation):
-
Instructions: The starting text for the AI language model to generate new text from. This text can include information about your Home Assistant instance, devices, and areas and is written using Home Assistant Templating.
-
Model: The name of the GPT language model deployed for text generation (i.e.-
my-gpt35-model). You can find more details on the available models in the Azure OpenAI Documentation. If you are having issues using an assistant that uses this integration please check this model is the model you actually deployed. -
Maximum Tokens to Return in Response The maximum number of words or "tokens" that the AI model should generate in its completion of the prompt. For more information, see the Azure OpenAI Completion Documentation.
-
Temperature: A value that determines the level of creativity and risk-taking the model should use when generating text. A higher temperature means the model is more likely to generate unexpected results, while a lower temperature results in more deterministic results. See the Azure OpenAI Completion Documentation for more information.
-
Top P: An alternative to temperature, top_p determines the proportion of the most likely word choices the model should consider when generating text. A higher top_p means the model will only consider the most likely words, while a lower top_p means a wider range of words, including less likely ones, will be considered. For more information, see the Azure OpenAI Completion Documentation.
This value couldn't be changed through options, to update it you must need to delete and recreate the integration. Make sure that you have all required values like API key saved before recreation.
Please reference the release history.
While it'd be nice to have more developers, you can contribute without knowing how to code. You can file bugs/feature requests, or you can help with other tasks like UI Translations and updating the README.
- Clone the repository and move into it.
- Copy the environment template and set your GitHub token (required for HACS checks):
cp .env.example .env- set
GITHUB_TOKENin.envwith read-only access to Contents and Metadata.
- Run local checks using Docker:
make lintruns Ruffmake hassfestruns Home Assistant integration validationmake hacsruns HACS validationmake testruns all of the above
- (Optional) Start Home Assistant locally for manual testing:
docker compose up -d homeassistant- Integration files are mounted from
./custom_components/azure_openai_conversationinto the container.
More languages can be added here, contributions are welcome :)
Languages available:
- English
The README file will be used for Documentation, if it's expanded in the future with automations or other tweaks, we can think on a wiki for that purpose.
Disclaimer: Don't worry about making mistakes as we can revert using the history 😊.
| GitHub | Buy me a coffee |
|---|---|
![]() |
![]() |
MIT - By providing a contribution, you agree the contribution is licensed under MIT.

