Skip to content

Commit a8d1522

Browse files
committed
feat(open_responses): document integration
1 parent 706aa88 commit a8d1522

1 file changed

Lines changed: 141 additions & 0 deletions

File tree

Lines changed: 141 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,141 @@
1+
---
2+
title: Open Responses
3+
description: Instructions on how to integrate an OpenAI Responses API-compatible endpoint as a conversation agent
4+
ha_category:
5+
- AI
6+
- Voice
7+
ha_release: 2026.6
8+
ha_iot_class: Cloud Polling
9+
ha_config_flow: true
10+
ha_codeowners:
11+
- '@Komzpa'
12+
ha_domain: open_responses
13+
ha_integration_type: service
14+
ha_platforms:
15+
- ai_task
16+
- conversation
17+
ha_quality_scale: bronze
18+
related:
19+
- docs: /integrations/openai_conversation/
20+
title: OpenAI
21+
- docs: /integrations/open_router/
22+
title: OpenRouter
23+
- docs: /voice_control/voice_remote_expose_devices/
24+
title: Exposing entities to Assist
25+
---
26+
27+
The **Open Responses** {% term integration %} lets Home Assistant use an endpoint that implements the OpenAI Responses API. You can use it with a proxy, gateway, load balancer, or provider that exposes a Responses-compatible API surface and supports the models and tools you configure in Home Assistant.
28+
29+
Use this integration when you want to connect Home Assistant to an OpenAI Responses API-compatible endpoint that is not the official OpenAI API endpoint. If you use the official OpenAI API endpoint directly, use the [OpenAI integration](/integrations/openai_conversation/) instead.
30+
31+
Controlling Home Assistant is done by giving the conversation agent access to the Assist API. You can control which devices and entities it can access from the {% my voice_assistants title="exposed entities page" %}.
32+
33+
## Prerequisites
34+
35+
Before setting up the integration, make sure you have:
36+
37+
- An API key for your Responses-compatible endpoint
38+
- The full API base URL for the endpoint
39+
- A model name supported by that endpoint
40+
41+
The base URL must point to the API root used by the OpenAI Python SDK. For many compatible endpoints, this ends in `/v1`.
42+
43+
{% important %}
44+
Home Assistant validates that the endpoint can list models during setup. After setup, compatibility depends on the endpoint implementing the Responses API features you enable, such as tool calling, structured output, web search, code interpreter, and image generation. If an endpoint behaves differently from the official OpenAI API, reproduce the issue against that endpoint before reporting it to Home Assistant.
45+
{% endimportant %}
46+
47+
{% include integrations/config_flow.md %}
48+
49+
{% configuration_basic %}
50+
API key:
51+
description: "The API key used to authenticate requests to your Responses-compatible endpoint."
52+
Base URL:
53+
description: "The full API base URL for your Responses-compatible endpoint."
54+
{% endconfiguration_basic %}
55+
56+
{% include integrations/option_flow.md %}
57+
58+
The integration provides the following types of subentries:
59+
60+
- [Conversation](/integrations/conversation/)
61+
- [AI Task](/integrations/ai_task/)
62+
63+
The Conversation and AI Task subentries have the following configuration options. Some options may be unavailable depending on the subentry type or selected model.
64+
65+
{% configuration_basic %}
66+
Instructions:
67+
description: Instructions for the AI on how it should respond to your requests. It is written using [Home Assistant templating](/docs/templating/).
68+
Control Home Assistant:
69+
description: If enabled, the model can interact with Home Assistant. It can only control or provide information about entities that are [exposed](/voice_control/voice_remote_expose_devices/) to it.
70+
Recommended settings:
71+
description: If enabled, Home Assistant uses the recommended model and settings.
72+
{% endconfiguration_basic %}
73+
74+
If you choose not to use the recommended settings, you can configure the following options:
75+
76+
{% configuration_basic %}
77+
Model:
78+
description: The model name to send to your Responses-compatible endpoint. The value must match a model supported by that endpoint.
79+
Maximum tokens to return in response:
80+
description: The maximum number of output tokens the model can generate.
81+
Temperature:
82+
description: Controls the creativity of the model response. Higher values can make responses more varied, while lower values make responses more deterministic.
83+
Top P:
84+
description: Controls nucleus sampling. Lower values make the model consider fewer likely tokens.
85+
Store requests and responses:
86+
description: If enabled, Home Assistant asks the endpoint to store requests and responses. Whether this is supported, where data is stored, and how long it is retained depends on your endpoint.
87+
Service tier:
88+
description: The service tier value to send to the endpoint. Whether Auto, Standard, Flex, or Priority are supported depends on your endpoint and selected model.
89+
Enable web search:
90+
description: Allows the model to use web search through the Responses API if your endpoint supports it.
91+
Search context size:
92+
description: Controls how much context the web search tool can retrieve if web search is enabled and supported by your endpoint.
93+
Include home location:
94+
description: Allows Home Assistant to use the location of your Home Assistant instance to provide more relevant web search results.
95+
Code interpreter:
96+
description: Allows the model to use the code interpreter tool if your endpoint and model support it.
97+
Image model:
98+
description: The image model to use when generating images with AI Task if your endpoint supports image generation.
99+
Reasoning effort:
100+
description: Controls how many reasoning tokens the model can use before creating a response if the selected model supports reasoning.
101+
Reasoning summary:
102+
description: Controls the length and detail of reasoning summaries provided by the model if the selected model supports reasoning summaries.
103+
Verbosity:
104+
description: Controls response detail for models that support verbosity.
105+
{% endconfiguration_basic %}
106+
107+
## Supported functionality
108+
109+
The Open Responses integration supports:
110+
111+
- Conversation agents for Assist
112+
- AI Task data generation
113+
- AI Task image generation when the endpoint supports Responses API image generation
114+
- Home Assistant tool calling through the Assist API
115+
- Responses API options such as structured output, reasoning settings, web search, code interpreter, and response storage when supported by the endpoint
116+
117+
## Known limitations
118+
119+
Open Responses is a compatibility integration. It does not translate requests between different provider APIs, add provider-specific fallback behavior, or hide endpoint compatibility problems.
120+
121+
If setup succeeds but a later request fails, check whether your endpoint supports the exact Responses API feature used by the selected model and subentry options. For example, an endpoint might support basic text responses but not web search, code interpreter, image generation, or reasoning summaries.
122+
123+
## Troubleshooting
124+
125+
### The integration cannot connect during setup
126+
127+
Check that the base URL is reachable from Home Assistant and points to the API root expected by the OpenAI Python SDK. Also check that the API key has permission to list models.
128+
129+
### A model or tool fails after setup
130+
131+
Disable advanced options such as web search, code interpreter, image generation, reasoning summaries, or response storage, then try again. If the request works with fewer options enabled, the endpoint likely does not support one of the selected Responses API features.
132+
133+
### Issues with a compatible endpoint
134+
135+
If the issue happens only with a third-party endpoint, proxy, or gateway, report it to that endpoint first. Home Assistant can usually fix bugs in how it calls the Responses API, but it cannot fix differences in endpoint behavior.
136+
137+
## Removing the integration
138+
139+
This integration follows standard integration removal. No extra steps are required.
140+
141+
{% include integrations/remove_device_service.md %}

0 commit comments

Comments
 (0)