You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This page covers all LangChain integrations with [Microsoft Azure](https://portal.azure.com) and other [Microsoft](https://www.microsoft.com) products.
8
8
9
9
<Tip>
10
-
**Recommended: Azure OpenAI with `ChatOpenAI`**
10
+
**Recommended: Azure OpenAI**
11
11
12
-
For most applications, we recommend using [Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/) together with @[`ChatOpenAI`] on the [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python) (Generally Available as of August 2025). This gives you a unified interface across OpenAI and Azure OpenAI, native support for Microsoft Entra ID authentication, and access to the latest features including the [Responses API](#responses-api) and [reasoning models](/oss/integrations/chat/azure_chat_openai#reasoning-effort-and-summary).
13
-
14
-
See the [Azure ChatOpenAI integration page](/oss/integrations/chat/azure_chat_openai) for setup, authentication, and examples.
12
+
We recommend using @[Azure OpenAI][AzureOpenAI] across [chat models](#chat-models), [LLMs](#llms), and [embedding models](#embedding-models). With the [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python) (Generally Available as of August 2025), you can use your Azure endpoint and API keys directly with the @[`langchain-openai`] package to call any model deployed in [Microsoft Foundry](https://learn.microsoft.com/en-us/azure/ai-foundry/) (including OpenAI, Llama, DeepSeek, Mistral, and Phi) through a single interface. You also get native support for Microsoft Entra ID authentication and access to the latest features including the [Responses API](#responses-api) and [reasoning models](/oss/integrations/chat/azure_chat_openai#reasoning-effort-and-summary). [Get started here](#azure-openai).
15
13
16
14
**Samples and tutorials:**
17
-
- [microsoft/langchain-for-beginners](https://github.com/microsoft/langchain-for-beginners) — A hands-on course introducing LangChain with Azure OpenAI.
18
-
- [Azure-Samples/langchain-agent-python](https://github.com/Azure-Samples/langchain-agent-python) — Build and deploy LangChain agents on Azure.
15
+
- [microsoft/langchain-for-beginners](https://github.com/microsoft/langchain-for-beginners): A hands-on course introducing LangChain with Azure OpenAI.
16
+
- [Azure-Samples/langchain-agent-python](https://github.com/Azure-Samples/langchain-agent-python): Build and deploy LangChain agents on Azure.
19
17
</Tip>
20
18
19
+
<Note>
20
+
**Claude on Azure**
21
+
22
+
Microsoft Foundry also offers access to all [Anthropic Claude models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude), including Opus, Sonnet, and Haiku. Claude models are served through a dedicated Anthropic-native endpoint rather than the Azure OpenAI v1 API. Use [`langchain-anthropic`](/oss/integrations/chat/anthropic) pointed at your Foundry Anthropic endpoint.
23
+
</Note>
24
+
21
25
## Chat models
22
26
23
27
Microsoft offers three main options for accessing chat models through Azure:
24
28
25
-
1.**[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Provides access to OpenAI models like `gpt-5`, `o4-mini`, and `gpt-4.1` through Microsoft Azure's enterprise platform. Use @[`ChatOpenAI`] on the v1 API, or @[`AzureChatOpenAI`] for traditional deployments.
26
-
2.**[Azure AI](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models)** — Offers access to a variety of models from different providers including Anthropic, DeepSeek, Cohere, Phi, and Mistral through a unified API.
29
+
1.**[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Access any model deployed in Microsoft Foundry (including OpenAI, Llama, DeepSeek, Mistral, and Phi) through a single interface, with enterprise features such as keyless authentication through [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/managed-identity), regional data residency, and private networking. Use @[`ChatOpenAI`] on the v1 API, or @[`AzureChatOpenAI`] for traditional deployments.
30
+
31
+
Azure OpenAI also supports the [Responses API](#responses-api), which gives you access to server-side tools like code interpreter, image generation, and file search directly from your chat model.
32
+
2.**[Azure AI](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models)** — Recommended for accessing tools, storage, and custom middleware from the broader Azure ecosystem alongside your chat model.
27
33
3.**[Azure ML](https://learn.microsoft.com/en-us/azure/machine-learning/)** — Allows deployment and management of custom or fine-tuned open-source models with Azure Machine Learning.
28
34
29
35
### Azure OpenAI
30
36
31
-
>[Azure OpenAI](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/) is an Azure service that provides powerful language models from OpenAI — including the GPT-5, GPT-4.1, and o-series families — with enterprise features such as keyless authentication through Microsoft Entra ID, regional data residency, and private networking.
37
+
To get started with Azure OpenAI, [create an Azure deployment](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource) and install the `langchain-openai` package:
32
38
33
39
<CodeGroup>
34
40
```bash pip
@@ -40,17 +46,42 @@ Microsoft offers three main options for accessing chat models through Azure:
40
46
```
41
47
</CodeGroup>
42
48
43
-
On the v1 API, use @[`ChatOpenAI`] directly against your Azure endpoint — no `api_version` required:
44
-
45
-
```python
46
-
from langchain_openai import ChatOpenAI
47
-
48
-
llm = ChatOpenAI(
49
-
model="gpt-5.4-mini", # your Azure deployment name
For traditional Azure OpenAI API versions, use @[`AzureChatOpenAI`]:
56
87
@@ -64,25 +95,50 @@ See the [Azure ChatOpenAI integration page](/oss/integrations/chat/azure_chat_op
64
95
65
96
Azure OpenAI supports the [Responses API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/responses), which provides stateful conversations, built-in tools (web search, file search, code interpreter), and structured reasoning summaries. @[`ChatOpenAI`] automatically routes to the Responses API when you set the `reasoning` parameter, or you can opt in explicitly with `use_responses_api=True`:
response = llm.invoke("Summarize the bitter lesson.")
132
+
print(response.text)
133
+
```
134
+
</Tab>
135
+
</Tabs>
80
136
81
137
For a walkthrough of reasoning effort, reasoning summaries, and streaming with the Responses API, see [Reasoning effort and summary](/oss/integrations/chat/azure_chat_openai#reasoning-effort-and-summary).
82
138
83
139
### Azure AI
84
140
85
-
>[Azure AI Foundry](https://learn.microsoft.com/en-us/azure/developer/python/get-started)provides access to a wide range of models from various providers including Azure OpenAI, DeepSeek R1, Cohere, Phi and Mistral through the `AzureAIOpenAIApiChatModel` class.
141
+
>[Azure AI Foundry](https://learn.microsoft.com/en-us/azure/developer/python/get-started)is the broader Azure AI platform. The `langchain-azure-ai` package lets you bring Azure-native tools, storage, and custom middleware into your LangChain app, and exposes chat models deployed in Foundry through the `AzureAIOpenAIApiChatModel` class.
86
142
87
143
<CodeGroup>
88
144
```bash pip
@@ -94,59 +150,151 @@ For a walkthrough of reasoning effort, reasoning summaries, and streaming with t
94
150
```
95
151
</CodeGroup>
96
152
97
-
Configure your endpoint. You can use a project endpoint with `DefaultAzureCredential`, or set an API key directly.
See a [usage example](/oss/integrations/chat/azure_ai).
102
154
103
-
```python
104
-
from langchain_azure_ai.chat_models import AzureAIOpenAIApiChatModel
105
-
from azure.identity import DefaultAzureCredential
106
-
107
-
llm = AzureAIOpenAIApiChatModel(
108
-
model="gpt-5.4",
109
-
credential=DefaultAzureCredential(),
110
-
)
111
-
```
155
+
### Azure ML chat online endpoint
112
156
113
-
See a [usage example](/oss/integrations/chat/azure_ai)
157
+
<CodeGroup>
158
+
```bash pip
159
+
pip install -U langchain-community
160
+
```
114
161
115
-
### Azure ML chat online endpoint
162
+
```bash uv
163
+
uv add langchain-community
164
+
```
165
+
</CodeGroup>
116
166
117
167
See the [Azure ML chat endpoint documentation](/oss/integrations/chat/azureml_chat_endpoint) for accessing chat
118
168
models hosted with [Azure Machine Learning](https://azure.microsoft.com/en-us/products/machine-learning/).
119
169
120
170
121
171
## LLMs
122
172
123
-
### Azure ML
124
-
125
-
See a [usage example](/oss/integrations/llms/azure_ml).
173
+
Microsoft offers two main options for accessing LLMs through Azure:
126
174
127
-
```python
128
-
from langchain_community.llms.azureml_endpoint import AzureMLOnlineEndpoint
129
-
```
175
+
1.**[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Access any model deployed in Microsoft Foundry (including OpenAI, Llama, DeepSeek, Mistral, and Phi) as a completion LLM with @[`AzureOpenAI`].
176
+
2.**[Azure ML](https://learn.microsoft.com/en-us/azure/machine-learning/)** — Use custom or open-source models hosted on Azure Machine Learning online endpoints.
130
177
131
178
### Azure OpenAI
132
179
133
180
See a [usage example](/oss/integrations/llms/azure_openai).
134
181
135
-
```python
136
-
from langchain_openai import AzureOpenAI
137
-
```
182
+
<CodeGroup>
183
+
```bash pip
184
+
pip install -U langchain-openai
185
+
```
186
+
187
+
```bash uv
188
+
uv add langchain-openai
189
+
```
190
+
</CodeGroup>
191
+
192
+
<Tabs>
193
+
<Tabtitle="Entra ID (recommended)">
194
+
```python
195
+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
196
+
from langchain_openai import AzureOpenAI
197
+
198
+
token_provider = get_bearer_token_provider(
199
+
DefaultAzureCredential(),
200
+
"https://cognitiveservices.azure.com/.default",
201
+
)
202
+
203
+
llm = AzureOpenAI(
204
+
azure_deployment="gpt-5.4-mini", # your Azure deployment name
205
+
api_version="2025-04-01-preview",
206
+
azure_ad_token_provider=token_provider,
207
+
)
208
+
209
+
print(llm.invoke("Write a haiku about the ocean."))
210
+
```
211
+
</Tab>
212
+
<Tabtitle="API key">
213
+
```python
214
+
from langchain_openai import AzureOpenAI
215
+
216
+
llm = AzureOpenAI(
217
+
azure_deployment="gpt-5.4-mini", # your Azure deployment name
print(llm.invoke("Write a haiku about the ocean."))
224
+
```
225
+
</Tab>
226
+
</Tabs>
227
+
228
+
### Azure ML
229
+
230
+
<CodeGroup>
231
+
```bash pip
232
+
pip install -U langchain-community
233
+
```
234
+
235
+
```bash uv
236
+
uv add langchain-community
237
+
```
238
+
</CodeGroup>
239
+
240
+
See a [usage example](/oss/integrations/llms/azure_ml).
138
241
139
242
## Embedding models
140
243
141
244
Microsoft offers two main options for accessing embedding models through Azure:
142
245
246
+
1.**[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Use embedding models deployed in Microsoft Foundry (including OpenAI `text-embedding-3-small`, `text-embedding-3-large`, and Cohere) with @[`AzureOpenAIEmbeddings`].
247
+
2.**[Azure AI](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models)** — Recommended for accessing tools, storage, and custom middleware from the broader Azure ecosystem alongside your embedding model.
248
+
143
249
### Azure OpenAI
144
250
145
-
See a [usage example](/oss/integrations/embeddings/azure_openai)
251
+
See a [usage example](/oss/integrations/embeddings/azure_openai).
146
252
147
-
```python
148
-
from langchain_openai import AzureOpenAIEmbeddings
149
-
```
253
+
<CodeGroup>
254
+
```bash pip
255
+
pip install -U langchain-openai
256
+
```
257
+
258
+
```bash uv
259
+
uv add langchain-openai
260
+
```
261
+
</CodeGroup>
262
+
263
+
<Tabs>
264
+
<Tabtitle="Entra ID (recommended)">
265
+
```python
266
+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
267
+
from langchain_openai import AzureOpenAIEmbeddings
268
+
269
+
token_provider = get_bearer_token_provider(
270
+
DefaultAzureCredential(),
271
+
"https://cognitiveservices.azure.com/.default",
272
+
)
273
+
274
+
embeddings = AzureOpenAIEmbeddings(
275
+
azure_deployment="text-embedding-3-small", # your Azure deployment name
276
+
api_version="2025-04-01-preview",
277
+
azure_ad_token_provider=token_provider,
278
+
)
279
+
280
+
vector = embeddings.embed_query("LangChain makes agents easy.")
281
+
```
282
+
</Tab>
283
+
<Tabtitle="API key">
284
+
```python
285
+
from langchain_openai import AzureOpenAIEmbeddings
286
+
287
+
embeddings = AzureOpenAIEmbeddings(
288
+
azure_deployment="text-embedding-3-small", # your Azure deployment name
0 commit comments