Skip to content

Commit e337fb7

Browse files
marlenezwCopilot
andcommitted
Refine Microsoft integrations page: v1 API model list, Claude note, Azure AI/ML install snippets
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
1 parent 069c818 commit e337fb7

1 file changed

Lines changed: 212 additions & 78 deletions

File tree

src/oss/python/integrations/providers/microsoft.mdx

Lines changed: 212 additions & 78 deletions
Original file line numberDiff line numberDiff line change
@@ -7,28 +7,34 @@ sidebarTitle: "Microsoft"
77
This page covers all LangChain integrations with [Microsoft Azure](https://portal.azure.com) and other [Microsoft](https://www.microsoft.com) products.
88

99
<Tip>
10-
**Recommended: Azure OpenAI with `ChatOpenAI`**
10+
**Recommended: Azure OpenAI**
1111

12-
For most applications, we recommend using [Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/) together with @[`ChatOpenAI`] on the [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python) (Generally Available as of August 2025). This gives you a unified interface across OpenAI and Azure OpenAI, native support for Microsoft Entra ID authentication, and access to the latest features including the [Responses API](#responses-api) and [reasoning models](/oss/integrations/chat/azure_chat_openai#reasoning-effort-and-summary).
13-
14-
See the [Azure ChatOpenAI integration page](/oss/integrations/chat/azure_chat_openai) for setup, authentication, and examples.
12+
We recommend using @[Azure OpenAI][AzureOpenAI] across [chat models](#chat-models), [LLMs](#llms), and [embedding models](#embedding-models). With the [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python) (Generally Available as of August 2025), you can use your Azure endpoint and API keys directly with the @[`langchain-openai`] package to call any model deployed in [Microsoft Foundry](https://learn.microsoft.com/en-us/azure/ai-foundry/) (including OpenAI, Llama, DeepSeek, Mistral, and Phi) through a single interface. You also get native support for Microsoft Entra ID authentication and access to the latest features including the [Responses API](#responses-api) and [reasoning models](/oss/integrations/chat/azure_chat_openai#reasoning-effort-and-summary). [Get started here](#azure-openai).
1513

1614
**Samples and tutorials:**
17-
- [microsoft/langchain-for-beginners](https://github.com/microsoft/langchain-for-beginners) A hands-on course introducing LangChain with Azure OpenAI.
18-
- [Azure-Samples/langchain-agent-python](https://github.com/Azure-Samples/langchain-agent-python) Build and deploy LangChain agents on Azure.
15+
- [microsoft/langchain-for-beginners](https://github.com/microsoft/langchain-for-beginners): A hands-on course introducing LangChain with Azure OpenAI.
16+
- [Azure-Samples/langchain-agent-python](https://github.com/Azure-Samples/langchain-agent-python): Build and deploy LangChain agents on Azure.
1917
</Tip>
2018

19+
<Note>
20+
**Claude on Azure**
21+
22+
Microsoft Foundry also offers access to all [Anthropic Claude models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude), including Opus, Sonnet, and Haiku. Claude models are served through a dedicated Anthropic-native endpoint rather than the Azure OpenAI v1 API. Use [`langchain-anthropic`](/oss/integrations/chat/anthropic) pointed at your Foundry Anthropic endpoint.
23+
</Note>
24+
2125
## Chat models
2226

2327
Microsoft offers three main options for accessing chat models through Azure:
2428

25-
1. **[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Provides access to OpenAI models like `gpt-5`, `o4-mini`, and `gpt-4.1` through Microsoft Azure's enterprise platform. Use @[`ChatOpenAI`] on the v1 API, or @[`AzureChatOpenAI`] for traditional deployments.
26-
2. **[Azure AI](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models)** — Offers access to a variety of models from different providers including Anthropic, DeepSeek, Cohere, Phi, and Mistral through a unified API.
29+
1. **[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Access any model deployed in Microsoft Foundry (including OpenAI, Llama, DeepSeek, Mistral, and Phi) through a single interface, with enterprise features such as keyless authentication through [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/managed-identity), regional data residency, and private networking. Use @[`ChatOpenAI`] on the v1 API, or @[`AzureChatOpenAI`] for traditional deployments.
30+
31+
Azure OpenAI also supports the [Responses API](#responses-api), which gives you access to server-side tools like code interpreter, image generation, and file search directly from your chat model.
32+
2. **[Azure AI](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models)** — Recommended for accessing tools, storage, and custom middleware from the broader Azure ecosystem alongside your chat model.
2733
3. **[Azure ML](https://learn.microsoft.com/en-us/azure/machine-learning/)** — Allows deployment and management of custom or fine-tuned open-source models with Azure Machine Learning.
2834

2935
### Azure OpenAI
3036

31-
>[Azure OpenAI](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/) is an Azure service that provides powerful language models from OpenAI — including the GPT-5, GPT-4.1, and o-series families — with enterprise features such as keyless authentication through Microsoft Entra ID, regional data residency, and private networking.
37+
To get started with Azure OpenAI, [create an Azure deployment](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource) and install the `langchain-openai` package:
3238

3339
<CodeGroup>
3440
```bash pip
@@ -40,17 +46,42 @@ Microsoft offers three main options for accessing chat models through Azure:
4046
```
4147
</CodeGroup>
4248

43-
On the v1 API, use @[`ChatOpenAI`] directly against your Azure endpoint — no `api_version` required:
44-
45-
```python
46-
from langchain_openai import ChatOpenAI
47-
48-
llm = ChatOpenAI(
49-
model="gpt-5.4-mini", # your Azure deployment name
50-
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
51-
api_key="your-azure-api-key",
52-
)
53-
```
49+
On the v1 API, use @[`ChatOpenAI`] directly against your Azure endpoint—no `api_version` required:
50+
51+
<Tabs>
52+
<Tab title="Entra ID (recommended)">
53+
```bash
54+
pip install azure-identity
55+
```
56+
57+
```python
58+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
59+
from langchain_openai import ChatOpenAI
60+
61+
token_provider = get_bearer_token_provider(
62+
DefaultAzureCredential(),
63+
"https://cognitiveservices.azure.com/.default",
64+
)
65+
66+
llm = ChatOpenAI(
67+
model="gpt-5.4-mini", # your Azure deployment name
68+
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
69+
api_key=token_provider, # callable that handles token refresh
70+
)
71+
```
72+
</Tab>
73+
<Tab title="API key">
74+
```python
75+
from langchain_openai import ChatOpenAI
76+
77+
llm = ChatOpenAI(
78+
model="gpt-5.4-mini", # your Azure deployment name
79+
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
80+
api_key="your-azure-api-key",
81+
)
82+
```
83+
</Tab>
84+
</Tabs>
5485

5586
For traditional Azure OpenAI API versions, use @[`AzureChatOpenAI`]:
5687

@@ -64,25 +95,50 @@ See the [Azure ChatOpenAI integration page](/oss/integrations/chat/azure_chat_op
6495

6596
Azure OpenAI supports the [Responses API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/responses), which provides stateful conversations, built-in tools (web search, file search, code interpreter), and structured reasoning summaries. @[`ChatOpenAI`] automatically routes to the Responses API when you set the `reasoning` parameter, or you can opt in explicitly with `use_responses_api=True`:
6697

67-
```python
68-
from langchain_openai import ChatOpenAI
69-
70-
llm = ChatOpenAI(
71-
model="gpt-5.4-mini",
72-
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
73-
api_key="your-azure-api-key",
74-
use_responses_api=True,
75-
)
76-
77-
response = llm.invoke("Summarize the bitter lesson.")
78-
print(response.text)
79-
```
98+
<Tabs>
99+
<Tab title="Entra ID (recommended)">
100+
```python
101+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
102+
from langchain_openai import ChatOpenAI
103+
104+
token_provider = get_bearer_token_provider(
105+
DefaultAzureCredential(),
106+
"https://cognitiveservices.azure.com/.default",
107+
)
108+
109+
llm = ChatOpenAI(
110+
model="gpt-5.4-mini",
111+
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
112+
api_key=token_provider,
113+
use_responses_api=True,
114+
)
115+
116+
response = llm.invoke("Summarize the bitter lesson.")
117+
print(response.text)
118+
```
119+
</Tab>
120+
<Tab title="API key">
121+
```python
122+
from langchain_openai import ChatOpenAI
123+
124+
llm = ChatOpenAI(
125+
model="gpt-5.4-mini",
126+
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
127+
api_key="your-azure-api-key",
128+
use_responses_api=True,
129+
)
130+
131+
response = llm.invoke("Summarize the bitter lesson.")
132+
print(response.text)
133+
```
134+
</Tab>
135+
</Tabs>
80136

81137
For a walkthrough of reasoning effort, reasoning summaries, and streaming with the Responses API, see [Reasoning effort and summary](/oss/integrations/chat/azure_chat_openai#reasoning-effort-and-summary).
82138

83139
### Azure AI
84140

85-
>[Azure AI Foundry](https://learn.microsoft.com/en-us/azure/developer/python/get-started) provides access to a wide range of models from various providers including Azure OpenAI, DeepSeek R1, Cohere, Phi and Mistral through the `AzureAIOpenAIApiChatModel` class.
141+
>[Azure AI Foundry](https://learn.microsoft.com/en-us/azure/developer/python/get-started) is the broader Azure AI platform. The `langchain-azure-ai` package lets you bring Azure-native tools, storage, and custom middleware into your LangChain app, and exposes chat models deployed in Foundry through the `AzureAIOpenAIApiChatModel` class.
86142
87143
<CodeGroup>
88144
```bash pip
@@ -94,59 +150,151 @@ For a walkthrough of reasoning effort, reasoning summaries, and streaming with t
94150
```
95151
</CodeGroup>
96152

97-
Configure your endpoint. You can use a project endpoint with `DefaultAzureCredential`, or set an API key directly.
98-
99-
```bash
100-
export AZURE_AI_PROJECT_ENDPOINT=your-project-endpoint
101-
```
153+
See a [usage example](/oss/integrations/chat/azure_ai).
102154

103-
```python
104-
from langchain_azure_ai.chat_models import AzureAIOpenAIApiChatModel
105-
from azure.identity import DefaultAzureCredential
106-
107-
llm = AzureAIOpenAIApiChatModel(
108-
model="gpt-5.4",
109-
credential=DefaultAzureCredential(),
110-
)
111-
```
155+
### Azure ML chat online endpoint
112156

113-
See a [usage example](/oss/integrations/chat/azure_ai)
157+
<CodeGroup>
158+
```bash pip
159+
pip install -U langchain-community
160+
```
114161

115-
### Azure ML chat online endpoint
162+
```bash uv
163+
uv add langchain-community
164+
```
165+
</CodeGroup>
116166

117167
See the [Azure ML chat endpoint documentation](/oss/integrations/chat/azureml_chat_endpoint) for accessing chat
118168
models hosted with [Azure Machine Learning](https://azure.microsoft.com/en-us/products/machine-learning/).
119169

120170

121171
## LLMs
122172

123-
### Azure ML
124-
125-
See a [usage example](/oss/integrations/llms/azure_ml).
173+
Microsoft offers two main options for accessing LLMs through Azure:
126174

127-
```python
128-
from langchain_community.llms.azureml_endpoint import AzureMLOnlineEndpoint
129-
```
175+
1. **[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Access any model deployed in Microsoft Foundry (including OpenAI, Llama, DeepSeek, Mistral, and Phi) as a completion LLM with @[`AzureOpenAI`].
176+
2. **[Azure ML](https://learn.microsoft.com/en-us/azure/machine-learning/)** — Use custom or open-source models hosted on Azure Machine Learning online endpoints.
130177

131178
### Azure OpenAI
132179

133180
See a [usage example](/oss/integrations/llms/azure_openai).
134181

135-
```python
136-
from langchain_openai import AzureOpenAI
137-
```
182+
<CodeGroup>
183+
```bash pip
184+
pip install -U langchain-openai
185+
```
186+
187+
```bash uv
188+
uv add langchain-openai
189+
```
190+
</CodeGroup>
191+
192+
<Tabs>
193+
<Tab title="Entra ID (recommended)">
194+
```python
195+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
196+
from langchain_openai import AzureOpenAI
197+
198+
token_provider = get_bearer_token_provider(
199+
DefaultAzureCredential(),
200+
"https://cognitiveservices.azure.com/.default",
201+
)
202+
203+
llm = AzureOpenAI(
204+
azure_deployment="gpt-5.4-mini", # your Azure deployment name
205+
api_version="2025-04-01-preview",
206+
azure_ad_token_provider=token_provider,
207+
)
208+
209+
print(llm.invoke("Write a haiku about the ocean."))
210+
```
211+
</Tab>
212+
<Tab title="API key">
213+
```python
214+
from langchain_openai import AzureOpenAI
215+
216+
llm = AzureOpenAI(
217+
azure_deployment="gpt-5.4-mini", # your Azure deployment name
218+
api_version="2025-04-01-preview",
219+
azure_endpoint="https://YOUR-RESOURCE-NAME.openai.azure.com/",
220+
api_key="your-azure-api-key",
221+
)
222+
223+
print(llm.invoke("Write a haiku about the ocean."))
224+
```
225+
</Tab>
226+
</Tabs>
227+
228+
### Azure ML
229+
230+
<CodeGroup>
231+
```bash pip
232+
pip install -U langchain-community
233+
```
234+
235+
```bash uv
236+
uv add langchain-community
237+
```
238+
</CodeGroup>
239+
240+
See a [usage example](/oss/integrations/llms/azure_ml).
138241

139242
## Embedding models
140243

141244
Microsoft offers two main options for accessing embedding models through Azure:
142245

246+
1. **[Azure OpenAI](https://learn.microsoft.com/en-us/azure/ai-services/openai/)** (recommended) — Use embedding models deployed in Microsoft Foundry (including OpenAI `text-embedding-3-small`, `text-embedding-3-large`, and Cohere) with @[`AzureOpenAIEmbeddings`].
247+
2. **[Azure AI](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models)** — Recommended for accessing tools, storage, and custom middleware from the broader Azure ecosystem alongside your embedding model.
248+
143249
### Azure OpenAI
144250

145-
See a [usage example](/oss/integrations/embeddings/azure_openai)
251+
See a [usage example](/oss/integrations/embeddings/azure_openai).
146252

147-
```python
148-
from langchain_openai import AzureOpenAIEmbeddings
149-
```
253+
<CodeGroup>
254+
```bash pip
255+
pip install -U langchain-openai
256+
```
257+
258+
```bash uv
259+
uv add langchain-openai
260+
```
261+
</CodeGroup>
262+
263+
<Tabs>
264+
<Tab title="Entra ID (recommended)">
265+
```python
266+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
267+
from langchain_openai import AzureOpenAIEmbeddings
268+
269+
token_provider = get_bearer_token_provider(
270+
DefaultAzureCredential(),
271+
"https://cognitiveservices.azure.com/.default",
272+
)
273+
274+
embeddings = AzureOpenAIEmbeddings(
275+
azure_deployment="text-embedding-3-small", # your Azure deployment name
276+
api_version="2025-04-01-preview",
277+
azure_ad_token_provider=token_provider,
278+
)
279+
280+
vector = embeddings.embed_query("LangChain makes agents easy.")
281+
```
282+
</Tab>
283+
<Tab title="API key">
284+
```python
285+
from langchain_openai import AzureOpenAIEmbeddings
286+
287+
embeddings = AzureOpenAIEmbeddings(
288+
azure_deployment="text-embedding-3-small", # your Azure deployment name
289+
api_version="2025-04-01-preview",
290+
azure_endpoint="https://YOUR-RESOURCE-NAME.openai.azure.com/",
291+
api_key="your-azure-api-key",
292+
)
293+
294+
vector = embeddings.embed_query("LangChain makes agents easy.")
295+
```
296+
</Tab>
297+
</Tabs>
150298

151299
### Azure AI
152300

@@ -160,21 +308,7 @@ from langchain_openai import AzureOpenAIEmbeddings
160308
```
161309
</CodeGroup>
162310

163-
Configure your endpoint. You can use a project endpoint with `DefaultAzureCredential`, or set an API key directly.
164-
165-
```bash
166-
export AZURE_AI_PROJECT_ENDPOINT=your-project-endpoint
167-
```
168-
169-
```python
170-
from langchain_azure_ai.embeddings import AzureAIOpenAIApiEmbeddingsModel
171-
from azure.identity import DefaultAzureCredential
172-
173-
embed_model = AzureAIOpenAIApiEmbeddingsModel(
174-
model="text-embedding-ada-002",
175-
credential=DefaultAzureCredential(),
176-
)
177-
```
311+
See a [usage example](/oss/integrations/providers/azure_ai#azure-ai-model-inference-for-embeddings).
178312

179313
## Middleware
180314

0 commit comments

Comments
 (0)