| title | Microsoft integrations |
|---|---|
| description | Integrate with Microsoft using LangChain Python. |
| sidebarTitle | Microsoft |
This page covers all LangChain integrations with Microsoft Azure and other Microsoft products.
**Recommended: Azure OpenAI**We recommend using @[Azure OpenAI][AzureOpenAI] across [chat models](#chat-models), [LLMs](#llms), and [embedding models](#embedding-models). With the [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python) (Generally Available as of August 2025), you can use your Azure endpoint and API keys directly with the @[`langchain-openai`] package to call any model deployed in [Microsoft Foundry](https://learn.microsoft.com/en-us/azure/ai-foundry/) (including OpenAI, Llama, DeepSeek, Mistral, and Phi) through a single interface. You also get native support for Microsoft Entra ID authentication and access to the latest features including the [Responses API](#responses-api) and [reasoning models](/oss/integrations/chat/azure_chat_openai). [Get started here](#azure-openai).
**Samples and tutorials:**
- [microsoft/langchain-for-beginners](https://github.com/microsoft/langchain-for-beginners): A hands-on course introducing LangChain with Azure OpenAI.
- [Azure-Samples/langchain-agent-python](https://github.com/Azure-Samples/langchain-agent-python): Build and deploy LangChain agents on Azure.
Microsoft Foundry also offers access to all [Anthropic Claude models](https://learn.microsoft.com/en-us/azure/foundry/foundry-models/how-to/use-foundry-models-claude), including Opus, Sonnet, and Haiku. Claude models are served through a dedicated Anthropic-native endpoint rather than the Azure OpenAI v1 API. Use [`langchain-anthropic`](/oss/integrations/chat/anthropic) pointed at your Foundry Anthropic endpoint.
Microsoft offers three main options for accessing chat models through Azure:
-
Azure OpenAI (recommended) — Access any model deployed in Microsoft Foundry (including OpenAI, Llama, DeepSeek, Mistral, and Phi) through a single interface, with enterprise features such as keyless authentication through Microsoft Entra ID, regional data residency, and private networking. Use @[
ChatOpenAI] on the v1 API, or @[AzureChatOpenAI] for traditional deployments.Azure OpenAI also supports the Responses API, which gives you access to server-side tools like code interpreter, image generation, and file search directly from your chat model.
-
Azure AI — Recommended for accessing tools, storage, and custom middleware from the broader Azure ecosystem alongside your chat model.
-
Azure ML — Allows deployment and management of custom or fine-tuned open-source models with Azure Machine Learning.
To get started with Azure OpenAI, create an Azure deployment and install the langchain-openai package:
```bash uv
uv add langchain-openai
```
On the v1 API, use @[ChatOpenAI] directly against your Azure endpoint—no api_version required:
```python
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
from langchain_openai import ChatOpenAI
token_provider = get_bearer_token_provider(
DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default",
)
llm = ChatOpenAI(
model="gpt-5.4-mini", # your Azure deployment name
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider, # callable that handles token refresh
)
```
</Tab>
<Tab title="API key">
```python
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-5.4-mini", # your Azure deployment name
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key="your-azure-api-key",
)
```
</Tab>
For traditional Azure OpenAI API versions, use @[AzureChatOpenAI]:
from langchain_openai import AzureChatOpenAISee the Azure ChatOpenAI integration page for end-to-end setup, Entra ID authentication, tool calling, and reasoning examples.
Azure OpenAI supports the Responses API, which provides stateful conversations, built-in tools (web search, file search, code interpreter), and structured reasoning summaries. @[ChatOpenAI] automatically routes to the Responses API when you set the reasoning parameter, or you can opt in explicitly with use_responses_api=True:
token_provider = get_bearer_token_provider(
DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default",
)
llm = ChatOpenAI(
model="gpt-5.4-mini",
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider,
use_responses_api=True,
)
response = llm.invoke("Summarize the bitter lesson.")
print(response.text)
```
</Tab>
<Tab title="API key">
```python
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-5.4-mini",
base_url="https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key="your-azure-api-key",
use_responses_api=True,
)
response = llm.invoke("Summarize the bitter lesson.")
print(response.text)
```
</Tab>
For a walkthrough of reasoning effort, reasoning summaries, and streaming with the Responses API, see the Azure ChatOpenAI integration page.
```bash pip pip install -U langchain-azure-ai ```Azure AI Foundry is the broader Azure AI platform. The
langchain-azure-aipackage lets you bring Azure-native tools, storage, and custom middleware into your LangChain app, and exposes chat models deployed in Foundry through theAzureAIOpenAIApiChatModelclass.
```bash uv
uv add langchain-azure-ai
```
See a usage example.
```bash pip pip install -U langchain-community ``````bash uv
uv add langchain-community
```
See the Azure ML chat endpoint documentation for accessing chat models hosted with Azure Machine Learning.
Microsoft offers two main options for accessing LLMs through Azure:
- Azure OpenAI (recommended) — Access any model deployed in Microsoft Foundry (including OpenAI, Llama, DeepSeek, Mistral, and Phi) as a completion LLM with @[
AzureOpenAI]. - Azure ML — Use custom or open-source models hosted on Azure Machine Learning online endpoints.
See a usage example.
```bash pip pip install -U langchain-openai ``````bash uv
uv add langchain-openai
```
token_provider = get_bearer_token_provider(
DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default",
)
llm = AzureOpenAI(
azure_deployment="gpt-5.4-mini", # your Azure deployment name
api_version="2025-04-01-preview",
azure_ad_token_provider=token_provider,
)
print(llm.invoke("Write a haiku about the ocean."))
```
</Tab>
<Tab title="API key">
```python
from langchain_openai import AzureOpenAI
llm = AzureOpenAI(
azure_deployment="gpt-5.4-mini", # your Azure deployment name
api_version="2025-04-01-preview",
azure_endpoint="https://YOUR-RESOURCE-NAME.openai.azure.com/",
api_key="your-azure-api-key",
)
print(llm.invoke("Write a haiku about the ocean."))
```
</Tab>
```bash uv
uv add langchain-community
```
See a usage example.
Microsoft offers two main options for accessing embedding models through Azure:
- Azure OpenAI (recommended) — Use embedding models deployed in Microsoft Foundry (including OpenAI
text-embedding-3-small,text-embedding-3-large, and Cohere) with @[AzureOpenAIEmbeddings]. - Azure AI — Recommended for accessing tools, storage, and custom middleware from the broader Azure ecosystem alongside your embedding model.
See a usage example.
```bash pip pip install -U langchain-openai ``````bash uv
uv add langchain-openai
```
token_provider = get_bearer_token_provider(
DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default",
)
embeddings = AzureOpenAIEmbeddings(
azure_deployment="text-embedding-3-small", # your Azure deployment name
api_version="2025-04-01-preview",
azure_ad_token_provider=token_provider,
)
vector = embeddings.embed_query("LangChain makes agents easy.")
```
</Tab>
<Tab title="API key">
```python
from langchain_openai import AzureOpenAIEmbeddings
embeddings = AzureOpenAIEmbeddings(
azure_deployment="text-embedding-3-small", # your Azure deployment name
api_version="2025-04-01-preview",
azure_endpoint="https://YOUR-RESOURCE-NAME.openai.azure.com/",
api_key="your-azure-api-key",
)
vector = embeddings.embed_query("LangChain makes agents easy.")
```
</Tab>
```bash uv
uv add langchain-azure-ai
```
See a usage example.
Azure AI Content Safety provides guardrails you can apply to LangChain agents through middleware. The
langchain-azure-aipackage currently exports middleware for text moderation, image moderation, prompt injection detection, protected material detection, and groundedness evaluation.
Install the middleware package:
```bash pip pip install -U langchain-azure-ai ``````bash uv
uv add langchain-azure-ai
```
See the Microsoft Foundry middleware guide.
from langchain_azure_ai.agents.middleware import AzureContentModerationMiddlewareAzure AI Foundry (formerly Azure AI Studio provides the capability to upload data assets to cloud storage and register existing data assets from the following sources:
Microsoft OneLakeAzure Blob StorageAzure Data Lake gen 2
First, you need to install several python packages.
```bash pip pip install azureml-fsspec, azure-ai-generative ``````bash uv
uv add azureml-fsspec, azure-ai-generative
```
See a usage example.
from langchain.document_loaders import AzureAIDataLoaderAzure AI Document Intelligence (formerly known as
Azure Form Recognizer) is machine-learning based service that extracts texts (including handwriting), tables, document structures, and key-value-pairs from digital or scanned PDFs, images, Office and HTML files.Document Intelligence supports
JPEG/JPG,PNG,BMP,TIFF,HEIF,DOCX,XLSX,PPTXandHTML.
First, you need to install a python package.
```bash pip pip install azure-ai-documentintelligence ``````bash uv
uv add azure-ai-documentintelligence
```
See a usage example.
from langchain.document_loaders import AzureAIDocumentIntelligenceLoaderAzure Blob Storage is Microsoft's object storage solution for the cloud. Blob Storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data.
Azure Blob Storage is designed for:
- Serving images or documents directly to a browser.
- Storing files for distributed access.
- Streaming video and audio.
- Writing to log files.
- Storing data for backup and restore, disaster recovery, and archiving.
- Storing data for analysis by an on-premises or Azure-hosted service.
```bash uv
uv add langchain-azure-storage
```
See usage examples for the Azure Blob Storage Loader.
from langchain_azure_storage.document_loaders import AzureBlobStorageLoaderMicrosoft OneDrive (formerly
SkyDrive) is a file-hosting service operated by Microsoft.
First, you need to install a python package.
```bash pip pip install o365 ``````bash uv
uv add o365
```
See a usage example.
from langchain_community.document_loaders import OneDriveLoaderMicrosoft OneDrive (formerly
SkyDrive) is a file-hosting service operated by Microsoft.
First, you need to install a python package.
```bash pip pip install o365 ``````bash uv
uv add o365
```
from langchain_community.document_loaders import OneDriveFileLoaderMicrosoft Word is a word processor developed by Microsoft.
See a usage example.
from langchain_community.document_loaders import UnstructuredWordDocumentLoaderMicrosoft Excel is a spreadsheet editor developed by Microsoft for Windows, macOS, Android, iOS and iPadOS. It features calculation or computation capabilities, graphing tools, pivot tables, and a macro programming language called Visual Basic for Applications (VBA). Excel forms part of the Microsoft 365 suite of software.
The UnstructuredExcelLoader is used to load Microsoft Excel files. The loader works with both .xlsx and .xls files.
The page content will be the raw text of the Excel file. If you use the loader in "elements" mode, an HTML
representation of the Excel file will be available in the document metadata under the text_as_html key.
See a usage example.
from langchain_community.document_loaders import UnstructuredExcelLoaderMicrosoft SharePoint is a website-based collaboration system that uses workflow applications, “list” databases, and other web parts and security features to empower business teams to work together developed by Microsoft.
See a usage example.
from langchain_community.document_loaders.sharepoint import SharePointLoaderMicrosoft PowerPoint is a presentation program by Microsoft.
See a usage example.
from langchain_community.document_loaders import UnstructuredPowerPointLoaderFirst, let's install dependencies:
```bash pip pip install bs4 msal ``````bash uv
uv add bs4 msal
```
See a usage example.
from langchain_community.document_loaders.onenote import OneNoteLoaderPlaywright is an open-source automation tool developed by
Microsoftthat allows you to programmatically control and automate web browsers. It is designed for end-to-end testing, scraping, and automating tasks across various web browsers such asChromium,Firefox, andWebKit.
First, let's install dependencies:
```bash pip pip install playwright unstructured ``````bash uv
uv add playwright unstructured
```
See a usage example.
from langchain_community.document_loaders.onenote import OneNoteLoader```bash pip pip install langchain-azure-ai ```Azure Cosmos DB provides chat message history storage for conversational AI applications, enabling you to persist and retrieve conversation history with low latency and high availability.
```bash uv
uv add langchain-azure-ai
```
Configure your Azure Cosmos DB connection:
from langchain_azure_ai.chat_message_histories import CosmosDBChatMessageHistory
history = CosmosDBChatMessageHistory(
cosmos_endpoint="https://<your-account>.documents.azure.com:443/",
cosmos_database="<your-database>",
cosmos_container="<your-container>",
session_id="<session-id>",
user_id="<user-id>",
credential="<your-credential>" # or use connection_string
)AI agents can rely on Azure Cosmos DB as a unified memory system solution, enjoying speed, scale, and simplicity. This service successfully enabled OpenAI's ChatGPT service to scale dynamically with high reliability and low maintenance. Powered by an atom-record-sequence engine, it is the world's first globally distributed NoSQL, relational, and vector database service that offers a serverless mode.
Below are two available Azure Cosmos DB APIs that can provide vector store functionalities.
Azure Cosmos DB for MongoDB vCore makes it easy to create a database with full native MongoDB support. You can apply your MongoDB experience and continue to use your favorite MongoDB drivers, SDKs, and tools by pointing your application to the API for MongoDB vCore account's connection string. Use vector search in Azure Cosmos DB for MongoDB vCore to seamlessly integrate your AI-based applications with your data that's stored in Azure Cosmos DB.
See detailed configuration instructions.
We need to install langchain-azure-ai and pymongo python packages.
```bash uv
uv add langchain-azure-ai pymongo
```
Azure Cosmos DB for MongoDB vCore provides developers with a fully managed MongoDB-compatible database service for building modern applications with a familiar architecture.
With Cosmos DB for MongoDB vCore, developers can enjoy the benefits of native Azure integrations, low total cost of ownership (TCO), and the familiar vCore architecture when migrating existing applications or building new ones.
Sign Up for free to get started today.
See a usage example.
from langchain_azure_ai.vectorstores import AzureCosmosDBMongoVCoreVectorSearchAzure Cosmos DB for NoSQL now offers vector indexing and search in preview. This feature is designed to handle high-dimensional vectors, enabling efficient and accurate vector search at any scale. You can now store vectors directly in the documents alongside your data. This means that each document in your database can contain not only traditional schema-free data, but also high-dimensional vectors as other properties of the documents. This colocation of data and vectors allows for efficient indexing and searching, as the vectors are stored in the same logical unit as the data they represent. This simplifies data management, AI application architectures, and the efficiency of vector-based operations.
See detail configuration instructions.
We need to install langchain-azure-ai and azure-cosmos python packages.
```bash uv
uv add langchain-azure-ai azure-cosmos
```
Azure Cosmos DB offers a solution for modern apps and intelligent workloads by being very responsive with dynamic and elastic autoscale. It is available in every Azure region and can automatically replicate data closer to users. It has SLA guaranteed low-latency and high availability.
Sign Up for free to get started today.
See a usage example.
from langchain_azure_ai.vectorstores import AzureCosmosDBNoSqlVectorSearchAzure Database for PostgreSQL - Flexible Server is a relational database service based on the open-source Postgres database engine. It's a fully managed database-as-a-service that can handle mission-critical workloads with predictable performance, security, high availability, and dynamic scalability.
See set up instructions for Azure Database for PostgreSQL.
Simply use the connection string from your Azure Portal.
Since Azure Database for PostgreSQL is open-source Postgres, you can use the LangChain's Postgres support to connect to Azure Database for PostgreSQL.
Azure SQL Database is a robust service that combines scalability, security, and high availability, providing all the benefits of a modern database solution. It also provides a dedicated Vector data type & built-in functions that simplifies the storage and querying of vector embeddings directly within a relational database. This eliminates the need for separate vector databases and related integrations, increasing the security of your solutions while reducing the overall complexity.
By leveraging your current SQL Server databases for vector search, you can enhance data capabilities while minimizing expenses and avoiding the challenges of transitioning to new systems.
See detail configuration instructions.
We need to install the langchain-sqlserver python package.
!pip install langchain-sqlserver==0.1.1Sign Up for free to get started today.
See a usage example.
from langchain_sqlserver import SQLServer_VectorStoreAzure AI Search is a cloud search service that gives developers infrastructure, APIs, and tools for information retrieval of vector, keyword, and hybrid queries at scale. See the Azure AI Search usage examples for usage examples.
from langchain_community.vectorstores.azuresearch import AzureSearchAzure AI Search (formerly known as
Azure SearchorAzure Cognitive Search) is a cloud search service that gives developers infrastructure, APIs, and tools for building a rich search experience over private, heterogeneous content in web, mobile, and enterprise applications.
Search is foundational to any app that surfaces text to users, where common scenarios include catalog or document search, online retail apps, or data exploration over proprietary content. When you create a search service, you'll work with the following capabilities:
- A search engine for full text search over a search index containing user-owned content
- Rich indexing, with lexical analysis and optional AI enrichment for content extraction and transformation
- Rich query syntax for text search, fuzzy search, autocomplete, geo-search and more
- Programmability through REST APIs and client libraries in Azure SDKs
- Azure integration at the data layer, machine learning layer, and AI (AI Services)
See set up instructions.
See a usage example.
from langchain_community.retrievers import AzureAISearchRetrieverAzure Database for PostgreSQL - Flexible Server is a relational database service based on the open-source Postgres database engine. It's a fully managed database-as-a-service that can handle mission-critical workloads with predictable performance, security, high availability, and dynamic scalability.
See set up instructions for Azure Database for PostgreSQL.
You need to enable pgvector extension in your database to use Postgres as a vector store. Once you have the extension enabled, you can use the PGVector in LangChain to connect to Azure Database for PostgreSQL.
See a usage example. Simply use the connection string from your Azure Portal.
We need to get the POOL_MANAGEMENT_ENDPOINT environment variable from the Azure Container Apps service.
See the Azure dynamic sessions setup instructions.
We need to install a python package.
```bash pip pip install langchain-azure-dynamic-sessions ``````bash uv
uv add langchain-azure-dynamic-sessions
```
See a usage example.
from langchain_azure_dynamic_sessions import SessionsPythonREPLToolFollow the Bing search tool documentation to get a detailed explanations and instructions for this tool.
The environment variable BING_SUBSCRIPTION_KEY and BING_SEARCH_URL are required from Bing Search resource.
from langchain_community.tools.bing_search import BingSearchResults
from langchain_community.utilities import BingSearchAPIWrapper
api_wrapper = BingSearchAPIWrapper()
tool = BingSearchResults(api_wrapper=api_wrapper)We need to install several python packages.
```bash pip pip install azure-ai-formrecognizer azure-cognitiveservices-speech azure-ai-vision-imageanalysis ``````bash uv
uv add azure-ai-formrecognizer azure-cognitiveservices-speech azure-ai-vision-imageanalysis
```
See a usage example.
from langchain_community.agent_toolkits import azure_ai_servicesThe azure_ai_services toolkit includes the following tools:
- Image Analysis: AzureAiServicesImageAnalysisTool
- Document Intelligence: AzureAiServicesDocumentIntelligenceTool
- Speech to Text: AzureAiServicesSpeechToTextTool
- Text to Speech: AzureAiServicesTextToSpeechTool
- Text Analytics for Health: AzureAiServicesTextAnalyticsForHealthTool
We need to install several python packages.
```bash pip pip install azure-ai-formrecognizer azure-cognitiveservices-speech azure-ai-vision-imageanalysis ``````bash uv
uv add azure-ai-formrecognizer azure-cognitiveservices-speech azure-ai-vision-imageanalysis
```
See a usage example.
from langchain_community.agent_toolkits import AzureCognitiveServicesToolkitThe azure_ai_services toolkit includes the tools that queries the Azure Cognitive Services:
AzureCogsFormRecognizerTool: Form Recognizer APIAzureCogsImageAnalysisTool: Image Analysis APIAzureCogsSpeech2TextTool: Speech2Text APIAzureCogsText2SpeechTool: Text2Speech APIAzureCogsTextAnalyticsHealthTool: Text Analytics for Health API
from langchain_community.tools.azure_cognitive_services import (
AzureCogsFormRecognizerTool,
AzureCogsImageAnalysisTool,
AzureCogsSpeech2TextTool,
AzureCogsText2SpeechTool,
AzureCogsTextAnalyticsHealthTool,
)We need to install O365 python package.
```bash uv
uv add O365
```
See a usage example.
from langchain_community.agent_toolkits import O365ToolkitYou can use individual tools from the Office 365 Toolkit:
O365CreateDraftMessage: creating a draft email in Office 365O365SearchEmails: searching email messages in Office 365O365SearchEvents: searching calendar events in Office 365O365SendEvent: sending calendar events in Office 365O365SendMessage: sending an email in Office 365
from langchain_community.tools.office365 import O365CreateDraftMessage
from langchain_community.tools.office365 import O365SearchEmails
from langchain_community.tools.office365 import O365SearchEvents
from langchain_community.tools.office365 import O365SendEvent
from langchain_community.tools.office365 import O365SendMessageWe need to install azure-identity python package.
```bash uv
uv add azure-identity
```
See a usage example.
from langchain_community.agent_toolkits import PowerBIToolkit
from langchain_community.utilities.powerbi import PowerBIDatasetYou can use individual tools from the Azure PowerBI Toolkit:
InfoPowerBITool: getting metadata about a PowerBI DatasetListPowerBITool: getting tables namesQueryPowerBITool: querying a PowerBI Dataset
from langchain_community.tools.powerbi.tool import InfoPowerBITool
from langchain_community.tools.powerbi.tool import ListPowerBITool
from langchain_community.tools.powerbi.tool import QueryPowerBIToolPlaywright is an open-source automation tool developed by
Microsoftthat allows you to programmatically control and automate web browsers. It is designed for end-to-end testing, scraping, and automating tasks across various web browsers such asChromium,Firefox, andWebKit.
We need to install several python packages.
```bash pip pip install playwright lxml ``````bash uv
uv add playwright lxml
```
See a usage example.
from langchain_community.agent_toolkits import PlayWrightBrowserToolkitYou can use individual tools from the PlayWright Browser Toolkit.
from langchain_community.tools.playwright import ClickTool
from langchain_community.tools.playwright import CurrentWebPageTool
from langchain_community.tools.playwright import ExtractHyperlinksTool
from langchain_community.tools.playwright import ExtractTextTool
from langchain_community.tools.playwright import GetElementsTool
from langchain_community.tools.playwright import NavigateTool
from langchain_community.tools.playwright import NavigateBackToolWe need to install a python package.
```bash pip pip install gremlinpython ``````bash uv
uv add gremlinpython
```
See a usage example.
from langchain_community.graphs import GremlinGraph
from langchain_community.graphs.graph_document import GraphDocument, Node, RelationshipMicrosoft Bing, commonly referred to as
BingorBing Search, is a web search engine owned and operated byMicrosoft.
See a usage example.
from langchain_community.utilities import BingSearchAPIWrapperPresidio (Origin from Latin praesidium ‘protection, garrison’) helps to ensure sensitive data is properly managed and governed. It provides fast identification and anonymization modules for private entities in text and images such as credit card numbers, names, locations, social security numbers, bitcoin wallets, US phone numbers, financial data and more.
First, you need to install several python packages and download a SpaCy model.
```bash uv
uv add langchain-experimental openai presidio-analyzer presidio-anonymizer spacy Faker
python -m spacy download en_core_web_lg
```
See usage examples.
from langchain_experimental.data_anonymizer import PresidioAnonymizer, PresidioReversibleAnonymizer