Automatically review pull requests in Azure DevOps using AI-powered code analysis. This Azure Function integrates with Azure DevOps to provide intelligent code reviews, adding comments directly to your PRs and optionally creating improvement PRs with suggested fixes.
- When a PR is created or updated in Azure DevOps, a webhook event is triggered
- The Azure Function receives the webhook payload and validates it
- The function checks if the PR is eligible for review (not a draft, not AI-generated)
- The function loads the review guidelines from the specified source
- The function initializes the selected AI model based on the MODEL_TYPE setting
- For each changed file in the PR:
- The function retrieves the old and new content
- The AI model analyzes the changes and generates comments
- Comments are added to the PR at specific line numbers
- If corrections are available and CREATE_NEW_PR is true:
- A new branch is created based on the source branch
- The corrected files are committed to the new branch
- A new PR is created with the AI-suggested improvements
- π Automatic triggering on Azure DevOps PR webhook events
- π§ AI-powered code analysis (supports multiple AI models)
- π¬ Contextual comments added directly to PR lines
- π οΈ Optional creation of improvement PRs with AI-suggested fixes
- π Customizable review guidelines
- π Support for multiple files in a single PR
- π€ Configurable AI model selection via environment variables
- Azure account with Function App creation permissions
- Azure DevOps project with admin access
- At least one AI API key (Azure OpenAI, OpenAI, or Google Gemini)
- Code review guidelines document
- Azure Portal > Resource Groups > Create
![]() |
![]() |
-
azure login
-
az group create --name <RESOURCE_GROUP_NAME> --location <REGION>
- Use Node.js runtime stack
![]() |
![]() |
az functionapp create --resource-group <RESOURCE_GROUP_NAME> --consumption-plan-location <REGION> --runtime node --functions-version 4 --name <APP_NAME> --storage-account <STORAGE_NAME>
- Open the project in VS Code
- Install the Azure Functions extension
- Sign in to your Azure account
ctrl
+shift
+P
on the project and select "Deploy to Function App"- Select your target Function App
cd azure-function-pr-review
npm install
func azure functionapp publish <YOUR_FUNCTION_APP_NAME>
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
If you want to deploy a different model in Azure,
- Go to
AI Foundry
- Create an
Azure AI Hub
(Keep in mind that you have to make sure the region you are selecting is supporting the model that you are going to deploy. E.g.:As of February 2025, the DeepSeek R1 model is available in East US, East US 2, West US, West US 3, South Central US, and North Central US regions only) - Go to the resource and launch
Azure AI Foundry
- Create a project
- Deploy the model you want
In the Azure Portal, navigate to your Function App > Configuration and add the following application settings:
Environmental Variable Name | Description |
---|---|
AZURE_PAT | Personal Access Token for Azure DevOps with Code (Read & Write) permissions |
AZURE_REPO | Default repository name (optional if provided in webhook) |
INSTRUCTION_SOURCE | Path or URL to review guidelines file |
CREATE_NEW_PR | Set to "true" to create new PRs with AI suggestions |
MODEL_TYPE | AI model to use: "azure", "openai", "gemini", "deepseek", "anthropic", or "groq" |
GEMINI_API_KEY | Google Gemini API key (required if MODEL_TYPE is "gemini") |
OPENAI_API_KEY | OpenAI API key (required if MODEL_TYPE is "openai") |
AZURE_OPENAI_API_KEY | Azure OpenAI API key (required if MODEL_TYPE is "azure") |
AZURE_OPENAI_API_INSTANCE_NAME | Azure OpenAI instance name (required if MODEL_TYPE is "azure") |
AZURE_OPENAI_API_DEPLOYMENT_NAME | Azure OpenAI deployment name (required if MODEL_TYPE is "azure") |
AZURE_OPENAI_API_VERSION | Azure OpenAI API version (defaults to "2023-12-01-preview") |
DEEPSEEK_API_KEY | DeepSeek API key (required if MODEL_TYPE is "deepseek" or "deepseek-r1") |
ANTHROPIC_API_KEY | Anthropic API key (required if MODEL_TYPE is "anthropic" or "claude") |
GROQ_API_KEY | Groq API key (required if MODEL_TYPE is "groq") |
- Go to Project Settings > Service Hooks
- Create new webhook with:
- Trigger: Pull request created
- URL:
- Using VSCode:
ctrl
+shift
+P
-> Azure Functions: Copy Function URL - Using CLI: Replace "==" in the end of the URL with "%3D%3D" after copying it before pasting in the webhook (This is happening because URL encoding)
FOR /F "delims=" %a IN ('az functionapp function show --resource-group <RESOURCE_GROUP> --name <FUNCTION_APP> --function-name PRReviewTrigger --query invokeUrlTemplate -o tsv') DO SET "URL=%a" FOR /F "delims=" %b IN ('az functionapp function keys list --resource-group <RESOURCE_GROUP> --name <FUNCTION_APP> --function-name PRReviewTrigger --query default -o tsv') DO SET "KEY=%b" ECHO %URL%?code=%KEY%
- Using VSCode:
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
You can configure multiple API keys in your environment variables and switch between models by changing only the MODEL_TYPE
setting. This allows you to:
- π Easily switch between different AI providers
- π° Optimize for cost by selecting the most economical option
- π Choose the model that performs best for your specific codebase
- π Have fallback options if one service is unavailable
Below shown is a AI Models Comparison Table as of April 2025
- All prices are in USD and current as of April 2025
- "Cached" refers to cached input pricing, which is typically lower than standard input pricing
- Many providers offer batch processing with discounts (typically 25-50%)
- Context length refers to the maximum number of tokens the model can process in a single request
- Parameter sizes are not explicitly stated for many models, especially the newest ones
- Deployment options (global, regional) may affect pricing, especially for Azure OpenAI
- Some models offer time-based discounts (e.g., DeepSeek's off-peak pricing)
- Actual performance may vary based on specific use cases and implementation
- Google Gemini: cloud.google.com/vertex-ai/generative-ai/pricing
- OpenAI models: openai.com/api/pricing
- OpenAI o3-mini: apidog.com/blog/openai-o3-mini-api-pricing
- GPT-4.5: apidog.com/blog/gpt-4-5-api-price
- GPT-4o: dev.to/foxinfotech/how-much-does-gpt-4o-cost-per-month-4l17
- Azure OpenAI: azure.microsoft.com/en-us/pricing/details/cognitive-services/openai-service
- Claude models: anthropic.com/claude/sonnet
- Claude Haiku pricing: techcrunch.com/2024/11/04/anthropic-hikes-the-price-of-its-haiku-model
- DeepSeek models: api-docs.deepseek.com/quick_start/pricing
- DeepSeek R1: deepseek-r1.com/pricing
- Groq pricing: groq.com/pricing
- GroqCloud models: groq.com/groqcloud-now-offers-qwen-2-5-32b-and-deepseek-r1-distill-qwen-32b