-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Prompty support to AI Foundry Projects SDK #40106
base: feature/azure-ai-projects-beta8
Are you sure you want to change the base?
Add Prompty support to AI Foundry Projects SDK #40106
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This pull request adds Prompty support to the AI Foundry Projects SDK and updates related samples, telemetry instrumentation, and documentation to align with the new functionality.
- Introduces new samples demonstrating chat completions using both inline prompt strings and Prompty files.
- Updates the PromptTemplate parsing logic and agent sample integrations.
- Bumps the SDK version from 1.0.0b7 to 1.0.0b8 and revises telemetry event handling.
Reviewed Changes
Copilot reviewed 15 out of 15 changed files in this pull request and generated 1 comment.
Show a summary per file
File | Description |
---|---|
sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py | New sample using inline prompt strings for chat completions. |
sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty.py | New sample using Prompty file for chat completions. |
sdk/ai/azure-ai-projects/azure/ai/projects/prompts/_utils.py | Added utility function for processing multiline strings. |
sdk/ai/azure-ai-projects/azure/ai/projects/prompts/_patch.py | Updated PromptTemplate class to support Prompty and inline strings. |
sdk/ai/azure-ai-projects/azure/ai/projects/prompts/init.py | Module initialization with prompty dependency check and patch execution. |
sdk/ai/azure-ai-projects/samples/agents/sample_agents_azure_ai_search.py & README.md | Updated agent samples and documentation to include Prompty and reference enhancements. |
sdk/ai/azure-ai-projects/setup.py | Added extras_require for Prompty related dependencies. |
sdk/ai/azure-ai-projects/CHANGELOG.md | Updated the changelog for the new release. |
sdk/ai/azure-ai-projects/azure/ai/projects/telemetry/agents/_ai_agents_instrumentor.py | Revised telemetry instrumentation to improve type handling and return values. |
sdk/ai/azure-ai-projects/samples/agents/async_samples/sample_agents_basics_async_with_azure_monitor_tracing.py | Reformatted sample code for asynchronous tracing integration. |
sdk/ai/azure-ai-projects/azure/ai/projects/_version.py | SDK version bump to 1.0.0b8. |
Comments suppressed due to low confidence (2)
sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompt_string.py:59
- [nitpick] Using 'input' as a variable name shadows the built-in function. Consider renaming it to something more descriptive, like 'user_input'.
input = "When I arrived, can I still have breakfast?"
sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_prompty.py:43
- [nitpick] Using 'input' as a variable name shadows the built-in function. Consider renaming it to 'user_input' for clarity.
input = "When I arrived, can I still have breakfast?"
API change check APIView has identified API level changes in this PR and created following API reviews. |
Description
Please add an informative description that covers that changes made by the pull request and link all relevant issues.
If an SDK is being regenerated based on a new swagger spec, a link to the pull request containing these swagger spec changes has been included above.
All SDK Contribution checklist:
General Guidelines and Best Practices
Testing Guidelines