-
Notifications
You must be signed in to change notification settings - Fork 405
fix foundry openai responses api c# sample code #643
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
@qkfang : Thanks for your contribution! The author(s) and reviewer(s) have been notified to review your proposed change. |
|
Learn Build status updates of commit 38ec1c8: ✅ Validation status: passed
For more details, please refer to the build report. |
|
Learn Build status updates of commit 061c14d: ✅ Validation status: passed
For more details, please refer to the build report. |
|
Can you review the proposed changes? IMPORTANT: When the changes are ready for publication, adding a #label:"aq-pr-triaged" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR attempts to fix compilation errors in the C# sample code for the OpenAI Responses API by updating it to work with OpenAI NuGet package version 2.8.0. The changes include adding package installation instructions, updating using statements, modifying the client initialization code, adding pragma warnings to suppress experimental API warnings, and including example usage code.
Key Changes:
- Updated package installation to explicitly include Azure.Identity and Azure.Core
- Modified ResponsesClient instantiation with new constructor parameters and authentication options
- Added #pragma warning directives to suppress OPENAI001 warnings about experimental APIs
| var projectClient = new ResponsesClient( | ||
| model: "<YOUR-DEPLOYMENT-NAME>", //e.g. gpt-5.2-chat, must be a model that supports Responses API | ||
| authenticationPolicy: tokenPolicy, // if use EntraID | ||
| // credential: new ApiKeyCredential("<YOUR-KEY>") // if use APIKEY | ||
| clientOptions | ||
| ); |
Copilot
AI
Jan 2, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ResponsesClient constructor call has invalid syntax. The parameters are incorrectly mixed between named and positional arguments. The third parameter 'clientOptions' is not properly named and the second parameter 'authenticationPolicy' appears to be invalid for this constructor.
Based on the OpenAI SDK patterns shown in other parts of the codebase (e.g., articles/ai-foundry/foundry-models/how-to/generate-responses.md:98-100), the recommended approach is to use AIProjectClient with GetProjectResponsesClientForModel instead of directly instantiating ResponsesClient.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.
|
Learn Build status updates of commit 4af8a59: ✅ Validation status: passed
For more details, please refer to the build report. |
Updated the description of the code snippet for clarity.
|
Learn Build status updates of commit 8370b0f: ✅ Validation status: passed
For more details, please refer to the build report. |
|
Learn Build status updates of commit 3d8354e: ✅ Validation status: passed
For more details, please refer to the build report. |
the provided c# sample code for openai responses api does not work and have multiple compilations errors, see below.

have update the sample code based on latest openai nuget package 2.8.0, also referenced latest openai repo documentation
https://www.nuget.org/packages/OpenAI
https://github.com/openai/openai-dotnet/blob/main/examples/Responses/Example01_SimpleResponse.cs
add comments to highlight the model must support responses api
add sample to use APIKey (oauth is much harder to get it working correctly)
disabled warning from openai nuget saying "'OpenAI.Responses.ResponsesClient' is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.", it is from package
can compile correctly and tested working with foundry gpt-5.2-chat model.