Open
Description
Library name and version
Azure.AI.Inference
Describe the bug
When I set Custom JSON Response format for model output, server throwing error that value is not supported. Tried it with Deepseek and Llama 4 models didn't work
Same code working Azure.AI.OpenAI packages
Expected behavior
Model should return structured output based on schema provided
Actual behavior
server throwing error that value is not supported.
Reproduction Steps
Call DeepSeek model with ResponseFormat set
Environment
dotnet core 9
Metadata
Metadata
Assignees
Labels
This issue points to a problem in the data-plane of the library.Workflow: This issue is responsible by Azure service team.Issues that are reported by GitHub users external to the Azure organization.Workflow: This issue needs attention from Azure service team or SDK teamThe issue doesn't require a change to the product in order to be resolved. Most issues start as that