Skip to content

[BUG] Structured Output throws error from Server #50246

Open
@bharathm03

Description

@bharathm03

Library name and version

Azure.AI.Inference

Describe the bug

When I set Custom JSON Response format for model output, server throwing error that value is not supported. Tried it with Deepseek and Llama 4 models didn't work

Same code working Azure.AI.OpenAI packages

Expected behavior

Model should return structured output based on schema provided

Actual behavior

server throwing error that value is not supported.

Reproduction Steps

Call DeepSeek model with ResponseFormat set

Environment

dotnet core 9

Metadata

Metadata

Assignees

No one assigned

    Labels

    AI Model InferenceClientThis issue points to a problem in the data-plane of the library.Service AttentionWorkflow: This issue is responsible by Azure service team.customer-reportedIssues that are reported by GitHub users external to the Azure organization.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK teamquestionThe issue doesn't require a change to the product in order to be resolved. Most issues start as that

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions