Open
Description
Package: azure-ai-inference (version: 1.0.0b9)
Python version: 3.13.3
Operating System: Windows
Issue: When using the ChatCompletionClient to generate responses with complete( ), the returned object does not include context with citations field.
Code for reference to reproduce:
def init_ai_foundry_client():
ai_foundry_client = None
try:
project_conn_string=app_settings.azure_openai.project_conn_string
ai_foundry_project=AIProjectClient.from_connection_string(
conn_str=project_conn_string,
credential=AzureIdentityDefaultCredential(),
)
ai_foundry_client=ai_foundry_project.inference.get_chat_completions_client()
return ai_foundry_client
except Exception as e:
logging.exception("Exception in AI Foundry initialization", e)
ai_foundry_client=None
raise e
ai_foundry_client = init_ai_foundry_client()
messages = [
{
"role": "system",
"content": "You are an AI assistant that helps people find information and generate content. Do not answer any questions unrelated to retrieved documents. If you can't answer questions from available data, always answer that you can't respond to the question with available data."
}
]
response=ai_foundry_client.complete(
model='gpt-4o',
messages=messages,
temperature=0,
max_tokens=1000,
)
Expected: The response should include context with citations along with the generated content.
Metadata
Metadata
Assignees
Labels
Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)This issue points to a problem in the data-plane of the library.Workflow: This issue is responsible by Azure service team.Issues that are reported by GitHub users external to the Azure organization.Workflow: This issue needs attention from Azure service team or SDK teamThe issue doesn't require a change to the product in order to be resolved. Most issues start as that