Open
Description
- Package Name: azure-ai-evaluation
- Package Version: 1.3.0
- Operating System: macOS
- Python Version: 3.11
Describe the bug
The bug occurs with a AI Foundry hub using a private endpoint connection. The call to azure.ai.evaluation.evaluate fails with a DeserializationError, as msrest.serialization.Deserialize is not initiated with the classes PrivateEndpointConnection, PrivateEndpoint, and PrivateLinkServiceConnectionState returned by LiteMLClient.workspace_get_info() during the evaluate() call.
To Reproduce
Steps to reproduce the behavior:
- Create an Azure AI Foundry Hub
- Create a private endpoint in the same rg
- Create a AI Foundry project
- Use this example to run a few evaluations and provide the AI project information to the AI project created previously
Expected behavior
The call to evaluate should succeed and the results should be uploaded to AI Foundry.
Screenshots


Metadata
Metadata
Labels
This issue points to a problem in the data-plane of the library.Issues related to the client library for Azure AI EvaluationWorkflow: This issue is responsible by Azure service team.This issue requires a change to an existing behavior in the product in order to be resolved.Workflow: This issue needs attention from Azure service team or SDK team
Activity