Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove Matrix and 400 error ignoring from chatCompletions #32948

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

mikhail
Copy link
Member

@mikhail mikhail commented Feb 10, 2025

No description provided.

@mikhail
Copy link
Member Author

mikhail commented Feb 10, 2025

@microsoft-github-policy-service agree company="Microsoft"

@azure-sdk
Copy link
Collaborator

API change check

API changes are not detected in this pull request.

@mikhail mikhail marked this pull request as ready for review March 6, 2025 21:15
@deyaaeldeen
Copy link
Member

I queued a pipeline run in https://dev.azure.com/azure-sdk/internal/_build/results?buildId=4635999&view=logs&j=43701d7e-216c-59bf-08d1-1efd0c7ac90a&t=837467d0-f383-5ba6-1b1a-21aa68346dfb and it seems that there're still a few cases that should be handled. For example:

[vitest] → 400 Unsupported Model. Model Name: 'o1-mini' Model Version '2024-09-12'. Please retry with supported model: *** (0314),gpt-35-turbo (0301),*** (1106-preview),o-mini (2024-07-18), (0613),*** (turbo-2024-04-09),gpt-35-turbo (1106),gpt-35-turbo (0125),o (2024-05-13),-32k (0613),-32k (0314), (0125-preview),***o (2024-08-06),gpt-35-turbo-16k (0613)

Azure Search Error: 403, message='Server responded with status 403. Error message: ', url='/indexes/?api-version=2024-03-01-preview'
[vitest] Server responded with status 403. Error message: ]

@mikhail
Copy link
Member Author

mikhail commented Mar 10, 2025

My hunch is that these two errors should not be automatically ignored. If we expect the model to be supported and it isn't this is extremely important to become aware of. Similarly with 403 - If suddenly the live test key drops necessary permissions and all tests fire off 403 I don't want the tests to pass. Open to discussing this

@deyaaeldeen
Copy link
Member

@mikhail Yeah I agree. There is a balance here between "are we testing the client" or "are we testing the service". I like the tests to be green as long as the client is doing its job but still I would like to get enough information in the test output to understand how the service behaves.

@mikhail
Copy link
Member Author

mikhail commented Mar 10, 2025

@deyaaeldeen Agreed. I'd like to agree on some hard-defined rules on where that balance lies. If we're testing the client then let's create mock responses that include 4xx and 500 errors, and ensure that the client acts properly. If we have tests that hit the live service then it's no longer a unit/component/integration test, and shouldn't be treated as such. For example - I see zero value in, and on the contrary some danger in hitting an api that responds with Model Not Supported and then marking the test as Success, when that should be testing success of some model functionality

@deyaaeldeen
Copy link
Member

some danger in hitting an api that responds with Model Not Supported and then marking the test as Success

I think there is a misunderstanding, the sensible thing to do here is not to use this particular model in this particular test.

@mikhail mikhail enabled auto-merge (squash) March 11, 2025 21:59
@mikhail mikhail disabled auto-merge March 11, 2025 21:59
@mikhail mikhail enabled auto-merge (squash) March 11, 2025 22:00
Copy link
Member

@minhanh-phan minhanh-phan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for doing this 💯 I love the direction we're going with withDeployments. I only have some small comment, but no major concern from me!

].includes(error.code) ||
error.type === "invalid_request_error" ||
error.name === "AbortError" ||
errorStr.includes("Connection error") ||
errorStr.includes("toolCalls") ||
error.status === 404
[
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Would it make sense to put all these strings in an AcceptableErrors variable for this list similar to GlobalAcceptedErrors? Feel free to ignore this as we're moving away from this function.

Comment on lines +127 to +131
* @param clientsAndDeployments -
* @param run -
* @param validate -
* @param modelsListToSkip -
* @param acceptableErrors -
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* @param clientsAndDeployments -
* @param run -
* @param validate -
* @param modelsListToSkip -
* @param acceptableErrors -
* @param clientsAndDeployments - Information about the clients and their associated deployments.
* @param run - A function that runs the test on a given client and model.
* @param validate - An optional function to validate the result of the run function.
* @param modelsListToSkip - An optional list of models to skip during the test.
* @param acceptableErrors - An optional list of acceptable errors that should not cause the test to fail.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants