Fix: Preserve local and custom model names in parse_model_name#1648
Fix: Preserve local and custom model names in parse_model_name#1648ps2program wants to merge 4 commits intoconfident-ai:mainfrom
Conversation
|
Someone is attempting to deploy a commit to the Confident AI Team on Vercel. A member of the Team first needs to authorize it. |
- Adjust tests to expect None return for None input instead of error - Add test cases for local/custom model prefixes that should be preserved - Modify existing tests to align with updated logic for known vs unknown providers
|
hey @ps2program actually we're thinking about removing this functino entirely. The reason why this was introduced is because we wanted to fit the litellm format of "provider/model" but i'd better for us to just support lite llm directly. Was wondering can you do that PR instead? |
|
Hi @penguine-ip, Thanks for the context and the opportunity! I’ll go ahead and handle the removal of the Before proceeding, I wanted to confirm a few things:
If we're removing the function entirely, here are the areas that currently rely on it and would need cleanup:
Please confirm if it’s okay to proceed with removing it from all of the above as part of the PR. |
|
Hi @penguine-ip , please have a look at Description:
Please review and let me know if there are any edge cases or legacy concerns to address. |
|
Hey @ps2program i was thinking more of having litellm integration directly, so that it appears as one of the providers in the |
|
Hi@penguine-ip , sure. I can get this working. Thanks for the clarification. |
Summary
This PR improves the
parse_model_nameutility indeepeval.models.utilsby ensuring that model names with non-standard or custom prefixes likelocal/llama-3are preserved and not truncated.Problem
The current implementation of
parse_model_nameassumes that any string before a/is a provider name, and always strips it. This causes issues for locally hosted or custom models likelocal/llama-3, which get incorrectly truncated to justllama-3.Fix
"openai","anthropic","cohere").NoneforNoneinput instead of raisingTypeError.local/llama-3,custom/model-xyz).Example Behavior