Confused Add Custom Model Providers #1181
Unanswered
henryperkins
asked this question in
Q&A
Replies: 2 comments
-
✨✨ Here's an AI-assisted sketch of how you might approach this issue saved by @henryperkins using Copilot Workspace v0.27 |
Beta Was this translation helpful? Give feedback.
0 replies
-
Quick side note.. I tried to fix it but something in chainRunner or the way chain initialization and azure integrate threw me off.. Will attempt again. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When adding a custom model using the OpenAI-compatible format, its configuration is incorrectly grouped with the Azure OpenAI configuration. This causes the system to validate the custom model settings against Azure OpenAI parameters, leading to failed validation for the custom endpoint.
This issue makes it challenging to use multiple models simultaneously, such as one from Azure OpenAI and another from an OpenAI-compatible endpoint. The current behavior hinders Copilot functionality when different model sources are in use.
Steps to Reproduce:
1. Add an Azure OpenAI model to the configuration.
2. Add a second model using an OpenAI-compatible endpoint (non-Azure).
3. Attempt to validate the custom model configuration.
Observed Behavior:
• The custom model configuration fails validation because it is checked against Azure OpenAI requirements, not the provided custom endpoint configuration.
Expected Behavior:
• Each model configuration should be validated independently based on its own settings and endpoint requirements.
• Custom OpenAI-compatible models should not be linked to Azure OpenAI validation rules.
Impact:
• Prevents seamless use of Copilot with multiple AI models when one is from Azure OpenAI and another from a non-Azure, OpenAI-compatible source.
• Reduces flexibility for developers using diverse AI endpoints.
Beta Was this translation helpful? Give feedback.
All reactions