Add provider ordering hint for chat model picker#4972
Add provider ordering hint for chat model picker#4972kevin-m-kent wants to merge 8 commits intomicrosoft:mainfrom
Conversation
Add github.copilot.chat.modelPickerVendorOrdering experiment setting (tagged experimental + onExp) that enables vendor-prioritized ordering in the model picker: OpenAI → Anthropic → Gemini → others. When the experiment is enabled, sets vendorPriority on each model based on the endpoint's modelProvider field. Auto model is excluded from vendor ordering (always appears first via separate logic in vscode). Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Featured models (GPT-5.4, GPT-5.4 mini, Claude Opus 4.6, Claude Sonnet 4.6) get lowest priority values to appear first after Auto. Remaining models use vendor-based grouping: OpenAI (100) → Anthropic (200) → Gemini (300) → others (400). Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
ExperimentBased config keys must be read with getExperimentBasedConfig() not getConfig(). This was causing the vendorOrdering setting to never be read as true. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Remove the hardcoded featured model list and assign vendorPriority only by provider (OpenAI, Anthropic, Gemini, others). Top-level promotion is left to VS Code's models control manifest. Also switch the setting to a simple config-backed flag so it can be enabled reliably in local dev. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Run vscode-dts:update so the local proposed API definitions and vscodeCommit match the current upstream VS Code commit expected by CI. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This PR introduces an opt-in configuration that provides a provider-based ordering hint (vendorPriority) to VS Code’s chat model picker, so models can be sorted by vendor (e.g., OpenAI before Anthropic, etc.) while leaving “featured” promotion to VS Code’s models manifest.
Changes:
- Adds a new setting
github.copilot.chat.modelPickerVendorOrdering(and correspondingConfigKey) to enable vendor-prioritized ordering. - Extends the proposed VS Code LM API type surface with an optional
vendorPriority?: numberfield onLanguageModelChatInformation. - When enabled, assigns
vendorPriorityfor non-Auto endpoints based onendpoint.modelProvider.
Show a summary per file
| File | Description |
|---|---|
| src/platform/configuration/common/configurationService.ts | Adds ConfigKey.ModelPickerVendorOrdering to read the new setting. |
| src/extension/vscode.proposed.chatProvider.d.ts | Adds vendorPriority?: number to the proposed LanguageModelChatInformation interface. |
| src/extension/conversation/vscode-node/languageModelAccess.ts | Reads the setting and emits vendorPriority for model picker ordering. |
| package.json | Contributes the new setting schema entry under github.copilot.chat.modelPickerVendorOrdering. |
Copilot's findings
- Files reviewed: 3/3 changed files
- Comments generated: 2
| return 0; | ||
| case 'anthropic': | ||
| return 1; | ||
| case 'google': |
There was a problem hiding this comment.
getVendorPriority doesn't match the setting description ("...OpenAI → Anthropic → Gemini...") and will not prioritize BYOK Gemini models because their vendor/provider name is 'Gemini' (see GeminiNativeBYOKLMProvider.providerName = 'Gemini'). Consider handling 'gemini' (and/or more robust matching) so Gemini models actually get priority 2 when this setting is enabled.
| case 'google': | |
| case 'google': | |
| case 'gemini': |
package.json
Outdated
| "github.copilot.chat.modelPickerVendorOrdering": { | ||
| "type": "boolean", | ||
| "default": false, | ||
| "markdownDescription": "Enable vendor-prioritized ordering in the model picker (OpenAI → Anthropic → Gemini → others).", | ||
| "tags": [ | ||
| "experimental", | ||
| "onExp" | ||
| ] | ||
| }, |
There was a problem hiding this comment.
This new setting's markdownDescription is hard-coded English text, while nearby Copilot settings use localized %...% keys (e.g. github.copilot.config.rateLimitAutoSwitchToAuto). To keep settings UI localizable, add a new entry to package.nls.json and reference it here (e.g. %github.copilot.config.modelPickerVendorOrdering%).
|
just a heads up for @lramos15 and @sandy081 ! @kevin-m-kent @cwebster-99 if we want to take this in exp for this iteration, we can merge by eod today and test on monday! |
Assign vendorPriority after creating the LanguageModelChatInformation object so the runtime property is preserved without requiring the current upstream proposed API typings to include it yet. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…9+Copilot@users.noreply.github.com>
…Copilot <223556219+Copilot@users.noreply.github.com>
Summary
github.copilot.chat.modelPickerVendorOrderingto emit a provider-based ordering hint for the chat model pickervendorPriorityby provider so OpenAI models sort ahead of Anthropic, Gemini, and other providersTesting