Skip to content

Add provider ordering hint for chat model picker#4972

Open
kevin-m-kent wants to merge 8 commits intomicrosoft:mainfrom
kevin-m-kent:kevin-m-kent/model-picker-ordering
Open

Add provider ordering hint for chat model picker#4972
kevin-m-kent wants to merge 8 commits intomicrosoft:mainfrom
kevin-m-kent:kevin-m-kent/model-picker-ordering

Conversation

@kevin-m-kent
Copy link
Copy Markdown
Contributor

Summary

  • add github.copilot.chat.modelPickerVendorOrdering to emit a provider-based ordering hint for the chat model picker
  • assign vendorPriority by provider so OpenAI models sort ahead of Anthropic, Gemini, and other providers
  • leave top-level promotion to VS Code's existing models control manifest instead of hardcoding featured models in the extension

Testing

  • enabled the setting in a local OSS dev build and verified the picker orders models by provider while top-level featured models come from the manifest

kevin-m-kent and others added 4 commits April 3, 2026 17:39
Add github.copilot.chat.modelPickerVendorOrdering experiment setting
(tagged experimental + onExp) that enables vendor-prioritized ordering
in the model picker: OpenAI → Anthropic → Gemini → others.

When the experiment is enabled, sets vendorPriority on each model based
on the endpoint's modelProvider field. Auto model is excluded from
vendor ordering (always appears first via separate logic in vscode).

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Featured models (GPT-5.4, GPT-5.4 mini, Claude Opus 4.6, Claude
Sonnet 4.6) get lowest priority values to appear first after Auto.
Remaining models use vendor-based grouping:
OpenAI (100) → Anthropic (200) → Gemini (300) → others (400).

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
ExperimentBased config keys must be read with getExperimentBasedConfig()
not getConfig(). This was causing the vendorOrdering setting to never
be read as true.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Remove the hardcoded featured model list and assign vendorPriority
only by provider (OpenAI, Anthropic, Gemini, others). Top-level
promotion is left to VS Code's models control manifest.

Also switch the setting to a simple config-backed flag so it can be
enabled reliably in local dev.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Run vscode-dts:update so the local proposed API definitions and
vscodeCommit match the current upstream VS Code commit expected by CI.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR introduces an opt-in configuration that provides a provider-based ordering hint (vendorPriority) to VS Code’s chat model picker, so models can be sorted by vendor (e.g., OpenAI before Anthropic, etc.) while leaving “featured” promotion to VS Code’s models manifest.

Changes:

  • Adds a new setting github.copilot.chat.modelPickerVendorOrdering (and corresponding ConfigKey) to enable vendor-prioritized ordering.
  • Extends the proposed VS Code LM API type surface with an optional vendorPriority?: number field on LanguageModelChatInformation.
  • When enabled, assigns vendorPriority for non-Auto endpoints based on endpoint.modelProvider.
Show a summary per file
File Description
src/platform/configuration/common/configurationService.ts Adds ConfigKey.ModelPickerVendorOrdering to read the new setting.
src/extension/vscode.proposed.chatProvider.d.ts Adds vendorPriority?: number to the proposed LanguageModelChatInformation interface.
src/extension/conversation/vscode-node/languageModelAccess.ts Reads the setting and emits vendorPriority for model picker ordering.
package.json Contributes the new setting schema entry under github.copilot.chat.modelPickerVendorOrdering.

Copilot's findings

  • Files reviewed: 3/3 changed files
  • Comments generated: 2

return 0;
case 'anthropic':
return 1;
case 'google':
Copy link

Copilot AI Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

getVendorPriority doesn't match the setting description ("...OpenAI → Anthropic → Gemini...") and will not prioritize BYOK Gemini models because their vendor/provider name is 'Gemini' (see GeminiNativeBYOKLMProvider.providerName = 'Gemini'). Consider handling 'gemini' (and/or more robust matching) so Gemini models actually get priority 2 when this setting is enabled.

Suggested change
case 'google':
case 'google':
case 'gemini':

Copilot uses AI. Check for mistakes.
package.json Outdated
Comment on lines +3037 to +3045
"github.copilot.chat.modelPickerVendorOrdering": {
"type": "boolean",
"default": false,
"markdownDescription": "Enable vendor-prioritized ordering in the model picker (OpenAI → Anthropic → Gemini → others).",
"tags": [
"experimental",
"onExp"
]
},
Copy link

Copilot AI Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This new setting's markdownDescription is hard-coded English text, while nearby Copilot settings use localized %...% keys (e.g. github.copilot.config.rateLimitAutoSwitchToAuto). To keep settings UI localizable, add a new entry to package.nls.json and reference it here (e.g. %github.copilot.config.modelPickerVendorOrdering%).

Copilot uses AI. Check for mistakes.
@justschen
Copy link
Copy Markdown
Contributor

just a heads up for @lramos15 and @sandy081 !

@kevin-m-kent @cwebster-99 if we want to take this in exp for this iteration, we can merge by eod today and test on monday!

kevin-m-kent and others added 3 commits April 3, 2026 18:40
Assign vendorPriority after creating the LanguageModelChatInformation
object so the runtime property is preserved without requiring the
current upstream proposed API typings to include it yet.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…Copilot <223556219+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants