Problem
When using OpenClaude with a custom OpenAI-compatible provider (e.g. Kimi, GLM, DeepSeek via OpenRouter), the /model picker shows hardcoded GPT-5.x / Codex models that are irrelevant to the configured provider. Meanwhile, models defined in agentModels within ~/.openclaude/settings.json do not appear in the picker at all.
Current behavior
- Set up a provider profile with Kimi (
https://api.kimi.com/coding/, model kimi-k2.6)
- Define additional models in
settings.json under agentModels (GLM-5.1, DeepSeek V4 Flash, MiniMax)
- Open
/model picker
Result: The picker shows gpt-5.5, gpt-5.4, gpt-5.3-codex, gpt-5.3-codex-spark, gpt-5.2-codex, gpt-5.1-codex-max, gpt-5.1-codex-mini, gpt-5.5-mini, gpt-5.4-mini, plus Sonnet/Opus/Haiku — none of which are available through the configured provider. The custom agentModels (glm-5.1, deepseek-v4-flash, etc.) are nowhere in the list.
Expected behavior
- Models from
agentModels in settings.json should appear in the /model picker so the user can switch between them for the main session (not just subagents)
- Hardcoded GPT/Codex models should not be injected when
getAPIProvider() === "openai" and the base URL is clearly not OpenAI/Codex (e.g. api.kimi.com, api.z.ai, openrouter.ai)
Root cause (source analysis)
In src/utils/model/modelOptions.ts, lines 550-551:
if (getAPIProvider() === 'openai' || getAPIProvider() === 'codex') {
payg3pOptions.push(...getCodexModelOptions())
}
This unconditionally injects 10 GPT/Codex model options whenever the provider is openai, regardless of the actual base URL or configured models.
Meanwhile, agentModels from settings.json are only consumed by resolveAgentProvider() in src/services/api/agentRouting.ts for subagent routing — they are never surfaced in the model picker (getModelOptionsBase()).
Proposed solution
- Skip Codex injection when
OPENAI_BASE_URL does not point to an OpenAI/Codex endpoint
- Surface
agentModels from settings.json in the /model picker, so users can switch the main session model to any of their configured providers
- Alternatively, allow
providerProfiles to define multiple profiles with different base URLs, and let the user switch between them from /model (currently only the active profile's models appear)
Use case
I use a tiered multi-provider setup:
- GLM-5.1 (Z.AI) — for planning, 10-20 tps
- Kimi K2.6 (Moonshot) — for routine coding, 20-60 tps
- DeepSeek V4 Flash (OpenRouter) — for exploration, high throughput
- MiniMax (SambaNova) — for simple tasks, 200-400 tps
All are defined in agentModels + agentRouting in settings.json. The routing works for subagents, but I cannot switch the main session model to any of these without manually editing env vars and restarting.
Environment
- OpenClaude v0.10.0
- macOS
- Provider: Moonshot AI (Kimi Code) via OpenAI-compatible profile
Problem
When using OpenClaude with a custom OpenAI-compatible provider (e.g. Kimi, GLM, DeepSeek via OpenRouter), the
/modelpicker shows hardcoded GPT-5.x / Codex models that are irrelevant to the configured provider. Meanwhile, models defined inagentModelswithin~/.openclaude/settings.jsondo not appear in the picker at all.Current behavior
https://api.kimi.com/coding/, modelkimi-k2.6)settings.jsonunderagentModels(GLM-5.1, DeepSeek V4 Flash, MiniMax)/modelpickerResult: The picker shows
gpt-5.5,gpt-5.4,gpt-5.3-codex,gpt-5.3-codex-spark,gpt-5.2-codex,gpt-5.1-codex-max,gpt-5.1-codex-mini,gpt-5.5-mini,gpt-5.4-mini, plus Sonnet/Opus/Haiku — none of which are available through the configured provider. The customagentModels(glm-5.1, deepseek-v4-flash, etc.) are nowhere in the list.Expected behavior
agentModelsinsettings.jsonshould appear in the/modelpicker so the user can switch between them for the main session (not just subagents)getAPIProvider() === "openai"and the base URL is clearly not OpenAI/Codex (e.g.api.kimi.com,api.z.ai,openrouter.ai)Root cause (source analysis)
In
src/utils/model/modelOptions.ts, lines 550-551:This unconditionally injects 10 GPT/Codex model options whenever the provider is
openai, regardless of the actual base URL or configured models.Meanwhile,
agentModelsfromsettings.jsonare only consumed byresolveAgentProvider()insrc/services/api/agentRouting.tsfor subagent routing — they are never surfaced in the model picker (getModelOptionsBase()).Proposed solution
OPENAI_BASE_URLdoes not point to an OpenAI/Codex endpointagentModelsfromsettings.jsonin the/modelpicker, so users can switch the main session model to any of their configured providersproviderProfilesto define multiple profiles with different base URLs, and let the user switch between them from/model(currently only the active profile's models appear)Use case
I use a tiered multi-provider setup:
All are defined in
agentModels+agentRoutinginsettings.json. The routing works for subagents, but I cannot switch the main session model to any of these without manually editing env vars and restarting.Environment