Skip to content

Feature: Show agentModels from settings.json in /model picker instead of hardcoded GPT/Codex models #1119

@suenot

Description

@suenot

Problem

When using OpenClaude with a custom OpenAI-compatible provider (e.g. Kimi, GLM, DeepSeek via OpenRouter), the /model picker shows hardcoded GPT-5.x / Codex models that are irrelevant to the configured provider. Meanwhile, models defined in agentModels within ~/.openclaude/settings.json do not appear in the picker at all.

Current behavior

  1. Set up a provider profile with Kimi (https://api.kimi.com/coding/, model kimi-k2.6)
  2. Define additional models in settings.json under agentModels (GLM-5.1, DeepSeek V4 Flash, MiniMax)
  3. Open /model picker

Result: The picker shows gpt-5.5, gpt-5.4, gpt-5.3-codex, gpt-5.3-codex-spark, gpt-5.2-codex, gpt-5.1-codex-max, gpt-5.1-codex-mini, gpt-5.5-mini, gpt-5.4-mini, plus Sonnet/Opus/Haiku — none of which are available through the configured provider. The custom agentModels (glm-5.1, deepseek-v4-flash, etc.) are nowhere in the list.

Expected behavior

  • Models from agentModels in settings.json should appear in the /model picker so the user can switch between them for the main session (not just subagents)
  • Hardcoded GPT/Codex models should not be injected when getAPIProvider() === "openai" and the base URL is clearly not OpenAI/Codex (e.g. api.kimi.com, api.z.ai, openrouter.ai)

Root cause (source analysis)

In src/utils/model/modelOptions.ts, lines 550-551:

if (getAPIProvider() === 'openai' || getAPIProvider() === 'codex') {
    payg3pOptions.push(...getCodexModelOptions())
}

This unconditionally injects 10 GPT/Codex model options whenever the provider is openai, regardless of the actual base URL or configured models.

Meanwhile, agentModels from settings.json are only consumed by resolveAgentProvider() in src/services/api/agentRouting.ts for subagent routing — they are never surfaced in the model picker (getModelOptionsBase()).

Proposed solution

  1. Skip Codex injection when OPENAI_BASE_URL does not point to an OpenAI/Codex endpoint
  2. Surface agentModels from settings.json in the /model picker, so users can switch the main session model to any of their configured providers
  3. Alternatively, allow providerProfiles to define multiple profiles with different base URLs, and let the user switch between them from /model (currently only the active profile's models appear)

Use case

I use a tiered multi-provider setup:

  • GLM-5.1 (Z.AI) — for planning, 10-20 tps
  • Kimi K2.6 (Moonshot) — for routine coding, 20-60 tps
  • DeepSeek V4 Flash (OpenRouter) — for exploration, high throughput
  • MiniMax (SambaNova) — for simple tasks, 200-400 tps

All are defined in agentModels + agentRouting in settings.json. The routing works for subagents, but I cannot switch the main session model to any of these without manually editing env vars and restarting.

Environment

  • OpenClaude v0.10.0
  • macOS
  • Provider: Moonshot AI (Kimi Code) via OpenAI-compatible profile

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions