feat(zenmux): route models through protocol-specific SDKs#1254
feat(zenmux): route models through protocol-specific SDKs#1254rekram1-node merged 2 commits intoanomalyco:devfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR updates the ZenMux provider and model metadata so models are routed through protocol-appropriate SDKs/endpoints (notably Anthropic + MiniMax), while aligning capabilities/limits with observed runtime behavior.
Changes:
- Switch ZenMux’s default SDK configuration to OpenAI-compatible (
@ai-sdk/openai-compatible) usinghttps://zenmux.ai/api/v1. - Add per-model
[provider]overrides to route OpenAI models via@ai-sdk/openaiand Anthropic/MiniMax models via@ai-sdk/anthropic(with the/api/anthropic/v1endpoint). - Adjust/clean up model metadata (e.g., add interleaving where needed, tweak a capability flag, remove deprecated/unsupported model entries, update limits).
Reviewed changes
Copilot reviewed 37 out of 37 changed files in this pull request and generated no comments.
Show a summary per file
| File | Description |
|---|---|
| providers/zenmux/provider.toml | Changes ZenMux default SDK + base API to OpenAI-compatible /api/v1. |
| providers/zenmux/models/z-ai/glm-4.7.toml | Adds interleaving config for reasoning output (reasoning_content). |
| providers/zenmux/models/z-ai/glm-4.7-flash-free.toml | Adds interleaving config for reasoning output (reasoning_content). |
| providers/zenmux/models/openai/gpt-5.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.4.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.4-pro.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.4-nano.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.4-mini.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.3-codex.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.3-chat.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.2.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.2-pro.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.2-codex.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.1.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.1-codex.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.1-codex-mini.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5.1-chat.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/openai/gpt-5-codex.toml | Adds per-model provider override to use @ai-sdk/openai with ZenMux /api/v1. |
| providers/zenmux/models/moonshotai/kimi-k2.5.toml | Updates capability flag (temperature = false) to match runtime behavior. |
| providers/zenmux/models/minimax/minimax-m2.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/minimax/minimax-m2.7.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/minimax/minimax-m2.7-highspeed.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/minimax/minimax-m2.5.toml | Removes interleaving and adds provider override to @ai-sdk/anthropic on /api/anthropic/v1. |
| providers/zenmux/models/minimax/minimax-m2.5-lightning.toml | Removes interleaving and adds provider override to @ai-sdk/anthropic on /api/anthropic/v1. |
| providers/zenmux/models/minimax/minimax-m2.1.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/google/gemini-3-pro-image-preview.toml | Removes unsupported/deprecated model entry from ZenMux catalog. |
| providers/zenmux/models/anthropic/claude-sonnet-4.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-sonnet-4.6.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-sonnet-4.5.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-opus-4.toml | Updates output limit and adds provider override to @ai-sdk/anthropic on /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-opus-4.6.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-opus-4.5.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-opus-4.1.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-haiku-4.5.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-3.7-sonnet.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
| providers/zenmux/models/anthropic/claude-3.5-sonnet.toml | Removes deprecated model entry from ZenMux catalog. |
| providers/zenmux/models/anthropic/claude-3.5-haiku.toml | Adds per-model provider override to use @ai-sdk/anthropic with ZenMux /api/anthropic/v1. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Can u provide a bit more info for this change, evidence it works, etc Kinda a big pr and idk this provider |
|
@rekram1-node Added more detail to the PR description, but the short version is:
For validation: I manually tested all updated model configs and confirmed they work correctly. |
|
Is there anything blocking this PR? Hope this one can be merged asap. |
…MiniMax M2.1/M2.5 - Add [provider] npm = "@ai-sdk/anthropic" to match -free variants - Remove incorrect [interleaved] section per PR anomalyco#1254 design decision MiniMax models should route through @ai-sdk/anthropic with per-model provider overrides, not rely on [interleaved] for reasoning content handling.
Summary
Why
[provider]override.reasoning_contentto be preserved across turns. Interleaved reasoning_content lost for non-Claude models on Anthropic SDK opencode#14638 documents the concrete failure mode for Kimi K2.5 on the Anthropic SDK path: prior assistantreasoning_contentis dropped on replay because@ai-sdk/anthropiconly serializes Anthropicthinkingblocks. Using the OpenAI-compatible path for these models avoids that incompatibility.@ai-sdk/anthropic, while OpenAI models are explicitly routed through@ai-sdk/openai, so each family uses the SDK/API path that best matches its protocol and feature set.Validation