Skip to content

Commit 95f7fee

Browse files
fix: add model-name inference for MiMo and GLM/Z.AI in local proxy
inferRemoteModelOpenAIShimConfig() only recognized deepseek, kimi, and moonshot. When using a local proxy (e.g. GRouter) with mimo-v2.5-pro or GLM-5.1, the function returned undefined, so these models got no reasoning config even with the treatAsLocal merge fix. Now mimo and glm patterns are recognized and return the same reasoning config their vendor files define.
1 parent 1c4ce8b commit 95f7fee

1 file changed

Lines changed: 22 additions & 0 deletions

File tree

src/integrations/runtimeMetadata.ts

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -163,6 +163,28 @@ function inferRemoteModelOpenAIShimConfig(
163163
}
164164
}
165165

166+
if (normalizedModel.includes('mimo')) {
167+
return {
168+
preserveReasoningContent: true,
169+
requireReasoningContentOnAssistantMessages: true,
170+
reasoningContentFallback: '',
171+
thinkingRequestFormat: 'deepseek-compatible',
172+
maxTokensField: 'max_tokens',
173+
removeBodyFields: ['store'],
174+
}
175+
}
176+
177+
if (normalizedModel.includes('glm')) {
178+
return {
179+
preserveReasoningContent: true,
180+
requireReasoningContentOnAssistantMessages: true,
181+
reasoningContentFallback: '',
182+
thinkingRequestFormat: 'deepseek-compatible',
183+
maxTokensField: 'max_tokens',
184+
removeBodyFields: ['store'],
185+
}
186+
}
187+
166188
return undefined
167189
}
168190

0 commit comments

Comments
 (0)