Help connecting to minimax, moonshot and zai models via Bedrock #21228
Replies: 3 comments
-
|
I use Moonshot Kimi K2.5 directly through their API at api.moonshot.ai/v1 and it works well as an OpenAI-compatible endpoint. If you are having trouble with the Bedrock routing, you might want to try hitting Moonshot directly as a workaround while debugging the Bedrock config. My litellm-style config for the direct route looks like this: model_name: "kimi-k2.5"
litellm_params:
model: openai/kimi-k2.5
api_base: https://api.moonshot.ai/v1
api_key: your-moonshot-api-keyThe model ID on Moonshot's side is just For the Bedrock-specific issue, the model identifier format for newer marketplace models on Bedrock sometimes needs the full ARN instead of the shorthand. Have you tried using the model ARN from your Bedrock console instead of the |
Beta Was this translation helpful? Give feedback.
-
|
Bedrock integration for non-standard models can be tricky! At RevolutionAI (https://revolutionai.io) we have set up similar custom model routing. For Minimax/Moonshot/Zai via Bedrock:
response = litellm.completion(
model="bedrock/your-model-arn",
messages=[...],
aws_region_name="us-east-1"
)Alternative approach: If Bedrock does not have these models, consider using LiteLLM router to hit the provider APIs directly with fallback to Bedrock models. What specific error are you seeing? That would help narrow down whether it is auth, model availability, or configuration. |
Beta Was this translation helpful? Give feedback.
-
|
Bedrock third-party models have specific model IDs that differ from their marketing names. Get correct model IDs: aws bedrock list-foundation-models --region us-east-1 | grep -E "minimax|moonshot|zai"Common format issues:
Debug steps:
Try with explicit model ID: model_list:
- model_name: "MiniMax"
litellm_params:
model: bedrock/converse/minimax.minimax-01
aws_region_name: us-east-1The converse/ prefix helps with newer Bedrock models. We configure Bedrock + LiteLLM at Revolution AI — the model ID format varies by provider. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Dumb question. All my other AWS Bedrock models are working fine, except these 3. What is the correct config to connect? I confirmed I can use these models via AWS Console, but getting errors for these 3. Running v1.81.11
Beta Was this translation helpful? Give feedback.
All reactions