Skip to content

feat: upgrade MiniMax provider with M2.7 models and OpenAI-compatible API#2418

Open
octo-patch wants to merge 1 commit intoopen-compass:mainfrom
octo-patch:feature/upgrade-minimax-m2.7
Open

feat: upgrade MiniMax provider with M2.7 models and OpenAI-compatible API#2418
octo-patch wants to merge 1 commit intoopen-compass:mainfrom
octo-patch:feature/upgrade-minimax-m2.7

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Upgrade the MiniMax LLM provider integration to support the latest models and API endpoint:

  • New MiniMaxAPI class with full-featured OpenAI-compatible API support via https://api.minimax.io/v1/chat/completions
  • Latest models: MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed (up to 204K context)
  • Environment variable support: MINIMAX_API_KEY with multi-key rotation
  • Temperature clamping: Automatically clamps to MiniMax range [0, 1.0]
  • Reasoning content handling: think_tag support for M2.5/M2.7 thinking models, inline <think>...</think> tag stripping
  • System prompt support
  • Model config files for easy evaluation setup
  • 20 unit tests + 3 integration tests

The existing MiniMax and MiniMaxChatCompletionV2 classes are preserved for backward compatibility, with the default API URL updated from api.minimax.chat to api.minimax.io.

Changes

File Change
opencompass/models/minimax_api.py Add MiniMaxAPI class, update default URL
opencompass/models/__init__.py Export MiniMaxAPI
opencompass/configs/models/minimax/ Config files for M2.7, M2.5, M2.5-highspeed
tests/models/test_minimax_api.py 20 unit tests + 3 integration tests

Usage

from opencompass.models import MiniMaxAPI

model = MiniMaxAPI(
    path="MiniMax-M2.7",
    key="ENV",  # reads from MINIMAX_API_KEY
    temperature=0.7,
)
results = model.generate(["What is 2+2?"], max_out_len=500)

Test plan

  • 20 unit tests covering initialization, key rotation, temperature clamping, think tag handling, system prompt, retry logic
  • 3 integration tests with real MiniMax API (M2.7, M2.5-highspeed, temperature)
  • CI pipeline validation

… API

Add MiniMaxAPI class supporting the latest MiniMax models (MiniMax-M2.7,
MiniMax-M2.5, MiniMax-M2.5-highspeed) via the OpenAI-compatible
/v1/chat/completions endpoint at api.minimax.io.

Key improvements over existing MiniMax/MiniMaxChatCompletionV2:
- Environment variable support (MINIMAX_API_KEY) with multi-key rotation
- Temperature clamping to MiniMax's [0, 1.0] range
- Reasoning content handling (think_tag) for M2.5/M2.7 models
- Inline <think>...</think> tag stripping
- System prompt support
- Default API URL updated from api.minimax.chat to api.minimax.io
- Model config files for MiniMax-M2.7, M2.5, and M2.5-highspeed
- 20 unit tests + 3 integration tests

Backward compatible: existing MiniMax and MiniMaxChatCompletionV2 classes
are preserved with updated default URL.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants