Skip to content

feat: add MiniMax as LLM provider (M2.7/M2.5/M2.5-highspeed)#120

Open
octo-patch wants to merge 1 commit intoxiangsx:masterfrom
octo-patch:add-minimax-provider
Open

feat: add MiniMax as LLM provider (M2.7/M2.5/M2.5-highspeed)#120
octo-patch wants to merge 1 commit intoxiangsx:masterfrom
octo-patch:add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class LLM provider, supporting the latest MiniMax-M2.7, MiniMax-M2.5, and MiniMax-M2.5-highspeed models via MiniMax OpenAI-compatible API.

Changes

  • model/minimax/index.ts: New MiniMax provider extending the Chat base class with streaming support, temperature clamping to [0, 1.0] range, configurable base_url/api_key/proxy, MINIMAX_API_KEY env fallback
  • model/base.ts: Add MiniMaxM2_7, MiniMaxM2_5, MiniMaxM2_5_highspeed to ModelType enum; add Site.MiniMax
  • model/index.ts: Register MiniMax provider in ChatModelFactory
  • utils/config.ts: Add minimax config section (api_key, base_url, proxy)
  • README.md / README_zh.md: Document MiniMax provider and MINIMAX_API_KEY env var
  • jest.config.js + tests/minimax/: 15 unit tests + 3 integration tests

Supported Models

Model Context Window
MiniMax-M2.7 1,048,576 tokens
MiniMax-M2.5 1,048,576 tokens
MiniMax-M2.5-highspeed 204,800 tokens

Test Plan

  • 15 unit tests pass (model support, constructor, askStream, temperature clamping, error handling, env fallback)
  • 3 integration tests pass (streaming M2.7, streaming M2.5-highspeed, multi-turn conversation)
  • Manual test with MINIMAX_API_KEY via /ask and /ask/stream endpoints

Add MiniMax (https://platform.minimaxi.com/) as a first-class LLM
provider supporting MiniMax-M2.7, MiniMax-M2.5, and
MiniMax-M2.5-highspeed models via the OpenAI-compatible API.

Changes:
- model/minimax/index.ts: MiniMax provider extending Chat base class
  with streaming support, temperature clamping [0,1], and configurable
  base_url/api_key
- model/base.ts: Add MiniMax model types and Site.MiniMax enum
- model/index.ts: Register MiniMax in ChatModelFactory
- utils/config.ts: Add minimax config section (api_key, base_url, proxy)
- README.md/README_zh.md: Document MiniMax provider and MINIMAX_API_KEY
- jest.config.js: Add Jest configuration for tests
- __tests__/minimax/: 15 unit tests + 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant