This repository contains a shell script that provides intelligent proxy functionality for Claude Code CLI, allowing it to work with multiple AI model providers through a unified interface.
The .claude.sh script intercepts Claude Code CLI commands and automatically routes them to the appropriate model provider based on the --model parameter. It supports multiple Chinese AI providers while maintaining compatibility with the standard Claude Code interface.
The script supports the following model providers and their corresponding models:
| Provider | Environment Variable | Base URL | Supported Models |
|---|---|---|---|
| Zhipu (智谱) | ZHIPU_API_KEY |
https://open.bigmodel.cn/api/anthropic |
glm-4.5 |
| Alibaba (阿里) | DASHSCOPE_API_KEY |
https://dashscope.aliyuncs.com/api/v2/apps/claude-code-proxy |
qwen3-coder-plus |
| DeepSeek | DEEPSEEK_API_KEY |
https://api.deepseek.com/anthropic |
deepseek-chat |
| Moonshot (月之暗面) | MOONSHOT_API_KEY |
https://api.moonshot.cn/anthropic |
kimi-k2-0905-preview |
| LongCat | LONGCAT_API_KEY |
https://api.longcat.chat/anthropic |
LongCat-Flash-Chat |
-
Install Claude Code CLI:
npm install -g @anthropic-ai/claude-code
-
Source the proxy script in your shell configuration:
# Add to ~/.zshrc or ~/.bashrc source /path/to/.claude.sh
-
Set up your API keys for the desired providers:
export ZHIPU_API_KEY=your_zhipu_key export DASHSCOPE_API_KEY=your_alibaba_key export DEEPSEEK_API_KEY=your_deepseek_key export MOONSHOT_API_KEY=your_moonshot_key export LONGCAT_API_KEY=your_longcat_key
Use Claude Code as normal, but specify the desired model with the --model parameter:
# Use Zhipu's GLM-4.5 model
claude --model glm-4.5
# Use Alibaba's Qwen3 Coder Plus
claude --model qwen3-coder-plus
# Use DeepSeek Chat
claude --model deepseek-chat
# Use Moonshot's Kimi K2
claude --model kimi-k2-0905-preview
# Use LongCat's model
claude --model LongCat-Flash-Chat
# Use default Claude model (no proxy)
claude- Automatic Provider Detection: The script automatically detects which provider to use based on the model name
- Environment Variable Validation: Checks that required API keys are set before routing requests
- Transparent Proxy: Maintains full compatibility with Claude Code CLI syntax and options
- Error Handling: Provides clear error messages when API keys are missing or invalid
- Flexible Model Parameter: Supports both
--model model_nameand--model=model_nameformats
The script uses a associative array (PROVIDERS) to configure provider mappings. Each entry contains:
- Environment variable name for the API key
- Base URL for the provider's Anthropic-compatible API
- Comma-separated list of supported models
The script provides helpful error messages:
- CLI Not Found: Instructions for installing Claude Code if not detected
- Missing API Key: Specific environment variable name needed for each provider
- Provider Routing: Confirmation when successfully routing to a specific provider
All proxied requests use a 10-minute timeout (API_TIMEOUT_MS=600000) to accommodate longer processing times for complex tasks.
This setup includes support for Model Context Protocol (MCP) servers to enhance AI assistant capabilities with additional tools and resources.
- cf-docs: Cloudflare documentation search and resources
- context7: Up-to-date library documentation and code examples
- chrome-devtools: Browser developer tools integration
For detailed MCP setup instructions and commands for different AI assistants (Claude, Gemini, Codex), see MCP.md.
MCP servers provide additional tools like:
- Access to current documentation for various libraries
- Cloudflare product documentation and resources
- Browser debugging and inspection capabilities
- Real-time code examples and API references