Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,6 +140,7 @@ Advanced and source-build guides:
| Codex OAuth | `/provider` | Opens ChatGPT sign-in in your browser and stores Codex credentials securely |
| Codex | `/provider` | Uses existing Codex CLI auth, OpenClaude secure storage, or env credentials |
| Xiaomi MiMo | `/provider` or env vars | OpenAI-compatible API at `https://api.xiaomimimo.com/v1`; uses `MIMO_API_KEY` and defaults to `mimo-v2.5-pro` |
| Cloudflare Workers AI | `/provider` or env vars | OpenAI-compatible API at `https://api.cloudflare.com/client/v4/accounts/<ACCOUNT_ID>/ai/v1`; uses `CLOUDFLARE_API_TOKEN`. Replace `<ACCOUNT_ID>` with your Cloudflare account id. |
| Ollama | `/provider` or env vars | Local inference with no API key |
| Atomic Chat | `/provider`, env vars, or `bun run dev:atomic-chat` | Local Model Provider; auto-detects loaded models |
| Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |
Expand Down
11 changes: 11 additions & 0 deletions docs/advanced-setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -164,6 +164,17 @@ export OPENAI_MODEL=mimo-v2.5-pro

The `/provider` Xiaomi MiMo preset uses the same endpoint and stores the key as `MIMO_API_KEY`. `OPENAI_API_KEY` also works as a compatibility fallback, but `MIMO_API_KEY` keeps the profile tied to the MiMo route.

### Cloudflare Workers AI

```bash
export CLAUDE_CODE_USE_OPENAI=1
export CLOUDFLARE_API_TOKEN=...
export OPENAI_BASE_URL=https://api.cloudflare.com/client/v4/accounts/<ACCOUNT_ID>/ai/v1
export OPENAI_MODEL=@cf/meta/llama-3.3-70b-instruct-fp8-fast
```

Replace `<ACCOUNT_ID>` with your Cloudflare account id (visible in the Cloudflare dashboard URL). `OPENAI_API_KEY` also works as a compatibility fallback, but `CLOUDFLARE_API_TOKEN` keeps the profile tied to the Cloudflare preset. The `/provider` Cloudflare Workers AI preset stores the token under `CLOUDFLARE_API_TOKEN`.

### Mistral

```bash
Expand Down
26 changes: 26 additions & 0 deletions src/commands/provider/provider.test.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -369,6 +369,32 @@ test('buildProfileSaveMessage labels descriptor-backed Venice profiles consisten
expect(message).not.toContain('sk-venice-secret-12345678')
})

test('buildProfileSaveMessage labels descriptor-backed Cloudflare Workers AI profiles consistently', () => {
// Cloudflare URLs embed the user's account id, so the saved profile reflects
// the literal substituted URL. Make sure the label still routes through the
// descriptor preset rather than falling back to a generic "OpenAI-compatible"
// string, matching the Venice / MiMo / Bankr coverage above.
const message = buildProfileSaveMessage(
'openai',
{
OPENAI_API_KEY: 'cf-secret-token-12345678',
CLOUDFLARE_API_TOKEN: 'cf-secret-token-12345678',
OPENAI_MODEL: '@cf/meta/llama-3.3-70b-instruct-fp8-fast',
OPENAI_BASE_URL:
'https://api.cloudflare.com/client/v4/accounts/abc123/ai/v1',
},
'D:/codings/Opensource/openclaude/.openclaude-profile.json',
)

expect(message).toContain('Saved Cloudflare Workers AI profile.')
expect(message).toContain('Model: @cf/meta/llama-3.3-70b-instruct-fp8-fast')
expect(message).toContain(
'Endpoint: https://api.cloudflare.com/client/v4/accounts/abc123/ai/v1',
)
expect(message).toContain('Credentials: configured')
expect(message).not.toContain('cf-secret-token-12345678')
})

test('buildProfileSaveMessage describes Gemini access token / ADC mode clearly', () => {
const message = buildProfileSaveMessage(
'gemini',
Expand Down
1 change: 1 addition & 0 deletions src/integrations/compatibility.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@ const EXPECTED_PRESETS = [
'zai',
'bankr',
'atomic-chat',
'cloudflare',
'gitlawb-opengateway',
] as const satisfies readonly ProviderPreset[]

Expand Down
19 changes: 18 additions & 1 deletion src/integrations/generated/integrationArtifacts.generated.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
import type { AnthropicProxyDescriptor, BrandDescriptor, GatewayDescriptor, ModelDescriptor, ProviderPresetManifestEntry, VendorDescriptor } from '../descriptors.js'
import vendorAnthropic from '../vendors/anthropic.js'
import vendorBankr from '../vendors/bankr.js'
import vendorCloudflare from '../vendors/cloudflare.js'
import vendorDeepseek from '../vendors/deepseek.js'
import vendorGemini from '../vendors/gemini.js'
import vendorMinimax from '../vendors/minimax.js'
Expand Down Expand Up @@ -60,7 +61,7 @@ import modelQwen from '../models/qwen.js'
import modelXai from '../models/xai.js'
import modelXiaomiMimo from '../models/xiaomi-mimo.js'

export const VENDOR_DESCRIPTORS = [vendorAnthropic, vendorBankr, vendorDeepseek, vendorGemini, vendorMinimax, vendorMoonshot, vendorOpenai, vendorVenice, vendorXai, vendorXiaomiMimo, vendorZai] as const satisfies readonly VendorDescriptor[]
export const VENDOR_DESCRIPTORS = [vendorAnthropic, vendorBankr, vendorCloudflare, vendorDeepseek, vendorGemini, vendorMinimax, vendorMoonshot, vendorOpenai, vendorVenice, vendorXai, vendorXiaomiMimo, vendorZai] as const satisfies readonly VendorDescriptor[]
export const GATEWAY_DESCRIPTORS = [gatewayAtomicChat, gatewayAzureOpenai, gatewayBedrock, gatewayCustom, gatewayDashscopeCn, gatewayDashscopeIntl, gatewayGithub, gatewayGitlawbOpengateway, gatewayGroq, gatewayHicap, gatewayKimiCode, gatewayLmstudio, gatewayMistral, gatewayNvidiaNim, gatewayOllama, gatewayOpenrouter, gatewayTogether, gatewayVertex] as const satisfies readonly GatewayDescriptor[]
export const ANTHROPIC_PROXY_DESCRIPTORS = [] as const satisfies readonly AnthropicProxyDescriptor[]
export const BRAND_DESCRIPTORS = [brandClaude, brandDeepseek, brandGemini, brandGlm, brandGpt, brandKimi, brandLlama, brandMinimax, brandMistral, brandNemotron, brandOpenaiCompatibleAlias, brandQwen, brandXai, brandXiaomiMimo] as const satisfies readonly BrandDescriptor[]
Expand Down Expand Up @@ -150,6 +151,21 @@ export const PROVIDER_PRESET_MANIFEST = [
"OPENAI_MODEL"
]
},
{
"preset": "cloudflare",
"routeKind": "vendor",
"routeId": "cloudflare",
"vendorId": "cloudflare",
"description": "Cloudflare Workers AI OpenAI-compatible endpoint. Replace <ACCOUNT_ID> in the base URL with your Cloudflare account id.",
"label": "Cloudflare Workers AI",
"name": "Cloudflare Workers AI",
"apiKeyEnvVars": [
"CLOUDFLARE_API_TOKEN"
],
"modelEnvVars": [
"OPENAI_MODEL"
]
},
{
"preset": "deepseek",
"routeKind": "vendor",
Expand Down Expand Up @@ -401,6 +417,7 @@ export const ORDERED_PROVIDER_PRESETS = [
"dashscope-intl",
"azure-openai",
"bankr",
"cloudflare",
"deepseek",
"gemini",
"groq",
Expand Down
84 changes: 84 additions & 0 deletions src/integrations/vendors/cloudflare.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
import { defineVendor } from '../define.js'

// Cloudflare Workers AI exposes an OpenAI-compatible endpoint scoped to the
// caller's Cloudflare account id. The base URL contains `<ACCOUNT_ID>` as a
// literal placeholder: users must substitute their account id via the
// `/provider` baseUrl edit (or by setting `OPENAI_BASE_URL` directly) before
// requests will succeed. Same shape `docs/advanced-setup.md` already uses for
// Azure (`https://your-resource.openai.azure.com/...`). See issue #1100.
//
// A first-class AI Gateway integration with `gateway_id` URL templating, and
// dynamic `/models` discovery on the Groq #1143 / mapModel pattern, are clean
// follow-ups left for separate PRs.
export default defineVendor({
id: 'cloudflare',
label: 'Cloudflare Workers AI',
classification: 'openai-compatible',
defaultBaseUrl:
'https://api.cloudflare.com/client/v4/accounts/<ACCOUNT_ID>/ai/v1',
defaultModel: '@cf/meta/llama-3.3-70b-instruct-fp8-fast',
requiredEnvVars: ['CLOUDFLARE_API_TOKEN'],
setup: {
requiresAuth: true,
authMode: 'api-key',
credentialEnvVars: ['CLOUDFLARE_API_TOKEN'],
},
transportConfig: {
kind: 'openai-compatible',
openaiShim: {
maxTokensField: 'max_tokens',
// Workers AI rejects unknown OpenAI body fields (`store`, persistence
// flags) — mirror the Mistral / Gemini / Cerebras strip pattern.
removeBodyFields: ['store'],
supportsApiFormatSelection: false,
supportsAuthHeaders: false,
},
},
preset: {
id: 'cloudflare',
description:
'Cloudflare Workers AI OpenAI-compatible endpoint. Replace <ACCOUNT_ID> in the base URL with your Cloudflare account id.',
label: 'Cloudflare Workers AI',
name: 'Cloudflare Workers AI',
apiKeyEnvVars: ['CLOUDFLARE_API_TOKEN'],
modelEnvVars: ['OPENAI_MODEL'],
},
validation: {
kind: 'credential-env',
routing: {
// `<ACCOUNT_ID>` placeholder won't match a real URL, so rely on host
// matching to associate user-edited Cloudflare URLs back to this preset.
matchDefaultBaseUrl: false,
matchBaseUrlHosts: ['api.cloudflare.com', 'gateway.ai.cloudflare.com'],
},
credentialEnvVars: ['CLOUDFLARE_API_TOKEN', 'OPENAI_API_KEY'],
missingCredentialMessage:
'Cloudflare Workers AI auth is required. Set CLOUDFLARE_API_TOKEN or OPENAI_API_KEY.',
},
catalog: {
source: 'static',
models: [
{
id: '@cf/meta/llama-3.3-70b-instruct-fp8-fast',
apiName: '@cf/meta/llama-3.3-70b-instruct-fp8-fast',
label: 'Llama 3.3 70B Instruct (FP8 Fast)',
},
{
id: '@cf/meta/llama-3.1-8b-instruct',
apiName: '@cf/meta/llama-3.1-8b-instruct',
label: 'Llama 3.1 8B Instruct',
},
{
id: '@cf/deepseek-ai/deepseek-r1-distill-qwen-32b',
apiName: '@cf/deepseek-ai/deepseek-r1-distill-qwen-32b',
label: 'DeepSeek R1 Distill Qwen 32B',
},
{
id: '@cf/qwen/qwen2.5-coder-32b-instruct',
apiName: '@cf/qwen/qwen2.5-coder-32b-instruct',
label: 'Qwen 2.5 Coder 32B Instruct',
},
],
},
usage: { supported: false },
})
5 changes: 4 additions & 1 deletion src/utils/providerFlag.ts
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,10 @@ export function applyProviderFlag(
: process.env.OPENAI_API_KEY !== undefined &&
process.env.OPENAI_API_KEY === process.env.MINIMAX_API_KEY
? 'minimax'
: null
: process.env.OPENAI_API_KEY !== undefined &&
process.env.OPENAI_API_KEY === process.env.CLOUDFLARE_API_TOKEN
? 'cloudflare'
: null

delete process.env.CLAUDE_CODE_USE_OPENAI
delete process.env.CLAUDE_CODE_USE_GEMINI
Expand Down
6 changes: 5 additions & 1 deletion src/utils/providerProfile.ts
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,7 @@ const PROFILE_ENV_KEYS = [
'XAI_API_KEY',
'VENICE_API_KEY',
'MIMO_API_KEY',
'CLOUDFLARE_API_TOKEN',
] as const

export type CompatibilityProfileMode =
Expand All @@ -117,6 +118,7 @@ const SECRET_ENV_KEYS = [
'XAI_API_KEY',
'VENICE_API_KEY',
'MIMO_API_KEY',
'CLOUDFLARE_API_TOKEN',
] as const

export type ProviderProfile =
Expand Down Expand Up @@ -173,6 +175,7 @@ export type ProfileEnv = {
XAI_API_KEY?: string
VENICE_API_KEY?: string
MIMO_API_KEY?: string
CLOUDFLARE_API_TOKEN?: string
}

export type ProfileFile = {
Expand All @@ -194,7 +197,8 @@ type SecretValueSource = Partial<
| 'BNKR_API_KEY'
| 'XAI_API_KEY'
| 'VENICE_API_KEY'
| 'MIMO_API_KEY',
| 'MIMO_API_KEY'
| 'CLOUDFLARE_API_TOKEN',
string | undefined
>
>
Expand Down
39 changes: 39 additions & 0 deletions src/utils/providerProfiles.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -210,6 +210,19 @@ function buildXiaomiMimoProfile(overrides: Partial<ProviderProfile> = {}): Provi
})
}

function buildCloudflareProfile(overrides: Partial<ProviderProfile> = {}): ProviderProfile {
return buildProfile({
provider: 'cloudflare',
name: 'Cloudflare Workers AI',
// Account-scoped URL — users substitute `<ACCOUNT_ID>` for their account.
// Tests use a literal id so host-matching for the descriptor is exercised.
baseUrl: 'https://api.cloudflare.com/client/v4/accounts/abc123/ai/v1',
model: '@cf/meta/llama-3.3-70b-instruct-fp8-fast',
apiKey: 'cloudflare-test-token',
...overrides,
})
}

describe('applyProviderProfileToProcessEnv', () => {
test('openai profile clears competing gemini/github flags', async () => {
const { applyProviderProfileToProcessEnv } =
Expand Down Expand Up @@ -530,6 +543,32 @@ describe('applyProviderProfileToProcessEnv', () => {
expect(getFreshAPIProvider()).toBe('xiaomi-mimo')
})

test('cloudflare profile applies OpenAI-compatible env with CLOUDFLARE_API_TOKEN mirror', async () => {
// Account-scoped URL: a real user has substituted `<ACCOUNT_ID>` for their
// Cloudflare account id. The env-build path should mirror the api key into
// `CLOUDFLARE_API_TOKEN` so the descriptor's host-based route detection
// picks the cloudflare preset back up on the next reload.
const { applyProviderProfileToProcessEnv } =
await importFreshProviderProfileModules()
process.env.CLAUDE_CODE_USE_GEMINI = '1'

applyProviderProfileToProcessEnv(buildCloudflareProfile())
const { getAPIProvider: getFreshAPIProvider } =
await importFreshProvidersModule()

expect(process.env.CLAUDE_CODE_USE_GEMINI).toBeUndefined()
expect(String(process.env.CLAUDE_CODE_USE_OPENAI)).toBe('1')
expect(process.env.OPENAI_BASE_URL).toBe(
'https://api.cloudflare.com/client/v4/accounts/abc123/ai/v1',
)
expect(process.env.OPENAI_MODEL).toBe(
'@cf/meta/llama-3.3-70b-instruct-fp8-fast',
)
expect(process.env.OPENAI_API_KEY).toBe('cloudflare-test-token')
expect(process.env.CLOUDFLARE_API_TOKEN).toBe('cloudflare-test-token')
expect(getFreshAPIProvider()).toBe('openai')
})

test('xiaomi mimo profile normalizes stale docs endpoint to resolving API host', async () => {
const { applyProviderProfileToProcessEnv } =
await importFreshProviderProfileModules()
Expand Down
14 changes: 14 additions & 0 deletions src/utils/providerProfiles.ts
Original file line number Diff line number Diff line change
Expand Up @@ -541,6 +541,11 @@ function isProcessEnvAlignedWithProfile(
profile.baseUrl?.toLowerCase().includes('api.mimo-v2.com')
? !includeApiKey ||
sameOptionalEnvValue(processEnv.MIMO_API_KEY, profile.apiKey)
: true) &&
(profile.baseUrl?.toLowerCase().includes('api.cloudflare.com') ||
profile.baseUrl?.toLowerCase().includes('gateway.ai.cloudflare.com')
? !includeApiKey ||
sameOptionalEnvValue(processEnv.CLOUDFLARE_API_TOKEN, profile.apiKey)
: true)
)
}
Expand Down Expand Up @@ -666,6 +671,9 @@ export function applyProviderProfileToProcessEnv(profile: ProviderProfile): void
if (route.routeId === 'xiaomi-mimo' || profile.baseUrl.toLowerCase().includes('api.xiaomimimo.com') || profile.baseUrl.toLowerCase().includes('api.mimo-v2.com')) {
openAIProfileEnv.MIMO_API_KEY = profile.apiKey
}
if (route.routeId === 'cloudflare' || profile.baseUrl.toLowerCase().includes('api.cloudflare.com') || profile.baseUrl.toLowerCase().includes('gateway.ai.cloudflare.com')) {
openAIProfileEnv.CLOUDFLARE_API_TOKEN = profile.apiKey
}
}
if (route.gatewayId === 'nvidia-nim') {
openAIProfileEnv.NVIDIA_NIM = '1'
Expand Down Expand Up @@ -973,6 +981,12 @@ function buildOpenAICompatibleStartupEnv(
) {
env.MIMO_API_KEY = activeProfile.apiKey
}
if (
activeProfile.baseUrl?.toLowerCase().includes('api.cloudflare.com') ||
activeProfile.baseUrl?.toLowerCase().includes('gateway.ai.cloudflare.com')
) {
env.CLOUDFLARE_API_TOKEN = activeProfile.apiKey
}
} else {
delete env.OPENAI_API_KEY
}
Expand Down
Loading