Run OpenCode inside Cloudflare Sandboxes! Just open the worker URL in your browser to get the full OpenCode web experience.
- Copy
.dev.vars.exampleto.dev.varsand add your Anthropic API key:
cp .dev.vars.example .dev.vars
# Edit .dev.vars with your ANTHROPIC_API_KEY- Install dependencies and run:
npm install
npm run dev- Open http://localhost:8787 in your browser - you'll see the OpenCode web UI!
The worker acts as a transparent proxy to OpenCode running in the container:
Browser → Worker → Sandbox DO → Container :4096 → OpenCode Server
↓
Proxies UI from desktop.dev.opencode.ai
OpenCode handles everything:
- API routes (
/session/*,/event, etc.) - Web UI (proxied from
desktop.dev.opencode.ai) - WebSocket for terminal
- Web UI - Full browser-based OpenCode experience
- Isolated execution - Code runs in secure sandbox containers
- Persistent sessions - Sessions survive across requests
You can route AI requests through Cloudflare AI Gateway for monitoring, caching, and rate limiting. With unified billing, the gateway handles provider API keys — you only need your Cloudflare credentials.
Add these variables to .dev.vars:
CLOUDFLARE_ACCOUNT_ID=your-account-id
CLOUDFLARE_GATEWAY_ID=your-gateway-id
CLOUDFLARE_API_TOKEN=your-api-tokenConfigure the provider in src/index.ts. Models must be declared explicitly using the provider/model format:
const getConfig = (env: Env): Config => ({
provider: {
'cloudflare-ai-gateway': {
options: {
accountId: env.CLOUDFLARE_ACCOUNT_ID,
gatewayId: env.CLOUDFLARE_GATEWAY_ID,
apiToken: env.CLOUDFLARE_API_TOKEN
},
models: {
'anthropic/claude-sonnet-4-5-20250929': {},
'openai/gpt-4o': {}
}
}
}
});When using the SDK programmatically, specify the model with providerID: 'cloudflare-ai-gateway':
await client.session.prompt({
sessionID,
model: {
providerID: 'cloudflare-ai-gateway',
modelID: 'anthropic/claude-sonnet-4-5-20250929'
},
parts: [{ type: 'text', text: 'Hello!' }]
});You can pass additional environment variables to the OpenCode process using the env option. This is useful for:
- OTEL telemetry - Configure OpenTelemetry exporters
- Distributed tracing - Propagate W3C trace context (
TRACEPARENT) - Custom configuration - Any other env vars your setup requires
const traceparent = request.headers.get('traceparent');
const server = await createOpencodeServer(sandbox, {
config: getConfig(env),
env: {
...(traceparent ? { TRACEPARENT: traceparent } : {}),
OTEL_EXPORTER_OTLP_ENDPOINT: 'http://127.0.0.1:4318',
OTEL_EXPORTER_OTLP_PROTOCOL: 'http/protobuf'
}
});Happy hacking!