A high-performance proxy server that allows Claude Code to use ChatGPT backend API via Codex OAuth credentials.
This project is an experiment and is provided as-is with no guarantees. Upstream APIs may change at any time and break compatibility.
- 🚀 FastAPI runtime - Async Python server with clear request routing
- 🔐 OAuth token management - Automatic token refresh from
~/.codex/auth.json - 🔄 API translation - Anthropic Messages API → OpenAI Responses API (via LiteLLM adapter)
- 📊 Structured logging - Structlog with JSON output
- 📈 Distributed tracing - OpenTelemetry integration
- ✅ Type-safe - Pydantic validation for all API calls
- ⚙️ CLI-configurable - Ports, hosts, model mappings via CLI args
uv syncMake sure you're authenticated with Codex CLI:
codex loginThis creates ~/.codex/auth.json with OAuth tokens.
# Basic usage
uv run cc-codex-oauth-proxy
# Custom configuration
uv run cc-codex-oauth-proxy \
--port 8082 \
--host 127.0.0.1 \
--model-mapping claude-3-5-sonnet-20241022=gpt-5.2-codexexport ANTHROPIC_BASE_URL=http://localhost:8082
export ANTHROPIC_API_KEY=dummyclaude "Say hello"# Start the proxy
LOG_LEVEL=info uv run cc-codex-oauth-proxyIn another terminal:
# Point Claude Code at the proxy
export ANTHROPIC_BASE_URL=http://localhost:8082
export ANTHROPIC_API_KEY=dummy
# Verify tool usage via a file read
claude -p "Read pyproject.toml"uv run cc-codex-oauth-proxy \
--host 127.0.0.1 \
--port 8082 \
--model-mapping claude-sonnet-4-5-20250929=gpt-5.2-codex--port <number> Server port (default: 8082)
--host <string> Bind address (default: 127.0.0.1)
--codex-config <path> Path to auth.json (default: ~/.codex/auth.json)
--backend-url <url> Backend API URL
--model-mapping <key=value> Model mapping (repeatable)
--log-level <level> Log level: trace|debug|info|warn|error
--otel-endpoint <url> OpenTelemetry endpointSee .env.example for all available environment variables.
Claude Code → Proxy (localhost:8082) → ChatGPT Backend
↓
[Anthropic format → OpenAI Responses format]
[Uses Codex OAuth token]
This proxy implements the wide events pattern for superior observability:
- One comprehensive event per request - not scattered log lines
- High cardinality, high dimensionality - 50+ attributes capturing complete context
- Business + technical context - models, tokens, function calls, performance metrics
- Queryable structure - enables sophisticated filtering and analytics
Example wide event attributes:
request_id, request_timestamp, anthropic_model, anthropic_message_count,
translation_source_model, translation_target_model, codex_model,
codex_tools (JSON), codex_function_calls (JSON), codex_api_duration_ms,
anthropic_response_id, anthropic_input_tokens, anthropic_output_tokens,
total_duration_ms, success, error_type
Query examples in Jaeger:
- "All requests with function calls":
codex_function_calls != "[]" - "Slow requests (>1s)":
total_duration_ms > 1000 - "Model mapping failures":
translation_model_mapped = false
# Start Jaeger
docker run -d --name jaeger \
-p 4318:4318 \
-p 16686:16686 \
jaegertracing/all-in-one:latest
# Run proxy with tracing
uv run cc-codex-oauth-proxy --otel-endpoint http://localhost:4318
# View traces
open http://localhost:16686Trace Structure:
- One span per request:
handle_messages_request - All metadata accumulated as span attributes
- Function calls and tools stored as JSON arrays
- Complete end-to-end visibility
Default mappings:
| Claude Model | GPT Model |
|---|---|
| claude-sonnet-4-5-20250929 | gpt-5.2-codex |
| claude-haiku-4-5-20250929 | gpt-5.1-codex-mini |
| claude-haiku-4-5-20251001 | gpt-5.1-codex-mini |
| claude-3-5-sonnet-20241022 | gpt-5.2-codex |
| claude-3-5-haiku-20241022 | gpt-5.1-codex-mini |
| claude-3-opus-20240229 | gpt-5.2 |
Override via CLI:
uv run cc-codex-oauth-proxy \
--model-mapping claude-sonnet-4-5-20250929=gpt-5.2-codex \
--model-mapping claude-haiku-4-5-20250929=gpt-5.1-codex-mini- Binds to
127.0.0.1by default (localhost only) - Tokens redacted in logs (only last 8 chars shown)
- No data persisted or logged to disk
- Uses existing Codex OAuth credentials
Run codex login to authenticate.
Proxy auto-refreshes tokens. If refresh fails, re-run codex login.
Check that network doesn't buffer SSE responses.
# Basic run while developing
uv run cc-codex-oauth-proxyMIT