fix: route OpenAI Codex shortcuts to correct endpoint#566
fix: route OpenAI Codex shortcuts to correct endpoint#566Meetpatel006 wants to merge 6 commits intoGitlawb:mainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
Fixes OpenAI Codex shortcut models (codexplan / codexspark) so they resolve to and display the correct Codex endpoint (https://chatgpt.com/backend-api/codex) rather than the default OpenAI v1 base URL, and updates the startup banner to use the resolved provider request rather than raw env values.
Changes:
- Added an OpenAI Codex shortcut-alias helper and adjusted
resolveProviderRequest()to select the Codex base URL for those shortcuts. - Updated Codex transport selection to use the finalized/resolved base URL.
- Updated the startup screen provider detection to use
resolveProviderRequest()for endpoint + resolved model display.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| src/services/api/providerConfig.ts | Adds Codex shortcut alias handling and updates provider request/base URL/transport resolution logic. |
| src/components/StartupScreen.ts | Uses resolved provider request for startup banner endpoint/model display. |
Comments suppressed due to low confidence (1)
src/components/StartupScreen.ts:112
- In the OpenAI branch,
namedefaults to "OpenAI" even whenresolveProviderRequest()selects Codex transport / Codex endpoint (e.g.OPENAI_MODEL=codexplanresolves tohttps://chatgpt.com/backend-api/codex). This makes the startup banner internally inconsistent (provider shown as OpenAI, but endpoint is Codex). Consider settingname = 'Codex'whenresolvedRequest.transport === 'codex_responses'(or when the resolved base URL is the Codex base URL) before applying the other provider heuristics.
const baseUrl = resolvedRequest.baseUrl
const isLocal = isLocalProviderUrl(baseUrl)
let name = 'OpenAI'
if (/deepseek/i.test(baseUrl) || /deepseek/i.test(rawModel)) name = 'DeepSeek'
else if (/openrouter/i.test(baseUrl)) name = 'OpenRouter'
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const shellModel = process.env.OPENAI_MODEL?.trim() ?? '' | ||
| const isCodexAliasModel = | ||
| isOpenAICodexShortcutAlias(requestedModel) || | ||
| isOpenAICodexShortcutAlias(shellModel) | ||
| const finalBaseUrl = |
There was a problem hiding this comment.
isCodexAliasModel currently considers process.env.OPENAI_MODEL (shellModel) even when the caller passes an explicit options.model. This can incorrectly force finalBaseUrl to DEFAULT_CODEX_BASE_URL for non-Codex requests (e.g., resolveProviderRequest({ model: 'gpt-4o' }) while the environment has OPENAI_MODEL=codexplan), which would misroute traffic to the Codex endpoint. Consider basing the Codex-shortcut check only on requestedModel, or only consulting shellModel when options?.model is not provided and requestedModel came from the env var.
| const finalBaseUrl = | ||
| !isGithubMode && isCodexAliasModel && !explicitBaseUrl | ||
| ? DEFAULT_CODEX_BASE_URL | ||
| : rawBaseUrl |
There was a problem hiding this comment.
The finalBaseUrl override for Codex shortcuts is currently gated on !explicitBaseUrl (i.e., whether the caller passed options.baseUrl), not on whether the user configured a non-Codex endpoint. This makes behavior inconsistent across call sites (e.g., openaiShim calls resolveProviderRequest without baseUrl, so a locally configured OPENAI_BASE_URL=http://127.0.0.1:8080/v1 would be ignored for codexplan/codexspark and silently switched to the Codex endpoint). Consider basing the override on rawBaseUrl's value (e.g., only override when no base URL is set / it’s empty/"undefined" / it’s the official OpenAI v1 base URL), rather than on whether it was passed via options.
|
thank you for this bro @Meetpatel006 please look into the copilot comments |
|
Thanks for the fix, @Meetpatel006 — diagnosis is spot on and I confirmed the endpoint routing works on my end. A few things before we merge: 1. Provider label in the banner — Same thing Copilot flagged. When 2. Regression test — Could you add a test asserting 3. GitHub mode guard — The diff also adds an Non-blocking: could you also confirm the default Thanks again — the core fix is clean, just want to get these sorted before merging. |
|
@gnanam1990 ,
|
|
Thanks for the quick turnaround, @Meetpatel006 — the label fix and the new test coverage look great. The One small thing still outstanding: the diff has this guard that I flagged earlier and I don't see it mentioned in your response — const envBaseUrl =
isGithubMode && envBaseUrlRaw && getGithubEndpointType(envBaseUrlRaw) === 'custom'
? undefined
: envBaseUrlRawThis is a behavior change to GitHub mode routing that's unrelated to the Codex shortcut fix. Could you briefly explain what it's for? If it's fixing a known bug, I'm happy to expand the PR description. If it's speculative or unrelated, I'd prefer to pull it into a separate PR so each change can be reviewed independently. Once that's sorted, this is good to go. |
changesconst isCodexModelForGithub = isGithubMode && isCodexAlias(requestedModel)
|
gnanam1990
left a comment
There was a problem hiding this comment.
Thanks @Meetpatel006 — narrowing the guard to isCodexModelForGithub is exactly right. The logic makes sense now: if someone is in GitHub Copilot mode and requests a Codex alias, a stray custom OPENAI_BASE_URL (e.g. a local Ollama) shouldn't override GitHub's endpoint. Appreciate the careful response to feedback.
One non-blocking nit: the explanation you gave in the thread is great — worth copying into the PR description so future reviewers don't need to scroll the comments to understand why a GitHub-mode guard lives in a Codex shortcut fix. Also, a small test for the GitHub+Codex+custom-URL edge case would be nice to have as a follow-up, but the main code paths are well covered.
LGTM 🚀
|
@kevincodex1 final call |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const hasUserSetBaseUrl = rawBaseUrl && rawBaseUrl !== DEFAULT_OPENAI_BASE_URL | ||
| const finalBaseUrl = | ||
| !isGithubMode && isCodexAliasModel && !hasUserSetBaseUrl | ||
| ? DEFAULT_CODEX_BASE_URL | ||
| : rawBaseUrl |
There was a problem hiding this comment.
hasUserSetBaseUrl compares rawBaseUrl to DEFAULT_OPENAI_BASE_URL without normalizing. If a user sets OPENAI_BASE_URL to an equivalent value like https://api.openai.com/v1/ (trailing slash) or different casing, this will be treated as “custom”, preventing the Codex shortcut override and potentially flipping transport back to chat_completions. Consider normalizing before comparison (e.g., trimming trailing slashes or parsing with new URL).
| // Override to Codex when resolved endpoint is Codex | ||
| if (resolvedRequest.transport === 'codex_responses' || baseUrl.includes('chatgpt.com/backend-api/codex')) { | ||
| name = 'Codex' |
There was a problem hiding this comment.
The provider label is set to "Codex" when resolvedRequest.transport === 'codex_responses', but codex_responses can be selected for reasons other than hitting the Codex endpoint (e.g., Responses API on the OpenAI base URL). If the intent is “label as Codex only when using the Codex endpoint”, prefer checking isCodexBaseUrl(resolvedRequest.baseUrl) (and avoid hard-coded .includes('chatgpt.com/backend-api/codex')).
| test('resolves codexplan from env var OPENAI_MODEL to Codex endpoint', () => { | ||
| process.env.OPENAI_MODEL = 'codexplan' | ||
| delete process.env.OPENAI_BASE_URL | ||
| delete process.env.CLAUDE_CODE_USE_GITHUB | ||
|
|
||
| const resolved = resolveProviderRequest() | ||
| expect(resolved.transport).toBe('codex_responses') | ||
| expect(resolved.baseUrl).toBe('https://chatgpt.com/backend-api/codex') | ||
| expect(resolved.resolvedModel).toBe('gpt-5.4') | ||
| }) | ||
|
|
||
| test('does not override custom base URL for codexplan (e.g., local provider)', () => { | ||
| process.env.OPENAI_MODEL = 'codexplan' | ||
| process.env.OPENAI_BASE_URL = 'http://localhost:11434/v1' | ||
| delete process.env.CLAUDE_CODE_USE_GITHUB | ||
|
|
||
| const resolved = resolveProviderRequest() | ||
| expect(resolved.transport).toBe('chat_completions') | ||
| expect(resolved.baseUrl).toBe('http://localhost:11434/v1') | ||
| }) |
There was a problem hiding this comment.
These tests set process.env.OPENAI_MODEL but the suite doesn’t restore it in afterEach (only OPENAI_BASE_URL, OPENAI_API_BASE, and CLAUDE_CODE_USE_GITHUB are restored). This can leak state into later tests (including other files, depending on the runner). Capture and restore OPENAI_MODEL similarly to the other env vars or delete it after the test.
Vasanthdev2004
left a comment
There was a problem hiding this comment.
Quick correction from my side after rechecking the actual GitHub PR changed-files surface.
I was wrong to call out unrelated prompt/identity changes here. The real PR surface for #566 is only:
src/components/StartupScreen.tssrc/services/api/codexShim.test.tssrc/services/api/providerConfig.ts
So please ignore the earlier scope-contamination comment from me — that came from me looking at local branch state instead of the actual GitHub PR file list, and that was a review mistake on my side.
Rechecking the current head on the real PR surface:
- the Codex shortcut endpoint fix looks correct
- the startup banner now uses the resolved provider request and no longer shows the old OpenAI/Codex mismatch
- the focused regression coverage for
codexplan/codexsparkand custom base URLs is in place - current
smoke-and-testsis green
The narrowed GitHub+Codex+custom-URL guard also makes sense based on the explanation in thread.
From a code-review perspective, I do not see a remaining blocker on the current head.
Updated verdict: Approve-ready
I am still not sending an approval review right now only because the contributor explicitly asked maintainers to wait before merge (do not merge it, wait). But the earlier scope blocker I posted was incorrect, and on the actual PR surface this now looks good from my side.
…djust base URL handling
Summary3 Files Modified
|
Vasanthdev2004
left a comment
There was a problem hiding this comment.
Thanks for the follow-up. I rechecked the current head 1825b6e997d9a06af34c6a574a390c0626e77f47 against the actual GitHub PR surface, the latest commits, and the current check state.
This is a targeted re-review of the latest changes.
I do still see one blocker on the current head:
-
The latest commit now forces the Codex endpoint even when the user explicitly set a custom OpenAI-compatible base URL, and that is a behavior regression from the earlier safer version of the fix.
In
src/services/api/providerConfig.ts, the current logic now does:- detect
codexplan/codexspark(including the env-resolved-model case) - then unconditionally set
finalBaseUrl = DEFAULT_CODEX_BASE_URLfor non-GitHub mode
That means an explicit user-provided base URL like
http://localhost:11434/v1is no longer respected for those shortcut aliases.The updated tests in
src/services/api/codexShim.test.tsandsrc/commands/provider/provider.test.tsxnow lock in that forced override behavior, for example changing the old expectation from "respect the custom/local endpoint" to "always route to Codex".I don’t think we should merge that behavior change in this PR. The earlier version that fixed the banner + Codex shortcut routing while still preserving explicit custom base URLs was safer. Forcing a remote Codex endpoint over an explicit local/custom OpenAI-compatible endpoint silently changes user routing and network behavior, which is exactly the kind of config-precedence change I’d want to avoid unless it is a deliberate, separately justified product decision.
- detect
Non-blocking notes:
- the PR surface itself is still clean and focused
- the banner/label fix is still good
- current
smoke-and-testsis green
Verdict: Needs changes
If you restore the earlier "respect explicit custom base URL" behavior while keeping the banner fix and the shortcut routing regression tests, I’m happy to re-review quickly.
on it |
…nd adjust transport logic
|
What changed:
|
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 4 out of 4 changed files in this pull request and generated 2 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| const isCodexAliasModel = | ||
| isOpenAICodexShortcutAlias(requestedModel) || requestedMatchesEnvCodexShortcut | ||
| const hasUserSetBaseUrl = rawBaseUrl && rawBaseUrl !== DEFAULT_OPENAI_BASE_URL | ||
| const finalBaseUrl = | ||
| !isGithubMode && isCodexAliasModel && !hasUserSetBaseUrl | ||
| ? DEFAULT_CODEX_BASE_URL | ||
| : rawBaseUrl |
There was a problem hiding this comment.
hasUserSetBaseUrl treats rawBaseUrl === DEFAULT_OPENAI_BASE_URL as “not user set”, so an explicit OPENAI_BASE_URL=https://api.openai.com/v1 (or options.baseUrl) will still be overridden to the Codex endpoint for shortcut aliases. This also creates inconsistent behavior where a trailing slash (e.g. .../v1/) avoids the override. Consider treating any explicitly provided base URL (options/env) as user-set, or normalize/compare URLs rather than raw strings, and only apply the Codex default when no base URL was provided at all.
| test('resolves codexplan from env var OPENAI_MODEL to Codex endpoint', () => { | ||
| process.env.OPENAI_MODEL = 'codexplan' | ||
| delete process.env.OPENAI_BASE_URL | ||
| delete process.env.CLAUDE_CODE_USE_GITHUB | ||
|
|
||
| const resolved = resolveProviderRequest() | ||
| expect(resolved.transport).toBe('codex_responses') | ||
| expect(resolved.baseUrl).toBe('https://chatgpt.com/backend-api/codex') | ||
| expect(resolved.resolvedModel).toBe('gpt-5.4') |
There was a problem hiding this comment.
These tests mutate process.env.OPENAI_MODEL but the suite’s env reset logic at the top of the file does not restore it. This can leak OPENAI_MODEL=codexplan into later tests (including other describes in this file) and cause order-dependent failures. Capture the original OPENAI_MODEL and restore/delete it in the existing afterEach cleanup (or add a local beforeEach/afterEach in this describe).
Summary
What changed: Fixed endpoint URL for OpenAI Codex model shortcuts (
codexplan/codexspark) to correctly route tohttps://chatgpt.com/backend-api/codexinstead ofhttps://api.openai.com/v1. Also fixed the startup banner to use the resolved provider request instead of raw environment variables.Why it changed: When using
OPENAI_MODEL=codexplanwithCLAUDE_CODE_USE_OPENAI=1, the runtime correctly used Codex transport but the banner displayed the wrong endpoint URL because it read directly fromprocess.env.OPENAI_BASE_URLinstead of using the provider resolution logic.Impact
User-facing impact: Banner now shows correct Codex endpoint (
https://chatgpt.com/backend-api/codex) when usingcodexplanorcodexsparkmodels.Developer/maintainer impact: Separated OpenAI Codex shortcut handling from GitHub model handling to prevent cross-provider mixing. Added
isOpenAICodexShortcutAlias()helper for clean separation.Screenshot
Before:

After:

Testing
bun run buildbun run test:provider(154 pass, 0 fail)bun test --max-concurrency=1(pre-existing Windows path failures unrelated to this change)Focused tests:
src/services/api/codexShim.test.ts,src/services/api/providerConfig.tsNotes
OPENAI_MODEL=codexplan