Skip to content

fix: align codex oauth models and request contract#717

Open
tuanminhhole wants to merge 1 commit intodecolua:masterfrom
tuanminhhole:fix/codex-oauth-contract
Open

fix: align codex oauth models and request contract#717
tuanminhhole wants to merge 1 commit intodecolua:masterfrom
tuanminhhole:fix/codex-oauth-contract

Conversation

@tuanminhhole
Copy link
Copy Markdown

@tuanminhhole tuanminhhole commented Apr 21, 2026

9Router Codex OAuth Fix Draft

Closes #718

Proposed PR Title

Fix Codex OAuth request contract and narrow exposed ChatGPT-account models

Summary

  • use a valid Codex /responses probe payload during provider self-test
  • remove unsupported max_output_tokens before dispatching Codex requests
  • narrow the exposed cx/* model list to models verified to work with ChatGPT Plus OAuth on 2026-04-21

Repro

  1. Connect an OpenAI Codex OAuth account in 9Router with ChatGPT Plus.
  2. Call /api/providers/:id/test or route a cx/* model through /v1/responses.
  3. Observe false unavailable state or repeated 400 errors for models the account can actually use.

Root Cause

  • the Codex provider self-test used an invalid payload shape:
    • empty input
    • stream: false
    • no instructions
  • the executor still forwarded max_output_tokens, which the current Codex backend rejects
  • the UI exposed several cx/* models that ChatGPT-account OAuth currently rejects upstream

Verified Working Models

  • cx/gpt-5.4
  • cx/gpt-5.4-mini
  • cx/gpt-5.3-codex
  • cx/gpt-5.2

Verified Rejected Models

  • cx/gpt-5.3-codex-xhigh
  • cx/gpt-5.3-codex-high
  • cx/gpt-5.3-codex-low
  • cx/gpt-5.3-codex-none
  • cx/gpt-5.3-codex-spark
  • cx/gpt-5.2-codex
  • cx/gpt-5.1-codex-max
  • cx/gpt-5.1-codex
  • cx/gpt-5.1-codex-mini
  • cx/gpt-5.1-codex-mini-high
  • cx/gpt-5.1
  • cx/gpt-5-codex
  • cx/gpt-5-codex-mini

Files Changed In Upstream Draft

  • .tmp/9router-upstream/open-sse/executors/codex.js
  • .tmp/9router-upstream/open-sse/config/providerModels.js
  • .tmp/9router-upstream/src/app/api/providers/[id]/test/testUtils.js
  • .tmp/9router-upstream/README.md

Suggested PR Body

This patch fixes a Codex OAuth mismatch against the current ChatGPT-backed Codex /responses backend.

Changes:

  • send a valid payload during Codex provider self-test so accounts are not marked unavailable because of schema errors
  • strip max_output_tokens before forwarding Codex requests
  • narrow the exposed cx/* list to models that were verified working with ChatGPT Plus OAuth on 2026-04-21

Why:

  • the previous self-test payload could fail even when the token was valid
  • current backend contract requires instructions, list-form input, stream: true, and store: false
  • several exposed Codex models are currently rejected upstream for ChatGPT-account OAuth and should not be offered as ready-to-use defaults

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: Codex OAuth exposes unsupported ChatGPT-account models and misclassifies availability

1 participant