fix(openai): normalize tool schemas before OpenAI / Responses / Codex requests#8474
Conversation
Signed-off-by: Istvan Benedek <istvan.benedek.dev@gmail.com>
|
Supplemental review context for this PR. These notes mirror the local review pack I used while validating the change. The two Mermaid diagrams below are the same explanation that I exported locally as PNGs; I’m keeping the binary assets out of the repo and putting the reviewer context here instead. What changedThe fix is intentionally provider-side. It does not change tool runtime behavior or autovisualiser rendering logic. Affected request paths:
What to look at firstThe main logic is in The key reviewer question is not whether the outgoing schema is a byte-for-byte rewrite of the original. It is whether the outgoing schema stays valid enough for tool use while avoiding shapes that OpenAI rejects. Main tradeoff:
Diagram: request-shaping flowflowchart TD
A["Tool input_schema from rmcp/schemars"] --> B["OpenAI-facing request builder"]
B --> C{"Schema contains $defs / $ref / anyOf / recursive refs?"}
C -->|No| D["Keep schema unchanged"]
C -->|Yes| E["Normalize schema"]
E --> F["Resolve local refs"]
E --> G["Flatten unions"]
E --> H["Break recursive refs"]
E --> I["Ensure object/array completeness"]
D --> J["OpenAI-safe tool schema"]
F --> J
G --> J
H --> J
I --> J
J --> K["Serialize request to OpenAI / Responses / Codex"]
Diagram: concrete before/after exampleflowchart LR
BEFORE["Before<br/><br/>$defs: SingleDonutChart<br/>data.anyOf: $ref | array($ref)<br/>values.items.anyOf: number | object(label, value)"] --> X["normalize_json_schema_for_openai()"]
X --> AFTER["After<br/><br/>No $defs / $ref / anyOf<br/>data.type: [object, array]<br/>data.items.type: object<br/>values.items.type: [number, object]"]
AFTER --- NOTE["Same tool intent<br/>smaller schema surface for OpenAI"]
ValidationFocused regression tests run locally:
Practical validation also passed locally with a patched install and |
Summary
OpenAI-backed Goose providers can reject valid Goose tool schemas before normal generation starts.
Observed failure:
The root issue is not limited to one tool. Goose can emit JSON Schema constructs that OpenAI rejects during tool registration, including:
$defs$refanyOf/oneOfThis patch normalizes tool parameter schemas into an OpenAI-compatible subset before serializing requests for:
Why This Approach
The narrower donut-only fix is useful, but it solves one manifestation of a broader provider-compatibility problem.
This patch keeps tool definitions and runtime behavior intact, and instead fixes the serialization boundary where OpenAI rejects the payload.
Implementation
In
crates/goose/src/providers/formats/openai.rsthis change:$refentries from$defs/definitionsanyOf/oneOfallOfThe same normalization is then applied when building requests in:
crates/goose/src/providers/formats/openai.rscrates/goose/src/providers/formats/openai_responses.rscrates/goose/src/providers/chatgpt_codex.rsValidation
Focused regression tests:
cargo test -p goose --no-default-features --features rustls-tls test_validate_tool_schemas -- --nocapturecargo test -p goose --no-default-features --features rustls-tls test_create_request_sanitizes_tool_schema -- --nocapturecargo test -p goose --no-default-features --features rustls-tls test_codex_request_sanitizes_tool_schema -- --nocapturecargo test -p goose --no-default-features --features rustls-tls test_responses_request_sanitizes_tool_schema -- --nocaptureManual validation:
autovisualiserenabledReviewer Notes
This is intentionally a compatibility transform, not an exact schema-preserving rewrite.
The tradeoff is:
The patch is limited to provider-facing request shaping and does not change tool execution behavior.
Related Issue
I’ll add a follow-up PR comment with a concise reviewer guide and diagrams showing the request-shaping flow and a concrete before/after schema example.