Skip to content

[codex] Normalize Azure OpenAI max token aliases#390

Draft
54sun wants to merge 1 commit intoHKUDS:mainfrom
54sun:codex-azure-openai-max-token-normalization
Draft

[codex] Normalize Azure OpenAI max token aliases#390
54sun wants to merge 1 commit intoHKUDS:mainfrom
54sun:codex-azure-openai-max-token-normalization

Conversation

@54sun
Copy link
Copy Markdown

@54sun 54sun commented Apr 25, 2026

Summary

  • normalize max_completion_tokens to max_tokens before Azure OpenAI provider execution
  • add regression coverage for Azure OpenAI factory calls

Root cause

Newer model helpers emit max_completion_tokens for gpt-5.* models, but DeepTutor's Azure Responses provider accepts max_tokens and maps that internally to max_output_tokens. The alias leaked through and caused Azure OpenAI calls to fail before reaching the API.

Validation

  • python -m py_compile deeptutor/services/llm/factory.py tests/services/llm/test_factory_provider_exec.py
  • live Docker backend LLM and embedding connection tests passed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant