Skip to content

Commit 67b9da6

Browse files
committed
docs: openai temperature model aliasing incompatibility
1 parent 4a77084 commit 67b9da6

1 file changed

Lines changed: 4 additions & 0 deletions

File tree

docs/llm-providers.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -74,3 +74,7 @@ uv run oss-bugfind-crs build example_configs/test-local json-c
7474
# Should say hello from LLM
7575
uv run oss-bugfind-crs run example_configs/test-local json-c json_array_fuzzer
7676
```
77+
78+
# Known Issues with Aliasing Models
79+
80+
Recent LLMs like GPT-5 no longer support the `temperature` parameter. This can cause silent failures if you’re swapping model backends in LiteLLM while keeping the original model name (e.g., model name set to gpt-4o but relay configured to gpt-5). The temperature param gets passed through and the API rejects it.

0 commit comments

Comments
 (0)