-
-
Notifications
You must be signed in to change notification settings - Fork 6.6k
Closed
Labels
bugSomething isn't workingSomething isn't workingllm translationui-dashboardIssues related to the LiteLLM UI DashboardIssues related to the LiteLLM UI Dashboard
Description
Check for existing issues
- I have searched the existing issues and checked that my issue is not a duplicate.
What happened?
When I try openai LLM in plaground I got following error.
endpoint: v1/chat/completions
Error fetching response:Error: 404 litellm.BadRequestError: OpenAIException - Invalid URL (POST /v1/responses/chat/completions)No fallback model group found for original model_group=gpt-5.1-fallback. Fallbacks=[{'Generate_Chat_Default': ['Generate_Chat_Premium']}, {'Extract_Info_Default': ['llm_cache_anthropic']}, {'Extract_Info_Premium': ['llm_cache_anthropic']}]. Received Model Group=gpt-5.1-fallback
Available Model Group Fallbacks=None
Error doing the fallback: litellm.BadRequestError: OpenAIException - Invalid URL (POST /v1/responses/chat/completions)No fallback model group found for original model_group=gpt-5.1-fallback.
endpoint: v1/responses
Error fetching response:Error: 401 litellm.BadRequestError: OpenAIException - {
"error": {
"message": "Your request to POST /v1/responses/responses must be made with a session key (that is, it can only be made from the browser). You made it with the following key type: .",
"type": "invalid_request_error",
"param": null,
"code": "missing_scope"
}
}No fallback model group found for original model_group=gpt-5.1-fallback. Fallbacks=[{'Generate_Chat_Default': ['Generate_Chat_Premium']}, {'Extract_Info_Default': ['llm_cache_anthropic']}, {'Extract_Info_Premium': ['llm_cache_anthropic']}]. Received Model Group=gpt-5.1-fallback
Available Model Group Fallbacks=None
Error doing the fallback: litellm.BadRequestError: OpenAIException - {
"error": {
"message": "Your request to POST /v1/responses/responses must be made with a session key (that is, it can only be made from the browser). You made it with the following key type: .",
"type": "invalid_request_error",
"param": null,
"code": "missing_scope"
}
}
Steps to Reproduce
- Go to liteLLM playground,
- Select openai llm model
Relevant log output
What part of LiteLLM is this about?
UI Dashboard
What LiteLLM version are you on ?
v1.82.3
Twitter / LinkedIn details
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingllm translationui-dashboardIssues related to the LiteLLM UI DashboardIssues related to the LiteLLM UI Dashboard