Replies: 1 comment 22 replies
-
The prompt playground does support OpenAI-API compatible LLM providers. To resolve the connection error when using the "Qwen/Qwen2.5-72B-Instruct" model with an unsafe HTTP client, ensure that the server is configured with the correct To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
-
I could use this provider via openai's lib, with http_client set to unsafe http client, i.e.
unsafe_http_client = httpx.Client(verify=False)
But got a connection error when run in prompt playground. Any ideas?
Beta Was this translation helpful? Give feedback.
All reactions