-
Notifications
You must be signed in to change notification settings - Fork 76
Open
Description
Hi,
I try configure DevoxxGenieIDEAPlugin v0.6.9 with Open WebUI API (https://docs.openwebui.com/getting-started/api-endpoints/).
Ollama URL is working fine
The Open WebUI API is working too:
curl -H "Authorization: Bearer XXXXXXX" http://docker-tools:3000/api/models
with curl. I get an json response with all models.
But it is not working in DevoxxGenieIDEAPlugin.
I get
When I set Custom OpenAI Modell=devstral:24b I get
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - 22:01:39.977 [prompt-exec-6] ERROR c.d.g.s.p.error.PromptErrorHandler - Error occurred while processing chat message
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - java.util.concurrent.CompletionException: com.devoxx.genie.service.prompt.error.ModelException: Provider unavailable: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1770)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.lang.Thread.run(Thread.java:1583)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - Caused by: com.devoxx.genie.service.prompt.error.ModelException: Provider unavailable: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at com.devoxx.genie.service.prompt.response.nonstreaming.NonStreamingPromptExecutionService.processChatMessage(NonStreamingPromptExecutionService.java:206)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at com.devoxx.genie.service.prompt.response.nonstreaming.NonStreamingPromptExecutionService.lambda$executeQuery$0(NonStreamingPromptExecutionService.java:75)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1768)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - ... 3 common frames omitted
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - Caused by: dev.langchain4j.exception.InvalidRequestException: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.ExceptionMapper$DefaultExceptionMapper.mapHttpStatusCode(ExceptionMapper.java:69)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.ExceptionMapper$DefaultExceptionMapper.mapException(ExceptionMapper.java:42)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.ExceptionMapper.withExceptionMapper(ExceptionMapper.java:29)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.RetryUtils.lambda$withRetryMappingExceptions$2(RetryUtils.java:307)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.RetryUtils$RetryPolicy.withRetry(RetryUtils.java:195)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.RetryUtils.withRetry(RetryUtils.java:247)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.RetryUtils.withRetryMappingExceptions(RetryUtils.java:307)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.RetryUtils.withRetryMappingExceptions(RetryUtils.java:291)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.model.openai.OpenAiChatModel.doChat(OpenAiChatModel.java:151)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.model.chat.ChatLanguageModel.chat(ChatLanguageModel.java:47)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.service.DefaultAiServices$1.invoke(DefaultAiServices.java:224)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at com.devoxx.genie.service.prompt.response.nonstreaming.$Proxy251.chat(Unknown Source)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at com.devoxx.genie.service.prompt.response.nonstreaming.NonStreamingPromptExecutionService.processChatMessage(NonStreamingPromptExecutionService.java:183)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - ... 5 common frames omitted
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - Caused by: dev.langchain4j.exception.HttpException: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.http.client.jdk.JdkHttpClient.execute(JdkHttpClient.java:51)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.model.openai.internal.SyncRequestExecutor.execute(SyncRequestExecutor.java:20)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.model.openai.internal.RequestExecutor.execute(RequestExecutor.java:39)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.model.openai.OpenAiChatModel.lambda$doChat$3(OpenAiChatModel.java:152)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - at dev.langchain4j.internal.ExceptionMapper.withExceptionMapper(ExceptionMapper.java:27)
2025-08-04 22:01:39,977 [ 379028] INFO - STDOUT - ... 15 common frames omitted
2025-08-04 22:01:39,978 [ 379029] INFO - STDOUT - 22:01:39.978 [prompt-exec-5] ERROR c.d.g.s.prompt.error.PromptException - ERROR:Null response received - false
- Is my configuration wrong?
- Is there an bug?
Metadata
Metadata
Assignees
Labels
No labels