Skip to content

[BUG] LLM Ollama Provider (No response generated) #1384

@Layomg

Description

@Layomg

Describe the bug

I added a local ollama LLM Provider in the settings following the documentation adding process. The problem is that when I try to chat with any custom model it says "No response generated" and I don't know where to look and check URL.

Also, I used a python code to access http://localhost:11434/v1/completions and it works, but when using aperag it doesn't even appear in the terminal with ollama running.

Image Image Image Image Image

Screenshots & Logs

If applicable, add screenshots to help explain your problem.

Additional context

  • OS: Windows 11 - Docker
  • Browser: Brave

Metadata

Metadata

Assignees

Labels

StalebugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions