Issue Category
AI Assistant (Ollama)
Bug Description
I configured my AI assistant to use a remote LLM Server (LMStudio). After the initial config when I click "models & settings" I get an empty page.
Steps to Reproduce
- Configure LMStudio as your OpenAI compatible server. My example: http://192.168.8.211:1234
- Reopen Models & settings - you get an empty page
Expected Behavior
Page shows up
Actual Behavior
Empty page
N.O.M.A.D. Version
1.31.0
Operating System
Ubuntu 24.04
Docker Version
No response
Do you have a dedicated GPU?
Yes
GPU Model (if applicable)
No response
System Specifications
No response
Service Status (if relevant)
No response
Relevant Logs
Browser Console Errors (if UI issue)
Screenshots
No response
Additional Context
No response
Pre-submission Checklist
Issue Category
AI Assistant (Ollama)
Bug Description
I configured my AI assistant to use a remote LLM Server (LMStudio). After the initial config when I click "models & settings" I get an empty page.
Steps to Reproduce
Expected Behavior
Page shows up
Actual Behavior
Empty page
N.O.M.A.D. Version
1.31.0
Operating System
Ubuntu 24.04
Docker Version
No response
Do you have a dedicated GPU?
Yes
GPU Model (if applicable)
No response
System Specifications
No response
Service Status (if relevant)
No response
Relevant Logs
Browser Console Errors (if UI issue)
Screenshots
No response
Additional Context
No response
Pre-submission Checklist