Skip to content

[Bug]: "models & settings" empty page when using a remote LLM server #679

@lukasz-kastelik

Description

@lukasz-kastelik

Issue Category

AI Assistant (Ollama)

Bug Description

I configured my AI assistant to use a remote LLM Server (LMStudio). After the initial config when I click "models & settings" I get an empty page.

Steps to Reproduce

  1. Configure LMStudio as your OpenAI compatible server. My example: http://192.168.8.211:1234
  2. Reopen Models & settings - you get an empty page

Expected Behavior

Page shows up

Actual Behavior

Empty page

N.O.M.A.D. Version

1.31.0

Operating System

Ubuntu 24.04

Docker Version

No response

Do you have a dedicated GPU?

Yes

GPU Model (if applicable)

No response

System Specifications

No response

Service Status (if relevant)

No response

Relevant Logs

Browser Console Errors (if UI issue)

Screenshots

No response

Additional Context

No response

Pre-submission Checklist

  • I have searched for existing issues that might be related to this bug
  • I am running the latest version of Project N.O.M.A.D. (or have noted my version above)
  • I have redacted any personal or sensitive information from logs and screenshots
  • This issue is NOT related to running N.O.M.A.D. on an unsupported/non-Debian-based OS

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions