Skip to content

LM Studio returns Unexpected endpoint or method. #113

@skunkmonkey

Description

@skunkmonkey

I can't seem to get this extension to work with LM Studio. I've successfully used my server with other software, so I know the server works.
I have CORS enabled. I'm serving on the local network. I've set the port and ensured that the URL in BMO Chatbot includes the correct port for my LM Studio server. No glory. It can't detect the model. I just see No Model in the dropdown, and refreshing results in:
[ERROR] Unexpected endpoint or method. (GET /api/tags). Returning 200 anyway

I've tried multiple LLM's and I've tried every prompt template in the book. Am I missing something, or is this plugin broken for LM Studio?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions