Skip to content

[FEAT] Support ollama over the network. #172

@hayleigh-dot-dev

Description

@hayleigh-dot-dev

I'd like to use glass on my lower-powered MacBook Air while running models remotely using ollama on a separate Mac Studio I have. Often apps like glass only support on-device local LLMs but the ability to interact with models on the local network might make glass more accessible to more users.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions