Skip to content

Feature Request: Support for local AI components #9

@nickgnat

Description

@nickgnat

As a homelabber and board game nerd, I love the project! I saw another github issue suggesting making AI features optional, but wanted to offer another option -- many selfhosted apps integrate LiteLLM to allow the use of a variety of LLMs including self-hosted Ollama, would this be an option you'd consider? On the vector DB side, Qdrant is a popular option that allows for both self-hosting and cloud use.

I don't necessarily have the know-how to integrate these myself, but I could certainly scaffold together a docker-compose.yml that provided those components and test out any integrations!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions