-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
As a homelabber and board game nerd, I love the project! I saw another github issue suggesting making AI features optional, but wanted to offer another option -- many selfhosted apps integrate LiteLLM to allow the use of a variety of LLMs including self-hosted Ollama, would this be an option you'd consider? On the vector DB side, Qdrant is a popular option that allows for both self-hosting and cloud use.
I don't necessarily have the know-how to integrate these myself, but I could certainly scaffold together a docker-compose.yml that provided those components and test out any integrations!
BearsAreBigenricollen
Metadata
Metadata
Assignees
Labels
No labels