Local LLM/AI support? #198
Unanswered
ThiccNippls
asked this question in
Q&A
Replies: 1 comment
-
Would like to see ollama integration in the future. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Do you plan on adding Local LLM support(mistral, llma, ollma etc) to enable LLMs to privately interact with stored data?
Beta Was this translation helpful? Give feedback.
All reactions