Replies: 3 comments 3 replies
-
|
Interesting. Will try this approach when finished the plugin system #255. |
Beta Was this translation helpful? Give feedback.
-
|
Not tried anything so far, but I can see how a graph memory could make it feel more personal |
Beta Was this translation helpful? Give feedback.
-
|
I’ve been thinking — wouldn’t it be awesome if we could use Unsloth-supported AI models together with Ollama? Imagine this: we fine-tune a model on Google Colab and then run it locally with Ollama to power an AI girlfriend. I already have a journaling project built with PySide6 — the goal is to create daily notes that can include audio recordings (compressed to FLAC), images (converted to WebP), and encrypted text, audio, or photos, all synced through Git. The plan is to add a feature that exports everything as JSON, which can then be used as a dataset for local training in Ollama. Once the dataset is ready, I’ll use an Unsloth AI model for fine-tuning. After that, the next step would be merging everything into the AI girlfriend project. https://gitlab.com/krafi/git-backed-diary inst would be a new revelation ? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
The waifu can have a vector DB, but why not add another layer of graph-based memory? PKMs and wikis are not anything new, but they probably hold some kind of power that is distinct from vector DBs.
"Manual links" are probably something more distinct and can develop a bit of personality, while vectors are often stuck based on embedding models. Which means that originality and lateral thinking might need this as a component.
There are like two options:
Beta Was this translation helpful? Give feedback.
All reactions