Your own local friend
This is a very quick wrapper on ollama to serve a local LLM AI friend to an iOS shortcut.
- Install ollama
- Run ollama
ollama run llama3.1:8b
(If you use a different model, just change inindex.ts
) - Install deps
npm i
- Start server
npm start
- Install the shortcut on your phone (Either download from this repo or click here)
- Provide the IP address of your computer
- Give your friend a name!
- Install ollama
- Run ollama
ollama run llama3.1:8b
(If you use a different model, just change inserver.py
) - Install python deps
pip install -r requirements.txt
- Start server
python server.py
- Install the shortcut on your phone (Either download from this repo or click here)
- Provide the IP address of your computer
- Give your friend a name!
- Customise
identity.txt
to change things about your friend. - Use ngrok to expose your friend and talk on the go