Before you begin, make sure you have:
- macOS 11+ (Buddy is optimized for Apple Silicon)
- Node.js 18+ (check with
node --version) - 16GB RAM (8GB minimum)
- 5-6GB free disk space
- Chrome Browser
Ollama is the local AI runtime that powers Buddy.
- Visit https://ollama.ai
- Download Ollama for macOS
- Open the downloaded
.dmgfile - Drag Ollama to your Applications folder
- Open Ollama (it will appear in your menu bar)
Verify installation:
ollama --versionChoose one of these models:
Smaller, faster, good for most tasks. Uses ~4GB RAM.
ollama pull llama3.2Better writing quality. Uses ~7GB RAM.
ollama pull llama3Test the model:
ollama run llama3.2 "Say hello"- Navigate to the backend folder:
cd /Users/manmit/Dev/idea/buddy/backend- Install dependencies:
npm install- Verify the
.envfile exists and has correct settings:
cat .envShould show:
PORT=3000
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=llama3.2
- Start the backend server:
npm run devYou should see:
🚀 Buddy Backend Server Started!
================================
📍 Server: http://localhost:3000
🤖 Ollama: http://localhost:11434
🧠 Model: llama3.2
Open a new terminal and test the endpoints:
Health check:
curl http://localhost:3000/healthTest Ollama connection:
curl http://localhost:3000/api/testTest chat:
curl -X POST http://localhost:3000/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "What is your name?"}'# Check if Ollama is running
curl http://localhost:11434
# If not, open Ollama from Applications# List installed models
ollama list
# Pull the model if missing
ollama pull llama3.2# Check what's using port 3000
lsof -i :3000
# Kill the process or change PORT in .env# Clear node_modules and reinstall
rm -rf node_modules package-lock.json
npm installOnce the backend is running successfully:
- Proceed to Phase 2: Data Storage & RAG
- Add your personal information
- Build the Chrome extension
For convenience, use the provided script:
# Make scripts executable
chmod +x scripts/*.sh
# Install Ollama and model
./scripts/install-ollama.sh
# Start development environment
./scripts/start-dev.shIf you encounter issues:
- Check the troubleshooting section above
- Review the logs in the terminal
- Verify all prerequisites are met
- Check that Ollama is running in the menu bar