SayLO is an open-source interview preparation platform that focuses on privacy and local-first AI processing. It provides simulated interviews, resume parsing, video interviews (via Jitsi), and performance analytics.
Clean, modular architecture with separated frontend and backend.
- Frontend:
frontend/src/main.tsx - Backend:
backend/index.js - Frontend Dev:
npm run dev - Backend Dev:
npm run dev:backend - Both:
npm run dev:all
- Features
- Getting started (dev & production)
- Environment variables
- Testing
- Project structure
- Ollama (optional) and AI
- Contributing
- Troubleshooting
- Local-first AI interview simulations (optional Ollama integration)
- Resume parsing (client-side PDF processing)
- Jitsi-based video interviews
- Session storage using IndexedDB / Dexie
- Performance metrics and analytics
Prerequisites
- Node.js v18+ (Node 20 recommended)
- npm (or pnpm/yarn)
Clone
git clone https://github.com/hitesh-kumar123/saylo.git
cd sayloInstall
npm installRun locally (two terminals)
- Terminal A — start backend:
npm run dev:backend- Terminal B — start frontend:
npm run devOr run both in one terminal:
npm run dev:allOpen http://localhost:5173 in browser. Backend API runs at http://localhost:3001/api.
Production build
npm run build
npm run previewNotes
- The repository doesn't run a single combined process by default. Running frontend and backend in separate shells is the simplest local workflow.
Create a backend .env at project root and a frontend .env.local in the root for Vite variables.
Backend (.env)
PORT=3001
JWT_SECRET=your_jwt_secret_here
FRONTEND_ORIGIN=http://localhost:5173
Frontend (.env.local)
VITE_API_URL=http://localhost:3001/api
VITE_JITSI_DOMAIN=meet.jit.si
VITE_OLLAMA_HOST=http://localhost:11434
VITE_OLLAMA_MODEL=llama3.2:3b
Tip: Vite only exposes variables prefixed with VITE_ to client code.
Frontend tests with Vitest:
npm test # Watch mode
npm run test:run # Single run
npm run test:ui # UI dashboard
npm run test:coverage # Coverage reportTest files: frontend/test/
saylo/
├── frontend/ # React + TypeScript frontend
│ ├── src/
│ │ ├── components/ # Reusable UI components
│ │ ├── pages/ # Page components
│ │ ├── services/ # API & utility services
│ │ ├── store/ # Zustand state management
│ │ └── types/ # TypeScript types
│ └── package.json
│
├── backend/ # Node/Express API
│ ├── index.js # Main entry point
│ ├── config/ # Configuration files
│ ├── middleware/ # Express middleware
│ ├── routes/ # API endpoints
│ ├── services/ # Business logic
│ ├── utils/ # Utilities & validators
│ └── db/ # Database (mock, replace with real)
│
├── docs/ # Documentation
├── test/ # Frontend tests
└── package.json # Root package config
SayLO can integrate with Ollama to provide richer AI question generation and analysis. Ollama is optional — SayLO has fallback logic when Ollama is not available.
Basic steps (high level):
- Install Ollama (download from https://ollama.ai or use the platform installer).
- Run
ollama serve(default port11434). - Pull a model, e.g.
ollama pull llama3.2:3b. - Configure
VITE_OLLAMA_HOSTandVITE_OLLAMA_MODELin.env.local.
See OLLAMA_SETUP.md for a more detailed guide and troubleshooting tips.
Small contribution guidelines are in docs/CONTRIBUTING.md. In short:
- Fork and use feature branches
- Run tests and linters before opening a PR
- Add tests for new functionality
- Use clear, small commits
- If resume upload / parsing fails: check browser console for PDF.js worker errors and ensure the file is a valid PDF.
- If video (Jitsi) fails to connect: verify browser camera/microphone permissions and that
VITE_JITSI_DOMAINis reachable. - If AI features are missing: either Ollama isn't running or the configured model isn't available (
ollama list).
If you encounter environment-specific issues, open an issue with a short reproduction and logs.
If you'd like, I can also:
- Add a
docs/folder with a short Developer Setup and Contributing guide (I already addeddocs/CONTRIBUTING.md). - Add a small
READMEbadge summary and a short architecture diagram indocs/ARCHITECTURE.md.
Done — updated documentation to be accurate, concise and actionable.