LawBuddy AI is an intelligent legal assistant platform designed to simplify legal processes. It leverages local Large Language Models (LLMs) via Ollama to provide secure, private, and instant legal analysis, document drafting, and case management.
- 🤖 AI Legal Assistant: Real-time chat with custom-tuned local models (
LawBuddy:latest) for case analysis and strategy. - 📂 Case Management: Track cases, clients, status, and deadlines in a unified dashboard.
- 📝 Document Drafting: AI-assisted generation of legal letters, complaints, and contracts.
- 🔒 Secure & Private: All data processing happens locally. No external API calls.
- 🎨 Modern UI: Clean, responsive interface with Dark Mode support.
- Node.js: v18 or higher recommended.
- Ollama: Must be installed and running. Download Ollama
npm installEnsure Ollama is running. You need the LawBuddy:latest model. If you don't have it yet, use a base model like Llama 3 or Mistral:
ollama pull llama3
# Or create custom model if you have a Modelfile
# ollama create LawBuddy -f ModelfileNote: The application is configured to use LawBuddy:latest. You can change this in routes/ai.js or via environment variables.
Create a .env file in the root directory:
PORT=3000
SESSION_SECRET=your-secure-session-secret# Start server
npm start
# Dev mode (if nodemon is installed)
npm run devVisit http://localhost:3000 in your browser.
server.js: Main Express application entry point.routes/: API route handlers (auth,cases,ai, etc.).database.js: innovative local JSON file-based database (no SQL/NoSQL setup required).public/: Static assets (HTML, CSS, JS).
LawBuddy AI is an assistive tool and does not provide professional legal advice. All outputs should be reviewed by a licensed attorney.
This project is licensed under the MIT License.