Skip to content

Latest commit

 

History

History
318 lines (225 loc) · 13.6 KB

File metadata and controls

318 lines (225 loc) · 13.6 KB
LyraNote Logo

LyraNote

Your AI-powered second brain — chat with your knowledge, not just store it.

English · 简体中文

Table of Contents

TOC



👋 Getting Started

LyraNote is a modern, AI-powered personal knowledge management app designed to be your second brain. By integrating RAG (Retrieval-Augmented Generation), multi-step AI Agents, knowledge graphs, and long-term memory, LyraNote lets you truly converse with your own knowledge base — not just search through it.


✨ Features

Knowledge Management

  • Multi-format Import — Ingest PDF files, web URLs, and Markdown text; auto-parsed, chunked, and vectorized into your knowledge base.
  • RAG Conversations — AI answers questions grounded in your notebook's knowledge base, with source citations.
  • Knowledge Graph — Automatically extracts entities and relationships from sources and renders an interactive force-directed graph.

AI Assistant

  • Streaming AI Chat — Real-time SSE streaming with multi-turn context support.
  • Deep Research Agent — Multi-step autonomous research: browses the web and produces structured research reports.
  • AI Copilot — A floating AI panel docked beside the editor, always aware of your current notebook.
  • Inline Ghost Text — AI suggestions appear inline as you type; press Tab to accept.
  • AI-Generated Content — One-click generation of summaries, FAQs, study guides, briefings, and more.

Rich Note Editing

  • Rich Text Editor — Powered by Tiptap with Markdown shortcuts, headings, lists, code blocks, and blockquotes.
  • Auto-save — Edits sync to the backend in real time.
  • Public Sharing — Generate a read-only public link for any notebook.

Smart Automation

  • Long-term Memory — AI remembers user preferences and knowledge points across sessions for continuous personalization.
  • Scene Awareness — Automatically detects conversation context (research / writing / learning / review) and adapts strategy.
  • Mind Maps — Renders interactive mind maps directly inside AI chat.
  • Scheduled Tasks — Create Cron jobs (daily news digests, knowledge briefings, etc.) with email delivery.
  • Proactive Insights — AI proactively surfaces insight cards related to your current content.

🛠 Tech Stack

Frontend (web/)

Technology Purpose
Next.js 15 (App Router) React full-stack framework
React 19 + TypeScript UI development
Tailwind CSS Utility-first styling
Tiptap Rich text editor
TanStack Query Server-state management & caching
Zustand Client-side global state
Framer Motion Animations
react-force-graph-2d Knowledge graph visualization
markmap Mind map rendering
next-intl Internationalization (i18n)

Backend (api/)

Technology Purpose
Python 3.12 + FastAPI Async web framework
SQLAlchemy 2.0 + asyncpg Async ORM
Alembic Database migrations
PostgreSQL 16 + pgvector Relational data + vector similarity search
Celery + Redis Background async task queue
OpenAI SDK LLM calls & text embeddings
LangGraph Multi-step Agent orchestration
MinIO / S3 File object storage
Tavily API Web search tool

🏗 Architecture

┌─────────────────────────────────────────────────────────────────┐
│                         User Browser                            │
│          Next.js 15 Frontend (React 19 + Tiptap + Zustand)      │
└──────────────────────────────┬──────────────────────────────────┘
                               │ HTTP / SSE
┌──────────────────────────────▼──────────────────────────────────┐
│                   FastAPI Backend (Python 3.12)                  │
│   ┌─────────────┐  ┌──────────────┐  ┌───────────────────────┐  │
│   │  REST API   │  │  SSE Stream  │  │    Celery Worker      │  │
│   │   Routers   │  │  AI Chat     │  │  (Background AI Tasks)│  │
│   └──────┬──────┘  └──────┬───────┘  └──────────┬────────────┘  │
│          │                │                     │               │
│   ┌──────▼────────────────▼─────────────────────▼────────────┐  │
│   │                  Agent / Skills Layer                     │  │
│   │   ReAct Agent · RAG · Deep Research · Memory · KG · Write │  │
│   └──────────────────────────────────────────────────────────┘  │
└──────────────────────────────────────────────────────────────────┘
         │                    │                    │
┌────────▼──────┐   ┌─────────▼────────┐   ┌──────▼───────────┐
│  PostgreSQL   │   │      Redis       │   │   MinIO / S3     │
│  + pgvector   │   │ (Celery Broker   │   │  (File Storage)  │
│ (Data+Vector) │   │   + Cache)       │   │                  │
└───────────────┘   └──────────────────┘   └──────────────────┘

🛳 Self Hosting

Option 1 — Local Development

Best for hot-reload debugging. The data layer (PostgreSQL + Redis) is managed by Docker; the application layer runs as local processes.

git clone https://github.com/LinMoQC/LyraNote.git
cd LyraNote
./lyra init     # interactive setup wizard — generates api/.env
./lyra local    # start local dev mode

The CLI automatically: detects/starts database containers → creates a Python venv → installs dependencies → runs DB migrations → starts FastAPI, Celery Worker, and Next.js Dev Server in parallel.

Press Ctrl+C to stop local processes; database containers are unaffected.


Option 2 — Docker Compose (All-in-one)

Runs everything — frontend, backend, worker, and all infrastructure — in containers. Good for a quick full-stack preview or self-hosted server deployment.

1. Configure environment variables

./lyra init     # interactive wizard — generates api/.env with all required values

Infrastructure connection strings (DATABASE_URL, REDIS_URL, STORAGE_S3_*) are already injected by the compose file — no need to set them in .env.

AI configuration (API keys, models, etc.) is managed via the Setup Wizard on first launch, stored in the database. .env values act only as fallbacks.

2. Start (development)

./lyra docker   # start all services via Docker Compose

Once running:

  • Frontend: http://localhost:3000
  • Backend API: http://localhost:8000
  • API Docs: http://localhost:8000/docs
./lyra logs     # tail live logs
./lyra stop     # stop all services

3. Deploy to production server

Use docker-compose.prod.yml which adds Nginx, disables debug, and omits exposed ports for the database/cache services.

./lyra init     # select "production server" mode — generates root .env with domain + passwords
./lyra prod     # pull cloud images and start production stack

The init wizard automatically generates a random JWT_SECRET, POSTGRES_PASSWORD, and MINIO_ROOT_PASSWORD, and writes them into the root .env consumed by docker-compose.prod.yml.

Then open https://your-domain.com and complete the Setup Wizard.

Open https://your-domain.com and complete the Setup Wizard.


Option 3 — Frontend on Vercel + Backend on Server

Deploy the backend via Docker Compose on your server, and the frontend separately on Vercel.

Backend (server)

./lyra init     # generates root .env (choose "production server" mode)
docker compose -f docker-compose.prod.yml up -d db redis minio minio-init api worker

Open port 80 (Nginx) in your firewall. HTTPS via a reverse proxy is strongly recommended.

Frontend (Vercel)

Add the following Environment Variable in your Vercel project dashboard:

Variable Value
NEXT_PUBLIC_API_BASE_URL https://your-server.com/api/v1

Push your code and Vercel will deploy automatically.


⚙️ Environment Variables

Backend (api/.env)

In Docker Compose mode, infrastructure connection strings are injected by the compose file and do not need to be set in .env.

Variable Description Required
JWT_SECRET JWT signing key (openssl rand -hex 32)
GOOGLE_CLIENT_ID/SECRET Google OAuth Optional
GITHUB_CLIENT_ID/SECRET GitHub OAuth Optional

AI-related config (OPENAI_API_KEY, LLM_MODEL, EMBEDDING_MODEL, TAVILY_API_KEY, storage backend, etc.) is stored in the database and managed via the Setup Wizard or Settings page. Values in .env act only as fallbacks.

Frontend (web/.env.local)

Variable Description
NEXT_PUBLIC_API_BASE_URL Backend API base URL (only needed for Vercel deployments)

⌨️ Quick Start

Option A — install globally once, then use lyra anywhere:

npm install -g lyra-cli   # installs the lyra command globally
# or: cd scripts/lyra-cli && npm link

lyra init     # interactive wizard — generates .env
lyra docker   # start all services (Docker Compose)
lyra status   # check service health
lyra logs     # tail live logs
lyra stop     # stop everything

Option B — no install, run directly from the repo root:

git clone https://github.com/LinMoQC/LyraNote.git
cd LyraNote
./lyra          # interactive menu (uses Node.js, no npm install needed)

Requires Node.js ≥ 18 and Docker. Run node --version to verify.


🤝 Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request.

contributors

📈 Star History

Star History Chart


Copyright © 2026 LinMoQC.
This project is MIT licensed.