Skip to content

Commit 83b00b2

Browse files
committed
revert: restore README destroyed by author section commit
1 parent d980edc commit 83b00b2

File tree

1 file changed

+241
-6
lines changed

1 file changed

+241
-6
lines changed

README.md

Lines changed: 241 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,243 @@
1-
<h1 align="center">AI Video Comment Analyzer<
1+
<h1 align="center">AI Video Comment Analyzer</h1>
22

3-
## Author
3+
<p align="center">
4+
AI-powered YouTube comment analysis with ML sentiment detection, topic modeling, and AI-generated summaries.
5+
</p>
46

5-
**Nolan Cacheux** — AI & ML Engineer
6-
- Portfolio: [nolancacheux.com](https://nolancacheux.com)
7-
- GitHub: [@nolancacheux](https://github.com/nolancacheux)
8-
- LinkedIn: [nolancacheux](https://linkedin.com/in/nolancacheux)
7+
<p align="center">
8+
<img src="demo/demo.gif" alt="Demo" width="700">
9+
</p>
10+
11+
## Architecture
12+
13+
<p align="center">
14+
<img src="demo/architecture.svg" alt="Architecture" width="900">
15+
</p>
16+
17+
## Features
18+
19+
- **Comment Extraction**: Fetch up to a configurable comment limit per video using yt-dlp (default 100)
20+
- **Sentiment Analysis**: BERT-powered multilingual sentiment classification (positive/negative/neutral/suggestion)
21+
- **Topic Modeling**: BERTopic clustering to identify key discussion themes
22+
- **AI Summaries**: Local LLM-powered summaries via Ollama (llama3.2:3b)
23+
- **Multi-Page Dashboard**: Dedicated pages for Overview, Charts, Topics, and Comments
24+
- **Real-time Progress**: SSE streaming with ML metrics during analysis
25+
26+
<details>
27+
<summary>Home</summary>
28+
<img src="demo/home.png" alt="Home" width="700">
29+
</details>
30+
31+
<details>
32+
<summary>Overview</summary>
33+
<img src="demo/overview.png" alt="Overview" width="700">
34+
</details>
35+
36+
<details>
37+
<summary>Charts</summary>
38+
<img src="demo/charts.png" alt="Charts" width="700">
39+
</details>
40+
41+
<details>
42+
<summary>Suggestions</summary>
43+
<img src="demo/suggestions.png" alt="Suggestions" width="700">
44+
</details>
45+
46+
## Quick Start
47+
48+
### Prerequisites
49+
50+
- Node.js 20+
51+
- pnpm
52+
- Python 3.11+
53+
- [uv](https://docs.astral.sh/uv/) (Python package manager)
54+
- [Ollama](https://ollama.ai) (optional, for AI summaries)
55+
- NVIDIA GPU with CUDA (optional, for faster ML inference)
56+
57+
### Frontend Setup
58+
59+
```bash
60+
# Install dependencies
61+
pnpm install
62+
63+
# Run development server
64+
pnpm dev
65+
```
66+
67+
Open [http://localhost:3000](http://localhost:3000) in your browser.
68+
69+
### Backend Setup
70+
71+
```bash
72+
# Install uv (if not already installed)
73+
curl -LsSf https://astral.sh/uv/install.sh | sh
74+
75+
# Install dependencies (includes ML models: torch, transformers, bertopic)
76+
uv sync
77+
78+
# Run API server
79+
uv run uvicorn api.main:app --reload --port 8000
80+
```
81+
82+
API available at [http://localhost:8000](http://localhost:8000)
83+
84+
### Ollama Setup (Optional)
85+
86+
For AI-generated summaries of comment sentiment:
87+
88+
```bash
89+
# Install Ollama
90+
curl -fsSL https://ollama.ai/install.sh | sh
91+
92+
# Pull the model
93+
ollama pull llama3.2:3b
94+
95+
# Start Ollama server (if not running)
96+
ollama serve
97+
```
98+
99+
## Architecture
100+
101+
### Multi-Page Routes
102+
103+
```
104+
/ - Home (URL input + history sidebar)
105+
/analysis/[id] - Overview page (At a Glance + Summary Cards)
106+
/analysis/[id]/charts - Charts page (2x2 grid with context)
107+
/analysis/[id]/topics - Topics page (list + detail panel)
108+
/analysis/[id]/comments - Comments page (filters + sorting)
109+
```
110+
111+
### Tech Stack
112+
113+
- **Frontend**: Next.js 15, React 19, TypeScript, Tailwind CSS v4, shadcn/ui, Recharts
114+
- **Backend**: FastAPI, Python 3.11+, yt-dlp, SQLAlchemy
115+
- **Database**: SQLite
116+
- **AI/ML**:
117+
- `nlptown/bert-base-multilingual-uncased-sentiment` (sentiment)
118+
- BERTopic with `all-MiniLM-L6-v2` embeddings (topics)
119+
- Ollama with llama3.2:3b (AI summaries)
120+
121+
### Design System
122+
123+
- **Typography**: Fraunces (display), DM Sans (body), JetBrains Mono (code)
124+
- **Colors**: Navy (#1E3A5F) + Terracotta (#D4714E) accent, warm cream background
125+
- **Sentiment**: Green (positive), Red (negative), Blue (suggestion), Gray (neutral)
126+
127+
## Project Structure
128+
129+
```
130+
src/
131+
├── app/
132+
│ ├── page.tsx # Home (input + history)
133+
│ ├── layout.tsx # Root layout with fonts
134+
│ ├── globals.css # Design system
135+
│ └── analysis/[id]/
136+
│ ├── layout.tsx # Analysis layout (nav + tabs)
137+
│ ├── page.tsx # Overview page
138+
│ ├── charts/page.tsx # Charts page
139+
│ ├── topics/page.tsx # Topics page
140+
│ └── comments/page.tsx # Comments page
141+
├── components/
142+
│ ├── navigation/ # GlobalNav, AnalysisTabs
143+
│ ├── blocks/ # EvidenceStrip, SummaryCard, SentimentFilter
144+
│ ├── charts/ # SentimentPie, EngagementBar, etc.
145+
│ └── results/ # TopicRanking, CommentCard, etc.
146+
├── context/
147+
│ └── analysis-context.tsx # Shared analysis state
148+
├── hooks/
149+
│ ├── useAnalysis.ts # Analysis + ML metrics
150+
│ └── useAnalysisData.ts # Data fetching
151+
└── types/
152+
└── index.ts # TypeScript interfaces
153+
154+
api/
155+
├── main.py # FastAPI app
156+
├── config.py # Environment config
157+
├── routers/
158+
│ └── analysis.py # SSE streaming endpoints
159+
├── services/
160+
│ ├── youtube.py # yt-dlp extraction
161+
│ ├── sentiment.py # BERT sentiment
162+
│ ├── topics.py # BERTopic modeling
163+
│ └── summarizer.py # Ollama summaries
164+
└── db/
165+
└── models.py # SQLAlchemy models
166+
```
167+
168+
## API Endpoints
169+
170+
| Method | Endpoint | Description |
171+
|--------|----------|-------------|
172+
| POST | `/api/analysis/analyze` | Start analysis (SSE stream) |
173+
| GET | `/api/analysis/result/{id}` | Get analysis results |
174+
| GET | `/api/analysis/result/{id}/comments` | Get comments for analysis |
175+
| GET | `/api/analysis/history` | List past analyses |
176+
| DELETE | `/api/analysis/history/{id}` | Delete an analysis |
177+
| GET | `/api/analysis/search` | Search YouTube videos |
178+
179+
## Environment Variables
180+
181+
Copy `.env.example` to `.env` and configure:
182+
183+
```bash
184+
# YouTube
185+
YOUTUBE_MAX_COMMENTS=100
186+
187+
# Ollama (AI Summaries)
188+
OLLAMA_URL=http://localhost:11434
189+
OLLAMA_MODEL=llama3.2:3b
190+
OLLAMA_ENABLED=true
191+
192+
# ML Processing
193+
SENTIMENT_BATCH_SIZE=32
194+
MAX_TOPICS=10
195+
```
196+
197+
## Development
198+
199+
### Running the App
200+
201+
```bash
202+
# Terminal 1 - Frontend
203+
pnpm dev
204+
205+
# Terminal 2 - Backend
206+
uv run uvicorn api.main:app --reload --port 8000
207+
208+
# Terminal 3 - Ollama (optional)
209+
ollama serve
210+
```
211+
212+
### Running Tests
213+
214+
```bash
215+
# Run all tests
216+
uv run pytest tests/ -v
217+
218+
# Run with coverage
219+
uv run pytest tests/ -v --cov=api --cov-report=term-missing
220+
```
221+
222+
### Code Quality
223+
224+
```bash
225+
# Lint
226+
uv run ruff check api/ tests/
227+
228+
# Format
229+
uv run ruff format api/ tests/
230+
231+
# Frontend lint
232+
pnpm lint
233+
```
234+
235+
### CI Pipeline
236+
237+
GitHub Actions runs on every push/PR:
238+
- **Lint**: Ruff check + format verification
239+
- **Test**: pytest with 65% coverage threshold
240+
241+
## License
242+
243+
MIT

0 commit comments

Comments
 (0)