Skip to content

okxint/product-teardown

Repository files navigation

Product Teardown

The PM research tool I wished existed when I was prepping for interviews.

Enter any product name. Get a structured, opinionated analysis — not a summary, but a real teardown with user journey maps, drop-off analysis, and solutions you can actually present.

Live Demo · Report Bug · Request Feature


Why I Built This

Every PM I know does the same thing before interviews: spend 4-6 hours Googling a company, reading App Store reviews, scrolling through Reddit threads, checking G2 ratings, and manually piecing together a "product teardown" in a Google Doc.

The output is usually surface-level — a feature list, some competitor names, and a vague "I'd improve onboarding" recommendation.

I wanted something that goes deeper. Something that actually walks through the full PM problem-solving cycle the way it's done in APM assignments and product competitions:

Identify the problem → Segment users → Map the journey →
Find drop-off points → Propose solutions → Prioritize with data →
Define metrics → Design experiments → Plan implementation

So I built it.


What You Get

When you enter a product name, the tool scrapes real data from 8+ sources and generates a 3,000-5,000 word analysis covering 12 PM frameworks.

Here's what a teardown of Notion looks like:

User Journey & Drop-off Analysis

Every stage of the funnel mapped with expected vs actual behavior, estimated drop-off rates, and root causes from real user reviews.

Journey Stage Expected Actual Drop-off Root Cause
Awareness → Signup High intent visitors convert 12% conversion rate 88% Pricing page confusion
Signup → Activation Users create first page 68% reach aha moment 32% Blank page anxiety
Activation → D7 Retention Users return within a week 42% come back 58% Template discovery gap

Effort-Impact Matrix

Every solution scored and prioritized. Quick wins separated from big bets.

Solution Impact Effort Priority
Guided template picker on signup High Low Ship this week
AI-assisted page setup High Medium Next sprint
Collaborative onboarding flow Medium High Backlog

Metrics Framework

Not vague "increase engagement" — specific metrics with formulas.

  • North Star: Weekly active editors (users who create/edit 2+ pages per week)
  • KR1: Increase D7 retention from 42% to 55% by Q3
  • KR2: Reduce signup-to-activation time from 48h to 12h
  • KR3: Grow template usage rate from 23% to 40%

Plus 9 More Sections

Business context, problem space deep dive, user segmentation, competitive landscape, sentiment analysis, A/B testing plans, implementation roadmap, strategic outlook, and PM recommendations.


Two Modes

Teardown Mode

Full end-to-end product analysis. Use this when you want to understand a product completely — for interview prep, competitive analysis, or portfolio pieces.

Enter: "Notion"
Domain: notion.so (optional, for deeper website analysis)

Case Study Mode

Problem-focused deep dive. Use this when you want to solve a specific challenge — for PM assignments, case competitions, or feature proposals.

Enter: "Spotify"
Problem: "podcast discovery and retention"

The case study mode walks through root cause analysis (5 Whys), designs 3 solutions with user flows, and includes A/B testing plans with hypothesis → control → variant → success criteria.


Data Sources

The tool scrapes these sources in parallel and feeds everything to Claude for synthesis:

Source What It Pulls Why It Matters
App Store Ratings, reviews, description, release notes User sentiment at scale
Google Play Listing data, reviews Android perspective
G2 Enterprise reviews, pros/cons, feature ratings B2B user voice
Product Hunt Launch reception, community feedback Early adopter sentiment
Crunchbase Funding, investors, team size Business context
Reddit User discussions, complaints, praise Unfiltered opinions
Google News Recent coverage, announcements Strategic moves
Company Website Landing page, pricing, features, about Positioning & messaging

If a source doesn't return useful data (some products aren't on G2, for example), the tool continues with what it has. It works best with well-known products that have presence across multiple platforms.


Getting Started

Use the Website

Go to pm-teardown.vercel.app, type a product name, and hit generate. That's it.

Self-Host

If you want to run your own instance:

Prerequisites

Setup

# Clone the repo
git clone https://github.com/okxint/product-teardown.git
cd product-teardown

# Install dependencies
npm install

# Set your API key
export ANTHROPIC_API_KEY=sk-ant-your-key-here

# Run locally
npm run dev

Open localhost:3000 and start analyzing.

Deploy to Vercel

Deploy with Vercel

One click. Just add your ANTHROPIC_API_KEY as an environment variable.


Tech Stack

Layer Technology Why
Framework Next.js 16 (App Router) Server components + API routes in one project
Styling Tailwind CSS v4 Utility-first, supports light/dark mode with CSS vars
AI Claude API (Anthropic SDK) Best reasoning for structured, analytical output
Scraping Axios + Cheerio Lightweight, no browser needed for most sources
Streaming ReadableStream Real-time output as Claude generates
Language TypeScript Type safety across the whole stack
Deployment Vercel Zero-config Next.js hosting

Project Structure

src/
├── app/
│   ├── page.tsx              # Landing page
│   ├── analyze/
│   │   └── page.tsx          # Analysis interface (input → streaming results)
│   ├── api/
│   │   └── analyze/
│   │       └── route.ts      # POST endpoint: scrape → analyze → stream
│   ├── layout.tsx            # Root layout with theme support
│   └── globals.css           # Theme system (light/dark CSS variables)
├── components/
│   └── theme-toggle.tsx      # Dark/light mode toggle
└── lib/
    ├── scraper.ts            # 7 scraper functions + gatherAllData
    └── prompts.ts            # Teardown & case study prompt templates

How It Works Under the Hood

  1. User submits a product name via the /analyze page
  2. API route receives the request, validates input, checks rate limit (5 requests/IP/hour)
  3. Scraper runs 7 scrapers in parallel using Promise.allSettled — App Store (iTunes API), Reddit (JSON API), G2, Product Hunt, Crunchbase, Google News, and the company website (Cheerio for HTML parsing)
  4. Research data is assembled into a structured prompt with the appropriate template (teardown or case study)
  5. Claude generates the analysis via streaming — the response flows to the browser in real-time as it's being written
  6. User sees the markdown rendered with tables, headers, and formatting, and can copy or download the result

The streaming approach means you see the analysis building paragraph by paragraph instead of waiting 60 seconds for a blank page to suddenly fill.


Limitations

Honest about what this doesn't do:

  • Not a replacement for using the product. This analyzes public data. It can't evaluate the actual UX of an onboarding flow or test a feature.
  • Scraping is fragile. Some sources block automated requests. G2 and Crunchbase especially. The tool handles failures gracefully, but some analyses will have fewer data points than others.
  • Inferred data is marked [Inferred]. When Claude doesn't have hard numbers, it makes reasonable estimates and flags them. Don't present inferred numbers as facts.
  • Rate limited. The free hosted version allows 5 analyses per hour per IP. Self-host if you need more.
  • Best for well-known products. A startup with 50 users won't have App Store reviews or Reddit discussions to analyze.

Roadmap

Things I want to add:

  • PDF export — Generate a formatted PDF that's presentation-ready
  • History — Save past analyses locally so you can revisit them
  • Comparison mode — Analyze two products side-by-side
  • Custom data upload — Paste your own research (survey results, analytics) for Claude to incorporate
  • Better scraping — Playwright-based scraping for JS-rendered sites
  • Shareable links — Generate a public link to share your analysis

Contributing

Contributions are welcome. If you want to:

  • Fix a bug — Open a PR with a clear description of the issue and fix
  • Add a scraper — New data sources are always useful. Follow the pattern in src/lib/scraper.ts
  • Improve prompts — The analysis quality lives or dies by the prompts in src/lib/prompts.ts
  • UI improvements — Keep it clean and minimal. No gratuitous animations.
# Fork and clone
git clone https://github.com/YOUR_USERNAME/product-teardown.git
cd product-teardown
npm install
npm run dev

License

MIT — do whatever you want with it.


Built by @okxint

If this helped you prep for an interview or build a better product, star the repo — it helps others find it.

About

AI-powered product teardown generator for PMs. Enter a company name, get a full strategic teardown from 8+ sources.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors