Social Finder is a Flask-based OSINT web application for finding public social profiles connected to a first name and last name.
It combines Sherlock username checks, direct profile probes, search-engine dorks, link validation heuristics, a modern dashboard UI, PDF export, anti-spam controls, and an optional Premium login mode.
The project is designed as a professional portfolio application: privacy-conscious, containerized, hardened for deployment, and explicit about responsible use.
- 🔍 Generates likely username variants from a first name and last name.
- ⚡ Searches public profiles with Sherlock.
- 🌐 Extends coverage with direct probes for platforms such as GitHub, GitLab, Reddit, Threads, Bluesky, Mastodon, Twitch, Medium, Dev.to, Behance, Dribbble, Pinterest, SoundCloud, Telegram, Linktree, About.me, and more.
- 🧠 Validates candidate links with stricter heuristics for 404 pages, generic login redirects, private profiles, unavailable pages, and minimal responses.
- 📊 Displays results in a responsive dashboard with counters, status filters, text filtering, collapsible details, and clickable cards.
- 📄 Exports visible results to PDF.
- 🍪 Keeps a cookie/privacy banner and remembers the user's theme preference.
- 🌙 Supports persistent dark mode across home, privacy, and disclaimer pages.
- 🔐 Provides an optional Premium login popup while still allowing free limited usage.
- 🚫 Applies anti-spam rate limits to searches, login, PDF export, and link-check endpoints.
- 🐳 Runs with Docker Compose using a hardened runtime configuration.
- Backend: Python, Flask, Gunicorn
- Frontend: HTML, CSS, Vanilla JS
- OSINT Engine: Sherlock + custom direct probes
- Search: DuckDuckGo by default, optional SerpAPI
- Export: PDF generation in memory
- Deployment: Docker & Docker Compose
No local Python setup is required when using Docker.
From the project root:
docker compose up -d --buildOpen:
http://localhost:3000
Check container status:
docker compose psView logs:
docker compose logs -f webStop the app:
docker compose downAll runtime variables are defined in the root .env file. Docker Compose loads it automatically.
Important defaults:
APP_NAME=Social Finder - Open Version
APP_PORT=3000
AUTH_ENABLED=true
AUTH_USERNAME=admin
AUTH_PASSWORD=change-me-now
AUTH_TOKEN_SECRET=change-this-secret-before-publishing
AUTH_TOKEN_TTL=43200
SEARCH_STREAM_RATE_MAX=1
PDF_EXPORT_RATE_MAX=1
LINK_CHECK_RATE_MAX=1
PREMIUM_SEARCH_STREAM_RATE_MAX=30
PREMIUM_PDF_EXPORT_RATE_MAX=60
PREMIUM_LINK_CHECK_RATE_MAX=600After changing .env, recreate the container:
docker compose up -d --build --force-recreatePremium login is controlled by:
AUTH_ENABLED=true
AUTH_USERNAME=admin
AUTH_PASSWORD=change-me-now
AUTH_TOKEN_SECRET=change-this-secret-before-publishingWhen AUTH_ENABLED=true, the UI shows a Login Premium popup.
Users can:
- log in with the credentials from
.env; - close the popup;
- choose Continue with the free limit.
Free users keep the default anti-spam limits. Premium users receive higher limits configured through the PREMIUM_* variables.
For local testing, the default login is:
Username: admin
Password: change-me-now
Before publishing, change both:
AUTH_PASSWORD=use-a-strong-password
AUTH_TOKEN_SECRET=use-a-long-random-secretTo disable the Premium popup:
AUTH_ENABLED=falseThe default free limit mirrors the original anti-spam behavior: 1 request every 5 minutes per IP on sensitive endpoints.
| Variable | Default | Meaning |
|---|---|---|
LOGIN_RATE_MAX |
5 |
Login attempts per IP every 5 minutes. |
SEARCH_RATE_MAX |
1 |
/api/search requests per IP every 5 minutes. |
SEARCH_STREAM_RATE_MAX |
1 |
Streaming searches per IP every 5 minutes. |
PDF_EXPORT_RATE_MAX |
1 |
PDF exports per IP every 5 minutes. |
LINK_CHECK_RATE_MAX |
1 |
Manual link checks per IP every 5 minutes. |
PREMIUM_SEARCH_STREAM_RATE_MAX |
30 |
Premium streaming searches per IP every 5 minutes. |
PREMIUM_PDF_EXPORT_RATE_MAX |
60 |
Premium PDF exports per IP every 5 minutes. |
PREMIUM_LINK_CHECK_RATE_MAX |
600 |
Premium link checks per IP every 5 minutes. |
Rate limits are stored in memory and reset when the container restarts.
| Variable | Default | Meaning |
|---|---|---|
SERPAPI_KEY |
empty | Enables SerpAPI instead of DuckDuckGo for dork search. |
SCRAPE_WORKERS |
10 |
Concurrent workers used for link validation. |
PAGE_TIMEOUT |
8 |
HTTP timeout for target page checks. |
SHERLOCK_TIMEOUT |
5 |
Timeout passed to Sherlock. |
DIRECT_VARIANT_LIMIT |
6 |
Maximum username variants used for direct profile probes. |
You can test Social Finder directly at:
- Change
AUTH_PASSWORDandAUTH_TOKEN_SECRETbefore exposing the app publicly. - Keep
FLASK_DEBUG=0. - Put the app behind a reverse proxy with TLS.
- Keep
read_only: true,cap_drop: ALL,no-new-privileges:true, andtmpfsenabled in Compose. - The public
/healthzroute is intentionally hidden. Docker uses the internal health path configured byINTERNAL_HEALTH_PATH. - The app does not persist search results to disk. PDF exports are generated in memory.
The Docker setup uses:
- multi-stage build;
- non-root runtime user;
- Gunicorn instead of Flask debug server;
- read-only filesystem;
- writable
/tmpmounted astmpfs; - dropped Linux capabilities;
no-new-privileges;- healthcheck through an internal-only route.
- PDF export of search results.
- Improved detection heuristics.
- Extended supported platforms.
- Optional Premium authentication with free limited mode.
- Anti-spam rate limiting for searches, login, PDF export, and link checks.
- Cookie/privacy banner.
- Persistent dark mode.
- Status filters for valid and invalid results.
- Clickable result cards opening in a new tab.
- Updated privacy and disclaimer pages for portfolio positioning.
- Hardened Docker image and Docker Compose runtime.
- Hidden public debug health route.
-
.env-based configuration.
- Add automated tests for rate limiting and validation heuristics.
- Add platform category presets.
- Add optional encrypted audit logging for private deployments.
- Add reverse proxy examples for Nginx or Caddy.
- Add stricter Content Security Policy for production.
- Add export formats beyond PDF, such as CSV and JSON.
Use this tool only with public data and for lawful, ethical, defensive, demonstrative, or portfolio-related purposes.
Do not use it for:
- harassment;
- stalking;
- spam;
- credential attacks;
- unauthorized mass collection.
MIT, where applicable to the original project.
See the LICENSE file for details.
Contributions, issues, and feature requests are welcome!
Feel free to open an issue or submit a pull request.
⚡ Built with ❤️ for OSINT & Cybersecurity research.



