Multi-agent AI that researches a market sector, ranks the top 3 stocks, and explains exactly why β in seconds.
For research and education only. Not financial advice.
Try it free β Β Β·Β Self-host it Β Β·Β View the API docs
Stock Analyst runs a three-agent AI crew (powered by CrewAI + Google Gemini) that takes any investable sector β Semiconductors, Clean Energy, AI Infrastructure, Healthcare β searches the live web, crunches public financial signals, and returns a structured π₯ π₯ π₯ podium of the three best stocks. Every pick is explained with head-to-head reasoning, and your history is tracked against real 30/60/90-day price returns so you can see how the AI performs over time.
The crew pipeline:
Market Researcher βββΊ Financial Analyst βββΊ Investment Strategist
(live search) (signal scoring) (ranked output)
| Feature | Notes | |
|---|---|---|
| π€ | 3-agent AI pipeline | Market researcher β financial analyst β investment strategist, fully automated |
| π | Ranked top-3 podium | π₯π₯π₯ with head-to-head reasoning at each rank |
| π | Live web search | Serper-powered search on every run β no stale training data |
| π | Track record | Past AI picks vs actual 30/60/90-day Yahoo Finance returns |
| π | PDF export | Download any report as a formatted PDF β no print dialog |
| β‘ | Real-time progress | WebSocket streaming shows the crew thinking live |
| π | Auth0 + Google sign-in | OIDC, opaque-token safe, JWT validated on every request |
| π³ | Docker-ready | One command to self-host on Fly.io, AWS Lightsail, or any VPS |
| ποΈ | SQLite persistence | All runs stored; history survives restarts |
Stock Analyst is free to try. Pro unlocks unlimited daily research and advanced features.
| Free | Pro β $2.99 / mo | |
|---|---|---|
| Sector research runs per day | 1 | Unlimited |
| All 15 sectors | β | β |
| π₯π₯π₯ ranked picks + reasoning | β | β |
| Track record (30 / 60 / 90-day returns) | β | β |
| PDF export | β | β |
| Real-time WebSocket progress | β | β |
| Email on completion | β | β (coming soon) |
| Weekly sector digest | β | β (coming soon) |
| Watchlist Β±5% alerts | β | β (coming soon) |
| Priority queue (no cold starts) | β | β (coming soon) |
Pro is in early access. Join the waitlist β it's free and you'll be first in line.
| Sign-in | Dashboard |
![]() |
![]() |
| Sector selection | Live research progress |
![]() |
![]() |
| History | Track record |
![]() |
![]() |
No setup required. Sign in at stocks.srini.fyi with Google and you're researching in under 30 seconds.
# 1. Clone
git clone https://github.com/your-org/stock-analyst.git
cd stock-analyst/market_researcher
# 2. Configure
cp .env.example .env
# Fill in: GEMINI_API_KEY, SERPER_API_KEY, JWT_JWKS_URL, JWT_ISSUER, JWT_AUDIENCE, CORS_ORIGINS
# 3. Run
docker build -t stock-analyst .
docker run -p 8000:8000 --env-file .env stock-analystThe API is now live at http://localhost:8000.
OpenAPI docs: http://localhost:8000/docs
Then start the frontend:
cd ../frontend
cp .env.example .env # set VITE_AUTH0_* and VITE_API_URL
npm install && npm run dev # http://localhost:3000Full deployment guide (Fly.io, HTTPS, env vars):
docs/HOSTING_stocks.srini.fyi.md
flowchart TB
subgraph clients [Clients]
Dash[React SPA / CLI]
end
subgraph fastapi [FastAPI]
HttpRoutes[HTTP routes]
WsRoute[WebSocket stream]
JobRunner[Async job runner]
end
subgraph jobs [In-memory]
JobReg[Job registry + event log]
end
subgraph sqlite [SQLite]
TUsers[(users)]
TRuns[(research_runs)]
TCache[(sector_cache)]
end
subgraph crew [CrewAI pipeline]
Ag1[market_researcher]
Ag2[financial_analyst]
Ag3[investment_strategist]
end
subgraph cloud [External services]
GeminiAPI[Google Gemini]
SerperAPI[Serper search]
end
Dash --> HttpRoutes
Dash --> WsRoute
HttpRoutes --> JobReg
HttpRoutes --> JobRunner
WsRoute --> JobReg
JobRunner --> TUsers
JobRunner --> TRuns
JobRunner --> TCache
JobRunner --> crew
Ag1 --> GeminiAPI
Ag1 --> SerperAPI
Ag2 --> GeminiAPI
Ag2 --> SerperAPI
Ag3 --> GeminiAPI
flowchart LR
subgraph inputs [Kickoff inputs]
Sec[sector]
Year[current year]
end
inputs --> T1[market_landscape_task]
T1 --> T2[financial_signals_task]
T2 --> T3[investment_recommendation_task]
T3 --> Out[InvestmentRecommendation JSON + DB row]
Results are cached per sector for 24 hours and shared across all users. At most 15 CrewAI runs happen per day (one per sector). Subsequent users get the cached result instantly via the existing WebSocket replay path β no frontend changes required.
User clicks sector
β
βΌ
Fresh cache hit? ββYESβββΊ Clone run to user history β instant WebSocket replay
β
NO
βΌ
Free user, daily limit reached? ββYESβββΊ 429 + upgrade prompt
β
NO
βΌ
Run CrewAI job β save run β update sector_cache
stock-analyst/
βββ market_researcher/ # Python backend (FastAPI + CrewAI)
β βββ src/market_researcher/
β β βββ api.py # FastAPI app, JWT auth, jobs, WebSocket
β β βββ crew.py # CrewAI agents, tasks, Gemini LLM, Serper
β β βββ db.py # SQLite: users, research_runs, sector_cache
β β βββ schemas.py # InvestmentRecommendation (Pydantic)
β β βββ config/
β β βββ agents.yaml
β β βββ tasks.yaml
β βββ Dockerfile
β βββ pyproject.toml
β βββ README.md # Full backend docs, env vars, API reference
βββ frontend/ # React + Vite SPA
β βββ README.md # Frontend setup and troubleshooting
βββ docs/
βββ HOSTING_stocks.srini.fyi.md # Production deployment guide
βββ PLAN_caching_and_freemium.md # Freemium architecture plan
βββ screenshots/
The backend is configured via market_researcher/.env. Copy .env.example to get started.
| Variable | Required | Description |
|---|---|---|
GEMINI_API_KEY |
β | Google Gemini API key |
SERPER_API_KEY |
β | Serper web search key |
JWT_JWKS_URL |
β | Auth0 JWKS endpoint, e.g. https://<tenant>.auth0.com/.well-known/jwks.json |
JWT_ISSUER |
β | Auth0 issuer, e.g. https://<tenant>.auth0.com/ |
JWT_AUDIENCE |
β | Auth0 API Identifier β must match VITE_AUTH0_AUDIENCE in the frontend |
CORS_ORIGINS |
β | Comma-separated allowed origins, e.g. https://stocks.srini.fyi |
GEMINI_MODEL |
β | Override model; default gemini/gemini-2.5-flash |
MARKET_RESEARCHER_DB |
β | SQLite path; default data/app.db |
API_DEV_SKIP_AUTH |
β | true to bypass JWT in local dev |
API_DEV_PASSWORD_LOGIN |
β | true to enable POST /auth/dev-login |
Production checklist: disable
API_DEV_SKIP_AUTH,API_DEV_PASSWORD_LOGIN, andVITE_DEV_PASSWORD_LOGIN. Use HTTPS origins everywhere.
For the full variable reference and Auth0 setup, see market_researcher/README.md.
Auth0 (OpenID Connect) handles sign-in. Google OAuth is wired through Auth0 β no separate Google credentials in the frontend.
Quick Auth0 setup:
- Create a Single Page Application in Auth0 β copy its Client ID to
VITE_AUTH0_CLIENT_ID. - Create an API in Auth0 β set Identifier to
https://market-researcher-apiβ copy toJWT_AUDIENCEandVITE_AUTH0_AUDIENCE. - Set Allowed Callback / Logout / Web Origins for your domain.
- (Optional) Add a Post-Login Action to embed profile claims on the access token.
Full walkthrough + troubleshooting table: market_researcher/README.md#auth0--jwt and frontend/README.md.
| Method | Path | Description |
|---|---|---|
GET |
/health |
Health check |
GET |
/me |
Current user profile (upserts on call) |
POST |
/research |
Start a research job β returns { job_id } |
GET |
/research/jobs/{job_id} |
Job status + result |
WS |
/research/ws/{job_id} |
Live event stream; append ?token=<JWT> |
GET |
/research/history |
Recent runs for the current user |
GET |
/research/{run_id} |
Single persisted run |
POST |
/auth/dev-login |
Dev-only HS256 token (when API_DEV_PASSWORD_LOGIN=true) |
Interactive docs available at /docs when the API is running.
# Backend
cd market_researcher
python -m venv .venv && source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e .
cp .env.example .env # fill in keys; set API_DEV_SKIP_AUTH=true for local
uvicorn market_researcher.api:app --reload --port 8000
# Frontend (separate terminal)
cd frontend
npm install
cp .env.example .env # set VITE_AUTH0_* and VITE_API_URL=http://localhost:8000
npm run dev # http://localhost:3000Run the crew from the CLI (no API needed):
cd market_researcher
run_crew # researches the default sector
python -m market_researcher.main # sameAll research output is generated by AI using publicly available information and is provided for educational and research purposes only. It is not financial advice. Past AI picks tracked against actual returns are shown for transparency, not as a guarantee of future performance. Always do your own due diligence before making investment decisions.
Built with CrewAI Β· Google Gemini Β· FastAPI Β· Auth0





