agents-radar is a daily digest generator for the AI open-source ecosystem. A GitHub Actions cron job runs at 00:00 UTC (08:00 CST) and produces five Chinese-language reports, published as GitHub Issues and committed Markdown files.
pnpm start # run the full digest locally
pnpm typecheck # tsc --noEmit
pnpm lint # ESLint
pnpm lint:fix # ESLint --fix
pnpm format # Prettier --write src
pnpm format:check # Prettier --check srcRequired env vars for local runs (set one LLM provider group):
export GITHUB_TOKEN=ghp_xxxxx
# Option A — OpenAI-compatible (takes precedence when OPENAI_API_KEY is set)
export OPENAI_API_KEY=sk-xxxxx
export OPENAI_BASE_URL=https://your-provider/v1 # optional
export OPENAI_MODEL=gpt-4o # optional, default: gpt-4o
# Option B — Anthropic (default when OPENAI_API_KEY is absent)
export ANTHROPIC_API_KEY=sk-ant-xxxxx
export ANTHROPIC_BASE_URL=https://api.kimi.com/coding/ # omit for Anthropic
export ANTHROPIC_MODEL=claude-sonnet-4-6 # optional
export DIGEST_REPO=owner/repo # omit to skip GitHub issue creationThe pipeline runs in four sequential phases, each implemented as a named async function in src/index.ts:
fetchAllData— all network I/O in parallel: GitHub API (issues/PRs/releases) for 17 repos, Claude Code Skills, Anthropic/OpenAI sitemaps, GitHub Trending HTML + Search API, Hacker News Algolia API.generateSummaries— per-repo LLM calls, all in parallel, rate-limited to 5 concurrent requests by a queue insrc/report.ts.- Comparisons — two LLM calls: cross-tool CLI comparison and OpenClaw cross-ecosystem comparison.
- Save phase —
buildCliReportContent/buildOpenclawReportContentbuild Markdown strings;saveWebReport/saveTrendingReport/saveHnReportcall LLM + write file + create GitHub Issue.
| File | Responsibility |
|---|---|
src/index.ts |
Orchestration: repo config, phase functions, main() |
src/github.ts |
GitHub API helpers: fetchRecentItems, fetchRecentReleases, fetchSkillsData, createGitHubIssue |
src/prompts.ts |
LLM prompt builders (one per report type) and formatItem |
src/report.ts |
callLlm (with concurrency limiter), saveFile, autoGenFooter |
src/web.ts |
Sitemap-based web content fetching; state persisted to digests/web-state.json |
src/trending.ts |
GitHub Trending HTML scraper + Search API topic queries |
src/hn.ts |
Hacker News top AI stories via Algolia HN Search API |
src/generate-manifest.ts |
Generates manifest.json (sidebar data for Web UI) and feed.xml (RSS 2.0 feed) |
Files written to digests/YYYY-MM-DD/:
| File | Label | Notes |
|---|---|---|
ai-cli.md |
digest |
Always generated |
ai-agents.md |
openclaw |
Always generated |
ai-web.md |
web |
Skipped if no new sitemap content |
ai-trending.md |
trending |
Skipped if both data sources fail |
ai-hn.md |
hn |
Skipped if Algolia fetch fails |
- CLI_REPOS (6): claude-code, codex, gemini-cli, kimi-cli, opencode, qwen-code
- OPENCLAW + OPENCLAW_PEERS (11): openclaw/openclaw + 10 peer projects (sorted by stars)
- CLAUDE_SKILLS_REPO: anthropics/skills — no date filter, sorted by popularity
- Web: anthropic.com + openai.com via sitemap, state in
digests/web-state.json - Trending: github.com/trending (HTML) + GitHub Search API (6 AI topics, 7-day window)
- HN: Algolia HN Search API — 6 parallel queries, top-30 AI stories by points, last 24h
- All LLM prompts are in
src/prompts.ts. Each report type has its own builder function. Prompts are written in Chinese and produce Chinese output. callLlm(prompt, maxTokens?)defaults to 4096 tokens. Web report uses 8192, trending uses 6144. HN report uses the default 4096.- On 429 rate-limit errors
callLlmretries up to 3 times with exponential backoff (5 s / 10 s / 20 s); the concurrency slot is released during the wait. - The concurrency limiter (
LLM_CONCURRENCY = 5) prevents 429s when many parallel LLM calls fire. Do not bypass it by calling the Anthropic SDK directly. - GitHub issue label colors are defined in
LABEL_COLORSinsrc/github.ts. Add new labels there. sampleNote(total, sampled)insrc/prompts.tsformats the "(共 N 条,展示前 M 条)" note. Reuse it — do not inline the same string format.- Web state (
digests/web-state.json) is committed to git on every run. It is the source of truth for which URLs have been seen.
- Web UI:
index.htmlreadsmanifest.jsonto build the sidebar, then fetchesdigests/YYYY-MM-DD/report.mdon demand. - RSS Feed:
feed.xmlat the repo root. Generated bysrc/generate-manifest.tsin the samepnpm manifeststep. Contains the latest 30 items (newest first) across all report types. Item links use hash routing:https://duanyytop.github.io/agents-radar/#YYYY-MM-DD/report. - Both
manifest.jsonandfeed.xmlare committed together in the "Commit manifest and feed" GHA step. - The
REPORT_LABELSmap ingenerate-manifest.tsmust be kept in sync with theLABELSobject inindex.htmlwhen adding new report types.
- Create a data fetcher (or add to an existing one).
- Add a
buildXxxPromptfunction insrc/prompts.ts. - Wire into
fetchAllData,generateSummaries, and asaveXxxReportfunction insrc/index.ts. - Add a label color entry in
LABEL_COLORSinsrc/github.ts. - Add the report ID and label to
REPORT_LABELSinsrc/generate-manifest.tsandLABELSinindex.html. - Add the report file name to
REPORT_FILESinsrc/generate-manifest.ts. - Update both README files and this file.