Turn your codebase into a comprehensive Wiki in minutes.
Any Model. Any Repo. Any Environment.
Wiki As Readme is the most flexible AI documentation tool available. Whether you're running a local Llama 3 model via Ollama, using Google's Gemini Pro, or hitting OpenAI's API, this tool adapts to your stack. It seamlessly integrates with any Git platform (GitHub, GitLab, Bitbucket) or local folders, making it the ultimate "drop-in" documentation solution.
Note
Some features and integrations are currently under development. Wiki-As-Readme warmly welcomes your contributions and Pull Requests!
This project is built to be truly pluggable. You choose how to run it, where to run it, and what powers it.
- Commercial APIs: Google Vertex AI (Gemini), OpenAI (GPT-4), Anthropic (Claude), xAI (Grok).
- Open/Local Models: Ollama, OpenRouter, HuggingFace.
- On-Premise: Connect to your own private LLM endpoints safely.
- Cloud Repos: Works seamlessly with GitHub, GitLab, and Bitbucket.
- Local Development: Analyze code directly from your local file system without pushing.
- Private/Enterprise: Full support for private instances and self-hosted Git servers.
- CI/CD: Drop it into GitHub Actions.
- Container: Run it via Docker Compose.
- Service: Deploy as a long-running API server with Webhooks.
- CLI: Run it locally while you code.
🧠 Deep Context Analysis:
Analyzes file structure and relationships to understand the project's architecture before writing.
📦 Smart Structure Generation:
Automatically determines a logical hierarchy (Sections > Pages) for your documentation.
🔍 Comprehensive Content:
Writes detailed pages including architecture overviews, installation guides, and API references.
📊 Automatic Diagrams:
Generates Mermaid.js diagrams (Flowcharts, Sequence diagrams, Class diagrams) to visualize architecture.
🚗 Hybrid Output:
Generates both individual Markdown files for a Wiki and a single consolidated README.md.
⚡ Async & Scalable:
Built with FastAPI and AsyncIO for non-blocking, efficient generation of large documentations.
Curious about the results? Check out our sample outputs to see the quality of documentation generated by Wiki As Readme:
LangGraph Wiki Example (English)
- A high-quality, structured wiki generated from the LangGraph repository, featuring architecture overviews, core concepts, and Mermaid diagrams.
LangGraph Wiki Example (Korean)
- The same LangGraph wiki, but generated in Korean.
- Documentation for this project, generated by itself!
- See how the generated wiki is automatically organized into a Notion database with sub-pages and structured content.
This project is designed to be pluggable and can be used in multiple ways depending on your needs:
- GitHub Action
- Automate documentation updates in your CI/CD pipeline.
- Docker Compose (Local)
- Run the full UI/API locally without installing Python dependencies.
- Local Python Development
- For developers who want to modify the source code.
- Server & Webhooks
- Deploy as a long-running service with Webhook support.
Add this workflow to your repository to automatically update a WIKI.md file whenever you push changes.
🎮 Manual Trigger:
You can manually run the workflow from the "Actions" tab and customize settings (Language, Model, Notion Sync, Commit Method) on the fly.
📒 Notion Sync:
Optionally sync the generated content to a Notion Database.
-
Create
.github/workflows/update-wiki.yml:name: Wiki-As-Readme As Action on: # 1. When pushing to main branch (runs automatically with defaults) push: branches: - main paths-ignore: - 'README.md' - 'WIKI.md' - '.github/workflows/update-wiki.yml' # 2. Manual trigger (allows custom input settings) # "Use workflow from" menu in GitHub UI handles branch selection automatically. workflow_dispatch: inputs: language: description: 'Language code (e.g., ko, en, ja, etc.)' required: false default: 'en' llm_provider: description: 'LLM Provider (google, openai, anthropic, etc.)' required: false default: 'google' model_name: description: 'Model Name' required: false default: 'gemini-2.5-flash' sync_to_notion: description: 'Sync to Notion? (true/false)' type: boolean required: false default: false is_comprehensive_view: description: 'Generate comprehensive wiki with more pages (true/false)' type: boolean required: false default: true commit_method: description: 'How to apply changes' type: choice options: - push - pull-request default: 'push' concurrency: group: wiki-update-${{ github.ref }} cancel-in-progress: true jobs: wiki-time: runs-on: ubuntu-latest permissions: contents: write pull-requests: write env: WIKI_OUTPUT_PATH: "WIKI.md" steps: # 1. Checkout code # No 'ref' needed; it automatically checks out the branch selected in the "Run workflow" UI. - name: Checkout code uses: actions/checkout@v4 # ----------------------------------------------------------------------- # [OPTIONAL] GCP Credentials Setup # Create GCP key only if using Google Provider (defaults to 'google' if undefined) # ----------------------------------------------------------------------- - name: Create GCP Credentials File if: ${{ (inputs.llm_provider == 'google') || (inputs.llm_provider == '') || (github.event_name == 'push') }} env: GCP_KEY: ${{ secrets.GOOGLE_APPLICATION_CREDENTIALS }} run: | if [ -n "$GCP_KEY" ]; then echo "$GCP_KEY" > ./gcp-key.json else echo "::warning::GOOGLE_APPLICATION_CREDENTIALS secret is missing, but provider is set to google." fi # 2. Generate Wiki Content & Sync - name: Generate Content (and Sync to Notion if enabled) uses: catuscio/wiki-as-readme@v1.5.0 env: # --- Basic Settings --- # Use input if available, otherwise default to 'en' (e.g., for push events) LANGUAGE: ${{ inputs.language || 'en' }} WIKI_OUTPUT_PATH: ${{ env.WIKI_OUTPUT_PATH }} IS_COMPREHENSIVE_VIEW: ${{ inputs.is_comprehensive_view == '' && 'true' || inputs.is_comprehensive_view }} # --- LLM Provider and Model Settings --- LLM_PROVIDER: ${{ inputs.llm_provider || 'google' }} MODEL_NAME: ${{ inputs.model_name || 'gemini-2.5-flash' }} # --- API Key Settings --- # [GCP / Vertex AI] GCP_PROJECT_NAME: ${{ secrets.GCP_PROJECT_NAME }} GCP_MODEL_LOCATION: ${{ secrets.GCP_MODEL_LOCATION }} GOOGLE_APPLICATION_CREDENTIALS: /github/workspace/gcp-key.json # [Other Providers] OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }} ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }} # --- GitHub Token --- GIT_API_TOKEN: ${{ secrets.GITHUB_TOKEN }} # --- Notion Sync Settings --- # Pass "true" only if inputs.sync_to_notion is true; otherwise "false" (including push events) NOTION_SYNC_ENABLED: ${{ inputs.sync_to_notion || 'false' }} NOTION_API_KEY: ${{ secrets.NOTION_API_KEY }} NOTION_DATABASE_ID: ${{ secrets.NOTION_DATABASE_ID }} # ----------------------------------------------------------------------- # [OPTIONAL] GCP Credentials Cleanup # ----------------------------------------------------------------------- - name: Remove GCP Credentials File if: always() run: rm -f ./gcp-key.json # 3. Commit and Push Changes (Update file in GitHub Repo) # Option A: Direct Push (default for push events or when 'push' is selected) - name: Commit and Push changes if: ${{ inputs.commit_method == 'push' || github.event_name == 'push' }} uses: stefanzweifel/git-auto-commit-action@v5 with: commit_message: "✨📚 Update ${{ env.WIKI_OUTPUT_PATH }} via Wiki-As-Readme Action (${{ inputs.language || 'en' }})" file_pattern: ${{ env.WIKI_OUTPUT_PATH }} # Option B: Create Pull Request (when 'pull-request' is selected) - name: Create Pull Request if: ${{ inputs.commit_method == 'pull-request' }} uses: peter-evans/create-pull-request@v7 with: title: "✨📚 Update ${{ env.WIKI_OUTPUT_PATH }} via Wiki-As-Readme Action" body: | This PR was automatically generated by [Wiki-As-Readme](https://github.com/catuscio/wiki-as-readme) Action. It includes the following changes: - Updated wiki content in **${{ env.WIKI_OUTPUT_PATH }}** based on the current state of the repository. - (If enabled) Synchronized changes to the linked Notion database. --- > 📖 Powered by [Wiki-As-Readme](https://github.com/catuscio/wiki-as-readme) > Turn your codebase into a comprehensive Wiki in minutes, delivered in a single Readme. > Works with Any Model. Any Repo. Any Environment. branch: wiki-update-${{ github.run_id }} commit-message: "✨📚 Update ${{ env.WIKI_OUTPUT_PATH }} via Wiki-As-Readme Action (${{ inputs.language || 'en' }})" add-paths: ${{ env.WIKI_OUTPUT_PATH }}
The workflow runs in two ways:
| Trigger | When | Commit Method | Settings |
|---|---|---|---|
push |
Code is pushed to main |
Always Direct Push | Uses defaults (language: en, model: gemini-2.5-flash) |
workflow_dispatch |
Manually from the "Actions" tab | Choose Push or Pull Request | Customizable per run |
Changes to
README.md,WIKI.md, and the workflow file itself are excluded from push triggers viapaths-ignoreto prevent infinite loops.
When commit_method is push (or on automatic push events):
- The generated
WIKI.mdis committed directly to the current branch. - Uses
stefanzweifel/git-auto-commit-actionto detect changes, stageWIKI.md, and push. - If the content hasn't changed, no commit is created.
- The commit message follows the format:
✨📚 Update WIKI.md via Wiki-As-Readme Action (en)
Best for: Automated workflows where you want docs to always stay in sync with the code.
When commit_method is pull-request (only available via manual trigger):
- A new branch
wiki-update-{run_id}is created from the current branch. - The generated
WIKI.mdis committed to that branch. - A Pull Request is automatically opened against the current branch using
peter-evans/create-pull-request. - The PR body includes a summary of what was generated and links back to Wiki-As-Readme.
Best for:
- Team workflows where wiki changes should be reviewed before merging.
- CI/CD environments where pushes to
main/developtrigger deployments — using a PR avoids accidentally kicking off a deploy pipeline from an auto-generated doc commit.
| Secret | Required | Description |
|---|---|---|
GOOGLE_APPLICATION_CREDENTIALS |
If using Google/Vertex AI | GCP service account JSON key |
GCP_PROJECT_NAME |
If using Google/Vertex AI | Vertex AI project ID |
GCP_MODEL_LOCATION |
If using Google/Vertex AI | Vertex AI region |
OPENAI_API_KEY |
If using OpenAI | OpenAI API key |
ANTHROPIC_API_KEY |
If using Anthropic | Anthropic API key |
NOTION_API_KEY |
If Notion sync enabled | Notion integration token |
NOTION_DATABASE_ID |
If Notion sync enabled | Target Notion database ID |
GITHUB_TOKENis automatically provided by GitHub Actions — no manual setup needed. It requirescontents: writeandpull-requests: writepermissions as configured in the workflow.
Run the application locally with a single command. This is the easiest way to try out the UI.
-
Configure
.env: Copy.env.exampleto.env.- Set your API keys (e.g.,
LLM_PROVIDER,OPENAI_API_KEY, orGCP_...). - (Optional) Configure Notion Sync settings (
NOTION_SYNC_ENABLED, etc.) or change theLOCAL_REPO_PATHto point to your target code.
- Set your API keys (e.g.,
-
Run:
docker-compose up --build
-
Access:
- Web UI:
http://localhost:8501- Tip: Use the History tab in the sidebar to view and download previously generated wikis.
- API Docs:
http://localhost:8000/docs
- Web UI:
For developers who want to modify the source code or run without Docker.
Prerequisites: Python 3.12+, uv.
-
Clone & Install:
git clone https://github.com/catuscio/wiki-as-readme.git cd wiki-as-readme uv sync source .venv/bin/activate
-
Configure
.env: Copy.env.exampleto.envand set your variables. -
Run Backend:
uv run uvicorn src.server:app --reload --port 8000
-
Run Frontend:
uv run streamlit run src/app.py
You can deploy the API server to handle requests or webhooks (e.g., from GitHub).
- Endpoint:
POST /api/v1/webhook/github - Payload: Standard GitHub push event payload.
- Behavior: Triggers a background task to generate the wiki for the repository and commit it back (requires
GIT_API_TOKEN).
Whether running locally or in Docker, you configure the app via environment variables.
See .env.example for a complete template with comments.
| Category | Variable | Description | Default |
|---|---|---|---|
| LLM | LLM_PROVIDER |
google, openai, anthropic, xai, openrouter, ollama |
google |
MODEL_NAME |
Specific model identifier | gemini-2.5-flash |
|
LLM_BASE_URL |
Custom base URL (e.g., for Ollama or proxies) | — | |
USE_STRUCTURED_OUTPUT |
Use native JSON mode (requires model support) | true |
|
temperature |
LLM randomness (0.0 = deterministic, 1.0 = creative) | 0.0 |
|
max_retries |
Retry count for failed LLM requests | 3 |
|
max_concurrency |
Max parallel LLM calls (prevents rate limits) | 5 |
|
llm_timeout |
Timeout in seconds for each LLM request | 300 |
|
| Auth | OPENAI_API_KEY |
OpenAI API Key | — |
ANTHROPIC_API_KEY |
Anthropic API Key | — | |
OPENROUTER_API_KEY |
OpenRouter API Key | — | |
XAI_API_KEY |
xAI API Key | — | |
GIT_API_TOKEN |
GitHub/GitLab PAT for private repos | — | |
| GCP | GCP_PROJECT_NAME |
Vertex AI Project ID | — |
GCP_MODEL_LOCATION |
Vertex AI Region | — | |
| Output | language |
Wiki language (ko, en, ja, zh, zh-tw, es, vi, pt-br, fr, ru) |
en |
WIKI_OUTPUT_PATH |
Path to save generated wiki | ./WIKI.md |
|
LOCAL_REPO_PATH |
Local repo path for Docker mounting | . |
|
IGNORED_PATTERNS |
JSON array of glob patterns to exclude from analysis | (see config.py) |
|
IS_COMPREHENSIVE_VIEW |
Generate comprehensive wiki (8-12 pages) vs concise (4-6 pages) | true |
|
| Notion | NOTION_SYNC_ENABLED |
Sync to Notion after generation | false |
NOTION_API_KEY |
Notion Integration Token | — | |
NOTION_DATABASE_ID |
Target Notion Database ID | — | |
| Webhook | GITHUB_WEBHOOK_SECRET |
HMAC secret for webhook signature verification | — |
The backend API is built with FastAPI. You can access the interactive Swagger documentation at http://localhost:8000/docs when the server is running.
Starts a background task to generate the wiki and saves it as a Markdown file on the server.
Request Body:
{
"repo_url": "https://github.com/owner/repo",
"repo_type": "github",
"language": "en",
"is_comprehensive_view": true
}Starts a background task to generate the wiki. The resulting text is stored in the task status.
Retrieves the status and result of a generation task.
Endpoint for GitHub Webhooks (Push events). Triggers automatic wiki generation on pushes to the main branch.
- HMAC Verification: If
GITHUB_WEBHOOK_SECRETis set, the endpoint verifies theX-Hub-Signature-256header. - Loop Prevention: Commits made by
Wiki-As-Readme-Botor containing "via Wiki-As-Readme" in the message are automatically ignored to prevent infinite loops. - Branch Filter: Only
refs/heads/mainpushes trigger generation; all other branches are ignored. - Requires:
GITHUB_ACCESS_TOKENenvironment variable to commit the generated wiki back to the repository.
- Frontend: Streamlit (User Interface)
- Backend: FastAPI (REST API, Background Tasks)
- LLM Integration: LiteLLM (Unified interface for 100+ LLMs)
- Data Models: Pydantic (Type safety & Structured Output validation)
- Diagrams: Mermaid.js
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the project.
- Create your feature branch (
git checkout -b feature/AmazingFeature). - Commit your changes (
git commit -m 'Add some AmazingFeature'). - Push to the branch (
git push origin feature/AmazingFeature). - Open a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- This project is heavily influenced by and utilizes core logic from deepwiki-open by AsyncFuncAI.
- Built with the power of open-source libraries.
- Inspired by the need for better automated documentation.
