This repository serves as the canonical reference for conventions and best practices. When Claude is used in any repository that uses the repomatic CLI and its [tool.repomatic] configuration, it should follow the same conventions defined here—including the structure and guidelines of this claude.md file itself.
In other words, downstream repositories should mirror the patterns established here for code style, documentation, testing, and design principles.
Contributing upstream: If Claude spots inefficiencies, potential improvements, performance bottlenecks, missing features, or opportunities for better adaptability in the repomatic CLI, its configuration, the reusable workflows, or this claude.md file itself, it should propose these changes upstream via a pull request or issue at kdeldycke/repomatic. This benefits all downstream repositories.
Upstream runtime dependency boundary: Downstream repos must have only one runtime dependency on the upstream repository: reusable workflow uses: calls (e.g., kdeldycke/repomatic/.github/workflows/autofix.yaml@vX.Y.Z). These are version-pinned to a git tag, giving downstream repos control over when to upgrade. All other references to the upstream (documentation links in PR body templates, footer attribution) are informational only — they do not affect functionality if the upstream is unavailable. Do not introduce new runtime dependencies on the upstream repo (e.g., Renovate shareable presets, remote config extends, API calls to upstream) as they create unversioned coupling where an upstream breakage would cascade to all downstream repos simultaneously.
# Run all tests with coverage.
$ uv run --group test pytest
# Run a single test file.
$ uv run --group test pytest tests/test_changelog.py
# Run a specific test.
$ uv run --group test pytest tests/test_changelog.py::test_function_name
$ uv run --group typing mypy repomatic
# Run locally during development.
$ uv run repomatic --help
# Try without installation using uvx.
$ uvx -- repomatic --help
claude.md: Contributor and Claude-focused directives—code style, testing guidelines, design principles, and internal development guidance.readme.md: User-facing documentation for therepomaticCLI and[tool.repomatic]configuration—installation, usage, and the workflows that implement them.
When adding new content, consider whether it benefits end users (readme.md) or contributors/Claude working on the codebase (claude.md).
claude.md must contain only conventions, policies, rationale, and non-obvious rules that Claude cannot discover by reading the codebase. Actively remove:
- Structural inventories — project trees, module tables, workflow lists. Claude can discover these via
Glob/Read. - Code examples that duplicate source files — YAML snippets copied from workflows, Python patterns visible in every module. Reference the source file instead.
- General programming knowledge — standard Python idioms, well-known library usage, tool descriptions derivable from imports.
- Implementation details readable from code — what a function does, what a workflow's concurrency block looks like. Only the rationale for non-obvious choices belongs here.
Always update documentation when making changes:
changelog.md: Add a bullet point describing what changed (new features, bug fixes, behavior changes), not why. Keep entries concise and actionable. Justifications and rationale belong in documentation (readme.md, Sphinx docs) or code comments, not in the changelog.readme.md: Update relevant sections when adding/modifying workflow jobs, CLI commands, or configuration options.
The following documentation artifacts must stay in sync with the code. When changing any of these, update the others:
- CLI output in
readme.md: The inlineuvx -- repomatichelp block,--versionoutput, and development version output must match actual CLI output. Re-run the commands and update the pasted text. - Version references in
readme.md: The--versionexamples and example workflow@vX.Y.Zreference must reflect the latest released version. - Workflow job descriptions in
readme.md: Each.github/workflows/*.yamlworkflow section must document all jobs by their actual job ID, with accurate descriptions of what they do, their requirements, and skip conditions. - Binary download URLs in
readme.md: The download table URLs are automatically frozen during releases (/releases/latest/download/→/releases/download/vX.Y.Z/with versioned filenames). No manual update needed. [tool.repomatic]configuration table inreadme.md: Generated byrepomatic --table-format github configfrom theConfigdataclass docstrings. Re-run the command and update the pasted table when adding, removing, or modifying config fields.
Each piece of knowledge has one canonical home, chosen by audience. Other locations get a brief pointer ("See module.py for rationale.").
| Audience | Home | Content |
|---|---|---|
| End users | readme.md |
Installation, configuration, usage. |
| Setup walkthroughs | setup-guide.md issue |
Step-by-step setup with deep links to repo settings pages. |
| Developers | Python docstrings | Design decisions, trade-offs, "why" explanations. |
| Workflow maintainers | YAML comments | Brief "what" + pointer to Python code for "why." |
| Bug reporters | .github/ISSUE_TEMPLATE/ |
Reproduction steps, version commands. |
| Contributors / Claude | claude.md |
Conventions, policies, non-obvious rules. |
YAML → Python distillation: When workflow YAML files contain lengthy "why" explanations, migrate the rationale to Python module, class, or constant docstrings (using reST admonitions like .. note:: and .. warning::). Trim the YAML comment to a one-line "what" plus a pointer: # See repomatic/module.py for rationale.
Document design decisions, trade-offs, and non-obvious implementation choices directly in the code using docstring admonitions (reST .. warning::, .. note::, .. caution::), inline comments, and module-level docstrings for constants that need context.
Use correct capitalization for proper nouns and trademarked names:
- PyPI (not
PyPi) — the Python Package Index. The "I" is capitalized because it stands for "Index". See PyPI trademark guidelines. - GitHub (not
Github) - GitHub Actions (not
Github ActionsorGitHub actions) - JavaScript (not
Javascript) - TypeScript (not
Typescript) - macOS (not
MacOSormacos) - iOS (not
IOSorios)
The version string is always bare (e.g., 1.2.3). The v prefix is a tag namespace — it only appears when the reference is to a git tag or something derived from a tag (action ref, comparison URL, commit message). This aligns with PEP 440, PyPI, and semver conventions.
| Context | Format | Example | Rationale |
|---|---|---|---|
Python __version__, pyproject.toml |
1.2.3 |
version = "5.10.1" |
PEP 440 bare version. |
| Git tags | `v1.2.3` |
`v5.10.1` |
Tag namespace convention. |
| GitHub comparison URLs | v1.2.3...v1.2.4 |
compare/v5.10.0...v5.10.1 |
References tags. |
| GitHub action/workflow refs | `@v1.2.3` |
actions/checkout@v6.0.2 |
References tags. |
| Commit messages | v1.2.3 |
[changelog] Release v5.10.1 |
References the tag being created. |
CLI --version output |
1.2.3 |
repomatic, version 5.10.1 |
Package version, not a tag. |
| Changelog headings | `1.2.3` |
## [`5.10.1` (2026-02-17)] |
Package version, code-formatted. |
| PyPI URLs | 1.2.3 |
pypi.org/project/repomatic/5.10.1/ |
PyPI uses bare versions. |
| PyPI admonitions | `1.2.3` |
`5.10.1` is available on PyPI |
Package version, not a tag. |
| PR titles | `v1.2.3` |
Release `v5.10.1` |
References the tag. |
| Prose/documentation | `v1.2.3` or `1.2.3` |
Depends on referent | Match what is being referenced. |
Rules:
- No
vprefix on package versions. Anywhere the version identifies the package (PyPI, changelog heading, CLI output), use the bare version:1.2.3. vprefix on tag references. Anywhere the version identifies a git tag (comparison URLs, action refs, commit messages, PR titles), usev1.2.3.- Always backtick-escape versions in prose. Both
v1.2.3(tag) and1.2.3(package) are identifiers, not natural language. In markdown, wrap them in backticks:`v1.2.3`,`1.2.3`. In reST docstrings, use double backticks:``v1.2.3``. - Development versions follow PEP 440:
1.2.3.dev0with optional+{short_sha}local identifier.
- All comments in Python files must end with a period.
- Docstrings use reStructuredText format (vanilla style, not Google/NumPy).
- Documentation in
./docs/uses MyST markdown format where possible. Fallback to reStructuredText if necessary. - Keep lines within 88 characters in Python files, including docstrings and comments (ruff default). Markdown files have no line-length limit.
- Titles in markdown use sentence case.
- Import from the root package (
from repomatic import cli), not submodules when possible. - Place imports at the top of the file, unless avoiding circular imports. Never use local imports inside functions — move them to the module level. Local imports hide dependencies, bypass ruff's import sorting, and make it harder to see what a module depends on.
- Version-dependent imports (e.g.,
tomllibfallback for Python 3.10) should be placed after all normal imports but before theTYPE_CHECKINGblock. This allows ruff to freely sort and organize the normal imports above without interference.
Place a module-level TYPE_CHECKING block after all imports (including version-dependent conditional imports). Use TYPE_CHECKING = False (not from typing import TYPE_CHECKING) to avoid importing typing at runtime. See existing modules for the canonical pattern.
Only add TYPE_CHECKING = False when there is a corresponding if TYPE_CHECKING: block. If all type-checking imports are removed, remove the TYPE_CHECKING = False assignment too — a bare assignment with no consumer is dead code.
Use modern equivalents from collections.abc and built-in types instead of typing imports. Use X | Y instead of Union and X | None instead of Optional. New modules should include from __future__ import annotations (PEP 563).
Omit type annotations on local variables, loop variables, and assignments when mypy can infer the type from the right-hand side. Annotations add visual noise without helping the type checker.
# ✅ Preferred: mypy infers the type.
root_dir = None
name = "default"
items = []
# ❌ Avoid: redundant annotation that mypy already knows.
root_dir: Path | None = None
name: str = "default"
items: list[str] = []When to annotate: Add an explicit annotation only when mypy cannot infer the correct type and reports an error — e.g., empty collections that need a specific element type (items: list[Package] = []), None initializations where the intended type isn't obvious from later usage, or narrowing a union that mypy doesn't resolve on its own.
Function signatures are unaffected. Always annotate function parameters and return types — those are part of the public API and cannot be inferred.
This project supports Python 3.10+. Be aware of syntax features that are not available in Python 3.10:
-
Multi-line f-string expressions (Python 3.12+): You cannot break an f-string after the
{character and continue the expression on the next line.# fmt: off # ❌ Fails on Python 3.10 (only works in Python 3.12+) message = f"value={ some_long_expression }" # ✅ Works on Python 3.10+: split into concatenated strings. message = ( "value=" f"{some_long_expression}" ) # fmt: on
-
Exception groups and
except*(Python 3.11+). -
Selftype hint (Python 3.11+): Usefrom typing_extensions import Selfinstead.
For single-line commands that fit on one line, use plain inline run: without any block scalar indicator:
# ✅ Preferred for short commands: plain inline.
- name: Check out repository
run: git checkout mainWhen a command is too long for a single line, use the folded block scalar (>) to split it across multiple lines:
# ✅ Preferred for long commands: folded block scalar joins lines with spaces.
- name: Run linter
run: >
uvx --no-progress 'yamllint==1.38.0' --strict --format github
--config-data "{rules: {line-length: {max: 120}}}" .
# ❌ Avoid: literal block scalar with backslash continuations.
- name: Run linter
run: |
uvx --no-progress 'yamllint==1.38.0' --strict --format github \
--config-data "{rules: {line-length: {max: 120}}}" .Why: The > scalar folds newlines into spaces, producing a single command without needing backslash escapes. This is cleaner and avoids issues with trailing whitespace after \.
When to use |: Use literal block scalar (|) only when the command requires preserved newlines (e.g., multi-statement scripts, heredocs).
CLI commands, workflow job IDs, PR branch names, and PR body template names must all agree on the same verb prefix. This consistency makes the conventions learnable and grepable across all four dimensions.
| Prefix | Semantics | Source of truth | Idempotent? | Examples |
|---|---|---|---|---|
sync-X |
Regenerate from a canonical or external source. | Template, API, repo | Yes | sync-gitignore, sync-mailmap, sync-renovate |
update-X |
Compute from project state. | Lockfile, git log | Yes | update-deps-graph, update-checksums |
format-X |
Rewrite to enforce canonical style. | Formatter rules | Yes | format-json, format-markdown, format-python |
fix-X |
Correct content (auto-fix). | Linter/checker rules | Yes | fix-typos |
lint-X |
Check content without modifying it. | Linter rules | Yes | lint-changelog |
Rules:
- Pick the verb that matches the data source. If the operation pulls from an external template, API, or canonical reference, it is a
sync. If it computes from local project state (lockfiles, git history, source code), it is anupdate. If it reformats existing content, it is aformat. - Name the specific tool or file, not a generic category. The noun in
verb-nounmust identify the concrete tool, file, or resource the operation targets (e.g.,sync-zizmor,sync-gitignore,sync-renovate). Do not use abstract groupings likesync-linter-configsorsync-vcs-configs. If a second tool is added to a category, create a separate operation for it. - All four dimensions must agree. When adding a file-modifying operation, the CLI command, workflow job ID, PR branch name, and PR body template file name must all use the same
verb-nounidentifier (e.g.,sync-gitignoreeverywhere). For read-only operations (lint-*), only the CLI command and workflow job ID apply. - Function names follow the CLI name. The Python function backing a CLI command uses the underscore equivalent of the CLI name (e.g.,
sync_gitignoreforsync-gitignore). Exception: when the function name would collide with an imported module, use the Clickname=parameter to override (e.g.,@repomatic.command(name="update-deps-graph")on a function nameddeps_graph) or append a_cmdsuffix (e.g.,sync_uv_lock_cmdto avoid collision withfrom .renovate import sync_uv_lock).
Every automated operation follows the naming conventions and is idempotent. The contracts below define the required properties for each operation type.
Every sync-* operation modifies or overwrites user-controlled files or resources. Users must retain full control: each sync operation must be individually disableable via [tool.repomatic].
Required properties (checklist for adding or auditing a sync job):
- Config toggle. A
*_sync: bool = Truefield in theConfigdataclass. Dotted sub-key in[tool.repomatic](e.g.,gitignore.sync = false). Alphabetically sorted among existing sync fields. - CLI command. A
repomatic sync-*command that loads config, checks the toggle, and exits cleanly (ctx.exit(0)) when disabled. Uses@pass_contextto receivectx. - Toggle enforcement. For CLI-based syncs: the toggle field goes in
SUBCOMMAND_CONFIG_FIELDS(checked in the CLI, not exposed as metadata). For workflow-only syncs (no CLI command): the toggle is exposed as a metadata output and checked in the job'sif:condition. - Workflow job. A
sync-*job in the appropriate workflow file (usuallyautofix.yaml, but lifecycle-specific syncs may live elsewhere — e.g.,sync-dev-releaseinrelease.yaml,sync-labelsinlabels.yaml). Requires: metadataneeds:when applicable, prerequisiteif:conditions, PR creation viapeter-evans/create-pull-request(branch name = job ID, body fromrepomatic pr-body --template sync-*). Exception: syncs targeting API resources (e.g., labels) rather than repo files apply changes directly. - Documentation. Config table row and TOML example in
readme.md. Job description with "Skipped if" clause inreadme.md. Changelog entry. - Tests. Default and custom value assertions in
test_repomatic_config_defaultsandtest_repomatic_config_custom_values.
Invariants:
- A disabled toggle must produce zero side effects: no file writes, no API calls, no PRs.
Every update-* operation computes derived artifacts from project state (lockfiles, git history, source code). Unlike sync operations, these generate computed output rather than overwriting user-authored content.
Required properties:
- CLI command. A
repomatic update-*command. - Workflow job. An
update-*job in the appropriate workflow file with PR creation viapeter-evans/create-pull-request(branch name = job ID, body fromrepomatic pr-body --template update-*). - Documentation. Job description in
readme.md. Changelog entry.
Optional properties:
- CLI command. A CLI wrapper is only required when the update runs custom repomatic Python logic (e.g.,
update-deps-graph). Updates that invoke external tools or standalone scripts (e.g.,sphinx-apidoc) may call them directly from the workflow without arepomatic update-*wrapper. - Config toggle. Add a
*_update: bool = Truetoggle only when the generated output involves files the user may want to manage independently. If added, follow the sync toggle pattern (Config field,SUBCOMMAND_CONFIG_FIELDS, tests). - Config parameters. Output paths, filtering options, or depth limits belong as Config fields (e.g.,
dependency-graph.output,dependency-graph.level). These configure behavior without enabling/disabling the operation.
Every format-* and fix-* operation rewrites files using a pinned external tool. format-* enforces canonical style (semantics-preserving); fix-* corrects content errors such as typos (semantics-altering). The naming convention table defines when to use each prefix.
Required properties:
- CLI command. A
repomatic format-*orrepomatic fix-*command that wraps a pinned external tool (e.g., ruff, mdformat, jq, typos). - Workflow job. A job in the appropriate workflow file (usually
autofix.yaml) with PR creation viapeter-evans/create-pull-request(branch name = job ID, body fromrepomatic pr-body --template verb-noun). - Documentation. Job description in
readme.md. Changelog entry.
Invariants:
- No config toggle. Format jobs gate on metadata file-detection outputs (e.g.,
python_files,markdown_files,json_files) making them self-skipping when irrelevant. Fix jobs may run unconditionally when the tool applies to all file types. - The external tool version must be pinned in the CLI command for reproducibility.
Every lint-* operation checks content without modifying it. Lint operations are read-only.
Required properties:
- CLI command. A
repomatic lint-*command. Returns exit code 0 on pass, non-zero on failure. - Workflow job. A
lint-*job inlint.yaml(notautofix.yaml). No PR creation — lints gate merges via status checks. - Documentation. Job description in
readme.md. Changelog entry.
Optional properties:
- CLI command. A CLI wrapper is only required when the lint runs custom Python logic (e.g.,
lint-repo). Lints that invoke a standard external tool (mypy,yamllint,actionlint,zizmor,gitleaks, etc.) may call the tool directly from the workflow without arepomatic lint-*wrapper.
Invariants:
- Read-only. No file writes, no PRs, no side effects beyond exit code and stdout/stderr output.
- Lives in
lint.yaml, notautofix.yaml.
Keep definitions sorted for readability and to minimize merge conflicts:
- Workflow jobs: Ordered by execution dependency (upstream jobs first), then alphabetically within the same dependency level.
- Python module-level constants and variables: Alphabetically, unless there is a logical grouping or dependency order. Hard-coded domain constants (e.g.,
NOT_ON_PYPI_ADMONITION,SKIP_BRANCHES) should be placed at the top of the file, immediately after imports. These constants encode domain assertions and business rules — surfacing them early gives readers an immediate sense of the assumptions the module operates under. - YAML configuration keys: Alphabetically within each mapping level.
- Documentation lists and tables: Alphabetically, unless a logical order (e.g., chronological in changelog) takes precedence.
A complete release consists of all of the following. If any are missing, the release is incomplete:
- Git tag (
vX.Y.Z) created on the freeze commit - GitHub release with non-empty release notes matching the
changelog.mdentry for that version - Binaries attached to the GitHub release for all 6 platform/architecture combinations (linux-arm64, linux-x64, macos-arm64, macos-x64, windows-arm64, windows-x64)
- PyPI package published at the matching version
changelog.mdentry with the release date and comparison URL finalized
- Use
@pytest.mark.parametrizewhen testing the same logic for multiple inputs. Prefer parametrize over copy-pasted test functions that differ only in their data — it deduplicates test logic, improves readability, and makes it trivial to add new cases. - Keep test logic simple with straightforward asserts.
- Tests should be sorted logically and alphabetically where applicable.
- Test coverage is tracked with
pytest-covand reported to Codecov. - Do not use classes for grouping tests. Write test functions as top-level module functions. Only use test classes when they provide shared fixtures, setup/teardown methods, or class-level state.
This repository uses two Claude Code agents defined in .claude/agents/. Their definitions should be lean — if a rule belongs in CLAUDE.md, put it here and reference it from the agent file. Do not duplicate.
CLAUDE.md defines the rules. The codebase and GitHub (issues, PRs, CI logs) are what you measure against those rules. When they disagree, fix the code to match the rules. If the rules are wrong, fix CLAUDE.md.
Patterns that recur across sessions — watch for these proactively:
- Documentation drift is the most frequent issue. CLI output, version references, and workflow job descriptions in
readme.mdgo stale after every release or refactor. Always verify docs against actual output after changes. - CI debugging starts from the URL. When a workflow fails, fetch the run logs first (
gh run view --log-failed). Do not guess at the cause. - Type-checking divergence. Code that passes
mypylocally may fail in CI where--python-version 3.10is used. Always consider the minimum supported Python version. - Simplify before adding. When asked to improve something, first ask whether existing code or tools already cover the case. Remove dead code and unused abstractions before introducing new ones.
- Agents make fixes in the working tree only. Never commit, push, or create PRs.
- Prefer mechanical enforcement (tests, autofix jobs, linting checks) over prose rules. If a rule can be checked by code, it should be.
- Agent definitions should reference
CLAUDE.mdsections, not restate them. - qa-engineer is the gatekeeper for agent definition changes.
Skills in .claude/skills/ are user-invocable only (disable-model-invocation: true) and follow agent conventions: lean definitions, no duplication with CLAUDE.md, reference sections instead of restating rules.
Skills are grouped by lifecycle phase. Each skill includes a "Next steps" section that suggests related skills to run next, creating a guided workflow:
- Setup:
/repomatic-init - Mechanical convenience (run CI steps locally):
/repomatic-sync,/repomatic-lint - Development:
/repomatic-deps,/repomatic-test - Release:
/repomatic-changelog,/repomatic-release - Maintenance:
/repomatic-audit,/repomatic-topics
Run repomatic list-skills to see all skills with descriptions.
The repomatic ecosystem has two layers of automation:
- Mechanical layer — CLI commands and CI workflows that deterministically sync, lint, format, and fix files. These run automatically on every push to
mainvia theautofix.yamlandlint.yamlworkflows. - Analytical layer — Judgment-based tasks that require comparing context, weighing trade-offs, and making recommendations. These cannot be automated by the CLI.
What the autofix workflow already handles mechanically: All sync-*, update-*, format-*, and fix-* jobs in autofix.yaml run on every push to main — no skill needed to trigger them. See § Automated operation contracts for the structural requirements of each operation type and the workflow files for the current inventory of jobs.
What skills should focus on — the gaps the mechanical layer cannot cover:
- Analysis of custom job content in header-only workflows (stale action versions, missing workarounds, outdated integration patterns). The sync only touches the header.
- Cross-repo pattern comparison — identifying conventions in the upstream reference that downstream repos should adopt, or downstream innovations that should be contributed back.
- Judgment calls — whether a config difference is intentional divergence or stale drift, whether a workaround is still needed, what to exclude from sync.
- Interactive guidance — explaining lint results, suggesting fix strategies, walking through release prep.
When writing or updating skills, always check whether the task overlaps with the mechanical layer. If a skill mostly wraps a CLI command that already runs in CI, the skill's value must come from the analytical layer on top: explaining results, suggesting next steps, or catching things the mechanical tool misses. Do not duplicate what CI already does.
- Create something that works (to provide business value).
- Create something that's beautiful (to lower maintenance costs).
- Work on performance.
The repomatic CLI and its [tool.repomatic] configuration in pyproject.toml are the project's primary interfaces. Everything else — reusable workflows, templates, label definitions — is a delivery mechanism. New features should be implemented in the CLI first; workflows should call the CLI, not the other way around. Documentation should lead with what the CLI does and how to configure it.
Linting and formatting are automated via GitHub workflows. Developers don't need to run these manually during development, but are still expected to do best effort. Push your changes and the workflows will catch any issues and perform the nitpicking.
GitHub Actions lacks conditional step groups—you cannot conditionally skip multiple steps with a single condition. Rather than duplicating if: conditions on every step, augment the repomatic metadata subcommand to compute the condition once and reference it from workflow steps.
Why: Python code in repomatic is simpler to maintain, test, and debug than complex GitHub Actions workflow logic. Moving conditional checks into metadata extraction centralizes logic in one place.
Example: Instead of a separate "check" step followed by multiple steps with if: steps.check.outputs.allowed == 'true', add the check to metadata output and reference steps.metadata.outputs.some_check == 'true'.
GitHub Actions workflows run in an environment where race conditions, eventual consistency, and partial failures are common. Prefer a belt-and-suspenders approach: use multiple independent mechanisms to ensure correctness rather than relying on a single guarantee.
For example, changelog.yaml's bump-versions job needs to know the latest released version. Rather than trusting that git tags are always available:
- Belt — The
workflow_runtrigger ensures the job runs after the release workflow completes, so tags exist by then. - Suspenders — The
is_version_bump_allowed()function falls back to commit message parsing ([changelog] Release vX.Y.Z) when tags aren't found.
Apply the same philosophy elsewhere: avoid single points of failure in workflow logic. If a job depends on external state (tags, published packages, API availability), add a fallback or a graceful default. When possible, make operations idempotent so re-runs are safe.
See also: actions/checkout#504 for context on actions/checkout's default merge commit behavior on pull requests.
When workflow_run fires, github.event.workflow_run.head_sha points to the commit that triggered the upstream workflow — not the latest commit on main. If the release cycle added commits after that trigger (freeze + unfreeze), checking out head_sha produces a stale tree and the resulting PR will conflict with current main.
Fix: Use github.sha instead, which for workflow_run events resolves to the latest commit on the default branch. The workflow_run trigger's purpose is timing (ensuring tags exist), not pinning to a specific commit. This applies to any job that needs the current state of main after an upstream workflow completes.
The release workflow creates a draft, uploads all assets, then publishes. Once published with GitHub immutable releases enabled, tags and assets are locked. Tag names are permanently burned — reinforcing the skip and move forward principle. Release notes remain editable for sync-github-releases.
What immutable releases actually locks: Immutability only blocks asset uploads and modifications on published releases (HTTP 422: Cannot upload assets to an immutable release). Published releases can still be deleted (along with their tags via --cleanup-tag). This distinction is critical for the dev release strategy below.
Dev releases use drafts. The sync-dev-release job creates dev pre-releases as drafts (--draft --prerelease) rather than published pre-releases. This ensures the workflow can upload binaries and packages to the release after creation. The release stays as a draft permanently — it is never published. On the next push, cleanup_dev_releases() deletes all existing .dev0 releases (drafts are always deletable) before creating a fresh one. See repomatic/github/dev_release.py for implementation.
Workflows and CLI commands must be safe to re-run. Running the same command or workflow twice with the same inputs should produce the same result without errors or unwanted side effects (e.g., duplicate tags, duplicate PR comments, redundant file modifications).
In practice:
- Use
--skip-existingor equivalent guards when creating resources (tags, releases, published packages). - Check for existing state before writing (e.g., skip adding an admonition if it's already present, skip creating a PR if one already exists for the branch).
- Prefer upsert semantics over create-only semantics.
- Make file-modifying operations convergent: applying the same transformation to an already-transformed file should be a no-op.
When idempotency is not achievable, document the reason in a comment or docstring explaining what side effects occur on re-runs and why they are acceptable or unavoidable.
When a release goes wrong — squash merge, broken artifact, bad metadata — prefer skipping the version and releasing the next one over reverting commits, force-pushing, or rewriting main. A burned version number is cheap; a botched automated recovery is not.
This mirrors how package repositories handle defective releases. PyPI lets maintainers yank a release rather than delete it, preserving immutability while signaling that consumers should upgrade. The same principle applies to our workflow:
- Don't automate destructive recovery. Automated reverts, force-pushes, and history rewrites on
mainare high-risk operations that compound the original mistake. Thedetect-squash-mergejob creates a notification issue instead of reverting precisely for this reason. - Notify and let humans decide. Open an issue, fail the workflow, and trust the maintainer to choose the right recovery path. A human in the loop is always safer than an automated guess.
- Version numbers are disposable. Software skips versions routinely. A changelog entry for an unpublished version is a minor cosmetic issue, not a correctness problem.
- Existing safeguards are the real protection. The tagging, publishing, and release jobs are gated on commit message patterns (
[changelog] Release v). If those gates hold, no broken release escapes — regardless of what landed onmain.
When designing new workflow safeguards, default to detection + notification rather than detection + automated fix. The blast radius of a missed notification is zero; the blast radius of a bad automated fix can be catastrophic.
Note
For user-facing documentation, see readme.md § Concurrency and cancellation.
Workflows use two concurrency strategies depending on whether they perform critical release operations. Read the concurrency: block in each workflow file for the exact YAML.
release.yaml handles tagging, PyPI publishing, and GitHub release creation. These operations must run to completion. Using conditional cancel-in-progress: false doesn't work because it's evaluated on the new workflow, not the old one. If a regular commit is pushed while a release workflow is running, the new workflow would cancel the release because they share the same concurrency group.
The solution is to give each release run its own unique group using the commit SHA. Both [changelog] Release and [changelog] Post-release patterns must be matched because when a release is pushed, the event contains two commits bundled together and github.event.head_commit refers to the most recent one (the post-release bump).
The prepare-release job in changelog.yaml creates a PR with exactly two commits that must be merged via "Rebase and merge" (never squash):
- Freeze commit (
[changelog] Release vX.Y.Z) — Freezes everything to the release version: finalizes the changelog date and comparison URL, removes the "unreleased" warning, freezes workflow action references to@vX.Y.Z, and freezes CLI invocations to a PyPI version. - Unfreeze commit (
[changelog] Post-release bump vX.Y.Z → vX.Y.Z) — Unfreezes for the next development cycle: reverts action references back to@main, reverts CLI invocations back to local source (--from . repomatic), adds a new unreleased changelog section, and bumps the version to the next patch.
The auto-tagging job in release.yaml depends on these being separate commits — it uses release_commits_matrix to identify and tag only the freeze commit. Squashing would merge both into one, breaking the tagging logic.
Squash merge safeguard: The detect-squash-merge job in release.yaml detects squash merges by checking if the head commit message starts with Release `v (the PR title pattern) rather than [changelog] Release v (the canonical freeze commit pattern). When detected, it opens a GitHub issue assigned to the person who merged, then fails the workflow. The release is effectively skipped — existing safeguards in create-tag prevent tagging, publishing, and releasing.
On main, workflows use --from . repomatic to run the CLI from local source (dogfooding). The freeze commit freezes these to 'repomatic==X.Y.Z' so tagged releases reference a published package. The unfreeze commit reverts them back for the next development cycle.
changelog.yaml includes github.event_name in its concurrency group to prevent cross-event cancellation. This is required because changelog.yaml has both push and workflow_run triggers. Without event_name in the group, the workflow_run event (which fires when "Build & release" completes) would cancel the push event's prepare-release job, but then skip prepare-release itself (due to if: github.event_name != 'workflow_run'), so prepare-release would never run.
Always prefer long-form options over short-form for readability when invoking commands:
- Use
--outputinstead of-o. - Use
--verboseinstead of-v. - Use
--recursiveinstead of-r.
The repomatic CLI defines both short and long-form options for convenience, but workflow files and scripts should use long-form options for clarity.
When invoking uv and uvx commands in GitHub Actions workflows:
--no-progresson all CI commands (uv-level flag, placed before the subcommand). Progress bars render poorly in CI logs.--frozenonuv runcommands (run-level flag, placed afterrun). Lockfile should be immutable in CI.- Flag placement:
uv --no-progress run --frozen -- command(notuv run --no-progress). - Exceptions: Omit
--frozenforuvxwith pinned versions,uv tool install, CLI invocability tests, and local development examples. - Prefer explicit flags over environment variables (
UV_NO_PROGRESS,UV_FROZEN). Flags are self-documenting, visible in logs, avoid conflicts (e.g.,UV_FROZENvs--locked), and align with the long-form option principle.