Local tooling to maintain a merged Slack export archive, build browsable outputs, and publish the latest export ZIP to Cloudflare R2.
archive/— merged Slack export data (living archive)imports/— raw export ZIPsdist/— generated viewer + markdown outputs for qmd-style indexinglib/— task implementations called bymisemise.toml— tool versions + task entrypoints
mise run setupmise run setup checks/installs tools from mise.toml (Python, AWS CLI), initializes submodules, and checks dependencies.
cp .env.example .envFill in real values for:
- R2 account + upload endpoint settings
- R2 upload keypair (Read & Write token)
- bucket/object names (defaults already set)
mise run upload automatically loads .env.
mise run publish -- imports/monthly-2026-05.zipUse mise run publish -- --skip-merge when archive/ is already up to date and you only want build + upload.
mise run merge -- imports/monthly-2026-05.zip
mise run build
mise run uploadmise run build now writes a qmd-friendly markdown corpus into:
dist/archive.md— top-level index filedist/markdown/<channel>/<YYYY-MM>.md— monthly channel documents
These files are structured for local indexing tools like qmd: real headings, smaller documents, and newest-first message ordering.
mise run upload creates railsperf-export-latest.zip from dist/ (falls back to .tar if zip is unavailable), then uploads via:
R2_PRESIGNED_PUT_URL+curl, or- AWS CLI to
https://<ACCOUNT_ID>.r2.cloudflarestorage.comusing:R2_ACCOUNT_IDR2_UPLOAD_ACCESS_KEY_IDR2_UPLOAD_SECRET_ACCESS_KEY
users.jsonmerged byid(incoming wins)channels.jsonmerged byid(incoming wins)- per-day channel files merged by message
ts - excluded channels:
#random,#introductions - idempotent when re-merging same export