Skip to content

BigDaddyAman/pg-r2-backup

Repository files navigation

License Python Storage Database Deploy Docker

Postgres-to-R2 Backup (S3-Compatible)

A lightweight PostgreSQL backup automation tool that creates scheduled backups and securely uploads them to S3-compatible object storage such as Cloudflare R2, AWS S3, Wasabi, Backblaze B2, or MinIO.

It is designed to run reliably on PaaS platforms, with first-class support for Docker and cron scheduling, while remaining fully portable via a CLI or container.


✨ Features

  • 📦 Automated Backups — scheduled daily or hourly PostgreSQL backups
  • 🔐 Optional Encryption — gzip compression or 7z encryption with password
  • ☁️ Cloudflare R2 Integration — seamless S3-compatible storage support
  • 🧹 Retention Policy — automatically delete old backups
  • 🔗 Flexible Database URLs — supports private and public PostgreSQL connection URLs
  • Optimized Performance — parallel pg_dump and multipart S3 uploads
  • 🐳 Docker Ready — portable, lightweight container
  • 🚀 Deployment Templates — no fork required for normal usage
  • 🪣 S3-Compatible Storage — works with R2, AWS S3, Wasabi, B2, MinIO
  • 💾 Optional Local Retention — keep backups locally for CLI, VPS, or NAS usage

🚀 Deployment on Railway

  1. Click the Deploy on Railway button below
  2. Railway will create a new project using the latest version of this repository
  3. Add the required environment variables in the Railway dashboard
  4. (Optional) Configure a cron job for your desired backup schedule

Railway uses ephemeral storage. Local backup files are deleted by default after upload.

Deploy on Railway


🔧 Environment Variables (S3-Compatible)

DATABASE_URL=           # PostgreSQL database URL (private)
DATABASE_PUBLIC_URL=    # Public PostgreSQL URL (optional)
USE_PUBLIC_URL=false    # Set true to use DATABASE_PUBLIC_URL

DUMP_FORMAT=dump        # sql | plain | dump | custom | tar
FILENAME_PREFIX=backup  # Backup filename prefix
MAX_BACKUPS=7           # Number of backups to retain
KEEP_LOCAL_BACKUP=false # Keep backup file locally after upload (not recommended on PaaS)

R2_ENDPOINT=            # S3 endpoint URL
R2_BUCKET_NAME=         # Bucket name
R2_ACCESS_KEY=          # Access key
R2_SECRET_KEY=          # Secret key
S3_REGION=us-east-1     # Required for AWS S3 (ignored by R2/MinIO)

BACKUP_PASSWORD=        # Optional: enables 7z encryption
BACKUP_TIME=00:00       # Daily backup time (UTC, HH:MM)

Variable names use R2_* for historical reasons, but any S3-compatible provider can be used by changing the endpoint and credentials. For AWS S3 users: ensure S3_REGION matches your bucket’s region.


☁️ Supported S3-Compatible Providers

This project uses the standard AWS S3 API via boto3, and works with:

  • Cloudflare R2 (recommended)
  • AWS S3
  • Wasabi
  • Backblaze B2 (S3 API)
  • MinIO (self-hosted)

Example Endpoints

Provider Endpoint Example
Cloudflare R2 https://<accountid>.r2.cloudflarestorage.com
AWS S3 https://s3.amazonaws.com
Wasabi https://s3.wasabisys.com
Backblaze B2 https://s3.us-west-004.backblazeb2.com
MinIO http://localhost:9000

⏰ Railway Cron Jobs

You can configure the backup schedule using Railway Cron Jobs:

  1. Open your Railway project
  2. Go to Deployments → Cron
  3. Add a cron job targeting this service

Common Cron Expressions

Schedule Cron Expression Description
Hourly 0 * * * * Every hour
Daily 0 0 * * * Once per day (UTC midnight)
Twice Daily 0 */12 * * * Every 12 hours
Weekly 0 0 * * 0 Every Sunday
Monthly 0 0 1 * * First day of the month

Tips

  • All cron times are UTC
  • Use https://crontab.guru to validate expressions
  • Adjust MAX_BACKUPS to match your schedule

If you use Railway Cron Jobs, the service will start once per execution. In this setup, the service is expected to run a single backup and exit. Any internal scheduler should not be relied on. Ensure the backup process exits cleanly after completion; otherwise, Railway will skip subsequent cron executions.


🖥️ Running Locally or on Other Platforms

It can run on any platform that supports:

  • Python 3.9+
  • pg_dump (PostgreSQL client tools)
  • Environment variables
  • Long-running background processes or cron

Docker images use Python 3.12 by default.
Local execution supports Python 3.9+.

Supported Environments

  • Local machine (Linux / macOS / Windows*)
  • VPS (Netcup, Hetzner, DigitalOcean, etc.)
  • Docker containers
  • Other PaaS providers (Heroku, Fly.io, Render, etc.)

Windows is supported when pg_dump is installed and available in PATH.

Local Requirements

  • Python 3.9+
  • PostgreSQL client tools (pg_dump)
  • pip

Run Manually (Local)

pip install -r requirements.txt
python main.py

Run with Docker (Optional)

Build and run the image locally:

docker build -t postgres-to-r2-backup .
docker run --env-file .env postgres-to-r2-backup

Ensure the container is allowed to run continuously when not using an external cron scheduler.

All scheduling uses UTC by default (e.g. Malaysia UTC+8 → set BACKUP_TIME=16:00 for midnight).

Run from Prebuilt Docker Image

If you downloaded a prebuilt Docker image archive (.tar or .tar.gz), you can run it without building locally:

# Extract the archive (if compressed)
tar -xzf postgres-to-r2-backup_v1.0.6.tar.gz

# Load the image into Docker
docker load -i postgres-to-r2-backup_v1.0.6.tar

# Run the container
docker run --env-file .env postgres-to-r2-backup:v1.0.6

Prebuilt images are architecture-specific (amd64 / arm64).


🧰 Using the CLI (Global Installation)

This project can also be used as a standalone CLI tool, installable via pip, in addition to running as a Railway or Docker service.

Install via pip

pip install pg-r2-backup

Requirements

  • Python 3.9+
  • PostgreSQL client tools (pg_dump) installed and available in PATH

Quick Start (CLI)

mkdir backups
cd backups

pg-r2-backup init      # creates .env from .env.example
pg-r2-backup doctor    # checks environment and dependencies
pg-r2-backup run       # runs a backup immediately

CLI Commands

pg-r2-backup run            # Run backup immediately
pg-r2-backup doctor         # Check environment & dependencies
pg-r2-backup config show    # Show current configuration
pg-r2-backup init           # Create .env from .env.example
pg-r2-backup schedule       # Show scheduling examples
pg-r2-backup --version

Environment Variable Resolution (CLI)

When running via the CLI, environment variables are resolved in the following order:

  1. A .env file in the current working directory (or parent directory)
  2. System environment variables

This allows different folders to maintain separate backup configurations.

Local Backup Behavior (CLI)

By default, pg-r2-backup deletes the local backup file after a successful upload.

To keep a local copy (recommended for local machines, VPS, or NAS):

KEEP_LOCAL_BACKUP=true

Not recommended on PaaS platforms (Railway, Fly.io, Render, Heroku, etc.) due to ephemeral filesystems.

Scheduling Backups (CLI)

The CLI does not run a background scheduler. Use your operating system or platform scheduler instead.

Linux / macOS (cron)

0 0 * * * pg-r2-backup run

Windows (Task Scheduler)

  • Program: pg-r2-backup
  • Arguments: run
  • Start in: folder containing .env (working directory)

Railway / Docker

Use the platform's built-in scheduler (recommended).

💡 Tip
Run pg-r2-backup schedule at any time to see scheduling examples.


🛠 Development & Contributions

Fork this repository only if you plan to:

  • Modify the backup logic
  • Add features or integrations
  • Submit pull requests
  • Run locally for development

For security best practices and deployment recommendations, see SECURITY.md.

❓ FAQ

Why only DATABASE_URL?
This matches how most modern platforms expose PostgreSQL credentials.
Support for separate DB variables may be added if there is demand.

📜 License

This project is open source under the MIT License.

You are free to use, modify, and distribute it with attribution.

About

Automated PostgreSQL backup bot & CLI that uploads encrypted or compressed dumps to Cloudflare R2 (S3-compatible). Supports retention policies, Docker, and Railway cron jobs.

Topics

Resources

Contributing

Security policy

Stars

Watchers

Forks

Contributors