A lightweight PostgreSQL backup automation tool that creates scheduled backups and securely uploads them to S3-compatible object storage such as Cloudflare R2, AWS S3, Wasabi, Backblaze B2, or MinIO.
It is designed to run reliably on PaaS platforms, with first-class support for Docker and cron scheduling, while remaining fully portable via a CLI or container.
- 📦 Automated Backups — scheduled daily or hourly PostgreSQL backups
- 🔐 Optional Encryption — gzip compression or 7z encryption with password
- ☁️ Cloudflare R2 Integration — seamless S3-compatible storage support
- 🧹 Retention Policy — automatically delete old backups
- 🔗 Flexible Database URLs — supports private and public PostgreSQL connection URLs
- ⚡ Optimized Performance — parallel pg_dump and multipart S3 uploads
- 🐳 Docker Ready — portable, lightweight container
- 🚀 Deployment Templates — no fork required for normal usage
- 🪣 S3-Compatible Storage — works with R2, AWS S3, Wasabi, B2, MinIO
- 💾 Optional Local Retention — keep backups locally for CLI, VPS, or NAS usage
- Click the Deploy on Railway button below
- Railway will create a new project using the latest version of this repository
- Add the required environment variables in the Railway dashboard
- (Optional) Configure a cron job for your desired backup schedule
Railway uses ephemeral storage. Local backup files are deleted by default after upload.
DATABASE_URL= # PostgreSQL database URL (private)
DATABASE_PUBLIC_URL= # Public PostgreSQL URL (optional)
USE_PUBLIC_URL=false # Set true to use DATABASE_PUBLIC_URL
DUMP_FORMAT=dump # sql | plain | dump | custom | tar
FILENAME_PREFIX=backup # Backup filename prefix
MAX_BACKUPS=7 # Number of backups to retain
KEEP_LOCAL_BACKUP=false # Keep backup file locally after upload (not recommended on PaaS)
R2_ENDPOINT= # S3 endpoint URL
R2_BUCKET_NAME= # Bucket name
R2_ACCESS_KEY= # Access key
R2_SECRET_KEY= # Secret key
S3_REGION=us-east-1 # Required for AWS S3 (ignored by R2/MinIO)
BACKUP_PASSWORD= # Optional: enables 7z encryption
BACKUP_TIME=00:00 # Daily backup time (UTC, HH:MM)Variable names use
R2_*for historical reasons, but any S3-compatible provider can be used by changing the endpoint and credentials. For AWS S3 users: ensureS3_REGIONmatches your bucket’s region.
This project uses the standard AWS S3 API via boto3, and works with:
- Cloudflare R2 (recommended)
- AWS S3
- Wasabi
- Backblaze B2 (S3 API)
- MinIO (self-hosted)
| Provider | Endpoint Example |
|---|---|
| Cloudflare R2 | https://<accountid>.r2.cloudflarestorage.com |
| AWS S3 | https://s3.amazonaws.com |
| Wasabi | https://s3.wasabisys.com |
| Backblaze B2 | https://s3.us-west-004.backblazeb2.com |
| MinIO | http://localhost:9000 |
You can configure the backup schedule using Railway Cron Jobs:
- Open your Railway project
- Go to Deployments → Cron
- Add a cron job targeting this service
| Schedule | Cron Expression | Description |
|---|---|---|
| Hourly | 0 * * * * |
Every hour |
| Daily | 0 0 * * * |
Once per day (UTC midnight) |
| Twice Daily | 0 */12 * * * |
Every 12 hours |
| Weekly | 0 0 * * 0 |
Every Sunday |
| Monthly | 0 0 1 * * |
First day of the month |
Tips
- All cron times are UTC
- Use https://crontab.guru to validate expressions
- Adjust
MAX_BACKUPSto match your schedule
If you use Railway Cron Jobs, the service will start once per execution. In this setup, the service is expected to run a single backup and exit. Any internal scheduler should not be relied on. Ensure the backup process exits cleanly after completion; otherwise, Railway will skip subsequent cron executions.
It can run on any platform that supports:
- Python 3.9+
pg_dump(PostgreSQL client tools)- Environment variables
- Long-running background processes or cron
Docker images use Python 3.12 by default.
Local execution supports Python 3.9+.
- Local machine (Linux / macOS / Windows*)
- VPS (Netcup, Hetzner, DigitalOcean, etc.)
- Docker containers
- Other PaaS providers (Heroku, Fly.io, Render, etc.)
Windows is supported when
pg_dumpis installed and available in PATH.
- Python 3.9+
- PostgreSQL client tools (
pg_dump) - pip
pip install -r requirements.txt
python main.pyBuild and run the image locally:
docker build -t postgres-to-r2-backup .
docker run --env-file .env postgres-to-r2-backupEnsure the container is allowed to run continuously when not using an external cron scheduler.
All scheduling uses UTC by default (e.g. Malaysia UTC+8 → set
BACKUP_TIME=16:00for midnight).
If you downloaded a prebuilt Docker image archive (.tar or .tar.gz), you can run it without building locally:
# Extract the archive (if compressed)
tar -xzf postgres-to-r2-backup_v1.0.6.tar.gz
# Load the image into Docker
docker load -i postgres-to-r2-backup_v1.0.6.tar
# Run the container
docker run --env-file .env postgres-to-r2-backup:v1.0.6Prebuilt images are architecture-specific (amd64 / arm64).
This project can also be used as a standalone CLI tool, installable via pip, in addition to running as a Railway or Docker service.
pip install pg-r2-backup- Python 3.9+
- PostgreSQL client tools (
pg_dump) installed and available in PATH
mkdir backups
cd backups
pg-r2-backup init # creates .env from .env.example
pg-r2-backup doctor # checks environment and dependencies
pg-r2-backup run # runs a backup immediatelypg-r2-backup run # Run backup immediately
pg-r2-backup doctor # Check environment & dependencies
pg-r2-backup config show # Show current configuration
pg-r2-backup init # Create .env from .env.example
pg-r2-backup schedule # Show scheduling examples
pg-r2-backup --versionWhen running via the CLI, environment variables are resolved in the following order:
- A
.envfile in the current working directory (or parent directory) - System environment variables
This allows different folders to maintain separate backup configurations.
By default, pg-r2-backup deletes the local backup file after a successful upload.
To keep a local copy (recommended for local machines, VPS, or NAS):
KEEP_LOCAL_BACKUP=true
Not recommended on PaaS platforms (Railway, Fly.io, Render, Heroku, etc.) due to ephemeral filesystems.
The CLI does not run a background scheduler. Use your operating system or platform scheduler instead.
Linux / macOS (cron)
0 0 * * * pg-r2-backup runWindows (Task Scheduler)
- Program:
pg-r2-backup - Arguments:
run - Start in: folder containing
.env(working directory)
Railway / Docker
Use the platform's built-in scheduler (recommended).
💡 Tip
Run pg-r2-backup schedule at any time to see scheduling examples.
Fork this repository only if you plan to:
- Modify the backup logic
- Add features or integrations
- Submit pull requests
- Run locally for development
For security best practices and deployment recommendations, see SECURITY.md.
Why only DATABASE_URL?
This matches how most modern platforms expose PostgreSQL credentials.
Support for separate DB variables may be added if there is demand.
This project is open source under the MIT License.
You are free to use, modify, and distribute it with attribution.