This repository is prepared for a separate Python runner plus a Cloudflare Worker serving layer.
cd /Users/viktorvitovec/Documents/Projekty/HousesPredict-v2/worker-app
npx wrangler whoami
npx wrangler loginnpx wrangler d1 create praha-price-predictor
npx wrangler r2 bucket create praha-price-modelsCopy the returned database_id into wrangler.jsonc and uncomment the d1_databases and r2_buckets bindings.
npx wrangler d1 migrations apply praha-price-predictor --remoteThe latest migration also creates service tables for:
- user profiles
- free prediction usage tracking
- premium entitlements
- Stripe customer mapping
- scored market opportunities for the dashboard
Set the product env on the Worker:
npx wrangler secret put SUPABASE_URL
npx wrangler secret put SUPABASE_ANON_KEY
npx wrangler secret put STRIPE_SECRET_KEY
npx wrangler secret put STRIPE_WEBHOOK_SECRET
npx wrangler secret put APP_BASE_URL
npx wrangler secret put STRIPE_PRICE_IDOptional non-secret vars can be added in wrangler.jsonc or via dashboard:
PREMIUM_PLAN_CODEPREMIUM_PRICE_LABEL
npx wrangler check
npx wrangler deployYou have two supported scheduling options.
Option A: separate Linux host
Run the Python pipeline on a separate Linux host with:
./ops/run-scrape-publish.sh
./ops/run-train-publish.shInstall the example cron from pipeline.crontab.example.
Option B: GitHub Actions
Enable these workflows and set these repository secrets:
-
CLOUDFLARE_API_TOKEN -
CLOUDFLARE_ACCOUNT_ID -
ALERT_WEBHOOK_URLoptional
The GitHub Actions setup provisions Python and Node, runs scrape/publish every 6 hours, train/publish nightly, and uploads generated reports as workflow artifacts. Without the Cloudflare secrets, scheduled runs skip remote publish/housekeeping and exit successfully so notification email stays quiet until publishing is configured.