A satirical AI-moderated debate platform where citizens engage in discourse under the watchful eye of our benevolent Robot Overlord.
The Robot Overlord API is the backend service for a unique debate platform that combines human creativity with AI moderation. Users can create topics, submit posts, and engage in discussions while an AI "Robot Overlord" moderates content and maintains order in the digital realm.
Built with modern Python technologies, this API provides a robust foundation for real-time debate management, user authentication, content moderation, and administrative oversight.
- π Google OAuth Authentication - Secure login with JWT token management
- π₯ Role-Based Access Control - Granular permissions for Citizens, Moderators, and Admins
- π€ AI-Powered Moderation - Automated content analysis and moderation decisions
- π Analytics Dashboard - Real-time platform metrics and user engagement data
- π Gamification System - User badges, loyalty scores, and leaderboards
- π Multilingual Support - Content translation and internationalization
- β‘ Background Processing - Async job queues for content moderation and analytics
- π± WebSocket Support - Real-time updates and notifications (planned)
- Backend: FastAPI (Python 3.13+)
- Database: PostgreSQL 17 with pgvector extension
- Cache/Queue: Redis with ARQ workers
- Authentication: Google OAuth 2.0 + JWT
- AI Integration: OpenAI, Anthropic, Google AI APIs
- Deployment: Docker Compose
- Docker Desktop (recommended)
- Just Command Runner -
brew install just - Python 3.13+ (for local development)
- PostgreSQL 17+ (for local development)
-
Clone the repository
git clone <repository-url> cd therobotoverlord-mono/therobotoverlord-api
-
Set up environment variables
cp .env.example .env
Edit
.envwith your API keys:# Required: Google OAuth credentials AUTH_GOOGLE_CLIENT_ID=your_google_client_id AUTH_GOOGLE_CLIENT_SECRET=your_google_client_secret # Required: At least one AI provider LLM_OPENAI_API_KEY=your_openai_api_key LLM_ANTHROPIC_API_KEY=your_anthropic_api_key
-
Start the application
just run
-
Verify it's running
just status
- API Health: http://localhost:8000/health
- API Docs: http://localhost:8000/docs
The project uses Just for simplified command management:
# Core commands
just run # Start all services
just stop # Stop all services
just status # Check service status
just logs # View all logs
just clean # Reset everything (removes data)
# Service-specific logs
just logs-api # API server logs
just logs-workers # Background worker logs
just logs-postgres # Database logs
just logs-redis # Redis logs
# Development
just restart-api # Restart API after code changes
just restart-workers # Restart workers
just pre-commit # Run code quality checks
just test # Run testsIf you prefer to run without Docker:
# Install dependencies
uv sync
# Set up PostgreSQL database
createdb therobotoverlord
export DATABASE_URL="postgresql://username:password@localhost/therobotoverlord"
# Start Redis (required for background jobs)
redis-server
# Run database migrations
uv run yoyo apply --database $DATABASE_URL migrations/
# Start the API server
uv run python -m therobotoverlord_api.main
# In another terminal, start background workers
uv run python -m therobotoverlord_api.workers.mainOnce running, you can explore the API:
- Interactive Docs: http://localhost:8000/docs
- Health Check: http://localhost:8000/health
- OpenAPI Schema: http://localhost:8000/openapi.json
POST /auth/google- Google OAuth loginGET /users/me- Get current user profilePOST /topics- Create a new debate topicGET /topics- List all topicsPOST /posts- Submit a post to a topicGET /admin/dashboard- Admin analytics (admin only)
# View migration status
docker-compose exec api uv run yoyo list --database $DATABASE_URL migrations/
# Access PostgreSQL directly
docker-compose exec postgres psql -U postgres -d therobotoverlord
# Reset database (development only)
just clean && just runThe system runs background workers for:
- Content Moderation - AI analysis of posts and comments
- Analytics Processing - User engagement metrics
- Appeal Handling - Automated moderation appeal reviews
- Health Monitoring - System status checks
# View all logs
just logs
# View worker logs specifically
just logs-workers
# Monitor Redis job queue
docker-compose exec redis redis-cli monitorThe system uses a consolidated PostgreSQL schema with 35+ tables:
- Core: users, topics, posts, comments, tags
- Auth: sessions, roles, permissions (RBAC)
- Moderation: flags, sanctions, appeals, AI decisions
- Analytics: user metrics, engagement tracking
- Gamification: badges, leaderboards, loyalty scores
- Admin: dashboard config, system settings
Built on ARQ (Async Redis Queue) for reliable job processing:
# Example: Queue a moderation job
from therobotoverlord_api.workers.moderation import moderate_post
await moderate_post.delay(post_id=123)Clean data access through repository classes:
from therobotoverlord_api.database.repositories.user import UserRepository
user_repo = UserRepository()
user = await user_repo.get_by_id(user_id)therobotoverlord-api/
βββ src/therobotoverlord_api/
β βββ api/ # FastAPI routes
β βββ database/ # Database models & repositories
β βββ workers/ # Background job workers
β βββ auth/ # Authentication logic
β βββ main.py # Application entry point
βββ migrations/ # Database migrations
βββ scripts/ # Deployment scripts
βββ docker-compose.yml # Development environment
- Code Changes: Edit files in
src/therobotoverlord_api/ - Database Changes: Create new migration files in
migrations/ - Testing: Run
just restart-apito reload - Logs: Use
just logs-apito debug - Pre-commit: Run
just pre-commitbefore committing
Key configuration options in .env:
# Database
DATABASE_URL=postgresql://postgres:password@localhost/therobotoverlord
# Redis
REDIS_URL=redis://localhost:6379/0
# Authentication
AUTH_GOOGLE_CLIENT_ID=your_client_id
AUTH_GOOGLE_CLIENT_SECRET=your_client_secret
AUTH_JWT_SECRET_KEY=your_jwt_secret
# AI Providers (at least one required)
LLM_OPENAI_API_KEY=your_openai_key
LLM_ANTHROPIC_API_KEY=your_anthropic_key
LLM_GOOGLE_API_KEY=your_google_key# Start all services
just run
# Check service status
just status
# View logs
just logs
# Stop services
just stop
# Clean reset (removes all data)
just clean- Set up PostgreSQL 17 with extensions:
pgcrypto,citext,vector,pg_trgm - Configure Redis for job queues
- Set production environment variables
- Run migrations:
uv run yoyo apply --database $DATABASE_URL migrations/ - Start services: API server + background workers
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and test locally
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.