This is the backend API for the DevFest Lecce 2025 conference application. It's built with Django and Django REST Framework, providing endpoints for managing conferences, speakers, badges, user connections, and leaderboards.
- User Management: Firebase-based authentication for secure user access
- Conference Management: Manage conference details, rooms, and schedules
- Speaker Profiles: CRUD operations for speaker information
- Badge System: Gamification with collectible badges and points
- Connections: Network with other attendees
- Leaderboard: Track user points and rankings
- RESTful API: Well-documented API with Swagger/OpenAPI support
- Admin Interface: Django admin panel for easy data management
- Python 3.10 or higher
- PostgreSQL (for production) or SQLite (for development)
- uv - Fast Python package installer
- Firebase project (for authentication)
- Google Cloud Storage bucket (optional, for media storage)
git clone https://github.com/riccardotornesello/devfest-lecce-be.git
cd devfest-lecce-beuv syncCopy the example environment file and configure it:
cp .env.example .envEdit .env with your configuration:
SECRET_KEY: Django secret key (generate one for production)DEBUG: Set tofalsein productionALLOWED_HOSTS: Your domain(s)FIREBASE_AUDIENCE: Your Firebase project IDPOSTGRES_*: Database credentials (if using PostgreSQL)GS_BUCKET_NAME: Google Cloud Storage bucket (optional)
cd devfest_lecce_2025_be
uv run manage.py migrateuv run manage.py createsuperuseruv run manage.py runserverThe API will be available at http://localhost:8000
Once the server is running, you can access the API documentation at:
- Swagger UI:
http://localhost:8000/swagger/ - ReDoc:
http://localhost:8000/redoc/ - Admin Panel:
http://localhost:8000/admin/
This project uses ruff for linting and formatting:
# Format code
uv run ruff format
# Lint and auto-fix
uv run ruff check --fixInstall pre-commit hooks to automatically lint and format on commit:
uv run pre-commit installThis repository includes two GitHub Actions workflows for development purposes:
-
CI Workflow (
.github/workflows/ci.yml): Runs on push and pull requests- Lints and formats code with ruff
- Runs security scans
- Validates Django configuration
- Tests Docker build
-
Release Workflow (
.github/workflows/release.yml): Runs on release creation- Builds Docker image
- Publishes to GitHub Container Registry
- Creates build attestation for supply chain security
Note: These workflows are for reference and development testing only. They validate code quality and ensure the Docker image builds correctly, but they do not deploy to production.
devfest_lecce_2025_be/
βββ app/ # Core Django settings and configuration
βββ badges/ # Badge system app
βββ conferences/ # Conference management app
βββ connections/ # User connections app
βββ leaderboard/ # Points and rankings app
βββ rooms/ # Conference rooms app
βββ speakers/ # Speaker profiles app
βββ users/ # User management app
The easiest way to run the application with all dependencies:
# Start all services (database and web server)
docker-compose up
# Run migrations (in another terminal)
docker-compose run migrate
# Stop all services
docker-compose down
# Remove all data
docker-compose down -vThe API will be available at http://localhost:8000
Build and run with Docker:
docker build -t devfest-lecce-be .
docker run -p 8000:8000 --env-file .env devfest-lecce-beThe production deployment is fully automated using Google Cloud Build. On every push to the main branch, Cloud Build automatically:
- Builds the Docker image and pushes it to Artifact Registry
- Updates the Cloud Run Job with the new image
- Runs database migrations via the Cloud Run Job (
TASK=migrate) - Collects static files via the Cloud Run Job (
TASK=collectstatic) - Deploys the backend service to Cloud Run with the new image
The pipeline is configured in cloudbuild.yaml and is triggered automatically by a Cloud Build trigger set up through Terraform (see Infrastructure section below).
Manual Trigger (if needed):
gcloud builds submit --config cloudbuild.yaml \
--substitutions=_ARTIFACT_REGISTRY="europe-west1-docker.pkg.dev/devfest-lecce/devfest-lecce",_SERVICE_REGION="europe-west1"Docker images can also be published to GitHub Container Registry through the release workflow. This is useful for development and testing purposes:
Creating a Release:
# Create and push a tag
git tag -a v1.0.0 -m "Release version 1.0.0"
git push origin v1.0.0
# Then create a release on GitHub from the tagThe workflow automatically publishes the image to:
ghcr.io/riccardotornesello/devfest-lecce-be:latestghcr.io/riccardotornesello/devfest-lecce-be:1.0.0ghcr.io/riccardotornesello/devfest-lecce-be:1.0ghcr.io/riccardotornesello/devfest-lecce-be:1
Pulling the Image:
docker pull ghcr.io/riccardotornesello/devfest-lecce-be:latestEnsure the following environment variables are set in your production environment:
SECRET_KEY: A strong, random secret keyDEBUG=falseALLOWED_HOSTS: Your production domain(s)CORS_ALLOWED_ORIGINS: Your frontend domain(s)CSRF_TRUSTED_ORIGINS: Your frontend domain(s)- Database credentials
FIREBASE_AUDIENCE: Your Firebase project IDGS_BUCKET_NAME: For media storage (if using Google Cloud Storage)
The entire Google Cloud infrastructure is managed with Terraform and is located in the infrastructure/ directory.
- Google Cloud Project Services (Artifact Registry, Cloud Build, Cloud Run, SQL Admin)
- Artifact Registry repository for Docker images
- Cloud Storage bucket for media files
- Cloud SQL (PostgreSQL) database instance (optional - can use external database instead)
- Cloud Run service for the backend API
- Cloud Run job for migrations and static file collection
- Load balancer with SSL certificate
- Cloud Build trigger (automatic deployment on push to
main) - Service accounts and IAM permissions
You can choose between two database configurations:
- Cloud SQL (default): Managed PostgreSQL instance on Google Cloud
- External Database: Use your own PostgreSQL database (self-hosted or from another provider)
-
Prerequisites:
- Google Cloud Project created
- Terraform installed (
terraformCLI) - Google Cloud SDK installed and authenticated (
gcloud auth application-default login) - GitHub repository connected to Google Cloud Build
-
Configure variables:
Create a
terraform.tfvarsfile in theinfrastructure/directory.For Cloud SQL (default):
project = "your-gcp-project-id" region = "europe-west1" repository_id = "devfest-lecce" bucket_name = "devfest-lecce-media" db_password = "your-secure-database-password" domain = "api.devfest.gdglecce.it" repo_owner = "riccardotornesello" repo_name = "devfest-lecce-be"
For external database:
project = "your-gcp-project-id" region = "europe-west1" repository_id = "devfest-lecce" bucket_name = "devfest-lecce-media" use_cloud_sql = false external_db_host = "your-database-host.example.com" external_db_port = 5432 external_db_name = "devfest_lecce_db" external_db_user = "devfest" db_password = "your-secure-database-password" domain = "api.devfest.gdglecce.it" repo_owner = "riccardotornesello" repo_name = "devfest-lecce-be"
-
Initialize and apply Terraform:
cd infrastructure terraform init terraform plan # Review the infrastructure changes terraform apply # Apply the changes (type 'yes' to confirm)
-
Post-setup:
- Once applied, Terraform will output important values like database connection details
- The Cloud Build trigger will automatically deploy on every push to
main - Manual intervention should not be needed unless infrastructure changes are required
For detailed information about the infrastructure components, see the infrastructure README.
This application follows security best practices:
- No hardcoded secrets: All sensitive data is loaded from environment variables
- HTTPS enforcement: In production, all traffic is redirected to HTTPS
- Security headers: XSS protection, content type nosniff, and frame options
- CSRF protection: Enabled for all state-changing operations
- CORS configuration: Properly configured allowed origins
- Firebase authentication: Secure token-based authentication
- SQL injection protection: Django ORM provides protection by default
- Password validation: Strong password requirements enforced
- Set
DEBUG=false - Use a strong, unique
SECRET_KEY - Configure
ALLOWED_HOSTSwith your specific domains - Set up HTTPS/SSL certificates
- Configure
CORS_ALLOWED_ORIGINSto only allow your frontend - Use a production-grade database (PostgreSQL)
- Set up regular database backups
- Keep dependencies up to date
- Monitor application logs
- Set up rate limiting if needed
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Run tests and linting (
uv run ruff check --fix) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Riccardo Tornesello - [email protected]
- DevFest Lecce organizing team
- Google Developers Group Lecce
- All contributors to this project
For issues and questions, please open an issue on GitHub or contact the development team.