A robust Django template with Supabase authentication integration, designed for scalable API development with built-in credit system, monitoring, and deployment solutions.
This template leverages Supabase as a powerful backend service that provides authentication, database, and storage capabilities with minimal configuration.
- π JWT Authentication: Secure token-based authentication with Supabase Auth
- π‘οΈ Row Level Security (RLS): Database-level multi-tenant data isolation
- πͺ£ Storage Buckets: Managed file storage with security policies
- π Real-time Subscriptions: Live data updates for web and mobile clients
- 𧩠Serverless Edge Functions: Deploy custom logic to the edge
- Users authenticate through Supabase (OAuth, email/password, phone, magic link)
- Supabase issues a JWT token with custom claims
- Django middleware validates the JWT token for protected routes
- Claims from the JWT token define user permissions and tenant access
The template includes a comprehensive multi-tenancy solution using Supabase's Row Level Security:
-- Enable RLS on tables
ALTER TABLE your_table ENABLE ROW LEVEL SECURITY;
-- Create tenant isolation policy
CREATE POLICY tenant_isolation ON your_table
FOR ALL USING (tenant_id = auth.jwt() -> 'app_metadata' ->> 'tenant_id');
- Database-level data isolation between tenants
- Automatic tenant context from JWT claims
- Middleware for maintaining tenant context in Django
- Built-in tenant management through Django admin
The template's credit system works seamlessly with Supabase through:
- Credit Decorator: Apply
@with_credits(credit_amount=5)
to any view - Credit Utility Function: Wrap any function with credit-based access control
- Database Integration: Track credit transactions with Supabase RLS policies
-
Create a Supabase Project:
- Sign up at supabase.com
- Create a new project and note your project URL and API keys
-
Configure Environment Variables:
# .env file
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_PUBLIC_KEY=your-public-anon-key
SUPABASE_SERVICE_KEY=your-service-role-key
SUPABASE_JWT_SECRET=your-jwt-secret
-
Enable Required Features:
- Configure authentication providers in the Supabase dashboard
- Enable Row Level Security for database tables
- Create storage buckets with appropriate policies
-
Deploy Edge Functions (optional):
- Use the Supabase CLI to deploy serverless functions
- Connect edge functions to your Django API endpoints
The template includes utility SQL functions for advanced use cases:
-- Create a function to execute arbitrary SQL (admin only)
CREATE OR REPLACE FUNCTION exec_sql(query text)
RETURNS VOID
LANGUAGE plpgsql
SECURITY DEFINER
AS $$
BEGIN
EXECUTE query;
END;
$$;
-- Grant access to authenticated users (or restrict to service_role if preferred)
GRANT EXECUTE ON FUNCTION exec_sql(text) TO authenticated;
- π Secure Authentication via Supabase JWT validation
- π¦ Rate Limiting & Credit System with concurrency-safe credit tracking for API usage
- π Monitoring & Observability with Prometheus metrics and structured logging
- π³ Production-Ready Deployment with Docker, Hetzner Cloud, and Coolify support
- π Async Task Processing with Redis and Celery integration
- ποΈ Flexible ORM Options with Django ORM and optional Drizzle ORM support
- π CI/CD Pipeline Integration for automated deployments
- π³ Subscription Management: Create and manage subscription plans
- π³ Stripe Integration: Manage subscriptions, webhooks, and credit allocation
- π³ Credit System: Manage credits for API usage
- π³ Testing: Comprehensive test cases for Each app and view
django-supabase-template/
βββ backend/ # Django application code
βββ config/ # Environment and configuration files
βββ docker/ # Docker-related files
βββ _docs/ # Project documentation
βββ .env.example # Example environment variables
βββ docker-compose.yml # Docker Compose configuration
βββ requirements.txt # Python dependencies
βββ README.md # Project documentation
- Docker and Docker Compose
- Git
- Supabase account and project
- Stripe account and API keys
- Hetzner Cloud account and API keys
- Coolify account and API keys
- Supabase Api Keys Or Self Hosted
- Optional Drizzle ORM
- Optional PostgreSQL Database
- Clone the repository
git clone https://github.com/your-org/django-supabase-template.git
cd django-supabase-template
- Configure environment variables
cp .env.example .env
# Edit .env with your Supabase credentials and other settings
- Start the services
docker-compose up --build
- Apply database migrations
docker exec -it backend python manage.py migrate
- Access the API
The API will be available at http://localhost:8000/api/
To simplify the integration of the Supabase template into your existing Python project, you can use the provided setup script. This script will copy all necessary files and configurations from the template into your project.
- Copies all necessary files and folders from the template to your project
- Merges requirements.txt if both your project and the template have them
- Preserves existing files in your project while adding new ones
- Handles configuration files like
.env
,docker-compose.yml
, etc.
-
Run the Setup Script: After cloning the repository, navigate to the project directory and run the script:
python _setup_integration.py /path/to/your/existing/project
This will copy all files and directories (except the script itself) to your project, including:
.env.example
as.env
docker-compose.yml
config
folder (for Prometheus)database
folderdocker
folder (containing Dockerfile)- All other necessary files
-
requirements.txt Merging: The project uses requirements.txt to manage dependencies. If your existing project has a
requirements.txt
and the template also has one, the script will intelligently merge them, preserving your existing dependencies while adding the ones required by the template. -
Follow the Remaining Setup Steps: After running the script, follow the remaining setup steps in the README to configure your project.
- Users authenticate through Supabase (OAuth, email/password, OTP)
- Supabase issues a JWT token
- Django validates the JWT token using middleware
- Role-based access control is enforced based on Supabase claims
The template includes a robust credit system with advanced features:
- Per-user rate limiting via Django REST Framework throttling
- Credit-based usage tracking for premium features
- API endpoints to check remaining credits
- Concurrency-safe credit operations with row-level locking
- Credit hold mechanism for long-running operations
- UUID primary keys for distributed environments
- Structured logging for comprehensive auditing
- Prometheus metrics for tracking credit transactions and balances
- Performance measurement with duration metrics and failure tracking
- Integration with monitoring dashboards
The CreditUsageRate
model allows administrators to define credit costs for different API endpoints. Here's how to use it:
-
Create a New Credit Usage Rate: You can create a new credit usage rate for an endpoint in the Django shell or through the admin interface:
from apps.credits.models import CreditUsageRate new_rate = CreditUsageRate.objects.create( endpoint_path='/api/resource/', # The API endpoint credits_per_request=5, # Set the number of credits for this endpoint description='Cost for accessing the resource endpoint', is_active=True # Set to True to make it active )
-
Update Existing Credit Usage Rates: If you need to change the credit cost for an existing endpoint, fetch the instance and update it:
existing_rate = CreditUsageRate.objects.get(endpoint_path='/api/resource/') existing_rate.credits_per_request = 3 # Set new credit cost existing_rate.save() # Save changes
-
Managing Credit Holds: For long-running operations, you can place a hold on credits:
from apps.credits.models import CreditHold # Place a hold on 10 credits hold = CreditHold.place_hold( user=request.user, amount=10, description="Long-running task", endpoint="/api/long-task/" ) # Later, release or consume the hold if task_successful: hold.consume() # Convert hold to an actual deduction else: hold.release() # Release the hold without charging credits
The template includes a powerful Redis caching system to improve application performance, reduce database load, and enhance scalability.
- π High-Performance Caching: In-memory storage for frequently accessed data
- π Smart Invalidation: Maintains data consistency across operations
- π Structured Cache Keys: Organized cache key patterns for different data types
- 𧩠Multiple Caching Patterns: Function results, API responses, database queries, and more
- π Monitoring: Cache hit/miss metrics for performance analysis
- π§ͺ Testable Design: Comprehensive testing utilities and patterns
The Redis caching implementation serves multiple purposes:
- Django Cache Backend: Primary cache backend for Django
- Session Storage: Fast, in-memory session storage
- Celery Message Broker: Backend for asynchronous task processing
- Rate Limiting: Efficient implementation of API rate limiting
- Pub/Sub Messaging: For real-time features
Redis is pre-configured in the template with production-ready settings:
# Redis Configuration in settings.py
REDIS_URL = os.getenv("REDIS_URL", "redis://redis:6379/0")
# Cache Configuration
CACHES = {
"default": {
"BACKEND": "django_redis.cache.RedisCache",
"LOCATION": "redis://redis:6379/1",
"OPTIONS": {
"CLIENT_CLASS": "django_redis.client.DefaultClient",
"PARSER_CLASS": "redis.connection.HiredisParser",
"CONNECTION_POOL_CLASS": "redis.BlockingConnectionPool",
"CONNECTION_POOL_CLASS_KWARGS": {
"max_connections": 50,
"timeout": 20,
},
"MAX_CONNECTIONS": 1000,
"IGNORE_EXCEPTIONS": True,
},
}
}
The template implements several caching strategies:
from apps.caching.utils.redis_cache import cache_result
@cache_result(timeout=60 * 15) # Cache for 15 minutes
def expensive_calculation(param1, param2):
# Complex operation here
return result
from rest_framework.decorators import api_view
from apps.caching.utils.redis_cache import cache_result
@api_view(['GET'])
@cache_result(timeout=300) # Cache for 5 minutes
def get_data(request):
# Expensive database queries or API calls
return Response(data)
from django.core.cache import cache
import hashlib
def get_user_stats(user_id):
# Generate cache key
cache_key = f"user_stats:{user_id}"
# Try to get from cache
cached_data = cache.get(cache_key)
if cached_data:
return cached_data
# If not in cache, fetch and store
stats = UserStats.objects.filter(user_id=user_id).annotate(
# Complex aggregations here
).values('counts', 'averages')
# Cache for 1 hour
cache.set(cache_key, stats, 3600)
return stats
from django.core.cache import cache
def update_user(user_id, data):
# Update the user
user = User.objects.get(id=user_id)
user.update(**data)
# Invalidate all related cache keys
cache_keys = [
f"user:{user_id}",
f"user_stats:{user_id}",
"all_users"
]
cache.delete_many(cache_keys)
return user
The template follows these patterns for cache key generation:
- User Authentication:
auth:user:{hashed_token}
- Database Queries:
db_query:{table}:{hashed_query_params}
- Storage Operations:
storage:list:{bucket_name}:{hashed_path}
- Timeout Selection: Cache timeouts are configured based on data volatility
- Memory Management: Selective caching to optimize memory usage
- Connection Pooling: Configured with optimal connection limits
The template includes comprehensive testing utilities:
# In your test file
from django.core.cache import cache
from django.test import TestCase, override_settings
@override_settings(CACHES={
"default": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache",
}
})
class CacheTest(TestCase):
def setUp(self):
cache.clear()
def tearDown(self):
cache.clear()
def test_cache_hit(self):
# Test that caching works as expected
cache.set('test_key', 'test_value', 10)
self.assertEqual(cache.get('test_key'), 'test_value')
See the detailed documentation in _docs/redis/
for more information on caching strategies, implementation patterns, and testing approaches.
The template includes comprehensive monitoring with Grafana dashboards for visualizing application metrics and system performance.
- π Real-time Dashboards: Pre-configured dashboards for API and system monitoring
- π API Performance Metrics: Request rates, latency, error rates, and anomaly detection
- π» System Monitoring: CPU, memory, disk, and network usage metrics
- π± Container Insights: Resource utilization for containerized services
β οΈ Alerting Capabilities: Set up notifications for critical conditions
The monitoring stack consists of:
- Django Monitoring App: Collects and exposes metrics from your application
- Prometheus: Scrapes and stores metrics from various sources
- Grafana: Visualizes metrics with interactive dashboards
- Node Exporter: Provides host system metrics (CPU, memory, disk)
- cAdvisor: Collects container-level metrics
- Start the Monitoring Stack
docker-compose -f docker-compose.monitoring.yml up -d
- Access the Grafana Dashboard
Open your browser and navigate to http://localhost:3000
Default credentials:
- Username:
admin
- Password:
admin
- View the Pre-configured Dashboards
Navigate to Dashboards > Browse > Django API Monitoring
-
Django API Monitoring Dashboard
- API Request Rate by endpoint
- Response Time metrics (95th percentile)
- Error Rate tracking
- Credit Usage monitoring
- Active User tracking
-
System Overview Dashboard
- Host resource utilization
- Container performance metrics
- Network traffic analysis
See the detailed documentation in _docs/grafana/
for more information on customizing dashboards, setting up alerts, and adding custom metrics.
For detailed deployment instructions, see the deployment documentation.
- Hetzner Cloud: Complete instructions for setting up on Hetzner Cloud servers
- Coolify: Step-by-step guide for deploying with the Coolify platform
For a comprehensive list of environment variables and their descriptions, see the environment variables reference.
The template includes Prometheus integration for monitoring:
-
Metrics Collection:
- API endpoint response times
- Credit transactions and balances
- Task queue performance
-
Prometheus Configuration:
- Pre-configured prometheus.yml in the config directory
- Django-prometheus integration for easy metrics exposure
-
Dashboard Integration:
- Ready to integrate with Grafana for visualization
Comprehensive documentation is available in the _docs/
directory
MIT