📖 Documentation • 🚀 Quick Start • 🔧 API Reference • 🐳 Deployment • 🤝 Contributing
Dyvine is a production-ready, high-performance REST API designed for content management. It provides comprehensive content download, user management, live streaming, and cloud storage integration capabilities.
🎯 Core Features • ⚡ Async Processing • 🔄 Batch Operations • ☁️ Cloud Integration • 📊 Observability
Dyvine provides a comprehensive API for downloading and managing content with production-grade reliability. It supports various content types including videos, images, live streams, and user information with built-in cloud storage integration.
- 📥 Content Management: Download videos, images, and live streams
- 👥 User Operations: Retrieve user profiles and content analytics
- ⚡ Batch Processing: Efficient bulk content download operations
- 📌 Operation Tracking: Persistent operation records for asynchronous downloads
- 🏗️ Architecture: Async operations with connection pooling
- ☁️ Cloud Storage: Direct integration with object storage
- 🔧 Developer Experience:
- Complete type hints throughout codebase
- Detailed error messages and logging
- Auto-generated OpenAPI/Swagger documentation
- Production-ready configuration management
- uv (recommended) or Python 3.12+
- Git
- 2GB+ free disk space
- Active internet connection
- Valid authentication cookie
- Optional: Object storage credentials
# Clone repository
git clone https://github.com/memenow/dyvine.git
cd dyvine
# Setup with uv (recommended)
uv sync
# Install development dependencies (optional)
uv sync --all-extras-
Environment Setup:
cp .env.example .env
-
Required Configuration:
Edit
.envfile with your settings:# Essential settings DOUYIN_COOKIE=your_cookie_here # Optional: Object storage integration R2_ACCOUNT_ID=your_account_id R2_ACCESS_KEY_ID=your_access_key R2_SECRET_ACCESS_KEY=your_secret_key R2_BUCKET_NAME=your_bucket_name # Optional: local operation state database API_OPERATION_DB_PATH=data/douyin/state/operations.db
# Start development server
uv run uvicorn src.dyvine.main:app --reload
# Production server
uv run uvicorn src.dyvine.main:app --host 0.0.0.0 --port 8000The API will be available at:
- Application: http://localhost:8000
- Interactive Documentation: http://localhost:8000/docs
- Alternative Documentation: http://localhost:8000/redoc
http://localhost:8000/api/v1
# Get user information
GET /api/v1/users/{user_id}
# Download user content
POST /api/v1/users/{user_id}/content:download# Get post details
GET /api/v1/posts/{post_id}
# List user posts
GET /api/v1/posts/users/{user_id}/posts
# Download user posts
POST /api/v1/posts/users/{user_id}/posts:download# Download active livestream
POST /api/v1/livestreams/users/{user_id}/stream:download
# Download from URL
POST /api/v1/livestreams/stream:download
# Check asynchronous operation status
GET /api/v1/livestreams/operations/{operation_id}Get User Information:
curl "http://localhost:8000/api/v1/users/USER_ID"Download User Posts:
curl -X POST "http://localhost:8000/api/v1/posts/users/USER_ID/posts:download" \
-H "Content-Type: application/json"Download a Livestream by User ID:
curl -X POST "http://localhost:8000/api/v1/livestreams/users/USER_ID/stream:download" \
-H "Content-Type: application/json" \
-d '{"output_path": null}'Download a Livestream by URL:
curl -X POST "http://localhost:8000/api/v1/livestreams/stream:download" \
-H "Content-Type: application/json" \
-d '{"url": "https://live.douyin.com/123456789"}'Check Asynchronous Operation Status:
curl "http://localhost:8000/api/v1/livestreams/operations/OPERATION_ID"The project includes a comprehensive test suite with full async support:
# Run all tests
uv run pytest
# Run with coverage
uv run pytest --cov=src/dyvine
# Run specific test file
uv run pytest tests/services/test_livestream_service.py
# Run with verbose output
uv run pytest -vtests/
├── conftest.py # Shared fixtures and sys.path setup
├── services/
│ ├── test_livestream_service.py # Livestream service unit tests
│ ├── test_storage_service.py # R2 storage service tests
│ └── test_user_service.py # User service tests
├── test_dependencies.py # DI container tests
├── test_main.py # App startup and health check tests
└── test_utils.py # Utility function tests
For the simplest Docker deployment, you only need to set essential environment variables:
# Build the image
docker build -t dyvine:latest .
# Run with minimal required configuration
docker run -d \
--name dyvine \
-p 8000:8000 \
-e DOUYIN_COOKIE="your_douyin_cookie_here" \
-e SECURITY_SECRET_KEY="your-production-secret-key" \
-e SECURITY_API_KEY="your-production-api-key" \
dyvine:latestFor production deployment with cloud storage:
docker run -d \
--name dyvine \
-p 8000:8000 \
-v $(pwd)/data:/app/data \
-v $(pwd)/logs:/app/logs \
-e DOUYIN_COOKIE="your_douyin_cookie_here" \
-e SECURITY_SECRET_KEY="your-production-secret-key" \
-e SECURITY_API_KEY="your-production-api-key" \
-e R2_ACCOUNT_ID="your_r2_account_id" \
-e R2_ACCESS_KEY_ID="your_r2_access_key" \
-e R2_SECRET_ACCESS_KEY="your_r2_secret_key" \
-e R2_BUCKET_NAME="your_r2_bucket_name" \
-e R2_ENDPOINT="your_r2_endpoint" \
--restart unless-stopped \
dyvine:latestIf you prefer using an .env file:
# Copy and customize environment template
cp .env.docker .env
# Edit .env with your configuration
# Run with env file
docker run -d \
--name dyvine \
-p 8000:8000 \
-v $(pwd)/data:/app/data \
-v $(pwd)/logs:/app/logs \
--env-file .env \
--restart unless-stopped \
dyvine:latest-
Prerequisites:
- Kubernetes cluster
- kubectl configured
- Container registry access
-
Deploy:
# Update image reference in deploy/k8s.yaml kubectl apply -f deploy/k8s.yaml # Verify deployment kubectl get pods -l app=dyvine kubectl get services dyvine
- Monitoring: Scrape
/metricsand aggregate structured logs - Horizontal Scaling: Use a shared operation backend before increasing replicas beyond 1
- Backup: Implement persistent volume and log archival strategies
- Persistent Storage: The default Kubernetes manifests provision a
ReadWriteOncePersistentVolumeClaimnameddyvine-operation-statefor the SQLite-backed operation store at/app/data. The cluster must provide a default StorageClass. ReusingemptyDiris not supported because operation state must survive Pod restarts.
GET /livez
GET /readyz
GET /startupz
GET /health
GET /metricsResponse includes:
- Liveness, readiness, and startup signals
- Application status, version, and uptime
- Memory and CPU metrics
- Prometheus-compatible metrics at
/metrics
/livez,/readyz,/startupzare the orchestration-facing probes./healthis retained as a human-facing summary endpoint. Its HTTP status semantics changed with the introduction of separate probes: it now returns503only when the aggregated status isunhealthy; adegradedstatus (for example, R2 credentials missing) now returns200. Monitors previously relying on/healthto page ondegradedshould migrate to/readyz, which fails closed when any dependency is missing.
- Structured JSON logging for machine readability
- Request correlation tracking
- Async-safe request context propagation
- Development/production formatting modes
- Performance metrics collection
- Dyvine currently relies on Douyin session cookies for upstream access.
- Built-in API authentication and rate limiting are not enforced by the application.
- If you deploy the API on the public internet, place it behind an API gateway, ingress policy, or service mesh that enforces authentication, authorization, and rate limits.
- The bundled Kubernetes manifests run the API as a single replica because the default operation store uses a pod-local SQLite file.
- Do not increase replicas or enable autoscaling until you replace the SQLite-backed operation store with a shared backend.
- The base manifests include a
PersistentVolumeClaim(dyvine-operation-state,ReadWriteOnce, 1Gi) mounted at/app/dataso that in-flight operation records survive Pod restarts. Increasing replicas beyond 1 still requires replacing the SQLite-backed operation store with a shared backend.
# Code formatting
uv run black .
uv run isort .
# Type checking
uv run mypy src/dyvine
# Linting
uv run ruff check .
# Run all checks
uv run pytest && uv run black . && uv run isort . && uv run mypy src/dyvine && uv run ruff check .The project uses GitHub Actions for continuous integration and deployment:
- Code Quality: Automated linting, formatting, and type checking
- Testing: Comprehensive test suite with coverage reporting
- Security: Vulnerability scanning with Trivy, Safety, and Bandit
- Docker: Multi-platform image builds and pushes to GitHub Container Registry
- Releases: Automated release creation for version tags
ci-cd.yml: Main CI/CD pipeline (runs on push/PR)code-quality.yml: Code quality checks (runs on PR)dependency-check.yml: Weekly security dependency audit
Images are automatically built and pushed to:
ghcr.io/memenow/prod/dyvine:latest
ghcr.io/memenow/prod/dyvine:<version>
ghcr.io/memenow/prod/dyvine:<branch>-<sha>
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.