NOTE: Since native NDI support got integrated in Daydream Scope, this bridge has no utility anymore. Still useful to use NDI with StreamDiffusion via the Daydream API
Real-time AI video transformation bridge using the Daydream API. Capture video via NDI and transform it with AI in real-time.
resolume_github.mp4
- NDI Input - Capture video from any NDI source on your network
- Real-time AI Transformation - Transform your video feed using StreamDiffusion
- Multiple AI Models - Support for SD 1.5, SD Turbo, SDXL Turbo
- ControlNet Support - Depth, Canny edge, and Tile ControlNets
- Web-based Control Panel - Adjust all parameters in a clean browser UI
- Low Latency - WebRTC output for minimal delay
- macOS 10.15+ or Windows 10+ (with NDI SDK)
- Python 3.10+
- NDI Tools installed
- Active internet connection
cd daydream-bridge
# Install dependencies
pip install -r requirements.txt
# Run the bridge
python app.pyOn first run, a browser window will open to sign in to Daydream:
- Create an account at daydream.live if you don't have one. The free trial is 10 hours of video.
- Sign in when the browser opens
- The app will automatically create and save an API key to
~/.daydream/credentials
You only need to sign in once — the API key is saved locally and reused on subsequent runs.
- Start your NDI source (VirtualDJ, Resolume, OBS, or any NDI-capable app)
- Run the bridge with
python app.py - Open the control panel in your browser (opens automatically)
- Select your NDI source from the dropdown
- Click "Start Stream" to begin AI transformation
- Adjust parameters in real-time using the sliders
| Parameter | Description |
|---|---|
| Prompt | Text description of desired visual style |
| Negative Prompt | What to avoid in generation |
| Denoise (Delta) | How much AI transformation to apply |
| Depth | Preserve depth/3D structure |
| Canny | Preserve edges/outlines |
| Tile | Preserve texture patterns |
| Guidance | How closely to follow the prompt |
The bridge automatically scans for NDI sources on your network. Common sources include:
- VirtualDJ - Enable NDI output in VirtualDJ's broadcast settings
- OBS Studio - Use the NDI plugin
- Any NDI-capable application
The bridge supports two backends:
Uses the Daydream cloud API for processing. Easy to set up, just sign in and go.
- 10 hours free trial
- No GPU required
- StreamDiffusion v1 models
Use Daydream Scope for more models and no cloud costs.
Features:
- StreamDiffusionV2, LongLive, Krea Realtime Video, and more
- LoRA support for custom styles
- Run locally or on RunPod
Using Scope with this bridge:
-
Start Scope (locally or on RunPod)
- Local:
uv run daydream-scope→http://localhost:8000 - RunPod: Use the RunPod template → get your proxy URL
- Local:
-
In the bridge control panel:
- Switch to "Scope (Self-hosted)" tab
- Paste your Scope URL (e.g.,
https://xxx-8000.proxy.runpod.net) - Click Test Connection to verify
- Select NDI source and start streaming
RunPod setup:
- Deploy the Scope RunPod template
- Set
HF_TOKENenvironment variable for TURN servers (required for WebRTC through firewalls) - Wait for deployment, get your proxy URL from RunPod
- Paste the URL in the bridge and stream!
GPU requirements for self-hosting:
- Minimum: RTX 3090/4090 (24GB VRAM)
- For Krea Realtime Video: RTX 5090 or better (32GB+ VRAM)
MIT License - see LICENSE file for details.
- Daydream - Real-time AI video generation API
- Daydream Scope - Self-hosted AI video generation
- NDI - Network Device Interface