-
Notifications
You must be signed in to change notification settings - Fork 14
Open
Description
I am encountering consistent CUDA Out of Memory (OOM) errors when running ROSE locally on a GPU with 16GB VRAM (RTX 4070 Ti 16GB). Despite 16GB being sufficient for many video generation tasks, the default inference settings (resolution and frame count) appear too aggressive for this memory capacity. Could you please provide guidance on optimized settings for 16GB cards ?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels