You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Question:
Thank you for this amazing project! I’m encountering issues with inference on multiple GPUs.
Context:
I have 4 GPUs, each with 20GB of memory.
I want to run inference across all 4 GPUs to improve performance.
Ask:
Could you please guide me on how to modify the code to enable multi-GPU inference? Are there specific configurations or examples I should follow?