-
-
Notifications
You must be signed in to change notification settings - Fork 225
Open
Description
After digging around a bit with my 5090 cu128-slim, it seem the following versions of torchs and xformers are working for me:
pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 xformers==0.0.30 --index-url https://download.pytorch.org/whl/cu128
Also with following flash-attn for most samplers to work:
pip install https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl
All of my samplers seem to be working fine as well as PulID. Maybe more peeps can test these out but I think these should be the usable versions for all blackwell cards.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels