Skip to content

[Suggestion] Blackwell with cu128 xformer version #144

@RokuDoan

Description

@RokuDoan

After digging around a bit with my 5090 cu128-slim, it seem the following versions of torchs and xformers are working for me:

pip install torch==2.7.0 torchvision==0.22.0 torchaudio==2.7.0 xformers==0.0.30 --index-url https://download.pytorch.org/whl/cu128

Also with following flash-attn for most samplers to work:

pip install https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.4.11/flash_attn-2.8.3+cu128torch2.7-cp312-cp312-linux_x86_64.whl

All of my samplers seem to be working fine as well as PulID. Maybe more peeps can test these out but I think these should be the usable versions for all blackwell cards.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions