Skip to content

build on ubuntu 24.04 with cuda 12.8 with gcc 14 still fails on flash_attn #552

@swingliluyao603

Description

@swingliluyao603

We saw a work around suggesting build flash-attention from source. Flash-attention won't build on gcc 14 either.

Can anyone suggest a work around that actually works?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions