Skip to content

Releases: Dao-AILab/flash-attention

v2.4.3

22 Jan 01:15

Choose a tag to compare

Bump to v2.4.3

v2.4.2

26 Dec 00:29

Choose a tag to compare

Bump to v2.4.2

v2.4.1

24 Dec 05:01

Choose a tag to compare

Bump to v2.4.1

v2.4.0.post1

22 Dec 18:10

Choose a tag to compare

[CI] Don't compile for python 3.7 pytorch 2.2

v2.4.0

22 Dec 08:10

Choose a tag to compare

Bump to v2.4.0

v2.3.6

28 Nov 00:24

Choose a tag to compare

Bump to v2.3.6

v2.3.5

27 Nov 03:09

Choose a tag to compare

Bump to v2.3.5

v2.3.4

20 Nov 07:22

Choose a tag to compare

Bump to v2.3.4

v2.3.3

24 Oct 07:24

Choose a tag to compare

Bump to v2.3.3

v2.3.2

09 Oct 00:22

Choose a tag to compare

Bump to v2.3.2