Skip to content

Releases: Dao-AILab/flash-attention

v2.5.4

21 Feb 00:32

Choose a tag to compare

Bump to v2.5.4

v2.5.3

10 Feb 09:09

Choose a tag to compare

Bump to v2.5.3

v2.5.2

31 Jan 10:46

Choose a tag to compare

Bump to v2.5.2

v2.5.1.post1

30 Jan 22:34

Choose a tag to compare

[CI] Install torch 2.3 using index

v2.5.1

30 Jan 05:07

Choose a tag to compare

Bump to v2.5.1

v2.5.0

23 Jan 07:41

Choose a tag to compare

Bump to v2.5.0

v2.4.3.post1

22 Jan 01:24

Choose a tag to compare

[CI] Fix CUDA 12.2.2 compilation

v2.4.3

22 Jan 01:15

Choose a tag to compare

Bump to v2.4.3

v2.4.2

26 Dec 00:29

Choose a tag to compare

Bump to v2.4.2

v2.4.1

24 Dec 05:01

Choose a tag to compare

Bump to v2.4.1