Skip to content

Releases: lucidrains/x-transformers

2.16.1

12 Feb 16:25

Choose a tag to compare

What's Changed

  • Support efficient flash attention for packed sequences using flash-attn-2.0 by @muthissar in #350

New Contributors

Full Changelog: 2.16.0...2.16.1

2.16.0

07 Feb 17:29

Choose a tag to compare

Full Changelog: 2.15.2...2.16.0

2.15.2

03 Feb 17:36

Choose a tag to compare

Full Changelog: 2.15.1...2.15.2

2.15.1

03 Feb 17:10

Choose a tag to compare

Full Changelog: 2.15.0...2.15.1

2.15.0

03 Feb 15:16

Choose a tag to compare

Full Changelog: 2.14.2...2.15.0

2.14.2

06 Jan 22:17

Choose a tag to compare

Full Changelog: 2.14.1...2.14.2

2.14.1

06 Jan 20:55

Choose a tag to compare

Full Changelog: 2.12.2...2.14.1

2.14.0

06 Jan 15:32

Choose a tag to compare

Full Changelog: 2.12.2...2.14.0

2.12.2

25 Dec 14:59

Choose a tag to compare

Full Changelog: 2.12.1...2.12.2

2.12.1

25 Dec 14:21

Choose a tag to compare

Full Changelog: 2.12.0...2.12.1