Releases: lucidrains/x-transformers
Releases · lucidrains/x-transformers
2.16.1
What's Changed
- Support efficient flash attention for packed sequences using flash-attn-2.0 by @muthissar in #350
New Contributors
- @muthissar made their first contribution in #350
Full Changelog: 2.16.0...2.16.1
2.16.0
Full Changelog: 2.15.2...2.16.0
2.15.2
Full Changelog: 2.15.1...2.15.2
2.15.1
Full Changelog: 2.15.0...2.15.1
2.15.0
Full Changelog: 2.14.2...2.15.0
2.14.2
Full Changelog: 2.14.1...2.14.2
2.14.1
Full Changelog: 2.12.2...2.14.1
2.14.0
Full Changelog: 2.12.2...2.14.0
2.12.2
Full Changelog: 2.12.1...2.12.2
2.12.1
Full Changelog: 2.12.0...2.12.1