5 releases

new 0.10.2 Apr 1, 2026
0.10.1 Apr 1, 2026
0.10.0 Mar 31, 2026
0.9.2 Jan 24, 2026
0.9.2-alpha.2 Dec 16, 2025

#2401 in Machine learning

Download history 10/week @ 2026-02-04 215/week @ 2026-02-11 127/week @ 2026-02-18 70/week @ 2026-02-25 19/week @ 2026-03-04 24/week @ 2026-03-11 41/week @ 2026-03-18 10/week @ 2026-03-25

96 downloads per month
Used in 6 crates (2 directly)

MIT/Apache

1.5MB
33K SLoC

Candle Flash Attention v3 Layer

Flash Attention v3 Layer for Hopper (compatible nvidia sm90a arch) and the candle framework.

Dependencies

~25MB
~524K SLoC