#machine-learning #tensor #blas

candle-flash-attn

Flash attention layer for the candle ML framework

12 unstable releases (3 breaking)

0.4.1 Feb 28, 2024
0.3.3 Jan 31, 2024
0.3.2 Dec 20, 2023
0.3.1 Nov 12, 2023

#318 in Machine learning

Download history 3/week @ 2023-12-23 3/week @ 2023-12-30 45/week @ 2024-01-20 15/week @ 2024-01-27 1/week @ 2024-02-03 31/week @ 2024-02-17 291/week @ 2024-02-24 61/week @ 2024-03-02 40/week @ 2024-03-09 16/week @ 2024-03-16 12/week @ 2024-03-23 36/week @ 2024-03-30 9/week @ 2024-04-06

79 downloads per month
Used in 7 crates (4 directly)

MIT/Apache

2MB
23K SLoC

candle-flash-attn

Dependencies

~9MB
~188K SLoC