26 releases (8 breaking)

0.13.0 Nov 15, 2023
0.12.3 Jun 27, 2023
0.12.0 May 31, 2023

#913 in Machine learning

Download history 9/week @ 2024-07-21 60/week @ 2024-07-28 79/week @ 2024-09-22 1/week @ 2024-09-29

80 downloads per month
Used in llm-chain-llama

MIT license

1MB
22K SLoC

C 10K SLoC // 0.1% comments C++ 5K SLoC // 0.1% comments Rust 4.5K SLoC // 0.0% comments Python 1K SLoC // 0.1% comments CUDA 545 SLoC // 0.0% comments Shell 178 SLoC // 0.2% comments Zig 50 SLoC Batch 48 SLoC Swift 20 SLoC

llama-sys

llama-sys is a set of bindgen generated wrappers for llama.cpp. This crate provides a low-level interface to llama.cpp, allowing you to use it in your Rust projects. To use llama-sys, simply add the following to your Cargo.toml file:

[dependencies]
llama-sys = "0.1.0"
use llama_sys::\*;

Note that llama-sys provides a lower-level interface than llm-chain-llama, and may be more difficult to use. However, if you need fine-grained control over llama.cpp, llama-sys is the way to go.

No runtime deps