#llama #cpp #cpp-bindings #safe

babichjacob-llama-cpp-2

llama.cpp bindings for Rust

1 unstable release

new 0.1.85 Dec 5, 2024

#13 in #cpp-bindings

MIT/Apache

5.5MB
109K SLoC

C++ 50K SLoC // 0.1% comments C 34K SLoC // 0.1% comments CUDA 8K SLoC // 0.0% comments Metal Shading Language 5K SLoC // 0.0% comments Objective-C 4K SLoC // 0.0% comments Python 3.5K SLoC // 0.1% comments GLSL 3K SLoC // 0.0% comments Rust 2.5K SLoC // 0.0% comments

llama-cpp-rs-2

A wrapper around the llama-cpp library for rust.

Info

This is part of the project powering all the LLMs at utilityai, it is tightly coupled llama.cpp and mimics its API as closly as possible while being safe in order to stay up to date.

Dependencies

This uses bindgen to build the bindings to llama.cpp. This means that you need to have clang installed on your system.

If this is a problem for you, open an issue, and we can look into including the bindings.

See bindgen for more information.

Disclaimer

This crate is not safe. There is absolutly ways to misuse the llama.cpp API provided to create UB, please create an issue if you spot one. Do not use this code for tasks where UB is not acceptable.

This is not a simple library to use. In an ideal world a nice abstraction would be written on top of this crate to provide an ergonomic API - the benefits of this crate over raw bindings is safety (and not much of it as that) and not much else.

We compensate for this shortcoming (we hope) by providing lots of examples and good documentation. Testing is a work in progress.

Contributing

Contributions are welcome. Please open an issue before starting work on a non-trivial PR.

Dependencies

~0.6–3.5MB
~66K SLoC