#llm-chain #llm

llm-chain-llama

A library implementing llm-chains for LLamA. Chains can be use to apply the model series to complete complex tasks, such as agents.

26 releases (9 breaking)

0.13.0 Nov 15, 2023
0.12.3 Jun 27, 2023
0.12.0 May 31, 2023

#376 in Machine learning

Download history 2/week @ 2024-07-21 41/week @ 2024-07-28 81/week @ 2024-09-22

81 downloads per month

MIT license

1MB
26K SLoC

C 10K SLoC // 0.1% comments Rust 9K SLoC // 0.0% comments C++ 5K SLoC // 0.1% comments Python 1K SLoC // 0.1% comments CUDA 544 SLoC // 0.0% comments Shell 177 SLoC // 0.2% comments Zig 49 SLoC Batch 47 SLoC Swift 19 SLoC

llm-chain-llama 🦙

Welcome to LLM-Chain-LLaMa, a powerful and versatile driver for LLaMa-style models! This crate leverages the amazing llama.cpp library, making it simple and efficient to run LLaMa, Alpaca, and similar models in a Rust environment.

Getting Started 🏁

To begin, you'll need to acquire a LLaMa model and adapt it for llama.cpp. Don't worry; we've got your back! Just follow the instructions from llama.cpp and you'll be up and running in no time. 🦾

Features 🌟

LLM-Chain-LLaMa is packed with all the features you need to harness the full potential of LLaMa, Alpaca, and similar models. Here's a glimpse of what's inside:

  • Running chained LLaMa-style models in a Rust environment, taking your applications to new heights 🌄
  • Prompts for working with instruct models, empowering you to easily build virtual assistants amazing applications 🧙‍♂️

So gear up and dive into the fantastic world of LLM-Chain-LLaMa! Let the power of LLaMa-style models propel your projects to the next level. Happy coding, and enjoy the ride! 🎉🥳

Dependencies

~16–30MB
~494K SLoC