#language-model #llm #chatgpt #chain #langchain

llm-chain

A library for running chains of LLMs (such as ChatGPT) in series to complete complex tasks, such as text summation

30 releases (10 breaking)

0.13.0 Nov 15, 2023
0.12.3 Jun 27, 2023
0.12.0 May 31, 2023
0.1.1-rc.1 Mar 25, 2023

#53 in Machine learning

Download history 309/week @ 2024-01-02 117/week @ 2024-01-09 241/week @ 2024-01-16 67/week @ 2024-01-23 86/week @ 2024-01-30 102/week @ 2024-02-06 189/week @ 2024-02-13 126/week @ 2024-02-20 241/week @ 2024-02-27 140/week @ 2024-03-05 194/week @ 2024-03-12 160/week @ 2024-03-19 169/week @ 2024-03-26 259/week @ 2024-04-02 195/week @ 2024-04-09 170/week @ 2024-04-16

826 downloads per month
Used in 10 crates

MIT license

195KB
4K SLoC

llm-chain 🚀

llm-chain is a collection of Rust crates designed to help you create advanced LLM applications such as chatbots, agents, and more. As a comprehensive LLM-Ops platform we have strong support for both cloud and locally-hosted LLMs. We also provide robust support for prompt templates and chaining together prompts in multi-step chains, enabling complex tasks that LLMs can't handle in a single step. We also provide vector store integrations making it easy to give your model long-term memory and subject matter knowledge. This empowers you to build sophisticated applications.

This crate is the main crate for llm-chain. You will need driver crate such as llm-chain-openai, or llm-chain-local

Discord Crates.io License Docs: Tutorial

Examples 💡

To help you get started, here is an example demonstrating how to use llm-chain. You can find more examples in the examples folder in the repository.

let exec = executor!()?;
let res = prompt!(
    "You are a robot assistant for making personalized greetings",
    "Make a personalized greeting for Joe"
)
.run(parameters()!, &exec)
.await?;
println!("{}", res);

➡️ tutorial: get started with llm-chain ➡️ quick-start: Create project based on our template

Features 🌟

  • Prompt templates: Create reusable and easily customizable prompt templates for consistent and structured interactions with LLMs.
  • Chains: Build powerful chains of prompts that allow you to execute more complex tasks, step by step, leveraging the full potential of LLMs.
  • ChatGPT support: Supports ChatGPT models, with plans to add OpenAI's other models in the future.
  • LLaMa support: Provides seamless integration with LLaMa models, enabling natural language understanding and generation tasks with Facebook's research models.
  • Alpaca support: Incorporates support for Stanford's Alpaca models, expanding the range of available language models for advanced AI applications.
  • Tools: Enhance your AI agents' capabilities by giving them access to various tools, such as running Bash commands, executing Python scripts, or performing web searches, enabling more complex and powerful interactions.
  • Extensibility: Designed with extensibility in mind, making it easy to integrate additional LLMs as the ecosystem grows.
  • Community-driven: We welcome and encourage contributions from the community to help improve and expand the capabilities of llm-chain.

Getting Started 🚀

To start using llm-chain, add it as a dependency in your Cargo.toml:

cargo add llm-chain llm-chain-openai

The examples for llm-chain-openai require you to set the OPENAI_API_KEY environment variable which you can do like this:

export OPENAI_API_KEY="sk-YOUR_OPEN_AI_KEY_HERE"

Then, refer to the documentation and examples to learn how to create prompt templates, chains, and more.

Contributing 🤝

We warmly welcome contributions from everyone! If you're interested in helping improve llm-chain, please check out our CONTRIBUTING.md file for guidelines and best practices.

License 📄

llm-chain is licensed under the MIT License.

Connect with Us 🌐

If you have any questions, suggestions, or feedback, feel free to open an issue or join our community discord. We're always excited to hear from our users and learn about your experiences with llm-chain.

We hope you enjoy using llm-chain to unlock the full potential of Large Language Models in your projects. Happy coding! 🎉

Dependencies

~16–32MB
~501K SLoC