15 releases (5 breaking)

0.5.0 Dec 3, 2024
0.4.1 Nov 13, 2024
0.3.0 Oct 24, 2024
0.2.1 Oct 1, 2024
0.0.0 May 29, 2024

#139 in Machine learning

Download history 1/week @ 2024-08-26 145/week @ 2024-09-02 2/week @ 2024-09-09 192/week @ 2024-09-16 34/week @ 2024-09-23 359/week @ 2024-09-30 30/week @ 2024-10-07 10/week @ 2024-10-14 150/week @ 2024-10-21 19/week @ 2024-10-28 171/week @ 2024-11-04 317/week @ 2024-11-11 135/week @ 2024-11-18 72/week @ 2024-11-25 286/week @ 2024-12-02

822 downloads per month
Used in 5 crates

MIT license

265KB
5K SLoC

Rig

Rig is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.

More information about this crate can be found in the crate documentation.

Table of contents

High-level features

  • Full support for LLM completion and embedding workflows
  • Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, in-memory)
  • Integrate LLMs in your app with minimal boilerplate

Installation

cargo add rig-core

Simple example:

use rig::{completion::Prompt, providers::openai};

#[tokio::main]
async fn main() {
    // Create OpenAI client and model
    // This requires the `OPENAI_API_KEY` environment variable to be set.
    let openai_client = openai::Client::from_env();

    let gpt4 = openai_client.model("gpt-4").build();

    // Prompt the model and print its response
    let response = gpt4
        .prompt("Who are you?")
        .await
        .expect("Failed to prompt GPT-4");

    println!("GPT-4: {response}");
}

Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).

Integrations

Rig supports the following LLM providers natively:

  • OpenAI
  • Cohere
  • Google Gemini
  • xAI

Additionally, Rig currently has the following integration sub-libraries:

  • MongoDB vector store: rig-mongodb

Dependencies

~5–19MB
~247K SLoC