15 releases (5 breaking)
0.5.0 | Dec 3, 2024 |
---|---|
0.4.1 | Nov 13, 2024 |
0.3.0 | Oct 24, 2024 |
0.2.1 | Oct 1, 2024 |
0.0.0 | May 29, 2024 |
#139 in Machine learning
822 downloads per month
Used in 5 crates
265KB
5K
SLoC
Rig
Rig is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.
More information about this crate can be found in the crate documentation.
Table of contents
High-level features
- Full support for LLM completion and embedding workflows
- Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, in-memory)
- Integrate LLMs in your app with minimal boilerplate
Installation
cargo add rig-core
Simple example:
use rig::{completion::Prompt, providers::openai};
#[tokio::main]
async fn main() {
// Create OpenAI client and model
// This requires the `OPENAI_API_KEY` environment variable to be set.
let openai_client = openai::Client::from_env();
let gpt4 = openai_client.model("gpt-4").build();
// Prompt the model and print its response
let response = gpt4
.prompt("Who are you?")
.await
.expect("Failed to prompt GPT-4");
println!("GPT-4: {response}");
}
Note using #[tokio::main]
requires you enable tokio's macros
and rt-multi-thread
features
or just full
to enable all features (cargo add tokio --features macros,rt-multi-thread
).
Integrations
Rig supports the following LLM providers natively:
- OpenAI
- Cohere
- Google Gemini
- xAI
Additionally, Rig currently has the following integration sub-libraries:
- MongoDB vector store:
rig-mongodb
Dependencies
~5–19MB
~247K SLoC