7 releases (breaking)

0.10.0 Mar 26, 2024
0.9.0 Feb 24, 2024
0.8.0 Feb 20, 2024
0.7.0 Jan 23, 2024
0.4.0 Jan 4, 2024

#714 in Machine learning

Download history 6/week @ 2023-12-30 21/week @ 2024-01-06 2/week @ 2024-01-13 10/week @ 2024-01-20 108/week @ 2024-02-17 191/week @ 2024-02-24 9/week @ 2024-03-02 5/week @ 2024-03-09 144/week @ 2024-03-23 22/week @ 2024-03-30 1/week @ 2024-04-06

168 downloads per month
Used in 3 crates (via rai)

MIT/Apache

290KB
8K SLoC

RAI

Rust Docs Status Latest Version Discord

ML framework with ergonomic APIs in Rust. Lazy computation and composable transformations.

Installation

cargo add rai

Code snippets

Function transformations (jvp, vjp, grad, value_and_grad)

use rai::{grad, Cpu, Func, Tensor, F32};

fn f(x: &Tensor) -> Tensor {
    x.sin()
}

fn main() {
    let grad_fn = grad(grad(f));
    let x = Tensor::ones([1], F32, &Cpu);
    let grad = grad_fn.apply((&x,));
    println!("{}", grad.dot_graph());
    println!("{}", grad);
}

NN Modules, Optimizer and loss functions

fn loss_fn<M: TrainableModule<Input = Tensor, Output = Tensor>>(
    model: &M,
    input: &Tensor,
    labels: &Tensor,
) -> (Tensor, Aux<Tensor>) {
    let logits = model.forward(input);
    let loss = softmax_cross_entropy(&logits, labels).mean(..);
    (loss, Aux(logits))
}

fn train_step<M: TrainableModule<Input = Tensor, Output = Tensor>, O: Optimizer>(
    optimizer: &mut O,
    model: &M,
    input: &Tensor,
    labels: &Tensor,
) {
    let vg_fn = value_and_grad(loss_fn);
    let ((_loss, Aux(_logits)), (grads, ..)) = vg_fn.apply((model, input, labels));
    let mut params = optimizer.step(&grads);
    eval(&params);
    model.update_params(&mut params);
}

Examples

LICENSE

This project is licensed under either of

at your option.

Dependencies

~4–17MB
~189K SLoC