#machine-learning #deep-learning #tensor

rai-core

ML framework with Ergonomic APIs in Rust

9 breaking releases

0.10.0 Mar 26, 2024
0.8.0 Feb 20, 2024
0.2.0 Dec 28, 2023

#863 in Machine learning

Download history 7/week @ 2023-12-27 72/week @ 2024-01-03 35/week @ 2024-01-10 16/week @ 2024-01-17 2/week @ 2024-01-24 77/week @ 2024-02-14 176/week @ 2024-02-21 88/week @ 2024-02-28 8/week @ 2024-03-06 5/week @ 2024-03-13 117/week @ 2024-03-20 43/week @ 2024-03-27 10/week @ 2024-04-03

170 downloads per month
Used in 5 crates (3 directly)

MIT/Apache

280KB
8K SLoC

RAI

Rust Docs Status Latest Version Discord

ML framework with ergonomic APIs in Rust. Lazy computation and composable transformations.

Installation

cargo add rai

Code snippets

Function transformations (jvp, vjp, grad, value_and_grad)

use rai::{grad, Cpu, Func, Tensor, F32};

fn f(x: &Tensor) -> Tensor {
    x.sin()
}

fn main() {
    let grad_fn = grad(grad(f));
    let x = Tensor::ones([1], F32, &Cpu);
    let grad = grad_fn.apply((&x,));
    println!("{}", grad.dot_graph());
    println!("{}", grad);
}

NN Modules, Optimizer and loss functions

fn loss_fn<M: TrainableModule<Input = Tensor, Output = Tensor>>(
    model: &M,
    input: &Tensor,
    labels: &Tensor,
) -> (Tensor, Aux<Tensor>) {
    let logits = model.forward(input);
    let loss = softmax_cross_entropy(&logits, labels).mean(..);
    (loss, Aux(logits))
}

fn train_step<M: TrainableModule<Input = Tensor, Output = Tensor>, O: Optimizer>(
    optimizer: &mut O,
    model: &M,
    input: &Tensor,
    labels: &Tensor,
) {
    let vg_fn = value_and_grad(loss_fn);
    let ((_loss, Aux(_logits)), (grads, ..)) = vg_fn.apply((model, input, labels));
    let mut params = optimizer.step(&grads);
    eval(&params);
    model.update_params(&mut params);
}

Examples

LICENSE

This project is licensed under either of

at your option.

Dependencies

~2–14MB
~146K SLoC