#nn #backward #mgrad

mgrad

A minimal automatic differentiation library

5 releases

Uses new Rust 2024

new 0.1.5 Apr 30, 2025
0.1.4 Apr 29, 2025

#255 in Machine learning

Download history 3/week @ 2025-02-12 174/week @ 2025-04-23

174 downloads per month

MIT license

38KB
1K SLoC

A minimal automatic differentiation library. Just for fun...

All code are in a single file (src/mgrad.rs) to easily copy-paste into other projects.

use mgrad::nn;

fn main() {
    let x = nn::variable(1);
    let y = x.sin() + nn::constant(1);
    let y = (x.pow(2) * y).ln();
    y.backward(1);

    // y = ln(x^2 * (sin(x) + 1))
    // dy/dx should be ~ 2.29341
    println!("The gradient of y=ln(x^2 * (sin(x) + 1)) at x=1 is: {:?}", x.grad);
}

Run with cargo doc --open to see the documentation.
More examples can be found in the examples folder.

No runtime deps