#automatic-differentiation #var #pure

niura

Automatic differentiation in pure Rust

1 unstable release

0.1.0 May 24, 2022

#7 in #var

MPL-2.0 license

16KB
418 lines

Niura is an automatic differentiation library written in Rust.

Add niura to your project

[dependencies]

niura = { git = "https://github.com/taminki/niura" }

Why

While there are a couple of libraries written in Rust that provide auto-differentiation capabilities, they mostly have a very verbose API syntax.

Niura aims to be as close as possible to symbolic mathemacs, with the major deviation being that you need to pass variables by reference, since differentiating with respect to a variable which has been dropped doesn't make sense.

Usage example

use niura::var::*;

fn main() {

    let a = var::float(2.0);
    let b = var::float(3.0);
    
    let c = tan(&(&a * &b));

    c.backward();
    println!("{}", a.grad()); 
}

What's next

This was built as an exploratory project in order to get to know how automatic differentiation works, so there isn't any concrete timeline.

However, I'm currently working on adding support for differentiating vector valued functions. And I want to add features for this to become a machine learning library in the future.

Dependencies

~310KB