2 releases
Uses old Rust 2015
0.0.2 | Oct 21, 2018 |
---|---|
0.0.1 | Aug 26, 2018 |
#709 in Science
56KB
1K
SLoC
This is a package for writing differential programs (i.e. neural networks) in rust.
- Look at the documentation
docs.rs/drug
for a better description - Check out an example
cargo run --example mnist --release
! - Please give me feedback
Versions
0.0.2
- Saving functionality
- New optimizers: momentum, Adam, RMSProp
- Nodes are now all part of one enum, rather than boxed traits
- Changed some function type signatures as per clippy suggestions
0.0.1
- Initial release
lib.rs
:
∂rug - Differentiable Rust Graph
This crate is a collection of utilities to build build neural networks (differentiable programs). See examples for implementations of canonical neural networks. You may need to download those datasets yourself to use them. Examples include:
- Mnist with dense networks
- Mnist with convolutional neural networks (though embarassingly slowly)
- Penn TreeBank character prediction with RNN and GRU
Planned Future Features
- Higher level API
- Building complexes of nodes (conv + bias + relu) / RNN cells, with parameter reuse
- Subgraphs / updating subsets of graphs (e.g. for GAN) with separate optimizers
- Parallel backprop multiple arguments of 1 node
- ndarray-parallel or OpenMPI for graph replication and parallelization
- Link to some optimized OpenCL maths backend for GPU utilization
Reinforcement learning applications may also challenge the archiecture but I don't understand the process well enough yet to consider adding it to the library.
Wish list
- Operator overloading API + Taking advantage of the type system and const generics
- May require total overhaul.. or may be possible with a "Graph Cursor" trait and more sophisticaed handles beyond current Idxs
- Automatic differentiation of operations defined only from loops (proc macros?)
- Taking advantage of just in time compilation and fusion of operations / kernels
- Other kinds of derivatives e.g. jacobian
Dependencies
~4.5MB
~94K SLoC