#neural-network #machine-learning #deep-learning #ai #convolutional #dense-neural-network

bin+lib unda

General purpose machine learning crate for neural network development and analysis

3 releases

0.2.2 Feb 13, 2024
0.2.1 Feb 12, 2024
0.1.9 Feb 11, 2024

#87 in Machine learning

Download history 12/week @ 2024-02-03 42/week @ 2024-02-10 12/week @ 2024-02-17 50/week @ 2024-02-24 1/week @ 2024-03-02 17/week @ 2024-03-09 24/week @ 2024-03-16 48/week @ 2024-03-30 2/week @ 2024-04-06 222/week @ 2024-04-13

272 downloads per month

MIT license

115KB
2K SLoC

unda icon

Unda

General purpose neural network crate

crates.io Documentation Unit Tests

Unda aims to bring the future of deep learning to the world of rust. With dynamic input traits, concurrent minibatch processing, and full Dense network support(with convolutions soon to come), Unda is quickly emerging and making neural network development easy and blazingly fast.

Installation

Use the package manager cargo to add unda to your rust project.

cargo add unda

or add the dependency directly in your cargo.toml file

[dependencies]
unda = "{version}"

Usage

use unda::core::network::Network;
use unda::core::layer::{methods::activations::Activations, layers::{LayerTypes, InputTypes}};
use unda::core::data::input::Input;
use unda::core::layer::{methods::errors::ErrorTypes};

fn main() {
    let inputs = vec![vec![0.0,0.0],vec![1.0,0.0],vec![0.0,1.0], vec![1.0,1.0]];
    let outputs = vec![vec![0.0],vec![1.0],vec![1.0], vec![0.0]];

    let mut new_net = Network::new(4);

    new_net.set_input(InputTypes::DENSE(2))
    new_net.add_layer(LayerTypes::DENSE(3, Activations::RELU, 0.001));
    new_net.add_layer(LayerTypes::DENSE(1, Activations::SIGMOID, 0.001));

    new_net.compile();

    new_net.fit(&inputs, &outputs, 2, ErrorTypes::MeanAbsolute);

    println!("1 and 0: {:?}", new_net.predict(vec![1.0,0.0])[0]);
    println!("0 and 1: {:?}", new_net.predict(vec![0.0,1.0])[0]);
    println!("1 and 1: {:?}", new_net.predict(vec![1.0,1.0])[0]);
    println!("0 and 0: {:?}", new_net.predict(vec![0.0,0.0])[0]);

    new_net.save("best_network.json");
}

Examples

The unda repository hosts a plethora of example ML models to compute a series of common problems. These examples can be found in the /examples folder and can be run by entering:

cargo run --release --example {example_name}

where example_name is the name of the file/folder you wish to run, omitting the .rs

Currently, Unda has example implementations for XoR, MNIST and a breast cancer model from Kaggle

Important! When using running the MNIST example, please make sure to put the appropriate ubyte files into the /src/util/mnist directory of this repository. We are currently working on using reqwest to automatically build the dataset, but for now it must be done manually

Here are google drive links to the necessary ubyte files

Implications for the future of ML

Using the built in Input trait, practically any data type can be mapped to an input for a neural network without the need for cutting corners, and the inner trait for layers allows for a plug and play style to neural network development. Currently, Unda has full support for Dense layers, Adam Optimization for Backprop, Activation functions (Sigmoid, TanH, ReLU and LeakyReLU), and even loss analysis per model and per layer.

Gradient descent currently can happen both syncronously as stochastic gradient descent or asynchronously through minibatch gradient descent.

TODO

Currently, Unda is in a very beta stage, the following features are still in development:

[Neural Network Goals]

  • Create abstract representation for layers (Layer trait)
    • Dense
    • Convolutional
      • Cateogorical Crossentropy
      • SoftMax
    • Recurrent
  • Allow for different activation functions and learning rates on each layer
  • Adam Optimization in backprop
  • Helper Function for parsing CSV data
  • Helper Function for generating the MNIST dataset
  • Helper Functions for generating and deriving categorical data

If open source development is your thing, we at Unda would love additional work on anything that can be implemented, please contact eversonb@msoe.edu if you'd like to help out!

License

Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0 or the MIT license http://opensource.org/licenses/MIT, at your option. This file may not be copied, modified, or distributed except according to those terms.

Dependencies

~6MB
~106K SLoC