2 releases

new 0.1.1 Apr 17, 2024
0.1.0 Apr 17, 2024

#188 in Machine learning

Download history 224/week @ 2024-04-15

224 downloads per month

MIT license

50KB
1K SLoC

Sprout Logo

About

Sprout is a Simple Machine Learning library in Rust made with no pre-existing ML or linear algebra libraries. I made Sprout to get a better understanding of ML concepts.

Key Features

  • Fully Connected Layers
  • Convolution Layers
  • Mini-Batch Gradient Descent
  • Normalizations
  • Model Saving/Loading to JSON

How To Use

Sprout uses a Vec of the included Layer struct which is passed into the Network struct as shown here:
use Sprouts::{Layer::{Layer, LayerType}, network::Network, activation::ActivationFunction::*, loss_function::LossType::*}

let layers = vec![
    Layer::dense([2, 3], Sigmoid),
    Layer::dense([3, 1], Sigmoid),
];

// Network::new(layers, learning_rate, batch_size, loss_function);
let nn = Network::new(layers, 0.2, 1, MSE);

//Prints network's loss and epoch progress in the terminal
nn.dense_train(true);

//data: Vec<[Inputs, Outputs]>
let data: Vec<[Vec<f64>; 2]> = vec![
    [vec![1.0, 0.0], vec![0.0]],
    [vec![0.0, 0.0], vec![1.0]],
    [vec![1.0, 1.0], vec![1.0]],
    [vec![0.0, 1.0], vec![0.0]],
];  

//dense_train(data, epochs)
nn.dense_train(data.clone(), 10000);

for i in 0..data.len() {
    println!("Input: {:?} || Output: {:?} || Target: {:?}",data[i][0].clone(), nn.dense_forward(data[i][0].clone()), data[i][1].clone());
}

As of now the only supported layers are conv and dense layers, pooling layers are next on the agenda.

will expound readme soon...

License

This project is licensed under the MIT License.

Dependencies

~13MB
~90K SLoC