#neural-network #training #python #simplifies #scratch #creation #validation

ksnn

ksnn, or Kosiorek's Simple Neural Networks, is a crate that simplifies the creation, training, and validation of a neural network. The crate is heavily inspired by "Neural Networks from Scratch in Python" by Harrison Kinsley & Daniel Kukieła.

3 releases (breaking)

0.3.0 Jul 18, 2022
0.2.0 Jul 6, 2022
0.1.2 Jul 5, 2022
0.1.1 Jul 5, 2022

#548 in Machine learning

21 downloads per month

MIT/Apache

68KB
1K SLoC

ksnn

ksnn, or Kosiorek's Simple Neural Networks, is a program that simplifies the creation, training, and validation of a neural network. The program is heavily inspired by "Neural Networks from Scratch in Python" by Harrison Kinsley & Daniel Kukieła.


lib.rs:

ksnn

ksnn, or Kosiorek's Simple Neural Networks, is a crate that simplifies the creation, training, and validation of a neural network. The crate is heavily inspired by "Neural Networks from Scratch in Python" by Harrison Kinsley & Daniel Kukieła.

Crate TODO's

  • Improving crate efficiency, likely through multithreading or moving calculations from the CPU to the GPU
  • Addition of more network types, such as regression networks
  • Addition of more activation and entropy functions

Examples

How to create a ClassificationNetwork:

// Step 1: Get Training data and the anwsers for that data formatted in a 2d array.
let x_train = ndarray::arr2(&[
    [0.7, 0.29, 1.0, 0.55, 0.33, 0.27],
    [0.01, 0.08, 0.893, 0.14, 0.19, 0.98]
]);

let y_train = ndarray::arr2(&[
    [0, 0, 1],
    [0, 1, 0]
]);

// Step 2: Get testing data and the anwsers for that data formatted in a 2d array.
let x_test = ndarray::arr2(&[
    [0.64, 0.456, 0.68, 0.1, 0.123, 0.32],
    [0.78, 0.56, 0.58, 0.12, 0.37, 0.46]
]);

let y_test = ndarray::arr2(&[
    [1, 0, 0],
    [0, 1, 0]
]);

// Step 3: Create the network.
let mut neural_network = ksnn::ClassificationNetwork::new(
    vec!["ActivationReLU", "ActivationReLU", "ActivationReLU", "SoftmaxLossCC"],
    vec![32, 64, 48, 3],
    ksnn::enable_dropout_layers(true),
    ksnn::optimizers::optimizer_adam_def(),
    &x_train,
);

// Step 4: Adjust dropout layers, if enabled.
neural_network.dropout_layers[0].rate = 0.8;
neural_network.dropout_layers[1].rate = 0.75;
neural_network.dropout_layers[2].rate = 0.9;

// Step 5: Adjust weight regularizers as desired.
neural_network.dense_layers[0].weight_regularizer_l2 = 5e-4;
neural_network.dense_layers[0].bias_regularizer_l2 = 5e-4;

neural_network.dense_layers[1].weight_regularizer_l2 = 5e-3;
neural_network.dense_layers[1].bias_regularizer_l2 = 5e-3;

neural_network.dense_layers[2].weight_regularizer_l2 = 5e-5;
neural_network.dense_layers[2].bias_regularizer_l2 = 5e-5;

// Step 6: Fit, or train, the network on the training data.
neural_network.fit(100, 1, x_train, y_train);
// Step 6: Test your trained network on data it hasn't seen before to see how well it 
// deals with new information.
neural_network.validate(x_test, y_test);
// Step 7: Save your network to a file to be loaded and used later.
neural_network.save("my_network.json");

How to load a ClassificationNetwork:

let mut neural_network = ksnn::ClassificationNetwork::load("my_network.json");

Dependencies

~6–16MB
~193K SLoC