11 releases

Uses old Rust 2015

0.1.8 Feb 2, 2024
0.1.7 Oct 19, 2018
0.1.6 Nov 7, 2015
0.1.5 Oct 26, 2015
0.0.3 Jun 28, 2015

#132 in Machine learning

Download history 3/week @ 2024-01-30 6/week @ 2024-02-13 45/week @ 2024-02-20 12/week @ 2024-02-27 4/week @ 2024-03-05 7/week @ 2024-03-12 13/week @ 2024-03-26 52/week @ 2024-04-02

76 downloads per month

LGPL-3.0

84KB
1.5K SLoC

fann-rs

Build Status

Rust wrapper for the Fast Artificial Neural Network (FANN) library. This crate provides a safe interface to FANN on top of the low-level bindings fann-sys-rs.

Documentation

Usage

Add fann and libc to the list of dependencies in your Cargo.toml:

[dependencies]
fann = "*"
libc = "*"

and this to your crate root:

extern crate fann;
extern crate libc;

Usage examples are included in the Documentation.


lib.rs:

A Rust wrapper for the Fast Artificial Neural Network library.

A new neural network with random weights can be created with the Fann::new method, or, for different network topologies, with its variants Fann::new_sparse and Fann::new_shortcut. Existing neural networks can be saved to and loaded from files.

Similarly, training data sets can be loaded from and saved to human-readable files, or training data can be provided directly to the network as slices of floating point numbers.

Example:

extern crate fann;
use fann::{ActivationFunc, Fann, TrainAlgorithm, QuickpropParams};

fn main() {
   // Create a new network with two input neurons, a hidden layer with three neurons, and one
   // output neuron.
   let mut fann = Fann::new(&[2, 3, 1]).unwrap();
   // Configure the activation functions for the hidden and output neurons.
   fann.set_activation_func_hidden(ActivationFunc::SigmoidSymmetric);
   fann.set_activation_func_output(ActivationFunc::SigmoidSymmetric);
   // Use the Quickprop learning algorithm, with default parameters.
   // (Otherwise, Rprop would be used.)
   fann.set_train_algorithm(TrainAlgorithm::Quickprop(Default::default()));
   // Train for up to 500000 epochs, displaying progress information after intervals of 1000
   // epochs. Stop when the network's error on the training data drops to 0.001.
   let max_epochs = 500000;
   let epochs_between_reports = 1000;
   let desired_error = 0.001;
   // Train directly on data loaded from the file "xor.data".
   fann.on_file("test_files/xor.data")
       .with_reports(epochs_between_reports)
       .train(max_epochs, desired_error).unwrap();
   // The network now approximates the XOR problem:
   assert!(fann.run(&[-1.0,  1.0]).unwrap()[0] > 0.9);
   assert!(fann.run(&[ 1.0, -1.0]).unwrap()[0] > 0.9);
   assert!(fann.run(&[ 1.0,  1.0]).unwrap()[0] < 0.1);
   assert!(fann.run(&[-1.0, -1.0]).unwrap()[0] < 0.1);
}

FANN also supports cascade training, where the network's topology is changed during training by adding additional neurons:

extern crate fann;
use fann::{ActivationFunc, CascadeParams, Fann};

fn main() {
   // Create a new network with two input neurons and one output neuron.
   let mut fann = Fann::new_shortcut(&[2, 1]).unwrap();
   // Use the default cascade training parameters, but a higher weight multiplier:
   fann.set_cascade_params(&CascadeParams {
                                weight_multiplier: 0.6,
                                ..CascadeParams::default()
                            });
   // Add up to 50 neurons, displaying progress information after each.
   // Stop when the network's error on the training data drops to 0.001.
   let max_neurons = 50;
   let neurons_between_reports = 1;
   let desired_error = 0.001;
   // Train directly on data loaded from the file "xor.data".
   fann.on_file("test_files/xor.data")
       .with_reports(neurons_between_reports)
       .cascade()
       .train(max_neurons, desired_error).unwrap();
   // The network now approximates the XOR problem:
   assert!(fann.run(&[-1.0,  1.0]).unwrap()[0] > 0.9);
   assert!(fann.run(&[ 1.0, -1.0]).unwrap()[0] > 0.9);
   assert!(fann.run(&[ 1.0,  1.0]).unwrap()[0] < 0.1);
   assert!(fann.run(&[-1.0, -1.0]).unwrap()[0] < 0.1);
}

Dependencies

~140KB