5 releases

0.1.4 Oct 14, 2022
0.1.3 Oct 14, 2022
0.1.2 Oct 14, 2022
0.1.1 Oct 14, 2022
0.1.0 Oct 14, 2022

#125 in Machine learning


709 lines


A simple neural network written in rust.


This implementation of a neural network using gradient-descent is completely written from ground up using rust. It is possible to specify the shape of the network, as well as the learning-rate of the network. Additionally, you can choose from one of many predefined datasets, for example the XOR- and CIRCLE Datasets, which represent the relative functions inside the union-square. As well as more complicated datasets like the RGB_DONUT, which represents a donut-like shape with a rainbow like color transition.

Below, you can see a training process, where the network is trying to learn the color-values of the RGB_DONUT dataset.


The following features are currently implemented:

  • Optimizers
    1. Adam
    2. RMSProp
    3. SGD
  • Loss Functions
    1. Quadratic
  • Activation Functions
    1. Sigmoid
    2. ReLU
  • Layers
    1. Dense
  • Plotting
    1. Plotting the cost-history during training
    2. Plotting the final predictions inside, either in grayscale or RGB


The process of creating and training the neural network is pretty straightforwards:


Example Training Process

Below, you can see how the network learns:

Learning Animation


Final Result

RGB_DONUT_SGD_ 2,64,64,64,64,64,3

Cool training results


Big Network

RGB_DONUT_RMS_PROP_ 2,128,128,128,3 RGB_DONUT_RMS_PROP_ 2,128,128,128,3 _history

Small Network

RGB_DONUT_SGD_ 2,8,8,8,3 RGB_DONUT_SGD_ 2,8,8,8,3 _history


XOR_SGD_ 2,8,8,8,1 XOR_SGD_ 2,8,8,8,1 _history


~117K SLoC