5 releases
0.1.0 | May 31, 2024 |
---|---|
0.0.4 | May 21, 2024 |
0.0.3 | May 19, 2024 |
0.0.2 | May 19, 2024 |
0.0.1 | May 18, 2024 |
#106 in Machine learning
645 downloads per month
74KB
984 lines
Modular neural network in Rust.
Create modular neural networks in Rust with ease! For educational purposes; vector operations are not throughly optimized.
Quickstart
use neurons::network::Network;
use neurons::activation::Activation;
use neurons::optimizer::Optimizer;
use neurons::objective::Objective;
fn main() {
let mut network = Network::new();
network.add_layer(4, 50, activation::Activation::Linear, true);
network.add_layer(50, 50, activation::Activation::Linear, true);
network.add_layer(50, 1, activation::Activation::Linear, false);
network.set_optimizer(
optimizer::Optimizer::AdamW(
optimizer::AdamW {
learning_rate: 0.001,
beta1: 0.9,
beta2: 0.999,
epsilon: 1e-8,
decay: 0.01,
momentum: vec![], // To be filled by the network
velocity: vec![], // To be filled by the network
}
)
);
network.set_objective(
objective::Objective::MSE, // Objective function
Some((-1f32, 1f32)) // Gradient clipping
);
println!("{}", network);
let (x, y) = ...; // Load data
let epochs = 1000;
let loss = network.learn(x, y, epochs); // Train the network
}
Examples can be found in the examples
directory.
Progress
-
Layer types
- Dense
- Feedback
- Convolutional
-
Activation functions
- Linear
- Sigmoid
- Tanh
- ReLU
- LeakyReLU
- Softmax
-
Objective functions
- AE
- MAE
- MSE
- RMSE
- CrossEntropy
- BinaryCrossEntropy
- KLDivergence
- Huber
-
Optimization techniques
- SGD
- SGDM
- Adam
- AdamW
- RMSprop
- Minibatch
-
Regularization
- Dropout
- Batch normalization
- Early stopping
-
Parallelization
- Multi-threading
-
Testing
- Unit tests
- Integration tests
-
Other
- Documentation
- Type conversion (e.g. f32, f64)
- Network type specification (e.g. f32, f64)
- Saving and loading
- Logging
- Data from file
- Custom tensor/matrix types
- Custom random weight initialization
- Plotting
Inspiration
Sources
Dependencies
~1.5–2MB
~26K SLoC