5 releases
0.1.4 | Sep 14, 2024 |
---|---|
0.1.3 | Aug 29, 2024 |
0.1.2 | Aug 23, 2024 |
0.1.1 | Aug 20, 2024 |
0.1.0 | Aug 20, 2024 |
#64 in Biology
265 downloads per month
66KB
1K
SLoC
Smyl - A Machine Learning Library in Rust
Smyl is a machine learning library written in Rust, providing a set of tools and abstractions for building and training neural networks.
Features
- Matrix Operations: Smyl provides a
Matrix
struct for efficient matrix operations, including addition, subtraction, multiplication, and more. - Activation Functions: Smyl includes common activation functions such as ReLU and Sigmoid.
- Layers: Smyl defines two main layer types:
SignalLayer
andSynapseLayer
, which can be used to build neural network architectures. - Macros (optional): With the
macros
feature enabled, Smyl provides a set of macros to simplify the creation of neural networks. - Idx3 Support (optional): With the
idx3
feature enabled, Smyl can read and process data in the IDX3 format, commonly used for storing images.
Getting Started
To use Smyl, add the following to your Cargo.toml
file:
[dependencies]
smyl = "0.1.1"
Then, in your Rust code, you can start with this hello world example
extern crate rand;
extern crate smyl;
use rand::{thread_rng, Rng};
use smyl::layer::LayerGradient;
use smyl::prelude::*;
fn main() {
let mut rng = thread_rng();
let input: SignalLayer<f64, 4> = Matrix([[1.0, 0.0, 1.0, 0.5]]);
let expected: SignalLayer<f64, 4> = Matrix([[1.0, 0.0, 0.5, 0.75]]);
let learning_rate: f64 = 0.01;
let mut synapses1: SynapseLayer<f64, 4, 2, ReLU> = rng.gen();
let mut synapses2: SynapseLayer<f64, 2, 2, ReLU> = rng.gen();
let mut synapses3: SynapseLayer<f64, 2, 4, ReLU> = rng.gen();
let total_epochs = 1000_000;
let chunk_epochs = 1;
for _ in 0..total_epochs / chunk_epochs {
let mut gradient1 = LayerGradient::default();
let mut gradient2 = LayerGradient::default();
let mut gradient3 = LayerGradient::default();
for _ in 0..chunk_epochs {
let output1 = synapses1.forward(input);
let output2 = synapses2.forward(output1);
let output3 = synapses3.forward(output2);
let output_gradient3 = output3 - expected;
let output_gradient2 =
synapses3.backward(output2, output3, output_gradient3, &mut gradient3);
let output_gradient1 =
synapses2.backward(output1, output2, output_gradient2, &mut gradient2);
synapses1.backward(input, output1, output_gradient1, &mut gradient1);
dbg!(output3);
}
synapses3.apply_gradient(gradient3, learning_rate);
synapses2.apply_gradient(gradient2, learning_rate);
synapses1.apply_gradient(gradient1, learning_rate);
}
}
But you might prefer using the macro:
use smyl::prelude::*;
fn main() {
let mut ann = ann!(4, 6, 4);
let input = Matrix([[1.0, 0.0, 1.0, 0.5]]);
let expected = Matrix([[1.0, 0.0, 0.0, 0.0]]);
let batch = 100;
let learning_rate = 0.01;
for i in 0..10000 {
let output = ann.forward(input);
ann.backward(expected);
if i % batch == 0 {
ann.apply_chunk_gradient(learning_rate * batch as f64);
println!("{output:?}");
}
}
}
Examples
You can find more example usage of Smyl in the examples
directory of the repository:
- Hello World: A simple example demonstrating the basic usage of the library.
- Macros: An example showcasing the use of macros for creating neural networks.
- MNIST: A more complex example that demonstrates training a neural network on the MNIST dataset.
Documentation
For more detailed documentations, please refer to the docs.rs page.
Contributing
Contributions are welcome! Please feel free to submit a pull request or open an issue for any suggestions or improvements.
License
This project is licensed under the GNU GPLv3 License. See the LICENSE file for details.
Acknowledgments
- Thanks to the Rust community for their support and contributions.
- Special thanks to the 3Blue1Brown channel for vulgarizing mathematics involved in machine learning
Dependencies
~0–330KB