#deep-learning #machine-learning #neural-network #pytorch

tinygrad

You like pytorch? You like micrograd? You love tinygrad! ❤️

2 unstable releases

0.1.0 Feb 19, 2024
0.0.0 Dec 29, 2023

#330 in Machine learning

31 downloads per month

MIT license

15KB
86 lines

✨️ tinygrad

Crates.io docs License

A Rust crate for building and training neural networks. tinygrad provides a simple interface for defining tensors, performing forward and backward passes, and implementing basic operations such as dot products and summation.

🚀 Quick Start

Get started with the tinygrad library by following these simple steps:

  1. Install the tinygrad crate by adding the following line to your Cargo.toml file:
[dependencies]
tinygrad = "0.1.0"
  1. Use the Tensor and ForwardBackward traits to create and work with tensors:
use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

// Create a tensor
let value = array![1.0, 2.0, 3.0];
let tensor = Tensor::new(value);

// Perform forward and backward passes
let mut ctx = Context::new();
let result = tensor.forward(&mut ctx, vec![tensor.get_value()]);
tensor.backward(&mut ctx, array![1.0, 1.0, 1.0].view());
  1. Implement custom operations by defining structs that implement the ForwardBackward trait:
use ndarray::ArrayView1;
use tinygrad::{ForwardBackward, Context, TensorTrait};

// Example operation: Dot product
struct Dot;

impl ForwardBackward for Dot {
    fn forward(&self, _ctx: &mut Context, inputs: Vec<ArrayView1<f64>>) -> f64 {
        let input = &inputs[0];
        let weight = &inputs[1];
        input.dot(weight)
    }

    fn backward(&self, ctx: &mut Context, grad_output: ArrayView1<f64>) {
        // Implement backward pass
        // ...
    }
}

🔧 Usage Example

use ndarray::{array, Array1};
use tinygrad::{Tensor, Context, TensorTrait};

fn main() {
    let input = array![1.0, 2.0, 3.0];
    let weight = array![4.0, 5.0, 6.0];

    let input_tensor = Box::new(Tensor::new(input));
    let weight_tensor = Box::new(Tensor::new(weight));

    let dot_fn = Dot;
    let mut ctx = Context::new();

    let inputs = vec![
        input_tensor.get_value(),
        weight_tensor.get_value(),
    ];
    let output = dot_fn.forward(&mut ctx, inputs);

    println!("Dot product: {:?}", output);

    let grad_output = array![1.0, 1.0, 1.0];
    dot_fn.backward(&mut ctx, grad_output.view());

    let grad_input = &input_tensor.grad.clone();
    let grad_weight = &weight_tensor.grad.clone();

    println!("Gradient for input: {:?}", grad_input);
    println!("Gradient for weight: {:?}", grad_weight);
}

🧪 Testing

Run tests for the tinygrad crate using:

cargo test

🌐 GitHub Repository

You can access the source code for the tinygrad crate on GitHub.

🤝 Contributing

Contributions and feedback are welcome! If you'd like to contribute, report an issue, or suggest an enhancement, please engage with the project on GitHub. Your contributions help improve this crate for the community.

📘 Documentation

Full documentation for tinygrad is available on docs.rs.

📄 License

This project is licensed under the MIT License.

Dependencies

~1.5MB
~25K SLoC