4 releases (breaking)
0.4.0 | Feb 27, 2024 |
---|---|
0.3.0 | Feb 23, 2024 |
0.2.1 | Feb 22, 2024 |
0.1.0 | Feb 9, 2024 |
0.0.0 |
|
#94 in Machine learning
558 downloads per month
39KB
835 lines
neat
Implementation of the NEAT algorithm using genetic-rs
Features
- rayon - Uses parallelization on the
NeuralNetwork
struct and adds therayon
feature to thegenetic-rs
re-export. - serde - Adds the NNTSerde struct and allows for serialization of
NeuralNetworkTopology
- crossover - Implements the
CrossoverReproduction
trait onNeuralNetworkTopology
and adds thecrossover
feature to the `genetic-rs re-export.
How To Use
When working with this crate, you'll want to use the NeuralNetworkTopology
struct in your agent's DNA and
the use NeuralNetwork::from
when you finally want to test its performance. The genetic-rs
crate is also re-exported with the rest of this crate.
Here's an example of how one might use this crate:
use neat::*;
#[derive(Clone, RandomlyMutable, DivisionReproduction)]
struct MyAgentDNA {
network: NeuralNetworkTopology<1, 2>,
other_stuff: Foo,
}
impl GenerateRandom for MyAgentDNA {
fn gen_random(rng: &mut impl rand::Rng) -> Self {
Self {
network: NeuralNetworkTopology::new(0.01, 3, rng),
other_stuff: Foo::gen_random(rng),
}
}
}
struct MyAgent {
network: NeuralNetwork<1, 2>,
some_other_state: Bar,
}
impl From<&MyAgentDNA> for MyAgent {
fn from(value: &MyAgentDNA) -> Self {
Self {
network: NeuralNetwork::from(&value.network),
some_other_state: Bar::default(),
}
}
}
fn fitness(dna: &MyAgentDNA) -> f32 {
let mut agent = MyAgent::from(dna);
// ... use agent.network.predict() and agent.network.flush() throughout multiple iterations
}
fn main() {
let mut rng = rand::thread_rng();
let mut sim = GeneticSim::new(
Vec::gen_random(&mut rng, 100),
fitness,
division_pruning_nextgen,
);
// ... simulate generations, etc.
}
License
This crate falls under the MIT
license
Dependencies
~0.7–1.6MB
~33K SLoC