#genetic-algorithm #algorithm #optimization #search-algorithms #metaheuristics

no-std metaheuristics-nature

A collection of nature-inspired metaheuristic algorithms

70 releases (44 stable)

10.0.1 Mar 27, 2024
9.2.3 Aug 24, 2023
9.2.2 Jun 18, 2023
8.0.4 Nov 21, 2022
0.8.0 Jul 27, 2021

#172 in Algorithms

Download history 3/week @ 2024-01-05 80/week @ 2024-01-12 34/week @ 2024-01-19 54/week @ 2024-01-26 2/week @ 2024-02-02 3/week @ 2024-02-16 46/week @ 2024-02-23 50/week @ 2024-03-01 47/week @ 2024-03-08 9/week @ 2024-03-15 246/week @ 2024-03-22 87/week @ 2024-03-29 20/week @ 2024-04-05

363 downloads per month
Used in 2 crates (via four-bar)

MIT license

77KB
1.5K SLoC

metaheuristics-nature

dependency status documentation

A collection of nature-inspired metaheuristic algorithms. This crate provides an objective function trait, well-known methods, and tool functions to implement your searching method.

This crate implemented the following algorithms:

  • Real-coded Genetic Algorithm (RGA)
  • Differential Evolution (DE)
  • Particle Swarm Optimization (PSO)
  • Firefly Algorithm (FA)
  • Teaching-Learning Based Optimization (TLBO)

Side functions:

  • Parallelable Seeded Random Number Generator (RNG)
    • This RNG is reproducible in single-thread and multi-thread programming.
  • Pareto front for Multi-Objective Optimization (MOO)
    • You can return multiple fitness in the objective function.
    • All fitness values will find the history-best solution as a set.

Each algorithm gives the same API and default parameters to help you test different implementations. For example, you can test another algorithm by replacing Rga with De.

use metaheuristics_nature as mh;

let mut report = Vec::with_capacity(20);

// Build and run the solver
let s = mh::Solver::build(mh::Rga::default(), mh::tests::TestObj)
    .seed(0)
    .task(|ctx| ctx.gen == 20)
    .callback(|ctx| report.push(ctx.best.get_eval()))
    .solve();
// Get the optimized XY value of your function
let (xs, p) = s.as_best();
// If `p` is a `WithProduct` type wrapped with the fitness value
let err = p.ys();
let result = p.as_result();
// Get the history reports
let y2 = &report[2];

What kinds of problems can be solved?

If your problem can be simulated and evaluated, the optimization method efficiently finds the best design! 🚀

Assuming that your simulation can be done with a function f, by inputting the parameters X and the evaluation value y, then the optimization method will try to adjust X={x0, x1, ...} to obtain the smallest y. Their relationship can be written as f(X) = y.

The number of the parameters X is called "dimension". Imagine X is the coordinate in the multi-dimension, and y is the weight of the "point." If the dimension increases, the problem will be more challenging to search.

The "metaheuristic" algorithms use multiple points to search for the minimum value, which detects the local gradient across the most feasible solutions and keeps away from the local optimum, even with an unknown gradient or feasible region.

Please have a look at the API documentation for more information.

Gradient-based Methods

For more straightforward functions, for example, if the 1st derivative function is known, gradient-based methods are recommended for the fastest speed. Such as OSQP.

Dependencies

~1.1–2.2MB
~42K SLoC