#algorithm #optimization #metaheuristic

no-std metaheuristics-nature

A collection of nature-inspired meta-heuristic algorithms

21 breaking releases

0.22.0 Jan 11, 2022
0.21.0 Dec 28, 2021
0.20.0 Dec 24, 2021
0.17.0 Nov 11, 2021
0.8.0 Jul 27, 2021

#178 in Algorithms

Download history 37/week @ 2021-10-02 45/week @ 2021-10-09 89/week @ 2021-10-16 33/week @ 2021-10-23 4/week @ 2021-10-30 40/week @ 2021-11-06 33/week @ 2021-11-13 20/week @ 2021-11-20 124/week @ 2021-11-27 98/week @ 2021-12-04 122/week @ 2021-12-11 63/week @ 2021-12-18 242/week @ 2021-12-25 133/week @ 2022-01-01 214/week @ 2022-01-08 151/week @ 2022-01-15

768 downloads per month
Used in four-bar

MIT license



dependency status

A collection of nature-inspired meta-heuristic algorithms. This crate provides objective function trait, well-known methods, and tool functions let you implement your own searching method.

This crate implemented following algorithms:

  • Real-coded Genetic Algorithm (RGA)
  • Differential Evolution (DE)
  • Particle Swarm Optimization (PSO)
  • Firefly Algorithm (FA)
  • Teaching-Learning Based Optimization (TLBO)

Each algorithm gives same API and default parameters to help you test different implementation. For example, you can test another algorithm by simply replacing Rga to De.

use metaheuristics_nature::{Rga, Solver};

// Build and run the solver
let s = Solver::build(Rga::default())
    .task(|ctx| ctx.gen == 20)
// Get the result from objective function
let ans = s.result();
// Get the optimized XY value of your function
let x = s.best_parameters();
let y = s.best_fitness();
// Get the history reports
let report = s.report();

What kinds of problems can be solved?

If your problem can be simulated and evaluate, the optimization method is the efficient way to find the best design! 🚀

Assuming that your simulation can be done with a function f, by inputting the parameters X and the evaluation value y, then the optimization method will try to adjust X to obtain the smallest y. Their relationship can be written as f(X) = y.

The number of the parameters X is called "dimension", imagine X is the coordinate in the multi-dimension, and y is the weight of the "point". If the dimension become higher, the problem will be more difficult to search.

The "meta-heuristic" algorithms use multiple points to search the minimum value, which detect the local gradient, across the most of feasible solutions, and keep away from the local optimum.

Please see the API documentation for more detail explanation.


~37K SLoC