### 21 breaking releases

0.22.0 | Jan 11, 2022 |
---|---|

0.21.0 | Dec 28, 2021 |

0.20.0 | Dec 24, 2021 |

0.17.0 | Nov 11, 2021 |

0.8.0 | Jul 27, 2021 |

#**178** in Algorithms

**768** downloads per month

Used in four-bar

**MIT**license

62KB

1K
SLoC

# metaheuristics-nature

A collection of nature-inspired meta-heuristic algorithms. This crate provides objective function trait, well-known methods, and tool functions let you implement your own searching method.

This crate implemented following algorithms:

- Real-coded Genetic Algorithm (RGA)
- Differential Evolution (DE)
- Particle Swarm Optimization (PSO)
- Firefly Algorithm (FA)
- Teaching-Learning Based Optimization (TLBO)

Each algorithm gives same API and default parameters to help you test different implementation. For example, you can
test another algorithm by simply replacing

to `Rga`

.`De`

`use` `metaheuristics_nature``::``{`Rga`,` Solver`}``;`
`//` Build and run the solver
`let` s `=` `Solver``::`build`(``Rga``::`default`(``)``)`
`.``task``(``|``ctx``|` `ctx``.`gen `==` `20``)`
`.``solve``(``MyFunc``::`new`(``)``)``;`
`//` Get the result from objective function
`let` ans `=` s`.``result``(``)``;`
`//` Get the optimized XY value of your function
`let` x `=` s`.``best_parameters``(``)``;`
`let` y `=` s`.``best_fitness``(``)``;`
`//` Get the history reports
`let` report `=` s`.``report``(``)``;`

### What kinds of problems can be solved?

If your problem can be simulated and evaluate, the optimization method is the efficient way to find the best design! ðŸš€

Assuming that your simulation can be done with a function

, by inputting the parameters `f`

and the evaluation value `X`

, then the optimization method will try to adjust `y`

to obtain the smallest `X`

. Their relationship can be written as `y`

.`f``(`X`)` `=` y

The number of the parameters

is called "dimension", imagine `X`

is the coordinate in the multi-dimension, and `X`

is the weight of the "point". If the dimension become higher, the problem will be more difficult to search.`y`

The "meta-heuristic" algorithms use multiple points to search the minimum value, which detect the local gradient, across the most of feasible solutions, and keep away from the local optimum.

Please see the API documentation for more detail explanation.

#### Dependencies

~1.3â€“2MB

~37K SLoC