### 4 releases (breaking)

0.5.0 | Oct 8, 2022 |
---|---|

0.4.0 | Jul 9, 2022 |

0.3.0 | May 6, 2022 |

0.2.1 | Apr 20, 2022 |

#**169** in Machine learning

**Apache-2.0**

545KB

8K
SLoC

# egobox

Rust toolbox for Efficient Global Optimization algorithms inspired from SMT.

is twofold:`egobox`

- for developers: a set of Rust libraries useful to implement bayesian optimization (EGO-like) algorithms,
- for end-users: a Python module, the Python binding of the implemented EGO-like optimizer, named

.`Egor`

## The Rust libraries

Rust libraries consists of the following sub-packages.`egobox`

Name | Version | Documentation | Description |
---|---|---|---|

doe | sampling methods; contains LHS, FullFactorial, Random methods | ||

gp | gaussian process regression; contains Kriging and PLS dimension reduction | ||

moe | mixture of experts using GP models | ||

ego | efficient global optimization with basic constraints and mixed integer handling |

### Usage

Depending on the sub-packages you want to use, you have to add following declarations to your `Cargo .toml`

`[``dependencies``]`
`egobox-doe = { version ``=` `"`0.5.0`"` }
`egobox-gp = { version ``=` `"`0.5.0`"` }
`egobox-moe = { version ``=` `"`0.5.0`"` }
`egobox-ego = { version ``=` `"`0.5.0`"` }

### Features

`serializable-gp`

`serializable-gp`

The

feature enables the serialization of GP models using the serde crate.`serializable-gp`

`persistent-moe`

`persistent-moe`

The

feature enables `persistent-moe`

and `save``(``)`

methods for MoE model to/from a json file using the serde crate.`load``(``)`

### Examples

Examples (in

sub-packages folder) are run as follows:`examples /`

`$`` cd doe` `&&` `cargo`` run`` --`example samplings` --`release

`$`` cd gp` `&&` `cargo`` run`` --`example kriging` --`release

`$`` cd moe` `&&` `cargo`` run`` --`example clustering` --`release

`$`` cd ego` `&&` `cargo`` run`` --`example ackley` --`release

### BLAS/LAPACK backend (optional)

relies on linfa project for methods like clustering and dimension reduction, but also try to adopt as far as possible the same coding structures.`egobox`

As for

, the linear algebra routines used in `linfa`

, `gp`

ad `moe`

are provided by the pure-Rust linfa-linalg crate, the default linear algebra provider.`ego`

Otherwise, you can choose an external BLAS/LAPACK backend available through the ndarray-linalg crate. In this case, you have to specify the

feature and a `blas`

BLAS/LAPACK backend feature (more information in linfa features).`linfa`

Thus, for instance, to use

with the Intel MKL BLAS/LAPACK backend, you could specify in your `gp`

the following features:`Cargo .toml`

`[``dependencies``]`
`egobox-gp = { version = "0.5.0", features ``=` `[``"`blas`"`, `"`linfa/intel-mkl-static`"``]` }

or you could run the

example as follows:`gp`

`$`` cd gp` `&&` `cargo`` run`` --`example kriging` --`release` --`features blas,linfa/intel-mkl-static

## The Python optimizer Egor

Thanks to the PyO3 project, which makes Rust well suited for building Python extensions, the EGO algorithm written in Rust (aka

) is binded in Python. You can install the Python package using:`Egor`

`$`` pip install egobox`

See the tutorial notebook for usage of the optimizer.

## Why egobox?

I started this library as a way to learn Rust and see if it can be used to implement algorithms like those in the SMT toolbox[^1]. As the first components (doe, gp) emerged, it appears I could translate Python code almost line by line in Rust (well... after a great deal of borrow-checker fight!) and thanks to Rust ndarray library ecosystem.

This library relies also on the linfa project which aims at being the "scikit-learn-like ML library for Rust". Along the way I could contribute to

by porting gaussian mixture model (`linfa`

) and partial least square family methods (`linfa -clustering/gmm`

`linfa-pls`

) confirming the fact that Python algorithms translation in Rust could be pretty straightforward.While I did not benchmark exactly my Rust code against SMT Python one, from my debugging sessions, I noticed I did not get such a great speed up. Actually, algorithms like

and `doe`

relies extensively on linear algebra and Python famous libraries `gp`

/`numpy`

which are strongly optimized by calling C or Fortran compiled code.`scipy`

My guess at this point is that interest could come from some Rust algorithms built upon these initial building blocks hence I started to implement mixture of experts algorithm (

) and on top surrogate-based optimization EGO algorithm (`moe`

) which gives its name to the library[^2][^3]. Aside from performance, such library can also take advantage from the others Rust selling points.`ego`

## Cite

If you happen to find this Rust library useful for your research, you can cite this project as follows:

`@`Misc`{`egobox`,`
author `=` `{`Rémi Lafage`}``,`
title `=` `{`Egobox`:` efficient global optimization toolbox `in` Rust`}``,`
year `=` `{``2020``-``-``}``,`
url `=` `"`https://github.com/relf/egobox`"`
`}`

[^1]: M. A. Bouhlel and J. T. Hwang and N. Bartoli and R. Lafage and J. Morlier and J. R. R. A. Martins. A Python surrogate modeling framework with derivatives. Advances in Engineering Software, 2019.

[^2]: Bartoli, Nathalie, et al. "Adaptive modeling strategy for constrained global optimization with application to aerodynamic wing design." Aerospace Science and technology 90 (2019): 85-102.

[^3]: Dubreuil, Sylvain, et al. "Towards an efficient global multidisciplinary design optimization algorithm." Structural and Multidisciplinary Optimization 62.4 (2020): 1739-1765.

#### Dependencies

~**14–30MB**

~522K SLoC