31 releases (breaking)
0.23.0 | Oct 1, 2024 |
---|---|
0.21.1 | Jul 30, 2024 |
0.16.0 | Mar 7, 2024 |
0.14.0 | Dec 13, 2023 |
0.4.0 | Jul 9, 2022 |
#613 in Machine learning
276 downloads per month
Used in 2 crates
340KB
7K
SLoC
Mixture of experts
egobox-moe
provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
The big picture
egobox-moe
is a library crate in the top-level package egobox.
Current state
egobox-moe
currently implements mixture of gaussian processes provided by egobox-gp
:
- Clustering (
linfa-clustering/gmm
) - Hard recombination / Smooth recombination
- Gaussian processe model choice: specify regression and correlation allowed models
Examples
There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
License
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0
Dependencies
~15–34MB
~499K SLoC