25 releases (16 breaking)
new 0.18.1 | Apr 10, 2024 |
---|---|
0.17.0 | Apr 4, 2024 |
0.16.0 | Mar 7, 2024 |
0.14.0 | Dec 13, 2023 |
0.4.0 | Jul 9, 2022 |
#516 in Machine learning
523 downloads per month
Used in 2 crates
325KB
6.5K
SLoC
Mixture of experts
egobox-moe
provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
The big picture
egobox-moe
is a library crate in the top-level package egobox.
Current state
egobox-moe
currently implements mixture of gaussian processes provided by egobox-gp
:
- Clustering (
linfa-clustering/gmm
) - Hard recombination / Smooth recombination
- Gaussian processe model choice: specify regression and correlation allowed models
Examples
There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
License
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0
Dependencies
~14–26MB
~439K SLoC