32 releases (breaking)
new 0.24.0 | Nov 12, 2024 |
---|---|
0.22.0 | Sep 4, 2024 |
0.21.1 | Jul 30, 2024 |
0.16.0 | Mar 7, 2024 |
0.4.0 | Jul 9, 2022 |
#515 in Machine learning
154 downloads per month
Used in 2 crates
340KB
7K
SLoC
Mixture of experts
egobox-moe
provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
The big picture
egobox-moe
is a library crate in the top-level package egobox.
Current state
egobox-moe
currently implements mixture of gaussian processes provided by egobox-gp
:
- Clustering (
linfa-clustering/gmm
) - Hard recombination / Smooth recombination
- Gaussian processe model choice: specify regression and correlation allowed models
Examples
There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
License
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0
Dependencies
~15–34MB
~498K SLoC