46 releases (breaking)
Uses new Rust 2024
| 0.33.0 | Oct 16, 2025 |
|---|---|
| 0.31.0 | Jul 11, 2025 |
| 0.27.0 | Mar 13, 2025 |
| 0.25.0 | Dec 19, 2024 |
| 0.4.0 | Jul 9, 2022 |
#455 in Machine learning
252 downloads per month
Used in 4 crates
(3 directly)
380KB
7.5K
SLoC
Mixture of experts
egobox-moe provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
The big picture
egobox-moe is a library crate in the top-level package egobox.
Current state
egobox-moe currently implements mixture of gaussian processes provided by egobox-gp:
- Clustering (
linfa-clustering/gmm) - Hard recombination / Smooth recombination
- Gaussian processe model choice: specify regression and correlation allowed models
Examples
There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
License
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0
Dependencies
~21–36MB
~620K SLoC