46 releases (breaking)

Uses new Rust 2024

0.33.0 Oct 16, 2025
0.31.0 Jul 11, 2025
0.27.0 Mar 13, 2025
0.25.0 Dec 19, 2024
0.4.0 Jul 9, 2022

#455 in Machine learning

Download history 291/week @ 2025-07-09 126/week @ 2025-07-16 9/week @ 2025-07-23 2/week @ 2025-07-30 138/week @ 2025-08-20 25/week @ 2025-08-27 59/week @ 2025-09-03 14/week @ 2025-09-10 8/week @ 2025-09-17 9/week @ 2025-09-24 50/week @ 2025-10-01 4/week @ 2025-10-08 175/week @ 2025-10-15 22/week @ 2025-10-22

252 downloads per month
Used in 4 crates (3 directly)

Apache-2.0

380KB
7.5K SLoC

Mixture of experts

crates.io docs

egobox-moe provides a Rust implementation of mixture of experts algorithm. It is a Rust port of mixture of expert of the SMT Python library.

The big picture

egobox-moe is a library crate in the top-level package egobox.

Current state

egobox-moe currently implements mixture of gaussian processes provided by egobox-gp:

  • Clustering (linfa-clustering/gmm)
  • Hard recombination / Smooth recombination
  • Gaussian processe model choice: specify regression and correlation allowed models

Examples

There is some usage examples in the examples/ directory. To run, use:

$ cargo run --release --example clustering

License

Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0

Dependencies

~21–36MB
~620K SLoC