36 releases (breaking)

new 0.26.0 Feb 14, 2025
0.25.1 Jan 3, 2025
0.25.0 Dec 19, 2024
0.24.0 Nov 12, 2024
0.4.0 Jul 9, 2022

#842 in Machine learning

Download history 2/week @ 2024-10-28 25/week @ 2024-11-04 125/week @ 2024-11-11 16/week @ 2024-11-18 8/week @ 2024-11-25 4/week @ 2024-12-02 13/week @ 2024-12-09 123/week @ 2024-12-16 132/week @ 2024-12-30 21/week @ 2025-01-06 4/week @ 2025-01-13 149/week @ 2025-02-03 115/week @ 2025-02-10

264 downloads per month
Used in 2 crates

Apache-2.0

345KB
7K SLoC

Mixture of experts

crates.io docs

egobox-moe provides a Rust implementation of mixture of experts algorithm. It is a Rust port of mixture of expert of the SMT Python library.

The big picture

egobox-moe is a library crate in the top-level package egobox.

Current state

egobox-moe currently implements mixture of gaussian processes provided by egobox-gp:

  • Clustering (linfa-clustering/gmm)
  • Hard recombination / Smooth recombination
  • Gaussian processe model choice: specify regression and correlation allowed models

Examples

There is some usage examples in the examples/ directory. To run, use:

$ cargo run --release --example clustering

License

Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0

Dependencies

~15–35MB
~485K SLoC