26 releases (17 breaking)

new 0.19.0 May 15, 2024
0.17.0 Apr 4, 2024
0.16.0 Mar 7, 2024
0.14.0 Dec 13, 2023
0.4.0 Jul 9, 2022

#480 in Machine learning

Download history 25/week @ 2024-01-31 22/week @ 2024-02-07 32/week @ 2024-02-14 219/week @ 2024-02-21 36/week @ 2024-02-28 200/week @ 2024-03-06 85/week @ 2024-03-13 31/week @ 2024-03-20 29/week @ 2024-03-27 255/week @ 2024-04-03 193/week @ 2024-04-10 16/week @ 2024-04-17 24/week @ 2024-04-24 8/week @ 2024-05-01 5/week @ 2024-05-08

55 downloads per month
Used in 2 crates

Apache-2.0

330KB
6.5K SLoC

Mixture of experts

crates.io docs

egobox-moe provides a Rust implementation of mixture of experts algorithm. It is a Rust port of mixture of expert of the SMT Python library.

The big picture

egobox-moe is a library crate in the top-level package egobox.

Current state

egobox-moe currently implements mixture of gaussian processes provided by egobox-gp:

  • Clustering (linfa-clustering/gmm)
  • Hard recombination / Smooth recombination
  • Gaussian processe model choice: specify regression and correlation allowed models

Examples

There is some usage examples in the examples/ directory. To run, use:

$ cargo run --release --example clustering

License

Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0

Dependencies

~15–34MB
~497K SLoC