40 releases

Uses new Rust 2021

0.6.0 Aug 9, 2022
0.6.0-rc.2 Jul 31, 2022
0.5.1 Feb 16, 2022
0.4.7 Aug 14, 2021
0.0.10 Feb 12, 2018

#11 in Math

Download history 1012/week @ 2022-04-27 1096/week @ 2022-05-04 1401/week @ 2022-05-11 1090/week @ 2022-05-18 1264/week @ 2022-05-25 1334/week @ 2022-06-01 1085/week @ 2022-06-08 1146/week @ 2022-06-15 975/week @ 2022-06-22 1056/week @ 2022-06-29 912/week @ 2022-07-06 1307/week @ 2022-07-13 2141/week @ 2022-07-20 2036/week @ 2022-07-27 1657/week @ 2022-08-03 1762/week @ 2022-08-10

7,948 downloads per month
Used in 15 crates (10 directly)

MIT/Apache and maybe LGPL-3.0

1MB
19K SLoC

Mathematical optimization in pure Rust

Website | Book | Docs (latest release) | Docs (main branch) | Examples (latest release) | Examples (main branch)

Crates.io version Crates.io downloads GitHub Actions workflow status License Gitter chat

argmin is a numerical optimization library written entirely in Rust.

argmins goal is to offer a wide range of optimization algorithms with a consistent interface. It is type-agnostic by the design, meaning that any type and/or math backend, such as nalgebra or ndarray can be used -- even your own.

Observers allow one to track the progress of iterations, either by using one of the provided ones for logging to screen or disk or by implementing your own.

An optional checkpointing mechanism helps to mitigate the negative effects of crashes in unstable computing environments.

Due to Rusts powerful generics and traits, most features can be exchanged by your own tailored implementations.

argmin is designed to simplify the implementation of optimization algorithms and as such can also be used as a toolbox for the development of new algorithms. One can focus on the algorithm itself, while the handling of termination, parameter vectors, populations, gradients, Jacobians and Hessians is taken care of by the library.

Algorithms

  • Line searches
    • Backtracking line search
    • More-Thuente line search
    • Hager-Zhang line search
  • Trust region method
    • Cauchy point method
    • Dogleg method
    • Steihaug method
  • Steepest descent
  • Conjugate gradient method
  • Nonlinear conjugate gradient method
  • Newton methods
    • Newton’s method
    • Newton-CG
  • Quasi-Newton methods
    • BFGS
    • L-BFGS
    • DFP
    • SR1
    • SR1-TrustRegion
  • Gauss-Newton method
  • Gauss-Newton method with linesearch
  • Golden-section search
  • Landweber iteration
  • Brent’s method
  • Nelder-Mead method
  • Simulated Annealing
  • Particle Swarm Optimization

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Dependencies

~0.8–3.5MB
~70K SLoC