7 stable releases

✓ Uses Rust 2018 edition

3.0.0 May 21, 2020
2.0.0 May 10, 2020
1.1.2 Nov 27, 2019

#21 in Date and time

38 downloads per month


278 lines

Devtimer Build Status Crates.io Crates.io Crates.io Crates.io

The compact yet complete benchmarking suite for Rust. Period.


I've seen many, many benchmarking tools. However, no one realizes that we need simplicity to simplify development and increase productivity. devtimer provides a very compact yet complete benchmarking suite for code written in Rust. It makes use of the standard library only to provide benchmark operations. You can either use it for benchmarking a single operation or you can use it for running an operation multiple times and finding the min, max and average execution times. Since this crate has no external dependencies, it is small, fast and does exactly what it claims to. Happy benchmarking!


Add this to your cargo.toml:

devtimer = "*"

Then add this line to your source file (i.e main.rs or lib.rs or where you need to use it):

use devtimer::DevTime;

Example usage

Simple usage

Let's say there are two functions called very_long_operation() and another_op() that take a very long time to execute. Then we can time it's execution as shown below:

fn main() {
    let mut timer = DevTime::new_simple();
    println!("The operation took: {} ns", timer.time_in_nanos().unwrap());
    // You can keep re-using the timer for other operations
    timer.start(); // this resets the timer and starts it again
    println!("The operation took: {} secs", timer.time_in_secs().unwrap());
    println!("The operation took: {} milliseconds", timer.time_in_millis().unwrap());
    println!("The operation took: {} microseconds", timer.time_in_micros().unwrap());
    println!("The operation took: {} ns", timer.time_in_nanos().unwrap());

    // With version 1.1.0 and upwards
    // The timer will start after two seconds
    // Do some huge operation now
    println!("The operation took: {} nanoseconds", devtimer.time_in_nanos().unwrap());

Example: Benchmarking (for 3.0.0 and up)

use devtimer::run_benchmark;
fn main() {
  // We will simulate a long operation by std::thread::sleep()
  // Run 10 iterations for the test
  let bench_result = run_benchmark(10, || {
    // Fake a long running operation

Example: Tagged timers (for 3.0.0 and up)

use devtimer::DevTime;
fn main() {
  let mut cmplx = DevTime::new_complex();
  // Create a timer with tag `timer-1`
  // Simulate a slow operation
  // Create a timer with tag `cool-timer`
  // Simulate a slow operation

  // We can output a benchmark in this way
  println!("`cool-timer` took: {}", cmplx.time_in_micros("cool-timer").unwrap());

  // Or we can iterate through all timers
  for (tname, timer) in cmplx.iter() {
    println!("{} - {} ns", tname, timer.time_in_micros().unwrap());

  // Or we can print results in the default '{timername} - {time} ns' format

Timing functions available (names are self explanatory):

  • time_in_secs() -> Returns the number of seconds the operation took
  • time_in_millis() -> Returns the number of milliseconds the operation took
  • time_in_micros() -> Returns the number of microseconds the operation took
  • time_in_nanos() -> Return the number of nanoseconds the operation took

See the full docs here.

Why are there no tests?

Well, there would be no possible test that I can think of that'd run uniformly across all systems. If I did something like:

let mut timer = DevTime::new();
assert_eq!(timer.time_in_secs().unwrap(), 2);

It can easily fail (and has failed) as system calls can take time and the time for them will differ across every system. This will necessarily pass on all systems, but when compared on a microsecond or nanosecond level, the tests have failed multiple times. Hence I decided to omit all tests from this crate.


This project is licensed under the Apache-2.0 License. Keep coding and benchmarking!

No runtime deps