8 releases
| 0.4.3 | Jan 23, 2026 |
|---|---|
| 0.4.2 | Oct 18, 2023 |
| 0.4.1 | Jun 3, 2021 |
| 0.3.0 | Apr 20, 2017 |
| 0.1.1 | Apr 20, 2017 |
#231 in Algorithms
9,165 downloads per month
Used in 15 crates
(10 directly)
10KB
84 lines
entropy
A Rust library for calculating Shannon entropy and metric entropy of byte sequences.
Installation
Add this to your Cargo.toml:
[dependencies]
entropy = "0.4"
Usage
use entropy::{shannon_entropy, metric_entropy};
// Calculate Shannon entropy (in bits)
let h = shannon_entropy("hello, world");
assert_eq!(h, 3.0220551);
// Works with byte slices too
let h = shannon_entropy(b"\x00\x01\x02\x03");
// Calculate metric entropy (shannon_entropy / input.len())
let m = metric_entropy("hello, world");
assert_eq!(m, 0.25183794);
What is Shannon Entropy?
Shannon entropy measures the average amount of information contained in a message, expressed in bits. For byte data:
| Entropy Value | Meaning |
|---|---|
| 0 | Completely uniform (e.g., "aaaa") |
| 8 | Maximum randomness (all 256 byte values equally distributed) |
Common Use Cases
- Cryptography - Measuring randomness of keys or random number generators
- Data compression - Estimating how compressible data is
- Malware analysis - Detecting packed or encrypted executables
- Password strength - Estimating password complexity