6 releases
Uses new Rust 2024
new 0.2.4 | May 16, 2025 |
---|---|
0.2.3 | May 13, 2025 |
0.1.0 | May 12, 2025 |
#893 in Algorithms
326 downloads per month
11KB
153 lines
hrw-hash
A minimalistic implementation of the Highest Random Weight (HRW) aka Rendezvous hashing algorithm as described in the "A name-based mapping scheme for Rendezvous", by Thaler and Ravishankar (1996).
The weighted variant of the HRW algorithm is implemented using Logarithmic Method as described in "Weighted distributed hash tables", by Schindelhauer and Schomaker (2005).
To constrain the number of hashing operations, the implementation hashes nodes and keys only once
(instead of nodes * keys
hashes). This optimization idea is well presented in the
"Rendezvous Hashing: The Path to Faster Hashes Calculation"
blog.
Features
- Absolutely minimalistic implementation with sane defaults.
- Allow weighted nodes.
- Optimized for performance and memory usage. No wasted re-hashing.
- Allow massive number of nodes (
O(log(n))
lookup time, instead ofO(n)
).
Motivation
Given an iterator of nodes (IntoIterator<Item = Node>
) the aim is to produce sorted list of
references to these nodes (Iterator<Item = &Node>
) for any given key.
This list serves as priority-sorted list of destination nodes for the key.
Both weighted and non-weighted nodes are supported.
Usage
For non-weighted nodes:
use hrw_hash::HrwNodes;
// Anything that implements `IntoIterator<Item = Hash + Eq>` can
// be used as list of target nodes.
let hrw = HrwNodes::new((0..10).map(|i| format!("node{}", i)));
// For a given key, get the iterator to node references
// (sorted by their weight).
let shard_id = 0;
let replicas: Vec<&String> = hrw.sorted(&shard_id)
.take(3).collect();
assert_eq!(replicas, vec!["node1", "node6", "node4"]);
For weighted nodes, which can have different capacities:
use hrw_hash::{WeightedHrwNodes, WeightedNode};
// Define node
// (anything that implements `Hash + Eq` can be used as node).
#[derive(Debug, PartialEq, Eq, Hash)]
struct Node {
id: u64,
capacity: usize,
}
impl Node {
fn new(id: u64, capacity: usize) -> Self {
Self { id, capacity }
}
}
// Implement `WeightedNode` trait for the node.
impl WeightedNode for Node {
fn capacity(&self) -> usize {
self.capacity
}
}
let mut nodes = Vec::new();
for i in 0..100 {
// Regular nodes, have the same capacity.
nodes.push(Node::new(i, 1));
}
// Add some nodes with higher capacities.
nodes.push(Node::new(100, 50));
nodes.push(Node::new(101, 20));
let hrw = WeightedHrwNodes::new(nodes);
// Nodes `100` and `101` have higher capacity, so they will be
// selected more often -- even though there are many other nodes.
assert_eq!(hrw.sorted(&"foobar1").next(), Some(&Node::new(29, 1)));
assert_eq!(hrw.sorted(&"foobar2").next(), Some(&Node::new(78, 1)));
assert_eq!(hrw.sorted(&"foobar3").next(), Some(&Node::new(100, 50)));
assert_eq!(hrw.sorted(&"foobar4").next(), Some(&Node::new(101, 20)));
assert_eq!(hrw.sorted(&"foobar5").next(), Some(&Node::new(100, 50)));
License
MIT
Dependencies
~67KB