5 unstable releases
0.3.2 | Oct 15, 2024 |
---|---|
0.3.1 | Oct 15, 2024 |
0.2.1 | Apr 5, 2024 |
0.2.0 | Mar 31, 2024 |
0.0.1 |
|
#120 in Concurrency
9,159 downloads per month
23KB
421 lines
aarc
Quickstart
Arc
/Weak
: drop-in replacements for the standard library'sArc
andWeak
, but implemented with deferred reclamation semantics.AtomicArc
/AtomicWeak
: variants ofArc
andWeak
with atomically updatable pointers, supporting standard atomic operations likeload
andcompare_exchange
.Guard
: A novel smart pointer that can be loaded fromAtomicArc
orAtomicWeak
, designed to reduce contention when multiple threads operate on the same atomic variable. It prevents deallocation but does not contribute to reference counts. (This was renamed fromSnapshot
in an earlier version, to reduce confusion.)
Motivation
Data structures built with Arc
typically require locks for synchronization, as only
the reference counts may be atomically updated, not the pointer nor the contained data. While locks
are often the right approach, lock-free data structures can have better theoretical and practical
performance guarantees in highly-contended settings.
Instead of protecting in-place updates with locks, an alternative approach is to perform copy-on-write updates by
atomically installing pointers. To avoid use-afer-free, mechanisms for safe memory reclamation (SMR) are typically
utilized (i.e. hazard pointers, epoch-based reclamation). aarc
uses the blazingly fast algorithm provided by the
fast-smr
crate and builds on top of it, hiding unsafety and providing
convenient RAII semantics through reference-counted pointers.
Examples
Example 1: Treiber Stack
use std::ptr::null;
use aarc::{Arc, AsPtr, AtomicArc, Guard};
struct StackNode {
val: usize,
next: Option<Arc<Self>>,
}
struct Stack {
top: AtomicArc<StackNode>,
}
impl Stack {
fn push(&self, val: usize) {
let mut top = self.top.load();
loop {
let top_ptr = top.as_ref().map_or(null(), AsPtr::as_ptr);
let new_node = Arc::new(StackNode {
val,
next: top.as_ref().map(Arc::from),
});
match self.top.compare_exchange(top_ptr, Some(&new_node)) {
Ok(()) => break,
Err(before) => top = before,
}
}
}
fn pop(&self) -> Option<Guard<StackNode>> {
let mut top = self.top.load();
while let Some(top_node) = top.as_ref() {
match self
.top
.compare_exchange(top_node.as_ptr(), top_node.next.as_ref())
{
Ok(()) => return top,
Err(actual_top) => top = actual_top,
}
}
None
}
}
Roadmap
- relax atomic orderings from SeqCst to Acq/Rel
- add tagged pointers
- add more tests and stabilize APIs
Resources
Dependencies
~22KB