#async #futures #concurrency


A hassle-free data type for asynchronous programming

21 releases

0.8.2 Mar 11, 2023
0.8.1 Jun 12, 2022
0.7.2 Mar 11, 2023
0.7.1 May 28, 2022
0.1.1 Nov 7, 2017

#64 in Concurrency

Download history 60/week @ 2023-01-17 113/week @ 2023-01-24 131/week @ 2023-01-31 69/week @ 2023-02-07 151/week @ 2023-02-14 102/week @ 2023-02-21 57/week @ 2023-02-28 134/week @ 2023-03-07 123/week @ 2023-03-14 36/week @ 2023-03-21 63/week @ 2023-03-28 87/week @ 2023-04-04 112/week @ 2023-04-11 28/week @ 2023-04-18 92/week @ 2023-04-25 62/week @ 2023-05-02

316 downloads per month
Used in 13 crates (9 directly)




desync = "0.8"

Desync provides a new synchronisation type, Desync<T>, which works by ordering operations on its enclosed data type instead of the traditional method of using mutexes to protect critical sections. This allows concurrency to be built around two basic operations:

  • desync_thing.sync(|thing| /* ... */) for synchronous access to the data
  • desync_thing.desync(|thing| /* ... */) for asynchronous access to the data - running the supplied task in the background.

If only the sync() operation is used, this is roughly equivalent to a standard Mutex, except with much stronger guarantees about which thread gets the data first. The other operation, desync() effectively replaces the need to spawn threads and move data around in order to add concurrency to a program.

Desync also provides equivalent methods for async code: future_sync() will perform an operation in the current async context and future_desync() will schedule an operation in the background. These can be freely mixed with the sync() and desync() operations so it becomes fairly easy to mix code using traditional threading and code using async futures. As Desync uses order-of-operations to guarantee exclusive access to the data, these operations can borrow the contained data across any awaits that might be needed, unlike locks created using the Mutex type, which can't be sent between threads.

Desync provides fairly strong ordering guarantees: in particular, when any of the methods return, the ordering of the operation is guaranteed relative to any following operation. This property makes desync code quite easy to follow and less prone to race conditions than traditional threading. The ability to easily schedule updates asynchronously provides a way around common scenarios where the need to lock multiple mutexes can create deadlocks.

Quick start

Desync provides a single type, Desync<T> that can be used to replace both threads and mutexes. This type schedules operations for a contained data structure so that they are always performed in order and optionally in the background.

Such a Desync object can be created like so:

use desync::Desync;
let number = Desync::new(0);

It supports two main operations. desync will schedule a new job for the object that will run in a background thread. It's useful for deferring long-running operations and moving updates so they can run in parallel.

let number = Desync::new(0);
number.desync(|val| {
    // Long update here
    *val = 42;

// We can carry on what we're doing with the update now running in the background

The other operation is sync, which schedules a job to run synchronously on the data structure. This is useful for retrieving values from a Desync.

let new_number = number.sync(|val| *val);           // = 42

Desync objects always run operations in the order that is provided, so all operations are serialized from the point of view of the data that they contain. When combined with the ability to perform operations asynchronously, this provides a useful way to immediately parallelize long-running operations.

Working with futures

Desync has support for the futures library. The simplest operation is future_sync(), which creates a future that runs asynchronously on a Desync object but - unlike desync() can return a result. It works like this:

let future_number = number.future_sync(|val| future::ready(*val));
assert!(executor::block_on(async { future_number.await.unwrap() }) == 42 )

There is also a future_desync() operation, which can be used in cases where the thread is expected to block. It can be used in the same situations as future_sync() but has a detach() method to leave the task running in the background, or a sync() method to wait for the result to be computed.

Desync can run streams in the background too, via the pipe_in() and pipe() functions. These work on Arc<Desync<T>> references and provide a way to process a stream asynchronously. These two functions provide a powerful way to process input and also to connect Desync objects together using message-passing for communication.

let some_object = Arc::new(Desync::new(some_object));

pipe_in(Arc::clone(&number), some_stream, 
    |some_object, input| some_object.process(input));

let output_stream = pipe(Arc::clone(&number), some_stream, 
    |some_object, input| some_object.process_with_output(input));


~16K SLoC