#read-stream #chunks #stream-processing #parallel #reading #reading-file #thread

parallel_reader

A utility for reading from a file (or any Read stream) and processing it by chunks, in parallel

3 releases

0.1.2 Apr 13, 2021
0.1.1 Nov 21, 2020
0.1.0 Nov 21, 2020

#733 in Concurrency

36 downloads per month
Used in dropbox-sdk

MIT/Apache

13KB
126 lines

parallel_reader

Crates.io docs.rs

A utility (with no dependencies) for reading from a file (or any Read stream) and processing it by chunks, in parallel.

This is useful if reading is not a bottleneck, and you have something slow to do with it that is easily parallelizable.

Examples might be:

  • hashing
  • compression
  • sending chunks over a network

This crate provides a function, read_stream_and_process_chunks_in_parallel that lets you give a Read stream, then specify a chunk size and number of threads, and some processing function, and it'll take care of reading the stream and assigning threads to work on chunks of it.

Your function can also return an error, and it'll stop the processing of the file early and return the error to you, including the chunk offset that it was on when it errored.

For now, this is only using synchronous, blocking I/O, but maybe in the future I'll add another function that uses async streams and runs futures in parallel. P/Rs are welcome.

No runtime deps