13 releases
| 0.6.2 | Sep 14, 2024 |
|---|---|
| 0.6.1 | Aug 13, 2023 |
| 0.6.0 | May 22, 2023 |
| 0.5.4 | Apr 29, 2023 |
| 0.2.1 | Sep 27, 2022 |
#878 in Machine learning
51 downloads per month
100KB
2.5K
SLoC
ai-dataloader
A rust port of pytorch dataloader library.
Highlights
- Iterable or indexable (Map style)
DataLoader. - Customizable
Sampler,BatchSamplerandcollate_fn. - Parallel dataloader using
rayonfor indexable dataloader (experimental). - Integration with
ndarrayandtch-rs, CPU and GPU support. - Default collate function that will automatically collate most of your type (supporting nesting).
- Shuffling for iterable and indexable
DataLoader.
More info in the documentation.
Examples
Examples can be found in the examples folder but here there is a simple one
use ai_dataloader::DataLoader;
let loader = DataLoader::builder(vec![(0, "hola"), (1, "hello"), (2, "hallo"), (3, "bonjour")]).batch_size(2).shuffle().build();
for (label, text) in &loader {
println!("Label {label:?}");
println!("Text {text:?}");
}
tch-rs integration
In order to collate your data into torch tensor that can run on the GPU, you must activate the tch feature.
This feature relies on the tch crate for bindings to the C++ libTorch API. The libtorch library is required can be downloaded either automatically or manually. The following provides a reference on how to set up your environment to use these bindings, please refer to the tch for detailed information or support.
Next Features
This features could be added in the future:
RandomSamplerwith replacement- parallel
dataloaderfor iterable dataset - distributed
dataloader
MSRV
The current MSRV is 1.63.
Dependencies
~4–6.5MB
~127K SLoC