#machine-learning #tensor #data #format #sparse #self-describing

metatensor

Self-describing sparse tensor data format for atomistic machine learning and beyond

9 releases

0.2.0 Sep 24, 2024
0.1.6 Sep 11, 2024
0.1.5 Mar 12, 2024
0.1.3 Feb 12, 2024
0.1.0 Oct 11, 2023

#172 in Data structures

Download history 103/week @ 2024-07-29 46/week @ 2024-08-05 180/week @ 2024-08-12 93/week @ 2024-08-19 126/week @ 2024-08-26 497/week @ 2024-09-02 513/week @ 2024-09-09 622/week @ 2024-09-16 797/week @ 2024-09-23 384/week @ 2024-09-30 265/week @ 2024-10-07 274/week @ 2024-10-14 325/week @ 2024-10-21 637/week @ 2024-10-28 648/week @ 2024-11-04 656/week @ 2024-11-11

2,268 downloads per month

BSD-3-Clause

215KB
3K SLoC

Metatensor

tests status documentation coverage

Metatensor is a self-describing sparse tensor data format for atomistic machine learning and beyond; storing values and gradients of these values together. Think numpy ndarray or pytorch Tensor equipped with extra metadata for atomic systems and other point clouds data. The core of this library is written in Rust and we provide API for C, C++, and Python.

The main class of metatensor is the TensorMap data structure, defining a custom block-sparse data format. If you are using metatensor from Python, we additionally provide a collection of mathematical, logical and other utility operations to make working with TensorMaps more convenient.

Documentation

For details, tutorials, and examples, please have a look at our documentation.

Contributors

Thanks goes to all people that make metatensor possible:

contributors list

We always welcome new contributors. If you want to help us take a look at our contribution guidelines and afterwards you may start with an open issue marked as good first issue.

Dependencies

~1.4–3.5MB
~58K SLoC