#machine-learning #tensor #data #format #sparse #self-describing

metatensor

Self-describing sparse tensor data format for atomistic machine learning and beyond

7 releases

0.1.5 Mar 12, 2024
0.1.4 Mar 12, 2024
0.1.3 Feb 12, 2024
0.1.2 Jan 26, 2024
0.1.0 Oct 11, 2023

#240 in Data structures

Download history 22/week @ 2023-12-22 14/week @ 2023-12-29 293/week @ 2024-01-05 118/week @ 2024-01-12 258/week @ 2024-01-19 317/week @ 2024-01-26 1240/week @ 2024-02-02 1156/week @ 2024-02-09 1407/week @ 2024-02-16 1033/week @ 2024-02-23 589/week @ 2024-03-01 940/week @ 2024-03-08 768/week @ 2024-03-15 464/week @ 2024-03-22 399/week @ 2024-03-29 317/week @ 2024-04-05

2,079 downloads per month

BSD-3-Clause

205KB
3K SLoC

Metatensor

tests status documentation coverage

Metatensor is a self-describing sparse tensor data format for atomistic machine learning and beyond; storing values and gradients of these values together. Think numpy ndarray or pytorch Tensor equipped with extra metadata for atomic systems and other point clouds data. The core of this library is written in Rust and we provide API for C, C++, and Python.

The main class of metatensor is the TensorMap data structure, defining a custom block-sparse data format. If you are using metatensor from Python, we additionally provide a collection of mathematical, logical and other utility operations to make working with TensorMaps more convenient.

Documentation

For details, tutorials, and examples, please have a look at our documentation.

Contributors

Thanks goes to all people that make metatensor possible:

contributors list

We always welcome new contributors. If you want to help us take a look at our contribution guidelines and afterwards you may start with an open issue marked as good first issue.

Dependencies

~1.4–3MB
~50K SLoC