#machine-learning #tensor #data #format #sparse #self-describing


Self-describing sparse tensor data format for atomistic machine learning and beyond

7 releases

0.1.5 Mar 12, 2024
0.1.4 Mar 12, 2024
0.1.3 Feb 12, 2024
0.1.2 Jan 26, 2024
0.1.0 Oct 11, 2023

#217 in Data structures

Download history 1155/week @ 2024-02-26 470/week @ 2024-03-04 1129/week @ 2024-03-11 407/week @ 2024-03-18 512/week @ 2024-03-25 400/week @ 2024-04-01 492/week @ 2024-04-08 490/week @ 2024-04-15 258/week @ 2024-04-22 108/week @ 2024-04-29 151/week @ 2024-05-06 1287/week @ 2024-05-13 811/week @ 2024-05-20 505/week @ 2024-05-27 727/week @ 2024-06-03 1230/week @ 2024-06-10

3,343 downloads per month




tests status documentation coverage

Metatensor is a self-describing sparse tensor data format for atomistic machine learning and beyond; storing values and gradients of these values together. Think numpy ndarray or pytorch Tensor equipped with extra metadata for atomic systems and other point clouds data. The core of this library is written in Rust and we provide API for C, C++, and Python.

The main class of metatensor is the TensorMap data structure, defining a custom block-sparse data format. If you are using metatensor from Python, we additionally provide a collection of mathematical, logical and other utility operations to make working with TensorMaps more convenient.


For details, tutorials, and examples, please have a look at our documentation.


Thanks goes to all people that make metatensor possible:

contributors list

We always welcome new contributors. If you want to help us take a look at our contribution guidelines and afterwards you may start with an open issue marked as good first issue.


~50K SLoC