|new 2.11.5||Jan 12, 2021|
|2.11.4||Oct 11, 2020|
|2.11.3||Aug 24, 2020|
#1 in #metric
824 downloads per month
Used in 2 crates (via dssim)
This tool computes (dis)similarity between two or more PNG images using an algorithm approximating human vision.
Comparison is done using the SSIM algorithm at multiple weighed resolutions.
The value returned is 1/SSIM-1, where 0 means identical image, and >0 (unbounded) is amount of difference. Values are not directly comparable with other tools. See below on interpreting the values.
- Comparison is done in L*a*b* color space (D65 white point, sRGB gamma). Other implementations use "RGB" or grayscale without gamma correction.
- Supports alpha channel.
- No OpenCV or MATLAB needed.
dssim file-original.png file-modified.png
Will output something like "0.02341" (smaller is better) followed by a filename.
You can supply multiple filenames to compare them all with the first file:
dssim file.png modified1.png modified2.png modified3.png
You can save an image visualising the difference between the files:
dssim -o difference.png file.png file-modified.png
It's also usable as a library.
Please be careful about color profiles in the images. Different profiles, or lack of support for profiles, can make images appear different even when the pixels are the same.
The amount of difference goes from 0 to infinity. It's not a percentage.
If you're comparing two different image compression codecs, then ensure you either:
- compress images to the same file size, and then use DSSIM to compare which one is closests to the original, or
- compress images to the same DSSIM value, and compare file sizes to see how much file size gain each option gives.
When you quote results, please include DSSIM version, since the scale has changed between versions.
The version is printed when you run
You need Rust 1.48 or later.
cargo build --release
Will give you
- The comparison is done on multiple weighed scales (based on IWSSIM) to measure features of different sizes. A single-scale SSIM is biased towards differences smaller than its gaussian kernel.
- Scaling is done in linear-light RGB to model physical effects of viewing distance/lenses. Scaling in sRGB or Lab would have incorrect gamma and mask distortions caused by chroma subsampling.
- ab channels of Lab are compared with lower spatial precision to simulate eyes' higher sensitivity to brightness than color changes.
- The lightness component of SSIM is ignored when comparing color channels.
- SSIM score is pooled using a combination of local maximums and global averages. You can get per-pixel SSIM from the API to implement custom pooling.