Classification builds a color description tree for the image. Reduction collapses the tree until the number it represents, at most, is the number of colors desired in the output image. Assignment defines the output image's color map and sets each pixel's color by reclassification in the reduced tree. Our goal is to minimize the numerical discrepancies between the original colors and quantized colors. To learn more about quantization error, see Measuring Color Reduction Error later in this document.
total number of nodes = 1+Sum(8^i), i=1,k For k=8, Number of nodes= 1 + (8^1+8^2+....+8^8) 8^8 - 1 = 1 + 8.----------- 8 - 1 = 19,173,961
For Cmax=255, Maximum tree depth = log (255) 2 = log (255) / log (2) e e =7.99 ~= 8
Ep = 0 while number of nodes with (n2 > 0) > required maximum number of colors prune all nodes such that E <= Ep Set Ep to minimum E in remaining nodes
To measure the difference between the original and color reduced images (the total color reduction error), ImageMagick sums over all pixels in an image the distance squared in RGB space between each original pixel value and its color reduced value. ImageMagick prints several error measurements including the mean error per pixel, the normalized mean error, and the normalized maximum error.
The normalized error measurement can be used to compare images. In general, the closer the mean error is to zero the more the quantized image resembles the source image. Ideally, the error should be perceptually-based, since the human eye is the final judge of quantization quality.
These errors are measured and printed when -verbose and -colors are specified on the command line: