Is dE the Standard Deviation between Sample and Standard Follow
Question: Is dE the Standard Deviation between Sample and Standard?
Answer:
A bare bones description might be; dE is color difference from a Standard and when a tolerance is applied, it is presumed than any dE value less than the tolerance will be an acceptable match. CIE76, which is the technical description for dE*, is the geometric distance in color space between the Standard and Sample reading. Since human perception is not uniform throughout the gamut of color space it is recommended that each Standard Color have a unique acceptance tolerance associated with it when using dE*. For example our ability to differentiate between different shades of Yellow is much greater than our ability to differentiate between shades of Deep Blue. If a tolerance dE* < 1 indicates an acceptable match for the Yellows, to have a visually comparable tolerance for Deep Blue might result in a dE* < 1.7
CMC (dECMC) and CIE2000 (dE*2000) consist of complex equations such that a uniform tolerance can be applied throughout color space. These equations were designed so that a value of 1 represents the limit of an typical acceptable match. Using CMC or CIE2000 would allow the user to apply a tolerance of 1 to both the Yellow and Deep Blue colors from the previous example.
Standard Deviation, and k or C.I. (confidence interval), are statistical terms that are used to model univariate groups of data that follow a Gaussian distribution. These models work well when applied to XYZ or Indices, but may be flawed when applied to multi-variate values like the different dE types. Since dE can't be negative it's impossible to get a normal distribution, which is what Std. Dev. is designed to represent. When using a group of dE's a Hotelling model t2 should be used.
Comments
0 comments
Please sign in to leave a comment.