The measured color and intensity of reflections from a small, uniform surface element with no inherent light emission or opacity depend on three functions: the spectral power distribution of the illuminant, I(λ); the spectral reflective properties of the surface material, R(λ); and the spectral sensitivities of the imager, S(λ).
The signal power measured by a detector can be expressed as:
In order to get a color image, the human eye, as well as photographic and video equipment, uses multiple adjacent sensors with different spectral responses. Human vision relies on three types of light-sensitive cone cells to formulate color perception. In developing a color model based on human perception, the International Commission on Illumination (CIE) has defined a set of three color-matching functions, ¯x(λ)?, y¯(λ)? and z¯(λ)?. These can be thought of as the spectral sensitivity curves of three linear light detectors that yield the CIE XYZ tristimulus values Px
, and Pz
, known collectively as the “CIE standard observer.”
Digital image sensors predominantly use two methods to measure tristimulus values: a color filter array overlay above inherently monochromatic photodiodes; and stacked photodiodes that measure the absorption depth of photons, which is proportional to wavelength λ.
However, neither of these methods creates spectral responses similar to those of the human eye. As a result, color measurements between different photo detection and reproduction equipment will differ, as will measurements between image sensors and human observers when photographing the same scene—the same (Iλ)? and (Rλ)?.
Thus, the purpose of camera calibration is to transform and correct the tristimulus values that a camera or image sensor measures, such that the spectral responses match those of the CIE standard observer.