Color is an AREA function - not pixel
theimage.com © 2007
PREV NEXT

A Bayer Pattern
An array of filters is placed over the sensor, one filter per cell. The pattern has 2X as many Green filters as either Red or Blue.

Each cell becomes sensitive to its portion of light at only the one hue, that associated with its particular filter.

Hence there is a unique light intensity reading at each location (cell), and one color associated with that particular reading.
**animation 14.8 mp

Color is an area function and not a pixel function. This is true both in RAW mode and JPEG mode. Each pixel or cell in the array has only one color associated with it if color is created using the Bayer method. Most digital cameras use this today. This is quite sufficient for photography, as a single image with a unique, non related color on each pixel position would produce nearly random noise.

The world naturally tends toward area color. Colors appear to be continuous tone with gradual gradations. Part of this is due to the form of objects which are naturally shaded due to their 3 dimensional shape. Shadow patterns break up a single color into a gradient of color. There are exceptions in nature like a field of wild flowers, but they are more rare than not.

It takes three of the Bayer filter to produce the entire range of color. If the camera is not in RAW mode, then the software within the camera decides how to combine the filters to produce a unique color value for each cell. Each cell is given a RED, GREEN and BLUE value based on the predominance of color in its area.

Today color is an AREA function and not an individual pixel function. (There are new sensors being developed that may change this is in the future.)