Jump to content

Image sensor

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Shaddack (talk | contribs) at 22:01, 20 January 2006 (New article). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

An image sensor is a device that converts visual image to an electric signal. It is used chiefly in digital cameras and other imaging devices. It is usually an array of charge-coupled devices or CMOS sensors.

There are several main types of color image sensors, differing by the means of the color separation mechanism:

  • Bayer sensor, low-cost and most common, using a Bayer filter that passes red,green, or blue light to selected pixels, forming interlaced grids sensitive to red, green, and blue. The image is then interpolated using a demosaicing algorithm.
  • Foveon X3 sensor, using an array of layered sensors where every pixel contains three stacked sensors sensitive to the individual colors.
  • 3CCD, using three discrete image sensors, with the color separation done by a dichroic prism. Considered the best quality, and generally more expensive than single-CCD sensors.

Specialty sensors

Special sensors are used for various applications. The most important are the sensors for thermal imaging, creation of multi-spectral images, sensor arrays for x-rays, and highly sensitive arrays for astronomy.