What is Image Sensors and How They Works?
An image sensor is a solid-state device, the part of the camera's hardware that captures light and converts what you see through a viewfinder or LCD monitor into an image. Think of the sensor as the electronic equivalent of film.
When using a film camera you can insert any kind of film you want. It’s the film you choose that gives photographs distinctive colors, tones, and grain. If you think one film gives images that are too blue or red, you can change to another film. With digital cameras, the “film” is permanently part of the camera so buying a digital camera is in part like selecting a film to use. Like film, different image sensors render colors differently, have different amounts of “grain,” different sensitivities to light, and so on. The only ways to evaluate these aspects are to examine some sample photographs from the camera or read reviews written by people you trust.
Initially, charge-coupled devices (CCDs) were the only image sensors used in digital cameras. They had already been well developed through their use in astronomical telescopes, scanners, and video camcorders. However, there is now a well-established alternative, the CMOS image sensor. Both CCD and CMOS image sensors capture light using a grid of small photosites on their surfaces. It’s how they process the image and how they are manufactured where they differ from one another.
CCD image sensors.
A charge-coupled device (CCD) gets its name from the way the charges on its pixels are read after an exposure. The charges on the first row are transferred to a place on the sensor called the read out register. From there, they are fed to an amplifier and then on to an analog-to-digital converter. Once a row has been read, its charges in the readout register row are deleted, the next row enters, and all of the rows above march down one row. With each row “coupled” to the row above in this way, each row of pixels is read—one row at a time.
CMOS image sensors.
Image sensors are manufactured in factories called wafer foundries or fabs where the tiny circuits and devices are etched onto silicon chips. The biggest problem with CCDs is that they are created in foundries using specialized and expensive processes that can only be used to make other CCDs.
Meanwhile, larger foundries use a different process called Complementary Metal Oxide Semiconductor (CMOS) to make millions of chips for computer processors and memory. CMOS is by far the most common and highest yielding chip-making process in the world.
Using this same process and the same equipment to manufacturer CMOS image sensors cuts costs dramatically because the fixed costs of the plant are spread over a much larger number of devices.
As a result of these economies of scale, the cost of fabricating a CMOS wafer is significantly less than the cost of fabricating a similar wafer using the specialized CCD process. Costs are lowered even farther because CMOS image sensors can have processing circuits created on the same chip. With CCDs, these processing circuits must be on separate chips.
What is differences between CCD and CMOS sensors?
- CCD sensors, as mentioned above, create high-quality, low-noise images. CMOS sensors, traditionally, are more susceptible to noise.
- Because each pixel on a CMOS sensor has several transistors located next to it, the light sensitivity of a CMOS chip tends to be lower. Many of the photons hitting the chip hit the transistors instead of the photodiode.
- CMOS traditionally consumes little power. Implementing a sensor in CMOS yields a low-power sensor.
- CCDs use a process that consumes lots of power. CCDs consume as much as 100 times more power than an equivalent CMOS sensor.
- CMOS chips can be fabricated on just about any standard silicon production line, so they tend to be extremely inexpensive compared to CCD sensors.
- CCD sensors have been mass produced for a longer period of time, so they are more mature. They tend to have higher quality and more pixels.
Image Sensors—Sensitivity and Noise
In some situations images are not as clear as they otherwise are. They appear grainy with randomly scattered colored pixels that break up smooth areas. This is what’s known as noise. It has three basic causes:
• Small photosites on the sensor. There is nothing you can do about this cause, but it also makes the following causes even more severe.
• A long shutter speed that lets light into the camera for a long time, usually in a dim or dark setting, gives noise a chance to build up.
• A high ISO setting let’s you use a faster shutter speed to avoid blur but also amplifies the noise along with the signal. Many cameras have one or more noise reduction modes that reduce the effects of this noise.
Despite their differences, both types of sensors are capable of giving very good results and both types are used by major camera companies. Canon and Nikon both use CMOS sensors in their high-end digital SLRs as do many other camera companies.