Charged coupled devices (CCDs) and CMOS active pixel sensors (CMOS sensors) are the primary image sensor technologies. They rely on lenses and other optical components to deliver the image to be captured.
This FAQ reviews the differences between CCDs and CMOS sensors, looks at some key specifications of image sensors and the concept of quantum efficiency, and reviews some related optical elements.
CCD sensors are being replaced in many applications by CMOS sensors. Cost and power consumption are primary driving factors; CMOS sensors are much less expensive to produce and consume as little as 5% of the energy required by a CCD. But CMOS sensors also have lower performance. They are more susceptible to noise and can have lower light sensitivity. The bottom line: CMOS sensors are improving and are suitable for a growing range of machine vision applications, but CCDs are still preferred in most high-performance vision applications.
Resolution measures the number of pixels and is a baseline specification for image sensors. It’s generally stated as the number of pixels in the X- and Y-axes. While it may seem that ‘more is better’ in terms of resolution, it’s usually advisable to determine the resolution needed to ensure the required levels of accuracy and repeatability. Too many pixels can place unnecessary demands on image processing electronics.
Quantum efficiency (QE) is an important metric for measuring the sensitivity of both CMOS and CCD sensors. It’s the ratio of the charge created (number of electrons) for a given number of photons hitting the sensor. QE changes with wavelength, and the QE specification is the efficiency at the optimal wavelength, often in the range of 500 to 600 nm (green/yellow). For example, a sensor with a QE of 90% at 550 nm may have a QE of only 25% at 200 nm. And QE curves are not necessarily smooth, they can have multiple peaks.
Sensor size and pixel size
Sensor size is a secondary consideration. It’s related to resolution, but the relationship is evolving as the pixels become smaller and more densely packed. There’s a wide range of image sensor sizes used for different applications (Figure 1). Industrial machine vision systems tend to use smaller sensors while video cameras for television and movie production use larger sensors. For a typical ½” sensor in an industrial imaging system, here are some examples of the relationship between pixel size and resolution:
- Using relatively large 9.9 µm pixels can deliver a resolution of 640 x 480 pixels (0.3 megapixels, MP).
- Smaller 5.5 µm pixels can boost the resolution to 1280 x 1024 pixels (1.3 MP).
- Shrinking the pixels down to 3.6 µm can deliver a resolution of 1600 x 1200 pixels (2 MP).
Lenses are needed to focus the image on the sensor. A few of the key specifications for lenses in machine vision applications are resolution, field of view, depth of field, and working distance (Figure 2):
- The resolving power of the lens and the resolution of the vision sensor need to be matched. Using a lens with too high or low of a resolving power can unnecessarily increase system cost or not use the full capabilities of the sensor, respectively.
- The field of view (FoV) is the amount of the scene captured by the camera. For example, a wide-angle lens captures a larger area and has a larger FoV. Sensor size is also related to the field of view (FoV) of a vision system. For a given primary magnification, determined by the lens, larger sensors support larger FoVs.
Depth of field (DoF) is the range of distances in which the image remains in focus. A shallow DoF can be useful for inspection systems that are tightly focused on a specific plane where the object will appear. In other applications, like an autonomous mobile robot, a wider DoF is needed. Bar code reading is an application that can benefit from a moderately wide DoF.
- The working distance is the distance between the object and the front of the lens. It’s related to FoV and DoF. In general, a longer working distance results in a wider FoV, but a narrower DoF, and vice versa. Long working distances are used in applications like high-temperature thermal imaging, where the camera needs to be a safe distance from the object being measured, while a short working distance can be used for close-up inspections of miniaturized devices.
CMOS image sensors are being used in an increasing number of industrial machine vision systems. CCD sensors are still used in the most demanding applications. Regardless of the imager technology, the performance of the lens system must be matched to the sensor capabilities and needs of the application.
Camera Fundamentals In Machine Vision, Qualitas
Fundamental parameters of machine vision optics, Cognex
Image sensor, Wikipedia
Key Specifications of Machine Vision Optics, Canrill
Matching Image Sensor and Lens, TechNexion
Understanding Camera Sensors for Machine Vision Applications, Edmund Optics
What is Machine Vision Optics and How Does It Work?, Synopsys