Image

NIR Technology Improves Driver Assistance Systems

March 28, 2014
ADAS imaging performance improvements, like raised HDR sensitivity characteristics, elimination of motion artifacts, increased resolution, higher speed, and extended functional safety, are all on the horizon. 

Although the figures have steadily dropped over recent years, there are still 75 fatalities on Europe’s roads every day, according to the European Commission. In the United States, 11.4 automotive deaths per 100,000 citizens are recorded annually. The use of optoelectronics to provide video support in modern automated driver assistance systems (ADAS) can greatly reduce the likelihood of accidents.

Related Articles

Through such implementations, drivers will benefit from improved visibility levels. For example, sudden bends in the road or potential obstacles can be determined much earlier, so drivers can take appropriate action much sooner. Yet ensuring an accurate visual assessment of the traffic situation in dark, rainy, or misty conditions is a major challenge not only for the human eye but also for imaging devices, as the viewing area often is limited to the range of the headlamps.

Video cameras are used as a front end for these systems. They can capture details such as the location, shape, size, brightness, and color of objects in the line of sight, regardless of the lighting conditions. Captured images can be displayed on the vehicle’s instrument panel or analyzed via software to generate warnings and even intervene in the vehicle’s operation if necessary.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

There are issues, though. The dynamic range of a scene (i.e., the brightness ratio between the brightest and darkest point in the scene) will, in many cases, far exceed the capabilities of conventional image sensors based on standard CMOS technology that exhibit linear sensitivity. The lights from oncoming vehicles will also seriously impair the ability of the eye (or a camera) to detect hazards that lay ahead.

If detailed imaging data cannot be acquired due to the camera’s lack of dynamic range, then the safety of the driver, passengers, and pedestrians may be compromised. By increasing the dynamic range of a camera or image sensor, its ability to determine where potential dangers lie will be markedly improved. The application of a non-linear approach presents a way of achieving this. Such an approach will reduce the background noise while ensuring that pixel saturation only begins at much higher signal strengths.

HDR Devices

The emergence of high dynamic range (HDR) imaging devices is now allowing the extended sensitivity that car manufacturers have been seeking to finally be realized. ADAS operational performance then can be enhanced to cope with adverse weather conditions and nighttime driving.

The pixels of some image sensors now can deliver dual functionality. These devices can cover the full sensitivity range, including the visual and near infrared (NIR) range, and are therefore much more sensitive than the conventional color pixels currently employed in automotive imaging systems. In addition, they can be used to estimate the fraction of NIR in the RGB pixel signals. This means that an unwanted NIR signal can be taken out of the color signal. The true color image then can be created by combining the visible and NIR signals.

While driving at night, HDR systems combine the luminance picture (a) and RGB color image (b) to form a composite image (c).

Figure a shows that the clear pixels are supplying a sufficiently bright grayscale image. The relatively dark color image shown in Figure b mainly reflects the colors of the lights on the almost black background. Since the spectral energy in the visible region is small compared to the entire range of sensitivity of the image sensor, the brightness component of the filtered color is significantly weaker than the luminance of the pixels. The combined final image in c provides a natural representation of all traffic-related color information.

Reduced visibility in poor driving conditions increases the risk of accidents significantly, as the state of the road can be misjudged or obstacles not detected. Active driver assistance mechanisms based on HDR image sensors with smoother pixel response curves and employing highly advanced proprietary algorithms offer the means to enhance visibility levels in such circumstances. This can increase safety in the driving environment. Furthermore, the generation of multi-kneepoint HDR images in a single exposure can effectively eliminate the presence of motion artifacts.

On The Market

Melexis has developed a series of image sensors specifically to improve ADAS effectiveness. These devices feature high sensitivity across all the wavelengths in the NIR spectrum, as well as provisions for full-color reproduction. The six-kneepoint multiple-slope CMOS pixel functionality utilized means that the discharge rate of photon energy increases in proportion to the incident light intensity.

While dark pixels are only partially discharged during the integration time, very bright pixels will witness complete discharge so they are ready to begin image recording again within the remaining shortened integration time. This process can be repeated several times with ever-lower recharge voltages and shorter rest periods.

The result is what can be considered as a linear approximation of a logarithmic curve. A proprietary adaptive control algorithm is used to adjust the shutter speed and expand the dynamic range, ensuring that the incremental signal-to-noise ratio (iSNR) always remains above a minimum threshold.

Moving forward, new developments in camera technology not only will improve the quality of images that a car’s ADAS has access to, it also will effectively enable tuning of ADAS sensitivity too. The system will be able to identify the decisive moment, based on data regarding the driver’s level of alertness, when to warn the driver or alternatively when to execute an automatic braking or avoidance maneuver.

ADAS imaging performance improvements, like raised HDR sensitivity characteristics, elimination of motion artifacts, increased resolution, higher speed, and extended functional safety, are all on the horizon. They will be supplemented by other emerging sensing technologies that will see widespread deployment in the coming years, including cost-effective far infrared (FIR) cameras and time-of-flight (ToF) 3D cameras. These innovations are almost certain to help automobile manufacturers pursue their long-term goal of introducing car models that are fully autonomous.

Cliff de Locht is product marketing manager for the Optoelectronics Division of Melexis, where he is in charge of developing automotive safety and driver assistance solutions. Based at its Tessenderlo site in Belgium, he has been with the company for seven years. Previously, he was senior product manager for data networks at Belgian telecommunication operator Belgacom. He has a master’s degree in microelectronics and telecommunications from the University of Brussels.  
About the Author

Cliff de Locht | Product Marketing Manager

Cliff de Locht is product marketing manager for the Optoelectronics Division of Melexis, where he is in charge of developing automotive safety and driver assistance solutions. Based at its Tessenderlo site in Belgium, he has been with the company for seven years. Previously, he was senior product manager for data networks at Belgian telecommunication operator Belgacom. He has a master’s degree in microelectronics and telecommunications from the University of Brussels. 

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!