Research Article | Open Access
SWIR Cameras for the Automotive Field: Two Test Cases
This paper presents the results obtained by the 2WIDE_SENSE Project, an EU funded project aimed at developing a low cost camera sensor able to acquire the full spectrum from the visible bandwidth to the Short Wave InfraRed one (from 400 to 1700 nm). Two specific applications have been evaluated, both related to the automotive field: one regarding the possibility of detecting icy and wet surfaces in front of the vehicle and the other regarding the pedestrian detection capability. The former application relies on the physical fact that water shows strong electromagnetic radiation absorption capabilities in the SWIR band around 1450 nm and thus an icy or wet pavement should be seen as dark; the latter is based on the observation that the amount of radiation in the SWIR band is quite high even at night and in case of poor weather conditions. Results show that even the use of SWIR and visible spectrum seems to be a promising approach; the use in outdoor environment is not always effective.
Increasing the road safety is an objective of mainstream importance for every political institution and great improvement capabilities are possible with development of more intelligent vehicles.
The ability to properly analyze the context in which the vehicle is moving, under hard real time constraints, is strongly influenced by the availability of powerful sensors. Conversely this kind of sensors is usually quite expensive and so it makes the development of affordable intelligent vehicle a difficult task.
Many research efforts are then spent with the aim to build cheap smart sensors that could provide data to better analyze such a complex environment as the automotive one.
The SWIR sensor presented is such a kind of smart, low cost device. To validate its usefulness this paper presents the results obtained in two different functionalities: detecting pedestrians and in discriminating amongst wet, dry, or icy pavement. These functionalities were selected since the additional use of the SWIR bandwidth component should, theoretically, improve the results.
1.1. SWIR Band
Usually as SWIR it is identified the part of the electromagnetic spectrum that ranges approximately from 1 μm to 2 μm.
Similarly to what happens with visible light, in standard automotive applications this band is mainly populated by the light reflected by different objects in the scene rather than by their thermal blackbody radiation, so that the only applications served by SWIR are those which benefit from reduced scattering effects of longer wavelengths like illumination from invisible sources, as passive illumination provided by the night glow originated in the upper atmosphere or active illumination from eye-safe lasers or alternatively from thermal emission of objects with temperatures above 150°C. So illumination is an important issue when dealing with SWIR images.
Adverse weather conditions are dangerous for driving. Rain both reduces visibility and makes roadway surfaces dangerous. Wet brakes are less effective too. Snow and ice cause roads to become even more slippery, especially when the temperature is at or below freezing. Slush makes it difficult to steer, hard packed snow increases the danger of skidding, and black ice makes driving extremely dangerous. Stopping distances on slippery pavement are from two to ten times farther than on dry pavement so that for a vehicle travelling at 30 km/h they can get up to 52 m on black ice. Moreover, usually, antibrake systems (ABS) are tuned for the most slippery scenario and therefore less effective than they can be in normal situations. Therefore the detection of a general road status or of the presence of slippery spots in front of the vehicle can significantly improve driving safety. It can be noticed that in Europe (EU-18) around 3800 casualties are due to wet, icy, or snowy situations . Most of the proposed solutions to this problem are not based on a true prediction but are focused on the estimation of the road friction, namely, the monitoring of tyres slippering. These approaches are mainly based on the use of inertial sensors or GPS or on the monitoring of the tyres noise [2–4]. Conversely, different perception approaches have been proposed for a true prediction like the use of radars  or lasers . The use of standard cameras has been proposed as well [7–10] exploiting the different polarization of the light reflected from the road surface.
Anyway, the most promising approach seems to be the analysis of the different spectral content of the light reflected from the asphalt in dry, wet, icy, or snow conditions .
More precisely, the Short Wave InfraRed (SWIR, 0.9 μm to 1.7 μm) bandwidth shows different light reflection patterns depending on the road status (see Figure 1) . According to this result, some solutions based on the use of custom spectrometers have been already implemented, for example, the Volvos Road eye or the Vaisala’s Road Weather Sensors family. While the use of a spectrometer can be effective, the proposed solutions are not suitable for on-board installation on vehicles.
Road conditions assessment is not the only implementation in which SWIR technology could be used: pedestrian detection could be a promising field of application.
Although several improvements in vehicle safety have been achieved in the last 25 years (i.e., crash tests, passive safety measures, new energy absorption materials, etc.), further reductions in road fatalities and injuries must be achieved. The development of active video-based driver assistance systems to detect preemptively dangerous situations involving vulnerable road users (VRU) as pedestrians is thus of fundamental importance for warning the driver or automatically taking control of the vehicle (i.e., braking) and becomes particularly valuable in case of drivers distraction or poor visibility conditions. Yet vision-based pedestrian detection is a difficult problem for a number of reasons [13, 14].
The objects of interest appear in highly cluttered backgrounds and have a wide range of appearance, due to body size and pose, clothing, and outdoor lighting conditions. Because of the moving vehicle, one does not have the possibility to use simple background subtraction methods (such as those used in surveillance applications) to obtain a foreground region containing the human shape. Furthermore, pedestrians can exhibit highly irregular motion, making prediction and situation analysis difficult. Finally, there are hard real-time requirements and tight performance criteria.
A peculiar characteristic to be noted about the SWIR spectrum is that human skin, having a very high water content, absorbs much of the longer wavelengths appearing in SWIR images very dark if not almost black (see Figure 2). Previous works within the IR bandwidth have dealt with skin detection both for face recognition  or people detection [16–18], but they are ineffective for an automotive pedestrian detector with very little skin area usually showing from the clothes. We have therefore applied a classic approach for pedestrian detection, an SVM classifier based on deformable part models [19, 20].
2. Hardware Equipment
Different solutions have been developed and used to collect data.
Solution 1 consists of a specific sensor and a large bandwidth lens has been developed. In addition, the camera is featuring a filter on the sensors that enables the independent acquisition of 4 different spectral bandwidths. The sensor of the 2WIDE SENSE camera module has been mainly developed by Alcatel-Thales III–V Lab and is an uncooled InGaAs and InP-based 640512 TM pixels array with a 15 μm pitch and a MAGIC logarithmic readout circuit (see Figure 3).
The two main features of the sensor are the large spectrum sensitivity (400 nm to 1700 nm) and the logarithmic gain that enables to avoid saturation effects.
Furthermore a specific microlens module has been developed by the OPTEC s.p.a (see Figure 3) to let the camera exploit the full spectrum capabilities.
The OB-V-SWIR 16 apochromatic lens is based on a combination of six elements produced using a specific moldable glass.
The lens transmittance is nearly the same in the whole functioning band: 98% in the 400 nm–1550 nm interval and decreasing down to 96% in the 1550 nm–1700 nm.
The most interesting feature of the developed camera is the presence of a Bayer-like filter to independently acquire specific spectral bandwidths that have been selected according to automotive world needs. More precisely, four different sapphire substrates have been grown on the pixel array and fourteen layers of TiO2 and SiO2 have been deposited on the substrates obtaining a 4 × 4 pixels mask pattern (see Figure 4).
Each pixel-filter is a high-pass filter with the following bandwidths: C clear (no filter, full bandwidth), F1 1350 nm–1700 nm (SWIR), F2 1000 nm–1700 nm (SWIR), and F4 over 540 nm (Red, NIR, and SWIR) (see Figure 5). The bandwidths of each filter have been selected according to the most used ADAS functions like Traffic Sign Recognition and High Beam Assist.
Anyway other bandwidths can be easily obtained combining different contributions; as an example the blue and green bandwidths can be obtained as a difference between the C component and the F4 contribution. The large bandwidth camera module developed during the project has been available and therefore only tested during the final stage of the 2WIDE SENSE experiments.
This paper reports about the preliminary tests done using a state-of-the-art InGaAs camera module with the OB-V-SWIR 16 microlens and high-pass SWIR filters applied on the lens (transmission bands as F1 and F2).
Solution 2 consists of a multispectral camera module. It has been developed during the project and has been available and therefore tested only during the final stage of the 2WIDE_SENSE experiments. A detailed description of the camera sensor, the filter pattern, and the large bandwidth lens has been provided above.
Conversely, most of the tests have been carried out using a state-of-the-art InGaAs camera module equipped with a SWIR high transmission lens. In order to mimic and evaluate the most suitable filters for the final prototype, a number of different filters have been used and tested (see Figure 3).
The camera used for tests is the OWL SW1.7 high sensitivity InGaAs FPA produced by Raptor Photonics and equipped with a sensor developed by Alcatel-Thales III–V Lab, both partners of the project consortium. The camera has a sensitivity bandwidth in the 400 nm–1700 nm interval covering the whole spectrum from visible to the SWIR and acquires 14 bit images within a 500 ns–500 μs exposure interval.
The lens used is the OB-SWIR25/2 developed and produced by Optec SpA. It is a high transmission lens featuring a transmission rate > 94% in the 900 nm–1700 nm interval. The focal length is 25 mm with a 35.5 deg angle of view.
In order to test a number of spectrum bandwidths and to compare the quantity of light reflected by the asphalt for different conditions and wavelengths, several filters have been used. In the preliminary phase of the project tunable liquid crystal filters have been employed to perform several temporal sequential acquisitions. These tunable filters allowed to choose different wavelengths with a 20 nm bandwidth resolution from 850 nm to 1800 nm and a transmittance around 60%. In the following phase a filter wheel (see Figure 6(b)) with 12 filters has been installed between the lens and the camera allowing to select between the available filters. This is a manual operation and therefore limits the use of the filters to still objects.
3.1. Results for Road Safety
Outdoor tests have been performed using both the state-of-the-art InGaAs camera with the OB-SWIR25/2 lens and the filter wheel as shown in Figure 6(a). The acquisition sessions for this activity were done at daytime with sunny and cloudy weather conditions and the road surface both dry and wet or iced in some areas as shown in the examples reported in Figure 7.
All combinations of gain and integration time values were also investigated to find the most suitable acquisition parameters for the RSM function. Some examples of these tests are shown in Figure 8.
Dry, wet, and icy road conditions at daytime have been investigated. In the following, two scenes showing different illumination and road conditions have been selected (see Figures 9 and 10). The spectral analysis has been done measuring the intensity values of the selected ROIs by using the filters included in the filter wheel operating in the SWIR bandwidth only.
The resulting ratios, shown in Figures 9 and 10, underline a behavior comparable to the indoor data although some relevant differences are noticeable:(i)ratio values are different respect to lab ones due to the different source spectrum, halogen lamp in the lab, and sun outdoor;(ii)due to modification in illumination condition (clouds, etc.) during the acquisition (spectra are collected by means of temporal sequential measurements of the filter wheel filters), it is not possible to find a ratio as good indicator for road condition.
Taking into account the previous considerations, several measurements have been done in order to characterize how the presence of clouds could affect the ratios. Spectra in a changeable weather day, initially with clouds and then clear, have been collected using a calibrated spectrometer. The temporal spectral evolution was compared to the theoretical solar spectra at sea level. In order to understand the contribution of the clouds, we collected some outdoor spectra during a cloudy day. In the table of Figure 9 the variations of different spectral ratios during the acquisition are shown. It has not been possible to evaluate the I(1500)/I(1100) ratio due to the spectral sensitivity limitation of the spectrometer.
From these measurements it can be noticed that illumination changes not only affect the intensity levels at all wavelengths but, due to the extra absorption of cloud water molecule, some wavelength ranges, that is, 1500 nm, are more deprived.
Our tests have shown that for indoor acquisitions the lamp spectrum affects results only by a multiplicative factor, outdoor the unpredictable changes in illumination not only affect the intensity levels at all wavelengths but, due to the extra absorption of cloud water molecule, different wavelength ranges are also differently affected. Image processing techniques applied to satellite and airborne pictures have also been taken into account looking for a procedure to limit this unwanted behavior, but all spectral analysis techniques are applied to images clear of clouds, a hard restriction which is totally unsuitable for functions to be applied in the automotive field.
3.2. Results for Pedestrian
A database of more than 10,000 images in different illumination and weather conditions with varied combinations of gain and exposure time has been collected, paying special attention to cases of reduced visibility caused by haze and fog (see Figures 11 and 12). A thorough investigation of images acquired in the SWIR bandwidths and comparisons with images acquired in the visible spectrum have been carried out.
In the following subsections three visibility conditions will be dealt with: clear sky, haze, and fog. To detect pedestrians in the SWIR spectrum the object detection method based on deformable part models illustrated in [19, 20] has been employed. Being based on both contrast sensitive and contrast insensitive HOG features, we have found that training the classifier on visible images only was suitable for detections on SWIR images with comparable detection rates. The following results have therefore been obtained training the classifier on datasets publicly available on the web (the PASCAL datasets ) featuring images acquired in the visible spectrum only.
Images acquired in the SWIR spectrum with clear sky conditions show that high water content objects like the human skin appear a lot darker than in visible only images. Nonetheless this peculiar characteristic is not very useful to effectively detect pedestrians as skin areas arising from clothes are of variable sizes and not in a fixed position, even the face (not always visible, i.e., in case of rear view) could be partially covered by sunglasses, scarf,…,ellipsis.
In addition, image contrast may change significantly in different seasons with light changes or particular atmospheric conditions such as high humidity levels and other sorts of absorption phenomena, making the skin color an unreliable indicator for automotive applications (see Figure 13). Through the classification process, correct detection values comparable to those obtained on visible only images are achievable but with no practical advantage by employing a SWIR sensor in respect of a standard visible only one (see Figure 14).
Haze is an atmospheric phenomenon where dust, smoke, and other wet or dry particles obscure the sky’s clarity. The SWIR wavelengths are able to penetrate those particles layer, making visibility clearer at distance (see Figure 15). However, due to the space between particles, haze becomes perceptible only from kilometers afar making any pedestrian detection application for the automotive field of questionable utility.
Acquisitions carried out in foggy conditions have shown that, despite the longer wavelengths capability of penetrating water particles suspension, in the presence of fog a clear visibility is not achievable employing a SWIR sensor (see Figure 17). Due to the peculiar nature of this atmospheric phenomenon, the scattering effect, predominantly in the forward direction, affects the SWIR wavelengths making imaging at distance impossible. The classifier returns correct detections only when the pedestrian is close enough to the camera (see Figures 16 and 18).
The experiments carried out on the previously described sensor both for icy and wet pavement conditions and pedestrian detection in a real world context have not been fully satisfying. Then both the former and the latter applications have proven to be very sensitive to the environmental conditions both in terms of weather and illumination.
The idea of a SWIR sensor adoption in the automotive field is anyway not ill posed: the SWIR spectrum presents very interesting physical properties but, in order to be able to effectively exploit them for real world applications, it is of mandatory importance to define proper strategies to address the above mentioned issues.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
- KfV, NTUA, SWOV, and TRL, “SafetyNet, annual statistical report 2008,” Tech. Rep., European Road Safety Observatory, 2008.
- S.-I. Akama, T. Tabaru, and S. Shin, “Bayes estimation of road surface using road noise,” in Proceedings of the 30th Annual Conference of IEEE Industrial Electronics Society (IECON '04), pp. 2923–2928, IEEE Computer Society, Busan, Korea, 2004.
- M. Bian, K. Li, D. Jin, and X. Lian, “Road condition estimation for automotive anti-skid control system based on BP neural network,” in Proceedings of the IEEE International Conference on Mechatronics and Automation (ICMA '05), pp. 1017–1022, IEEE Computer Society, Niagara Falls, Canada, 2005.
- P. P. Lin, M. Ye, and K.-M. Lee, “Intelligent observer-based road surface condition detection and identification,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (SMC '08), pp. 2465–2470, IEEE Computer Society, Singapore, 2008.
- V. Viikari, T. Varpula, and M. Kantanen, “Automotive radar technology for detecting road conditions. Backscattering properties of dry, wet, and icy asphalt,” in Proceedings of the 5th European Radar Conference (EuRAD '08), pp. 276–279, IEEE Computer Society, Amsterdam, The Netherlands, 2008.
- M. Mika, Y. Hiroyuki, K. Takao, I. Takeshi, and S. Mitsuo, “Road surface condition detector using high peak power fiber laser,” Transactions of the Institute of Electrical Engineers of Japan, vol. 10, no. 4, pp. 1198–1204, 2000.
- R. Omer and L. Fu, “An automatic image recognition system for winter road surface condition classification,” in Proceedings of the 13th International IEEE Conference on Intelligent Transportation Systems (ITSC '10), pp. 1375–1379, 2010.
- M. Jokela, M. Kutila, and L. Le, “Road condition monitoring system based on a stereo camera,” in Proceedings of the IEEE 5th International Conference on Intelligent Computer Communication and Processing (ICCP '09), pp. 423–428, 2009.
- J. Casselgren, M. Kutila, and M. Jokela, “Slippery road detection by using different methods of polarised light,” in Advanced Microsystems for Automotive Applications, pp. 207–220, 2012.
- Y. Lu and M. J. Higgins-Luthman, “Black ice detection and warning system,” US Patent App. 11/948, 086, 2007.
- J. Casselgren, M. Sjödahl, and J. LeBlanc, “Angular spectral response from covered asphalt,” Applied Optics, vol. 46, no. 20, pp. 4277–4288, 2007.
- J. Casselgren, M. Sjödahl, and J. P. LeBlanc, “Model-based winter road classification,” International Journal of Vehicle Systems Modelling and Testing, vol. 7, no. 3, pp. 268–284, 2012.
- E. Binelli, A. Broggi, A. Fascioli et al., “A modular tracking system for far infrared pedestrian recognition,” in Proceedings of the IEEE Intelligent Vehicles Symposium, pp. 759–764, Las Vegas, Nev, USA, 2005.
- A. Broggi, A. Cappalunga, C. Caraffi et al., “The passive sensing suite of the TerraMax autonomous vehicle,” in Proceedings of the IEEE Intelligent Vehicles Symposium (IV '08), pp. 769–774, Eindhoven, Netherlands, 2008.
- H. Chang, A. Koschan, M. Abidi, S. G. Kong, and C.-H. Won, “Multispectral visible and infrared imaging for face recognition,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW '08), pp. 1–6, IEEE Computer Society, Anchorage, Alaska, USA, 2008.
- A. S. Nunez and M. J. Mendenhall, “Detection of human skin in near infrared hyperspectral imagery,” in Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, pp. II621–II624, IEEE Computer Society, 2008.
- A. L. Brooks, Improved multispectral skin detection and its application to search space reduction for dismount detection based on histograms of oriented gradients [M.S. thesis], Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio, USA, 2012.
- G. A. Kilgore and P. R. Whillock, “Skin Detection Sensor,” United States Patent Office, Publication nr. US2007/0106160A1, Application n. 11/264, 654, Issued patent US7446316, 2008-11-04, 2008.
- P. F. Felzenszwalb, R. B. Girshick, and D. McAllester, “Cascade object detection with deformable part models,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '10), pp. 2241–2248, 2010.
- P. F. Felzenszwalb, R. B. Girshick, D. McAllester, and D. Ramanan, “Object detection with discriminatively trained part-based models,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 9, pp. 1627–1645, 2010.
- M. Everingham, L. van Gool, C. Williams, J. Winn, and A. Zisserman, “The pascal visual object classes,” http://pascallin.ecs.soton.ac.uk/challenges/VOC/.
Copyright © 2014 Nicola Bernini et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.