Next Article in Journal
Tightly Coupled Integration of GPS Ambiguity Fixed Precise Point Positioning and MEMS-INS through a Troposphere-Constrained Adaptive Kalman Filter
Next Article in Special Issue
An Optical Sensor for Measuring the Position and Slanting Direction of Flat Surfaces
Previous Article in Journal
A Novel Strategy to Eliminate the Influence of Water Adsorption on Quartz Surfaces on Piezoelectric Dynamometers
Previous Article in Special Issue
Microwave Imaging under Oblique Illumination
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Hyperspectral Image-Based Night-Time Vehicle Light Detection Using Spectral Normalization and Distance Mapper for Intelligent Headlight Control

1
Department of Electronic Engineering, Yeungnam University, 280, Daehak-ro, Gyeongsan-si, Gyeongsangbuk-do KS011, Korea
2
Daegu Gyeongbuk Institute of Science and Technology, 333, Techno jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu KS002, Korea
*
Author to whom correspondence should be addressed.
Sensors 2016, 16(7), 1058; https://doi.org/10.3390/s16071058
Submission received: 15 April 2016 / Revised: 19 June 2016 / Accepted: 22 June 2016 / Published: 8 July 2016
(This article belongs to the Special Issue Non-Contact Sensing)

Abstract

:
This paper proposes a vehicle light detection method using a hyperspectral camera instead of a Charge-Coupled Device (CCD) or Complementary metal-Oxide-Semiconductor (CMOS) camera for adaptive car headlamp control. To apply Intelligent Headlight Control (IHC), the vehicle headlights need to be detected. Headlights are comprised from a variety of lighting sources, such as Light Emitting Diodes (LEDs), High-intensity discharge (HID), and halogen lamps. In addition, rear lamps are made of LED and halogen lamp. This paper refers to the recent research in IHC. Some problems exist in the detection of headlights, such as erroneous detection of street lights or sign lights and the reflection plate of ego-car from CCD or CMOS images. To solve these problems, this study uses hyperspectral images because they have hundreds of bands and provide more information than a CCD or CMOS camera. Recent methods to detect headlights used the Spectral Angle Mapper (SAM), Spectral Correlation Mapper (SCM), and Euclidean Distance Mapper (EDM). The experimental results highlight the feasibility of the proposed method in three types of lights (LED, HID, and halogen).

1. Introduction

Intelligent headlight control (IHC) is important for advancing the driver assistance system (ADAS) for drivers who drive at night [1]. IHC systems aim to adjust the high beam of a vehicle automatically to illuminate the road ahead. The higher rate of vehicle accidents at night than during the day highlights the need for an IHC system [2]. Another important topic is the improvement of the visibility on the road at night [3]. Some detection techniques for IHC systems have been used, such radar system [4], microwave detectors [5], ultrasonic sensors [6], and infrared thermal sensors. These methods, however, are very expensive and the recent research trends have been in vision-based intelligent night-time driver assistance and surveillance system (VIDASS system) using a dashboard camera, because of its low cost and potential to collect a large amount of information [7]. Two kinds of cameras are available to develop an IHC system: Complementary metal-Oxide-Semiconductor (CMOS) [8] and Charge-Coupled Device (CCD) camera [9]. CMOS camera is image sensor as a type of integrated circuits, CCD camera is image sensor as a device for the movement of electrical charge. Research on vehicle detection during the day has been reported [10], but it is not useful at night. Other vehicle light detections systems have been reported, such as a grayscale image [11] and the OTSU algorithm [12] or blob detection technique [13,14,15]. In addition, methods using these studies used a Region of Interest (ROI) [12,16]. A probabilistic tracking method which can be used to represent the associations of two blobs from different frames, has also been reported [17]. Another detection method using R3I sensors, which have different appearance features and are used as the input for a novel classifier-based module, yields a degree of resemblance to a vehicle light [18]. These studies can be used in hyperspectral image processing, because visible hyperspectral images include the red, green, blue (RGB) bands, if the hyperspectral camera has a high frame rate. Figure 1 shows the limitations of the CCD-based headlight detection using the threshold method; (a) is a composite image from RGB bands in hyperspectral image; and (b) is the result of headlight detection as a binary image. This method, however, is not perfect for headlight detection, as shown in Figure 1, because of the following problems. First, other lights can be detected as targets (headlights), such as signboard lights, streetlights and traffic lights as shown in Figure 1 (in case of street detection as a headlight). Second, some studies used a ROI, but the position of the ROI can change according to the location of the installed camera. Therefore, this study examined another way of detecting vehicle headlights using hyperspectral images. Hyperspectral images are collected at very narrow wavelength intervals [19]. In addition, hyperspectral images provide more information than multispectral images and CCD images. Hyperspectral images combine the features of the images and spectroscopy to acquire both spatial and spectral data simultaneously [20]. For example, Figure 1 presents an image where each pixel has spectral data as shown in Figure 3. In the case of this paper, there is a great deal of spectral information in each pixel because the number of bands is 258. Therefore, there are many applications using hyperspectral image, such as damage detection of fruit spots in the food quality field [21], and distinguishing cancer [22] or tumors [23] in the medical field. This paper conducted experiments to detect vehicle headlights such as light emitting diodes (LED), high-intensity discharge (HID), and halogen, to develop an IHC system. In headlight lens, glass dispersion alters the position of the focal point according to the wavelength. For example, the focal point of blue light is closer to the lens than that of red light because the refraction index of blue light is typically higher than that of red light [24]. This paper proposes a headlight detection method using hyperspectral images instead of CCD or CMOS cameras. In addition, this paper compares each method (Raw data Mean-centered normalization) using detection metrics such as Spectral Angle Mapper (SAM), Spectral Correlation Mapper (SCM) and Euclidean Distance Mapper (EDM). Section 2 explains the purpose of this research and scenario. Section 3 describes the proposed vehicle headlight detection system using the detection metric after processing normalization. Section 4 presents the experiment results. Section 5 reports the conclusions.

2. Research Objectives and Experiment Scenario

A hyperspectral camera sensor must be commercialized to apply the IHC technique using the hyperspectral imaging technology. In particular, the lens design is a very important part. The projected area of a lamp can be calculated geometrically from a pixel illumination model. The larger the lens magnification, the larger the projected area of the lamp in an image. If the projected area of the lamp is less than 1, the lamp fits only into a single pixel [25]. This paper shows four types of results in the experiments. The first experiment is headlight detection, such as LED, HID, and halogen lamp distinguish between headlights and street lamps. The second experiment is focused on headlight detection for the distinction from the reflection of the ego-car in the infrastructure. The third experiment is related to the headlight detection capability according to the target distance (maximum 500 m). The final experiment is a rear lamp detection, such as LED and halogen lamp. Headlight detection should be made at a distance of at least 750 m in a straight line. One the other hand, it is very difficult to get a sufficient distance for the experiment. Therefore, this paper compared the headlight detection capability according to a target distance of 50 m and 500 m. This paper shows the research results of spectral analysis for active high-beam control. Therefore, it will assist in studies of IHC using spectral information in the future. A hyperspectral image can be used for lane markings detection and traffic sign recognition by extracting some band (like RGB band), if the hyperspectral camera has a high resolution and frame rate, such as current dashboard camera. The current state-of-the-art hyperspectral sensor can provide 16–25 bands with frame rate of 10 Hz. Figure 2 summarizes the headlight detection flow including a comparison with preprocessing methods, detection methods and an evaluation of the performance of each method and previous method. Three types of headlights were used, LED, HID and halogen. Preprocessing can be divided into two types. First, the RGB data is extracted from a hyperspectral image; its red, green and blue band is 639 nm, 549.41 nm, and 457.97 nm respectively. The Mean-Norm is Mean-centered normalization [26]. Headlights are detected using a spectral normalization and spectral distance mapper, such as Spectral Angle Mapper (SAM), Spectral Correlation Mapper (SCM) and Euclidean Distance Mapper (EDM).

3. Proposed Vehicle Headlight Detection System Using Hyperspectral Image

Figure 3 presents the spectral profile (in 10 random points) when the distance is 300 m from the hyperspectral camera to the vehicle. The peak value in the LED headlight is observed at 445.71 nm in the visible spectrum because LED lamps do not emit ultraviolet or infrared wavelengths. This makes it, more efficient to compare them with other light sources, such as halogen and HID lamps [27]. HID lamps are a source of light that is produced by the radiant energy generated from a gas discharge [28]. Halogen lamps emit light when a tungsten filament is heated to high temperatures [29]. Therefore, each head light source has different features.
Figure 4 presents the spectral profile of a street lamp. The street lamps have different spectral profile information from headlamps.

3.1. Preprocessing Spectral Profile (Normalized Spectral Profile)

Normalization techniques are used to solve the problems generated by different spectral profiles from the same material due to shading and shadow effects [26]. A range of normalization methods are used. Max-Norm is a Max-normalization method that divides the spectral profile using the maximum value in a pixel [30]. And Norm-Norm is Norm-normalization, which divides the spectral profile from the norm value in a pixel [31]. The most common technique is the same as the a Mean-Norm which is the same as mean-centered normalization [32] and can be expressed as Equation (1):
I ( P , λ ) = c 0 i ( P , λ ) 1 N λ = 1 N i ( P , λ )
where I ( P , λ ) is a normalized spectral profile intensity in the P t h pixel at the λ t h band in a hyperspectral image, and i ( P , λ ) is the raw spectral profile in a hyperspectral image. N is the total number of bands (In case of this paper, the number of bands is 258), c 0 is the scale factor to make an appropriate value to see an image [30]. However, it is possible to omit the constant c 0 . In Equation (1), this intensity value I ( P , λ ) is divided by the mean of the raw spectral profile. Figure 5 presents three headlights after processing the Mean-Norm; (a) is the LED headlight profile (b) is the HID headlight and (c) is the halogen headlight.

3.2. Processing Headlight Detection

This study used three detection metrics, SAM, SCM, and EDM. These methods were compared in each headlight type (LED, HID and Halogen).

3.2.1. Spectral Angle Mapper (SAM)

The SAM algorithm is based on the ideal assumption that a single pixel of remotely sensed images represents a certain ground cover material, and can be assigned uniquely to only one ground cover class. The SAM algorithm is just a simple measurement of the spectral similarity between two spectra [33]. The SAM compares the angle between the spectrum vector and each pixel vector in n-dimensional space in Equation (2), where SAM(P) is the spectral angle between the reference spectra and a test spectra of the P t h pixel. The reference spectra is a learned profile and the test spectra is the profile of a pixel in the image to determine if it is a headlamp.
S A M ( P ) = cos 1 ( n = 1 N i r e f × i t e s t n = 1 N i r e f 2 × n = 1 N i t e s t 2 )
Geometrically, at a lower angle (close to 0 ), each spectrum has similar materials and the higher the angle (close to 90 ), each spectra does not have similar materials, as shown in Figure 6.

3.2.2. Spectral Correlation Mapper (SCM)

The SCM was introduced [35] to measure the spectral similarity of the two reference spectral intensity ( I r e f ) and test spectral intensity ( I t e s t ).
S C M ( P ) = n = 1 N i r e f × i t e s t n = 1 N i r e f n = 1 N i t e s t [ n = 1 N i r e f 2 n = 1 N ( i r e f ) 2 ] [ n = 1 N i t e s t 2 n = 1 N ( i t e s t ) 2 ]
where I r e f is the average spectrum in each headlight (distance is 300 m from the hyperspectral camera to the vehicle). I t e s t is the difference spectral in the test image (distance is 450 m from the hyperspectral camera to the vehicle is 450 m). The SCM has the advantage in that it has information that is expressed a number.

3.2.3. Euclidean Distance Mapper (EDM)

The EDM is one of the popular spectral similarity measures and has been used widely in multispectral and hyperspectral image analysis. The EDM is expressed as Equation (4) in n-dimensional space [36].
E D M ( P ) = n = 1 N ( I r e f I t e s t ) 2
The Euclidean distance can measure the spectral distance between two spectral profiles in n-dimensional (the number of bands) spectral feature space. Figure 7 presents a description of the EDM in two bands. When the goal of optimization is to minimize the squared error, it can be used by the negative Euclidean distance [37].

4. Experimental Results

4.1. Hyperspectral Image Acquisition System

The hyperspectral image acquisition system consisted of a SPECIM VNIR (Middleton Spectral Vision, Tallahassee, FL, USA) camera mounted on a rotary tripod (Table 1). The original image contained a total 1032 bands with the highest spectral resolution; however, this study used only 258 bands to detect the headlamp to reduce the number of dimensions. The spectral range was 400–1000 nm. The radiance data was saved in 12-bit binary files. Although the image size was 1392 × 1040 pixels, only half of the data of the total image size, such as 200 × 350 pixels, was used to reduce the data.
Headlamps were prepared such as Mercedes-Benz′s E-class (LED lamp) (Friedberg, Germany), Hyundai′s Grandeur-HG (HID lamp) (Deagu, Korea), Kia′s Sorento (Halogen lamp) (Deagu, Korea), Hyundai′s Sonata(Halogen lamp) (Deagu, Korea) and Kia′s Cerato(Halogen lamp) (Deagu, Korea). Unlike the detection of reflecting light from a material, this experiment minimized the aperture size of the hyperspectral camera, to detect the light directly from the lamp. Matlab R2015b (The MathWorks Inc., Natick, MA, USA) was used to analyze the spectral profiles and process the headlamp detection. Figure 8 presents the installed visible near infrared (VNIR) hyperspectral image acquisition system on a parking lot in Yeungnam University (Gyeongsan-si, Korea). The distance between the camera and automotive headlamp was 450 m (test image).

4.2. First Experiment for Three Type Lamps Detection

Three types of evaluations were used for each method combination for the detection of each headlight (LED, HID and halogen). This experiment can distinguish between each lamp and streetlamps. In the experiments, preprocessing and detection methods were used.

4.2.1. Evaluation Performance Using ROC and AUC in the First Experiment

First, a range of methods were evaluated using the Receiver Operating Curve (ROC) and Area Under Curve (AUC); therefore, the best combination can be found by comparing with the previous method to detect a headlight (using OTSU algorithm after converting to grayscale). In the experiment, the detection result can be compared with the AUC in the ROC analysis of detectors. A mask image was made, as shown in Figure 9, to compare the detection methods using the ROC and AUC.
Figure 10 shows the ROC result of the detection headlight using each method. The types of headlights are listed at the beginning of each row in the figure and three detection metrics are at the head of in each column figure. Each figure presents, two results from the hyperspectral image (proposed method) and RGB image (previous method). Figure 10 presents the different false positive rate (FPR) value, which is the so-called specificity at the same true positive rate (TPR) value so-called sensitivity in each method [38]. The number of thresholds is 1000 with the same interval.
Table 2 summarizes the AUC metric in each headlight along with a combination with the preprocessing and detection metric.
In the case of the LED headlight, the best detection metric was the SCM and the AUC values were higher than with previous methods using the SCM (Raw, Mean-Norm) and the EDM (Raw). In the case of the HID headlight, the best combination method was the SAM detection metric and Mean-Norm preprocessing. In addition, all AUC values were higher than with previous methods. In the case of the halogen headlight, the best combination method was the EDM detection metric and raw data.

4.2.2. Detection Performance Given the Same False Positive Rate (FPR)

Figure 11 presents the results at the same FPR and compares the data from the proposed method with that obtained from the previous method [12]. Previous methods detected headlights using grayscale, ROI and blobs in a CCD or CMOS dashboard camera. The proposed method detects headlights using the best combination method in a hyperspectral image. Therefore, the results are compared with grayscale and the best combination method. The detection performance was compared when the FPR was lower than 0.0001. Comparing each detection performance is more effective because the number of target pixels is very low compared to the entire image. In Figure 11, there are three types of headlights, LED, HID and halogen. In a hyperspectral image, the best combinations were the Mean-Norm and SCM in LED, Mean-Norm and SAM in HID and raw data and EDM in Halogen. In the previous method, the thresholds for the LED, HID, and halogen were 337 t h , 510 t h , and 310 t h in the LED, HID and halogen in the RGB image. In the proposed method, the threshold was 45 t h , 565 t h and 421 t h in the hyperspectral image. The total number of thresholds was one thousand.

4.3. Second Experiment: Distinction of Headlights from the Reflections of Ego-Car

The second experiment was conducted to distinguish headlights from the reflection of ego-car in the infrastructure. Figure 12 shows the experimental environments. In the center of Figure 12, the green squares represent the halogen headlamps of an incoming car; the red square indicates the reflection posts; the yellow circle indicates the hyperspectral camera.

4.3.1. Comparison of the Spectral Information between Halogen Lamp and the Reflection from a Halogen Lamp

The halogen lamp and the reflection of the ego-car show different spectral information. In particular, the yellow reflection, whose spectra is shown in Figure 13c, absorbs in the blue region (380–500 nm). Similarly, the red reflection, whose spectra is shown in Figure 13d, absorbs in the blue and green region (380–580 nm). In addition, the white reflection showed a different spectral profile compared to the halogen lamp, as shown in Figure 14.

4.3.2. Headlight Detection Results in the Reflections of Ego-Car Environment

Figure 15 shows the experimental results of distinguishing the headlights from the reflection plates. Note that the proposed hyperspectral image analysis method can remove the false detections caused by the reflection plates.

4.4. Third Experiment to Compare the Headlight Detection from a Near and Far Distance

This experiment compares the detection headlight from the near distance and far distance. Figure 16 compares the headlight detection from the far distance (500 m) and near distance (50 m). The maximum line-of-sight (LOS) distance in authors’ university was 500 m.
The proposed hyperspectral image-based spectral analysis method can detect near and far distant headlights, simultaneously. Almost no degradation of the detection performance depending on the headlight distance was observed because the hyperspectral lamp signatures show consistent spectral profiles.

4.5. Fourth Experiment to Detect the Rear Lamps for the Distinction with Other Lamps

Rear lamps are usually in red color, and there are two types of lamps, such as Halogen and LED. The detection of rear lamps is very important in the development of an IHC system.

4.5.1. Comparison Spectral Information Rear Lamps (Halogen and LED)

Figure 17 shows the spectral profiles of each rear lamp, such as Halogen and LED. The intensity of the halogen rear lamp is distributed from 650 nm to 900 nm. The intensity of the LED rear lamp has a peak value at 630 nm.

4.5.2. Rear Lamp Detection Result with Other Lights

The rear lamp also can distinguish other lights, such as street lamps or other headlights. Figure 17 shows rear lamp detection result at 500 m. In the case rear lamp detection, it is trained as just a rear lamp; other headlights are not detected result in Figure 18e,f.
The proposed hyperspectral image-based spectral analysis method also can detect a rear lamp.

5. Conclusions

This paper proposed a new detection method using a hyperspectral image to detect automotive headlights effective for IHC. The experiments were conducted using the SAM, SCM, and EDM distance metrics and the detection performance was assessed using the ROC and AUC methods on real headlight images. In addition, there were two additional experiments. The threshold sensitivity from the previous methods was compared with that of the proposed method. According to the results, the best combination is the SCM detection metric with the raw or Mean-Norm in the case of LED. The best combination is the SAM detection metric with Mean-Norm in the case of HID. The best combination in the case of the halogen lamp was the EDM metric with raw data. The detection performance of the proposed method was lower at the same FPR ( 0 . 0001 ). Furthermore, headlights could be detected using a hyperspectral image with a reflection plate. The rear lamp could also be detected with other lights. Nevertheless, future studies will be needed to improve the detection performance using a band selection method to reduce the processing time.

Acknowledgments

This study was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (NRF-2014R1A2A2A01002299).

Author Contributions

The contributions were distributed between authors as follows : Heekang Kim wrote the text of the manuscript, programmed the headlight detection, researched of the related literature, and performed the accuracy experiments. Soon Kwon guided the active high-beam scenarios and experimental environments including discussions of the test results. Sungho Kim confirmed this paper and performed an in-depth discussion of the related literature and confirmed the accuracy of the experiments that are exclusive to this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Li, Y.; Pankanti, S. A performance study of an intelligent headlight control system. In Proceedings of the Winter Conference on Appilcation of Compter Vision, Kona, HI, USA, 5–7 January 2011; pp. 440–447.
  2. Mohite, H.; Mahangade, B.; Gholase, M.; Kattgihalimath, S.; Kumbhar, S. Intelligent and Adaptive Headlight with Electronic Controlled Power Steering System (IAEPS). In Proceedings of the IEEE Workshop on International Journal of Current Engineering and Technology (IJCET), Burbank, CA, USA, 2 April 2015; Volume 5, pp. 1026–1029.
  3. Büthorn, R.; Tadjine, H.H.; Auerbach, B.; Schulze, K. Advanced Headlight System: 3D High Beam. In Proceedings of the International Forum on Advanced Microsystems for Automotive Applications (AMAA2013), Berlin, Germany, 17–18 June 2013; pp. 77–88.
  4. Gavriilidis, A.; Muller, D.; Muller-Schneiders, S.; Velten, J.; Kummert, A. Sensor system blockage detection for night time headlight control based on camera and radar sensor information. In Proceedings of the International IEEE Conference on Intelligent Transportation Systems (ITSC), Anchorage, AK, USA, 16–19 September 2012; pp. 78–83.
  5. Kurita, K. Non-contact Moving Vehicle Detection Method Based on Electrostatic Induction Technique. J. Sci. Eng. 2015, 6, 24–28. [Google Scholar]
  6. Shekhar, S.; Kamal, M.; Joshi, A.; Kumar, R.; Nigam, V. Antitheft System in Vehicle with Automatic Headlights. Int. J. Adv. Res. Innov. 2015, 3, 394–397. [Google Scholar]
  7. Salvi, G. An automated nighttime vehicle counting and detection system for traffic surveillance. In Proceedings of the International Conference on Computational Science and Computational Intelligence (CSCI2014), Las Vegas, NV, USA, 10–13 March 2014; Volume 1, pp. 131–136.
  8. Eum, S.; Jung, H.G. Enhancing light blob detection for intelligent headlight control using lane detection. IEEE Trans. Intell. Transp. Syst. 2013, 1, 1003–1011. [Google Scholar] [CrossRef]
  9. Zou, Q.; Ling, H.; Luo, S.; Huang, Y.; Tian, M. Robust Nighttime Vehicle Detection by Tracking and Grouping Headlights. IEEE Trans. Intell. Transp. Syst. 2015, 16, 2838–2849. [Google Scholar] [CrossRef]
  10. Gang, L.; Zhang, M.; Zhao, X.; Wang, S. Improved Genetic Algorithm Optimization for Forward Vehicle Detection Problems. Information 2015, 6, 339–360. [Google Scholar] [CrossRef]
  11. Chen, Y.L.; Chiang, H.H.; Chiang, C.Y.; Liu, C.M.; Yuan, S.M.; Wang, J.H. A vision-based driver nighttime assistance and surveillance system based on intelligent image sensing techniques and a heterogamous dual-core embedded system architecture. Sensors 2012, 12, 2373–2399. [Google Scholar] [CrossRef] [PubMed]
  12. Guo, J.; Wang, J.; Guo, X.; Yu, C.; Sun, X. Preceding vehicle detection and tracking adaptive to illumination variation in night traffic scenes based on relevance analysis. Sensors 2014, 14, 15325–15347. [Google Scholar] [CrossRef] [PubMed]
  13. Taktak, R.; Dufaut, M.; Husson, R. Vehicle detection at night using image processing and pattern recognition. In Proceedings of the IEEE International Conference Image Processing (ICIP 94), Austin, TX, USA, 13–16 November 1994.
  14. Sina, I.; Wibisono, A.; Nurhadiyatna, A.; Hardjono, B.; Jatmiko, W.; Mursanto, P. Vehicle counting and speed measurement using headlight detection. In Proceedings of the IEEE International Conference on Advanced Computer Science and Information Systems (ICACSIS 2013), Bali, Indonesia, 28–29 September 2013; pp. 149–154.
  15. Juric, D.; Loncaric, S. A method for on-road night-time vehicle headlight detection and tracking. In Proceedings of the International Conference on Connected Vehicles and Expo (ICCVE 2014), Vienna, Austria, 3–7 November 2014; Volume 6, pp. 339–360.
  16. Ogura, R.; Ohashi, G. Vehicles detection based on extremas in nighttime driving scene. In Proceedings of the 1st IEEE Global Conference on Consumer Electronics, Tokyo, Japan, 2–5 October 2012; pp. 679–682.
  17. Rubio, J.C.; Serrat, J.; López, A.M.; Ponsa, D. Multiple-target tracking for intelligent headlights control. IEEE Trans. Intell. Transp. Syst. 2012, 13, 594–605. [Google Scholar] [CrossRef]
  18. López, A.; Hilgenstock, J.; Busse, A.; Baldrich, R.; Lumbreras, F.; Serrat, J. Nighttime vehicle detection for intelligent headlight control. In Advanced Concepts for Intelligent Vision Systems; Springer: Berlin/Heidelberg, Germany, 2008; pp. 113–124. [Google Scholar]
  19. Konda Naganathan, G.; Cluff, K.; Samal, A.; Calkins, C.; Jones, D.D.; Lorenzen, C.; Subbiah, J. A prototype on-line AOTF hyperspectral image acquisition system for tenderness assessment of beef carcasses. J. Food Eng. 2014, 154, 1–9. [Google Scholar] [CrossRef]
  20. Fu, X.; Kim, M.S.; Chao, K.; Qin, J.; Lim, J.; Lee, H.; Garrido-Varom, A.; Pérez-Marín, D.; Ying, Y. Detection of melamine in milk powders based on NIR hyperspectral imaging and spectral similarity analyses. J. Food Eng. 2014, 124, 97–104. [Google Scholar] [CrossRef]
  21. Rivera, N.V.; Gomez-Sanchis, J.; Chanona-Perez, J.; Carrasco, J.J.; Millan-Giraldo, M.; Lorente, D.; Cubero, S.; Blasco, J. Early detection of mechanical damage in mango using NIR hyperspectral images and machine learning. Biosyst. Eng. 2014, 122, 91–98. [Google Scholar]
  22. Du, B.; Zhang, L.; Tao, D.; Wang, N.; Chen, T. A spectral dissimilarity constrained nonnegative matrix factorization based cancer screening algorithm from hyperspectral fluorescence images. In Proceedings of the International Conference on Computerized Healthcare (ICCH 2012), Hong Kong, China, 17–18 December 2012; pp. 112–119.
  23. Liu, Z.; Wang, H.; Li, Q. Tongue tumor detection in medical hyperspectral images. Sensors 2011, 12, 162–174. [Google Scholar] [CrossRef] [PubMed]
  24. Boyce, P.; Hoffman, D.D.; Wallaschek, J. Automotive Lighting and Human Vision; Springer: Berlin, Germany, 2007. [Google Scholar]
  25. Yamazato, T.; Kinoshita, M.; Arai, S.; Souke, E.; Yendo, T.; Fujii, T.; Kamakura, K.; Okada, H. Vehicle Motion and Pixel Illumination Modeling for Image Sensor Based Visible Light Communication. Trans. ASAE 2015, 46, 551–558. [Google Scholar] [CrossRef]
  26. Ibrahim, I. Illumination Invariance and Shadow Compensation on Hyperspectral Images; Cranfield University: Cranfield, UK, 2014. [Google Scholar]
  27. Jung, E.D.; Lee, Y.L. Development of a heat dissipating LED headlamp with silicone lens to replace halogen bulbs in used cars. Appl. Therm. Eng. 2015, 86, 143–150. [Google Scholar] [CrossRef]
  28. Nsibi, W.; Chammam, A.; Nehdi, M.N.; Mrabet, B.; Sellami, A.; Zissis, G. HID lamps under low frequency square wave operation: Experimental Evaluation of Dimming effects. Light. Res. Technol. 2016. [Google Scholar] [CrossRef]
  29. Gaglianone, L.A.; Lima, A.F.; Goncalves, L.S.; Cavalcanti, A.N.; Aguiar, F.H.B.; Marchi, G.M. Mechanical properties and degree of conversion of etch-and-rinse and self-etch adhesive systems cured by a quartz tungsten halogen lamp and a light-emitting diode. J. Mech. Behav. Biomed. Mater. 2012, 12, 139–143. [Google Scholar] [CrossRef] [PubMed]
  30. Cheng, X.; Tao, Y.; Chen, Y.R.; Luo, Y. Nir/MIR dual–sensor machine vision system for online apple stem–end/calyx recognition. Trans. ASAE 2003, 46, 551–558. [Google Scholar] [CrossRef]
  31. Feilhauer, H.; Asner, G.P.; Martin, R.E.; Schmidtlein, S. Brightness-normalized partial least squares regression for hyperspectral data. J. Quant. Spectrosc. Radiat. Transf. 2010, 144, 1947–1957. [Google Scholar] [CrossRef]
  32. Sun, D.W. Hyperspectral Imaging for Food Quality Analysis and Control; Elsevier: Amsterdam, Nederland, 2010. [Google Scholar]
  33. Liu, X.; Yang, C. A kernel spectral angle mapper algorithm for remote sensing image classification. In Proceedings of the IEEE International Congress on Image and Signal Processing (CISP 2013), Hangzhou, China, 16–18 December 2013; pp. 814–818.
  34. Ibraheem, I. Comparative study of maximum likelihood and spectral angle mapper algorithms used for automated detection of melanoma. Skin Res. Technol. 2015, 21, 84–89. [Google Scholar] [CrossRef] [PubMed]
  35. van der Meero, F.; Bakker, W. Cross correlogram spectral matching: application to surface mineralogical mapping by using AVIRIS data from Cuprite, Nevada. Remote Sens. Environ. 1997, 61, 371–382. [Google Scholar] [CrossRef]
  36. Tuominen, J.; Lipping, T. Assessment of hyperspectral classification methods for benthic cover type mapping. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2012), Munich, Germany, 22–27 July 2012; pp. 4837–4840.
  37. Su, H.; Sheng, Y.; Du, P.; Liu, K. Adaptive affinity propagation with spectral angle mapper for semi-supervised hyperspectral band selection. Appl. Opt. 2012, 51, 2656–2663. [Google Scholar] [CrossRef] [PubMed]
  38. Nagaoka, T.; Kiyohara, Y.; Koga, H.; Nakamura, A.; Saida, T.; Sota, T. Modification of a melanoma discrimination index derived from hyperspectral data: a clinical trial conducted in 2 centers between March 2011 and December 2013. Skin Res. Technol. 2015, 21, 278–283. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Previous headlight detection method: (a) composite image by extracting the RGB bands in the hyperspectral camera (b) binary image of headlight detection using the OTSU algorithm [12].
Figure 1. Previous headlight detection method: (a) composite image by extracting the RGB bands in the hyperspectral camera (b) binary image of headlight detection using the OTSU algorithm [12].
Sensors 16 01058 g001
Figure 2. Block diagram of processing to detect vehicle headlights and a comparison of each method.
Figure 2. Block diagram of processing to detect vehicle headlights and a comparison of each method.
Sensors 16 01058 g002
Figure 3. Measured spectral profiles of headlights (a) LED, (b) HID and (c) Halogen.
Figure 3. Measured spectral profiles of headlights (a) LED, (b) HID and (c) Halogen.
Sensors 16 01058 g003
Figure 4. Spectral profile of a streetlight.
Figure 4. Spectral profile of a streetlight.
Sensors 16 01058 g004
Figure 5. Comparison of the spectral profiles after preprocessing in the Mean-Norm (a) LED headlight (b) HID headlight (c) Halogen headlight.
Figure 5. Comparison of the spectral profiles after preprocessing in the Mean-Norm (a) LED headlight (b) HID headlight (c) Halogen headlight.
Sensors 16 01058 g005
Figure 6. Description of the SAM in the three bands reference [34].
Figure 6. Description of the SAM in the three bands reference [34].
Sensors 16 01058 g006
Figure 7. Description of the EDM in two bands.
Figure 7. Description of the EDM in two bands.
Sensors 16 01058 g007
Figure 8. Visible near infrared (VNIR) hyperspectral image acquisition system.
Figure 8. Visible near infrared (VNIR) hyperspectral image acquisition system.
Sensors 16 01058 g008
Figure 9. Mask image to calculate the ROC and AUC.
Figure 9. Mask image to calculate the ROC and AUC.
Sensors 16 01058 g009
Figure 10. ROC in each combination methods (a) using the SAM detection metric in a LED headlight (b) using the SCM detection metric in a LED headlight (c) using the EDM detection metric in a LED headlight (d) using the SAM detection metric in a HID headlight (e) using the SCM detection metric in a HID headlight (f) using the EDM detection metric in a HID headlight (g) using the SAM detection metric in a halogen headlight (h) using the SCM detection metric in a halogen headlight (i) using the EDM detection metric in a halogen headlight.
Figure 10. ROC in each combination methods (a) using the SAM detection metric in a LED headlight (b) using the SCM detection metric in a LED headlight (c) using the EDM detection metric in a LED headlight (d) using the SAM detection metric in a HID headlight (e) using the SCM detection metric in a HID headlight (f) using the EDM detection metric in a HID headlight (g) using the SAM detection metric in a halogen headlight (h) using the SCM detection metric in a halogen headlight (i) using the EDM detection metric in a halogen headlight.
Sensors 16 01058 g010
Figure 11. Comparison of the previous method [12] and with proposed method in (a) LED, (b) HID, (c) Halogen.
Figure 11. Comparison of the previous method [12] and with proposed method in (a) LED, (b) HID, (c) Halogen.
Sensors 16 01058 g011
Figure 12. Introduction to the experimental environment.
Figure 12. Introduction to the experimental environment.
Sensors 16 01058 g012
Figure 13. Spectral information of a halogen lamp and reflection: (a) halogen lamp spectra (b) white reflection spectra, (c) yellow reflection spectra, and (d) red reflection spectra.
Figure 13. Spectral information of a halogen lamp and reflection: (a) halogen lamp spectra (b) white reflection spectra, (c) yellow reflection spectra, and (d) red reflection spectra.
Sensors 16 01058 g013
Figure 14. Comparison by the normalized spectral halogen lamp and reflection plate (a) without smoothing filter (b) with smoothing filter.
Figure 14. Comparison by the normalized spectral halogen lamp and reflection plate (a) without smoothing filter (b) with smoothing filter.
Sensors 16 01058 g014
Figure 15. Results of headlight detection (a) test scene (RGB picture), (b) using the RGB image (c) using hyperspectral image.
Figure 15. Results of headlight detection (a) test scene (RGB picture), (b) using the RGB image (c) using hyperspectral image.
Sensors 16 01058 g015
Figure 16. Comparison of the detection results according to the headlight distance: (a) RGB composite image acquired at 500 m; (b) RGB composite image acquired at 50 m; (c) SCM result for the hyperspectral image at 500 m; (d) SCM result for the hyperspectral image at 50 m; (e) detection result by thresholding to the SAM image for the 500 m data; (f) detection result by thresholding to SAM image for the 50 m data.
Figure 16. Comparison of the detection results according to the headlight distance: (a) RGB composite image acquired at 500 m; (b) RGB composite image acquired at 50 m; (c) SCM result for the hyperspectral image at 500 m; (d) SCM result for the hyperspectral image at 50 m; (e) detection result by thresholding to the SAM image for the 500 m data; (f) detection result by thresholding to SAM image for the 50 m data.
Sensors 16 01058 g016
Figure 17. Spectral information rear lamp: (a) Halogen rear lamp (b) LED rear lamp.
Figure 17. Spectral information rear lamp: (a) Halogen rear lamp (b) LED rear lamp.
Sensors 16 01058 g017
Figure 18. Comparison of the detection of each rear lamp, such as halogen and LED: (a) RGB composite image acquired from a halogen lamp; (b) RGB composite image acquired from a LED lamp; (c) SCM result for the hyperspectral image from a halogen lamp; (d) SCM result for the hyperspectral image from a LED lamp; (e) detection result by thresholding to SAM image from a halogen lamp; (f) detection result by thresholding to SAM image from a LED lamp.
Figure 18. Comparison of the detection of each rear lamp, such as halogen and LED: (a) RGB composite image acquired from a halogen lamp; (b) RGB composite image acquired from a LED lamp; (c) SCM result for the hyperspectral image from a halogen lamp; (d) SCM result for the hyperspectral image from a LED lamp; (e) detection result by thresholding to SAM image from a halogen lamp; (f) detection result by thresholding to SAM image from a LED lamp.
Sensors 16 01058 g018
Table 1. Specifications of the hyperspectral image acquisition system.
Table 1. Specifications of the hyperspectral image acquisition system.
ItemSpecifications
Spectral range400–1000 nm (VNIR)
SpectrographImSpector V10E 30 μ slit, 2.8 nm spectral resolution
CameraKappa 1392 × 1040 pixels, 12 bits, 11 fps, Firewire interface
ScannerRotational tripod, scan angle: max 160
Table 2. AUC values in each headlight image (including combinations in each method).
Table 2. AUC values in each headlight image (including combinations in each method).
Headlight TypeDetectionRawMean-Norm
LEDSAM0.99490.9857
(OTSU = 0.9964)SCM0.99920.9992
EDM0.99700.9858
HIDSAM0.99800.9981
(OTSU = 0.9937)SCM0.99650.9965
EDM0.99700.9980
HalogenSAM0.93660.9351
(OTSU = 0.9800)SCM0.96560.9654
EDM0.99360.9334

Share and Cite

MDPI and ACS Style

Kim, H.; Kwon, S.; Kim, S. Hyperspectral Image-Based Night-Time Vehicle Light Detection Using Spectral Normalization and Distance Mapper for Intelligent Headlight Control. Sensors 2016, 16, 1058. https://doi.org/10.3390/s16071058

AMA Style

Kim H, Kwon S, Kim S. Hyperspectral Image-Based Night-Time Vehicle Light Detection Using Spectral Normalization and Distance Mapper for Intelligent Headlight Control. Sensors. 2016; 16(7):1058. https://doi.org/10.3390/s16071058

Chicago/Turabian Style

Kim, Heekang, Soon Kwon, and Sungho Kim. 2016. "Hyperspectral Image-Based Night-Time Vehicle Light Detection Using Spectral Normalization and Distance Mapper for Intelligent Headlight Control" Sensors 16, no. 7: 1058. https://doi.org/10.3390/s16071058

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop