Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (5)

Search Parameters:
Keywords = taillight pairing

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
34 pages, 8312 KB  
Article
A Taillight Matching and Pairing Algorithm for Stereo-Vision-Based Nighttime Vehicle-to-Vehicle Positioning
by Thai-Hoa Huynh and Myungsik Yoo
Appl. Sci. 2020, 10(19), 6800; https://doi.org/10.3390/app10196800 - 28 Sep 2020
Cited by 5 | Viewed by 2725
Abstract
The stereo vision system has several potential benefits for delivering advanced autonomous vehicles compared to other existing technologies, such as vehicle-to-vehicle (V2V) positioning. This paper explores a stereo-vision-based nighttime V2V positioning process by detecting vehicle taillights. To address the crucial problems when applying [...] Read more.
The stereo vision system has several potential benefits for delivering advanced autonomous vehicles compared to other existing technologies, such as vehicle-to-vehicle (V2V) positioning. This paper explores a stereo-vision-based nighttime V2V positioning process by detecting vehicle taillights. To address the crucial problems when applying this process to urban traffic, we propose a three-fold contribution as follows. The first contribution is a detection method that aims to label and determine the pixel coordinates of every taillight region from the images. Second, a stereo matching method derived from a gradient boosted tree is proposed to determine which taillight in the left image a taillight in the right image corresponds to. Third, we offer a neural-network-based method to pair every two taillights that belong to the same vehicle. The experiment on the four-lane traffic road was conducted, and the results were used to quantitatively evaluate the performance of each proposed method in real situations. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

18 pages, 2003 KB  
Article
Nighttime Vehicle Detection and Tracking with Occlusion Handling by Pairing Headlights and Taillights
by Tuan-Anh Pham and Myungsik Yoo
Appl. Sci. 2020, 10(11), 3986; https://doi.org/10.3390/app10113986 - 8 Jun 2020
Cited by 20 | Viewed by 7284
Abstract
In recent years, vision-based vehicle detection has received considerable attention in the literature. Depending on the ambient illuminance, vehicle detection methods are classified as daytime and nighttime detection methods. In this paper, we propose a nighttime vehicle detection and tracking method with occlusion [...] Read more.
In recent years, vision-based vehicle detection has received considerable attention in the literature. Depending on the ambient illuminance, vehicle detection methods are classified as daytime and nighttime detection methods. In this paper, we propose a nighttime vehicle detection and tracking method with occlusion handling based on vehicle lights. First, bright blobs that may be vehicle lights are segmented in the captured image. Then, a machine learning-based method is proposed to classify whether the bright blobs are headlights, taillights, or other illuminant objects. Subsequently, the detected vehicle lights are tracked to further facilitate the determination of the vehicle position. As one vehicle is indicated by one or two light pairs, a light pairing process using spatiotemporal features is applied to pair vehicle lights. Finally, vehicle tracking with occlusion handling is applied to refine incorrect detections under various traffic situations. Experiments on two-lane and four-lane urban roads are conducted, and a quantitative evaluation of the results shows the effectiveness of the proposed method. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

19 pages, 6607 KB  
Article
Performance Evaluation of Region-Based Convolutional Neural Networks Toward Improved Vehicle Taillight Detection
by Zhenzhou Wang, Wei Huo, Pingping Yu, Lin Qi, Shanshan Geng and Ning Cao
Appl. Sci. 2019, 9(18), 3753; https://doi.org/10.3390/app9183753 - 8 Sep 2019
Cited by 9 | Viewed by 3888
Abstract
Increasingly serious traffic jams and traffic accidents pose threats to the social economy and human life. The lamp semantics of driving is a major way to transmit the driving behavior information between vehicles. The detection and recognition of the vehicle taillights can acquire [...] Read more.
Increasingly serious traffic jams and traffic accidents pose threats to the social economy and human life. The lamp semantics of driving is a major way to transmit the driving behavior information between vehicles. The detection and recognition of the vehicle taillights can acquire and understand the taillight semantics, which is of great significance for realizing multi-vehicle behavior interaction and assists driving. It is a challenge to detect taillights and identify the taillight semantics on real traffic road during the day. The main research content of this paper is mainly to establish a neural network to detect vehicles and to complete recognition of the taillights of the preceding vehicle based on image processing. First, the outlines of the preceding vehicles are detected and extracted by using convolutional neural networks. Then, the taillight area in the Hue-Saturation-Value (HSV) color space are extracted and the taillight pairs are detected by correlations of histograms, color and positions. Then the taillight states are identified based on the histogram feature parameters of the taillight image. The detected taillight state of the preceding vehicle is prompted to the driver to reduce traffic accidents caused by the untimely judgement of the driving intention of the preceding vehicle. The experimental results show that this method can accurately identify taillight status during the daytime and can effectively reduce the occurrence of confused judgement caused by light interference. Full article
Show Figures

Figure 1

23 pages, 765 KB  
Article
Preceding Vehicle Detection and Tracking Adaptive to Illumination Variation in Night Traffic Scenes Based on Relevance Analysis
by Junbin Guo, Jianqiang Wang, Xiaosong Guo, Chuanqiang Yu and Xiaoyan Sun
Sensors 2014, 14(8), 15325-15347; https://doi.org/10.3390/s140815325 - 19 Aug 2014
Cited by 22 | Viewed by 8785
Abstract
Preceding vehicle detection and tracking at nighttime are challenging problems due to the disturbance of other extraneous illuminant sources coexisting with the vehicle lights. To improve the detection accuracy and robustness of vehicle detection, a novel method for vehicle detection and tracking at [...] Read more.
Preceding vehicle detection and tracking at nighttime are challenging problems due to the disturbance of other extraneous illuminant sources coexisting with the vehicle lights. To improve the detection accuracy and robustness of vehicle detection, a novel method for vehicle detection and tracking at nighttime is proposed in this paper. The characteristics of taillights in the gray level are applied to determine the lower boundary of the threshold for taillights segmentation, and the optimal threshold for taillight segmentation is calculated using the OTSU algorithm between the lower boundary and the highest grayscale of the region of interest. The candidate taillight pairs are extracted based on the similarity between left and right taillights, and the non-vehicle taillight pairs are removed based on the relevance analysis of vehicle location between frames. To reduce the false negative rate of vehicle detection, a vehicle tracking method based on taillights estimation is applied. The taillight spot candidate is sought in the region predicted by Kalman filtering, and the disturbed taillight is estimated based on the symmetry and location of the other taillight of the same vehicle. Vehicle tracking is completed after estimating its location according to the two taillight spots. The results of experiments on a vehicle platform indicate that the proposed method could detect vehicles quickly, correctly and robustly in the actual traffic environments with illumination variation. Full article
(This article belongs to the Special Issue Positioning and Tracking Sensors and Technologies in Road Transport)
Show Figures

20 pages, 695 KB  
Article
A Region Tracking-Based Vehicle Detection Algorithm in Nighttime Traffic Scenes
by Jianqiang Wang, Xiaoyan Sun and Junbin Guo
Sensors 2013, 13(12), 16474-16493; https://doi.org/10.3390/s131216474 - 2 Dec 2013
Cited by 39 | Viewed by 8212
Abstract
The preceding vehicles detection technique in nighttime traffic scenes is an important part of the advanced driver assistance system (ADAS). This paper proposes a region tracking-based vehicle detection algorithm via the image processing technique. First, the brightness of the taillights during nighttime is [...] Read more.
The preceding vehicles detection technique in nighttime traffic scenes is an important part of the advanced driver assistance system (ADAS). This paper proposes a region tracking-based vehicle detection algorithm via the image processing technique. First, the brightness of the taillights during nighttime is used as the typical feature, and we use the existing global detection algorithm to detect and pair the taillights. When the vehicle is detected, a time series analysis model is introduced to predict vehicle positions and the possible region (PR) of the vehicle in the next frame. Then, the vehicle is only detected in the PR. This could reduce the detection time and avoid the false pairing between the bright spots in the PR and the bright spots out of the PR. Additionally, we present a thresholds updating method to make the thresholds adaptive. Finally, experimental studies are provided to demonstrate the application and substantiate the superiority of the proposed algorithm. The results show that the proposed algorithm can simultaneously reduce both the false negative detection rate and the false positive detection rate. Full article
(This article belongs to the Section Physical Sensors)
Show Figures

Back to TopTop