Next Article in Journal
Carbon Nanotube-Based Chemiresistive Sensors
Next Article in Special Issue
Shadow-Based Vehicle Detection in Urban Traffic
Previous Article in Journal
A Quantitative PCR-Electrochemical Genosensor Test for the Screening of Biotech Crops
Previous Article in Special Issue
Real-Time (Vision-Based) Road Sign Recognition Using an Artificial Neural Network
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(4), 883; doi:10.3390/s17040883

Multiple Objects Fusion Tracker Using a Matching Network for Adaptively Represented Instance Pairs

Department of Media Engineering, Catholic University of Korea, 43-1, Yeoggok 2-dong, Wonmmi-gu, Bucheon-si, Gyeonggi-do 14662, Korea
*
Author to whom correspondence should be addressed.
Academic Editor: Simon X. Yang
Received: 27 February 2017 / Revised: 14 April 2017 / Accepted: 14 April 2017 / Published: 18 April 2017
(This article belongs to the Special Issue Sensors for Transportation)
View Full-Text   |   Download PDF [1608 KB, uploaded 18 April 2017]   |  

Abstract

Multiple-object tracking is affected by various sources of distortion, such as occlusion, illumination variations and motion changes. Overcoming these distortions by tracking on RGB frames, such as shifting, has limitations because of material distortions caused by RGB frames. To overcome these distortions, we propose a multiple-object fusion tracker (MOFT), which uses a combination of 3D point clouds and corresponding RGB frames. The MOFT uses a matching function initialized on large-scale external sequences to determine which candidates in the current frame match with the target object in the previous frame. After conducting tracking on a few frames, the initialized matching function is fine-tuned according to the appearance models of target objects. The fine-tuning process of the matching function is constructed as a structured form with diverse matching function branches. In general multiple object tracking situations, scale variations for a scene occur depending on the distance between the target objects and the sensors. If the target objects in various scales are equally represented with the same strategy, information losses will occur for any representation of the target objects. In this paper, the output map of the convolutional layer obtained from a pre-trained convolutional neural network is used to adaptively represent instances without information loss. In addition, MOFT fuses the tracking results obtained from each modality at the decision level to compensate the tracking failures of each modality using basic belief assignment, rather than fusing modalities by selectively using the features of each modality. Experimental results indicate that the proposed tracker provides state-of-the-art performance considering multiple objects tracking (MOT) and KITTIbenchmarks. View Full-Text
Keywords: multiple objects tracking; deep learning; multiple sensor fusion; LIDAR; CCD multiple objects tracking; deep learning; multiple sensor fusion; LIDAR; CCD
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Oh, S.-I.; Kang, H.-B. Multiple Objects Fusion Tracker Using a Matching Network for Adaptively Represented Instance Pairs. Sensors 2017, 17, 883.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top