Next Article in Journal
Analysis of an ABE Scheme with Verifiable Outsourced Decryption
Next Article in Special Issue
Study of the Integration of the CNU-TS-1 Mobile Tunnel Monitoring System
Previous Article in Journal
A Police and Insurance Joint Management System Based on High Precision BDS/GPS Positioning
Previous Article in Special Issue
Real-Time Indoor Scene Description for the Visually Impaired Using Autoencoder Fusion Strategies with Visible Cameras
Article Menu
Issue 1 (January) cover image

Export Article

Open AccessArticle
Sensors 2018, 18(1), 171; doi:10.3390/s18010171

A New Localization System for Indoor Service Robots in Low Luminance and Slippery Indoor Environment Using Afocal Optical Flow Sensor Based Sensor Fusion

Department of Electrical and Computer Engineering, Automation and Systems Research Institute (ASRI), Seoul National University, Seoul 151-742, Korea
Inter-University Semiconductor Research Center (ISRC), Seoul National University, Seoul 151-742, Korea
Author to whom correspondence should be addressed.
Received: 6 December 2017 / Revised: 5 January 2018 / Accepted: 5 January 2018 / Published: 10 January 2018
(This article belongs to the Special Issue Indoor LiDAR/Vision Systems)
View Full-Text   |   Download PDF [8457 KB, uploaded 10 January 2018]   |  


In this paper, a new localization system utilizing afocal optical flow sensor (AOFS) based sensor fusion for indoor service robots in low luminance and slippery environment is proposed, where conventional localization systems do not perform well. To accurately estimate the moving distance of a robot in a slippery environment, the robot was equipped with an AOFS along with two conventional wheel encoders. To estimate the orientation of the robot, we adopted a forward-viewing mono-camera and a gyroscope. In a very low luminance environment, it is hard to conduct conventional feature extraction and matching for localization. Instead, the interior space structure from an image and robot orientation was assessed. To enhance the appearance of image boundary, rolling guidance filter was applied after the histogram equalization. The proposed system was developed to be operable on a low-cost processor and implemented on a consumer robot. Experiments were conducted in low illumination condition of 0.1 lx and carpeted environment. The robot moved for 20 times in a 1.5 × 2.0 m square trajectory. When only wheel encoders and a gyroscope were used for robot localization, the maximum position error was 10.3 m and the maximum orientation error was 15.4°. Using the proposed system, the maximum position error and orientation error were found as 0.8 m and within 1.0°, respectively. View Full-Text
Keywords: AOFS; localization; low illumination; slippery environment AOFS; localization; low illumination; slippery environment

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Yi, D.-H.; Lee, T.-J.; Cho, D.-I.“. A New Localization System for Indoor Service Robots in Low Luminance and Slippery Indoor Environment Using Afocal Optical Flow Sensor Based Sensor Fusion. Sensors 2018, 18, 171.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top