Next Article in Journal
An Adaptive Compensation Algorithm for Temperature Drift of Micro-Electro-Mechanical Systems Gyroscopes Using a Strong Tracking Kalman Filter
Next Article in Special Issue
Towards the Automatic Scanning of Indoors with Robots
Previous Article in Journal
Modification of an RBF ANN-Based Temperature Compensation Model of Interferometric Fiber Optical Gyroscopes
Previous Article in Special Issue
Quaternion-Based Unscented Kalman Filter for Accurate Indoor Heading Estimation Using Wearable Multi-Sensor System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Afocal Optical Flow Sensor for Reducing Vertical Height Sensitivity in Indoor Robot Localization and Navigation

1
Department of Electrical and Computer Engineering, Automation and Systems Research Institute (ASRI), Seoul National University, Seoul 151-742, Korea
2
Inter-University Semiconductor Research Center (ISRC), Seoul National University, Seoul 151-742, Korea
*
Author to whom correspondence should be addressed.
Sensors 2015, 15(5), 11208-11221; https://doi.org/10.3390/s150511208
Submission received: 6 March 2015 / Revised: 16 April 2015 / Accepted: 8 May 2015 / Published: 13 May 2015
(This article belongs to the Special Issue Sensors for Indoor Mapping and Navigation)

Abstract

:
This paper introduces a novel afocal optical flow sensor (OFS) system for odometry estimation in indoor robotic navigation. The OFS used in computer optical mouse has been adopted for mobile robots because it is not affected by wheel slippage. Vertical height variance is thought to be a dominant factor in systematic error when estimating moving distances in mobile robots driving on uneven surfaces. We propose an approach to mitigate this error by using an afocal (infinite effective focal length) system. We conducted experiments in a linear guide on carpet and three other materials with varying sensor heights from 30 to 50 mm and a moving distance of 80 cm. The same experiments were repeated 10 times. For the proposed afocal OFS module, a 1 mm change in sensor height induces a 0.1% systematic error; for comparison, the error for a conventional fixed-focal-length OFS module is 14.7%. Finally, the proposed afocal OFS module was installed on a mobile robot and tested 10 times on a carpet for distances of 1 m. The average distance estimation error and standard deviation are 0.02% and 17.6%, respectively, whereas those for a conventional OFS module are 4.09% and 25.7%, respectively.

Graphical Abstract

1. Introduction

Mobile robots are widely applied in indoor environments for various purposes, including floor cleaning, guidance, and surveillance. Accurate localization allows the mobile robots to accurately perform mapping, obstacle avoidance, and path planning [1]. Localization can be further decomposed into two types: absolute and relative [2]. Absolute localization relies on landmarks, maps, beacons, or satellite signals to determine the global position and orientation of the robot. Relative localization is usually used in the process model of estimators for absolute localization, or when absolute localization sensor measurements such as GPS are denied for a moment [3,4,5,6].
Dead reckoning (DR) is commonly used for the intermediate estimation of position during movement. DR is often used when wheel encoders are available for drive wheel position measurement. Encoders are inexpensive and allow very high sampling rates; however, because of the errors in kinematic model parameters, wheel slippage, or other subtle causes, poor position estimates may occur. Poor estimates in position during path execution require more frequent path re-planning, incurring extra overhead, and possibly slowing the time required to complete the mission. Therefore, it is important to minimize errors in estimated position during the path execution phase.
There has been interest in the use of optical flow techniques for vision-based mobile robot localization and navigation. McCarthy and Bames [7] presented a comparison of four optical flow methods and temporal filters for mobile robot navigation. The strongest performances were achieved using Lucas and Kanade and a recursive filter. Campbell et al. [8] proposed criteria and an experimental method for evaluating optical flow for visual odometry systems in real, unstructured environments. The optical flow was used to aid in the navigation of an omnidirectional robot [9]. A charge-coupled devices camera was positioned at a 45° downward angle with respect to the ground in front of the robot. The obtained optical flow was combined with the DR results using the maximum likelihood technique. The method used to calculate optical flow is quite complex and requires a large number of computations to obtain good results.
Using the two-dimensional displacement output from optical mouse sensors, which are small, inexpensive, non-contact devices have been suggested as an alternative method of odometry estimation [10,11,12,13,14,15,16,17,18,19]. Optical mouse sensors integrate a small complementary metal oxide semiconductor camera (of the order of 30 × 30 pixels) with digital signal processing hardware and proprietary firmware algorithms to infer the displacement in both the X and Y directions based on the optical flow of features identified in consecutive image frames [20]. The use of optical mouse sensors for odometry has clear advantages over using wheel encoders; specifically, the independence with respect to kinematic forces such as wheel slippage and movement are resolved in more than one axis, which is of particular interest for omnidirectional platforms [21].
Several sources of error can be identified in the use of optical mouse sensors, including surface type, height variance, lighting conditions, and the angular displacement [13,17]. Vertical height variance is a key source of odometry error when using optical mouse sensors [11,13,22]. The errors induced by height variance are related to the scaled distance inferred by the optical flow algorithm. The larger surface area represented by each pixel causes the sensor to read successively lower values. Hence, the highest count values sampled are closest to the floor, where each pixel occupies the minimum area and thus appears to move the greatest distance.
Generally, vertical height variance is a result of the robot moving over irregular surfaces; this is the dominant cause of systematic error in horizontal displacement measurements. Researchers have characterized error variances for different surfaces, particularly those commonly used in indoor flooring [13,21]. The results demonstrated that different ground surfaces yield different sensitivities, with variances typically less than 10%. Larger variances from the mean are typically recorded on rougher surfaces (e.g., carpet or stoneware) or over significant discontinuities (e.g., joins between tiles), suggesting that the variances are related to the previously identified vertical variance errors.
To minimize error induced by height variance, many of the existing implementations use a surface with minimal height variance and mount the optical mouse sensor assembly (including the lens) in direct contact with the surface being measured [11,12,13]. The use of contact-free refocused optical mouse sensors for odometry estimation has been investigated in [23,24,25,26], and the results show the sensor data is linear with respect to height [24]. Scarab, the lunar rover prototype developed at Carnegie Mellon University, uses four commodity optical mouse sensors, each of which is attached to a lens of a different focal length [27]. A differential optical navigation sensor for mobile robots that provides consistent output despite changes in the distance between the surface and the sensor has also been proposed using two OFSs [28].
In this paper, we aim to mitigate the systematic error induced by height variance in indoor mobile robots using an afocal optical mouse sensor module that consists of single optical mouse sensor, a refocused lens, and a pinhole. Using the methods described in this paper, accurate estimates of position can be maintained even when wheel slip occurs or uneven ground is encountered.

2. Methods

2.1. Afocal Optical Flow Sensor System

To mitigate errors induced by height variance, we propose an afocal OFS system with a pinhole located at the near focus of the first lens from an OFS. An afocal system is an optical system that produces no net convergence or divergence of the beam (i.e., it has an infinite effective focal length). An afocal system is formed by the combination of two focal systems. The near focal point of the first system is coincident with the front focal point of the second system. The rays parallel to the horizontal axis in object space are conjugate to the rays parallel to the axis in image space. Common afocal systems include a telescope imaging a star (the light entering the system is at infinity, and the image it forms is at infinity), binoculars, and beam expanders [29]. Figure 1 shows an afocal system in which the “A” rays parallel to the axis are only passed through the pinhole, whereas the “B” rays are not parallel to the axis, which would be blocked by the pinhole screen. In Equation (1), longitudinal magnification is constant and determined only by the focal lengths of the two lenses.
m = f 2 f 1 = h ' h
where m, f 1 , f 2 , h , and h ' are the longitudinal magnification, the near focal length of lens L1, the front focal length of lens L2, the subject, and the conjugated scale-changed subject, respectively.
The OFS selected for this work is the ADNS-3080 (Figure 2) [20], a common OFS that includes an internal low-resolution camera and a digital signal processor (DSP) programmed to estimate the relative displacement of the micro-shadows of the acquired images [30]. An OFS estimates a moving distance by analyzing with the reflected images from the active illuminated surface. If there are no irregularities on the surface, or if the surface is not sufficiently reflective, like glass, it cannot be used properly. This family of optical sensors uses complementary metal oxide semiconductor technology that integrates the camera and the DSP into the same chip. The sensor computes the optical flow by performing a comparative analysis of the sequence of images acquired of a flat surface in front of the sensor in order to estimate the motion of the surface [31]. The motion can be measured at different resolutions such as 800 counts per inch. Under this specific configuration, the OFS is internally calibrated to measure a pulse when a plain surface in front of the sensor is translated by 31.75 μm. Any change in the height between the sensor and the surface requires the development of a calibration procedure to convert the counts measured by the sensor to an estimate of the physical magnitude under measurement [32]. Palacin et al. [13] introduce an optical mouse calibration method using a MATLAB function.
Figure 1. Diagram of the afocal system.
Figure 1. Diagram of the afocal system.
Sensors 15 11208 g001
Figure 2. Cross-sectional view of the printed circuit board assembly components for the OFS ADNS-3080 (courtesy of Avago).
Figure 2. Cross-sectional view of the printed circuit board assembly components for the OFS ADNS-3080 (courtesy of Avago).
Sensors 15 11208 g002
The original design of an optical mouse based on this optical sensor combines three different parts: an OFS; an external light emitting diode (LED) to illuminate the surface in front of the sensor; and a small plastic structure that includes two convex lenses to focus the light of the LED and the image acquired by the OFS. The combination of all these parts was optimized to acquire focused images of a surface at a very low height (from 2.3 mm to 2.5 mm); although, this range can be increased by replacing the originally provided lens.
The ADNS-3080 includes an internal imaging device with a sensitive 30 × 30 array of gray intensity pixels. This OFS operates at very high frame rates and has a very fast internal shutter. It was originally designed to acquire images of a flat surface (approximately 1.82 mm × 1.82 mm) at a very low height (2.4 mm) to estimate displacement within images. The ADNS-3080 has a standard serial peripheral interface bus to read/write the internal registers and control the common actions of the sensor. According to the manufacturer’s specifications, the optical sensor can acquire and process images at a very fast speed (from 2000 to 6469 frames per second) due to the low sizes of the images acquired (30 × 30 pixels) [26].
The newly designed afocal OFS module consists of a pinhole with a diameter of 0.5 mm, a lens with a focal length of 10 mm (L1 in Figure 1), and an ADNS-3080 OFS that replaces the second lens in the afocal system (see Figure 3). As the pinhole sizes gets bigger, more less-columnated light rays enter through the pinhole, which makes the image less clear, which in turn increases the sensitivity to height variations. Note, that if the pinhole size is too small, there is insufficient light for the afocal OFS to recognize the surface pattern. We have selected a pinhole diameter of 0.5 mm with 4 LEDs in this study. The smaller pinhole size gives clear images and blocks more nonparallel light, whereas the sensor needs more light in order to obtain brighter images [33].
Figure 3. Cross-sectional view of the a focal OFS module with a pinhole diameter of 0.5 mm and a lens with a focal length of 10 mm, which images the floor onto the light-sensitive area of the sensor.
Figure 3. Cross-sectional view of the a focal OFS module with a pinhole diameter of 0.5 mm and a lens with a focal length of 10 mm, which images the floor onto the light-sensitive area of the sensor.
Sensors 15 11208 g003
Figure 4. Fabricated afocal OFS module: (a) top view without circuit board; (b) top view with circuit board; (c) perspective view; and (d) bottom view.
Figure 4. Fabricated afocal OFS module: (a) top view without circuit board; (b) top view with circuit board; (c) perspective view; and (d) bottom view.
Sensors 15 11208 g004
The fabricated afocal OFS module at different angles is shown in Figure 4. Figure 4a depicts the pinhole, Figure 4b shows the ADNS-3080 optical mouse chip, Figure 4c shows the perspective view of the afocal OFS module, and Figure 4d shows the lens and the red LEDs.

2.2. Experimental Setup

To obtain accurate and repeatable data, a robotic gantry system consisting of a height-adjustable jig and a belt-driven linear actuator is used (Figure 5). The linear actuator (DRSB99SL40-ST1000, Dream Robot System, Incheon, Korea) is capable of a repeat precision of 20 μm, an accuracy of 50 μm, and a maximum speed of 1000 mm/s [34]. Because of the 100 cm maximum moving distances of the linear actuator, the range is set to 80 cm in the experiments. A rack and pinion-type linear stage positioner (XWG60, Misumi, Tokyo, Japan) is used to precisely adjust the height [35].
Figure 5. Experimental setup: (a) robotic gantry system; (b) Misumi XWG60 rack-and-pinion-type linear stage positioner; and (c) activation image of the afocal OFS module.
Figure 5. Experimental setup: (a) robotic gantry system; (b) Misumi XWG60 rack-and-pinion-type linear stage positioner; and (c) activation image of the afocal OFS module.
Sensors 15 11208 g005
A microcontroller unit communicates with the sensor via one serial peripheral interface channel. A Microchip processor (dsPIC33FJ256) employing a powerful 16-bit architecture that seamlessly integrates the control features with the computational capabilities of a digital signal processor is used [36]. The sampling time to obtain the motion information from the sensor is 20 ms.
Most of the causes of height variation in indoor navigation, excluding variations associated with robot suspension, are rough carpet, doorsills, low obstacles, uneven ground, and any objects that a robot can climb. To verify the feasibility of the fabricated afocal OFS module in these environments, four types of floor materials were selected for experiments: laminated floor; vinyl sheet; texture-style carpet; and loop-style carpet (Figure 6). The height between the OFS and the surface of the material is varied from 30 to 50 mm in seven steps, and the sensor module is moved by the robotic gantry system over a distance of 80 cm at a speed of 35 cm/s. The same experiments are repeated 10 times, and the image quality is examined every 5 mm in the experimental height interval.
Figure 6. Floor materials used in this work: (a) texture-style carpet; (b) loop-style carpet; (c) laminate floor; and (d) vinyl sheet.
Figure 6. Floor materials used in this work: (a) texture-style carpet; (b) loop-style carpet; (c) laminate floor; and (d) vinyl sheet.
Sensors 15 11208 g006
Finally, the afocal OFS module is installed on a mobile robot (Figure 7) and tested 10 times on the loop-style carpet (Figure 6b) over distances of 1 m. Since the surface height variation of the loop-style carpet is the greatest among the other flooring materials (Figure 6), the loop-style carpet is selected for robot experiments. The height variation of the loop-style carpet is shown in Figure 8. The surface profile is measured by a laser distance sensor (DLS-B 30, DIMETIX, Herisau, Switzerland) [37], which has a resolution of 0.1 mm and the robotic gantry system. The laser sensor is moved from 0 to 200 mm by 1 mm intervals with the linear motion guide, and the surface profile is measured. The average height and SD are 86 mm and 8.7 mm, respectively.
Moving distance information is gathered while the robot is moved by manual operation. After the acquisition of the experimental data, the average moving distance estimation error and SD are calculated.
Figure 7. Vacuum-cleaning robot equipped with an afocal OFS module: (a) rear view and (b) bottom view.
Figure 7. Vacuum-cleaning robot equipped with an afocal OFS module: (a) rear view and (b) bottom view.
Sensors 15 11208 g007
Figure 8. Laser measured surface profile of the loop-style carpet.
Figure 8. Laser measured surface profile of the loop-style carpet.
Sensors 15 11208 g008

3. Results and Discussion

This section analyzes the performance of the suggested afocal OFS module in robustly estimating distance in indoor environments with height variation. A first experiment was carried out to determine if the use of the suggested sensor system improved image clarity. The images of the word “an” (MS Word, 9 pt. font size, Times New Roman) obtained with the OFS module are shown in Figure 9. The images are clearer compared to those obtained without a pinhole, as the height changes between the sensor and the test floor from 30 mm to 50 mm at intervals of 5 mm. The only difference between the afocal OFS module and conventional fixed-focal-length OFS module used in this test is the existence of the pinhole. The conventional OFS module is calibrated at the height of 35 mm. Since the conventional OFS module does not have pinhole, it can be regarded as having a big pinhole. As the pinhole size becomes bigger, more less-columnated light rays enter through the pinhole, which makes the image less clear, which in turn increases the sensitivity to height variations.
Figure 9. Image acquisition for different OFS module at different heights: (a) sensor height in mm; (b) images obtained using the conventional fixed-focal-length OFS module; and (c) images obtained using the proposed afocal OFS module.
Figure 9. Image acquisition for different OFS module at different heights: (a) sensor height in mm; (b) images obtained using the conventional fixed-focal-length OFS module; and (c) images obtained using the proposed afocal OFS module.
Sensors 15 11208 g009
The measured mean distance and SD are shown in Figure 10. In addition to the data in Figure 10, Table 1 includes the error percentages obtained using the afocal OFS module while varying the height between the afocal OFS module and the floor material. The results show that a 1-mm offset in sensor height induces a 0.1% systematic error in the height range of 30 mm to 40 mm; in comparison, the systematic error for a 1 mm offset in a conventional OFS module with a fixed focal length in the height range of 30 mm to 35 mm is 14.7%. In Table 1, the results show that when the height of the afocal OFS sensor rises over 40 mm the error rates start to increase. This is because the pinhole size is not ideally zero, and the illumination on the surface decreases as the height increases.
Figure 10. OFS-measured mean distances and standard deviations after 10 repeated measurements on each floor type at different heights: (a) texture-style carpet; (b) loop-style carpet; (c) laminate floor; and (d) vinyl sheet.
Figure 10. OFS-measured mean distances and standard deviations after 10 repeated measurements on each floor type at different heights: (a) texture-style carpet; (b) loop-style carpet; (c) laminate floor; and (d) vinyl sheet.
Sensors 15 11208 g010
The test results for a mobile robot with an installed OFS module are shown in Figure 11 and Table 2. For ten 1000 mm-movements of the robot over loop-style carpet, the average estimated distance is 1000.2 mm (0.02% distance error), and the SD is 17.6. For the conventional OFS module with a fixed focal length, the average estimated distance is 959.1 mm (4.09% distance error), and the SD is 25.7.
Figure 11. OFS-measured mean distances and standard deviations after 10 repeats on the loop-style carpet in a mobile robot: (a) conventional OFS module and (b) proposed afocal OFS module.
Figure 11. OFS-measured mean distances and standard deviations after 10 repeats on the loop-style carpet in a mobile robot: (a) conventional OFS module and (b) proposed afocal OFS module.
Sensors 15 11208 g011
Table 1. Computed distance error for various module heights.
Table 1. Computed distance error for various module heights.
Floor MaterialHeight (mm)Conventional Fixed-Focal-Length OFS ModuleProposed Afocal OFS Module
Mean (mm)Standard DeviationError (%)Mean (mm)Standard DeviationError (%)
Laminate Floor3000100.07990.90.1
336782.915.38020.90.2
358020.40.38040.80.5
378230.72.98050.70.6
4081.499.08060.50.8
4500100.06942.913.3
5000100.000.0100.0
Vinyl Sheet30292.496.48200.42.5
338092.21.18200.52.5
358270.63.48200.62.5
3745222.143.58190.72.3
4000100.08170.72.1
4500100.07222.49.8
5000100.02532.468.4
Texture-Style Carpet3067120.616.17892.61.4
337966.50.57991.60.2
357906.81.38080.61.0
3748218.339.88100.61.3
401799.177.68120.51.5
45284.796.58071.50.9
5000100.06144.423.3
Loop-Style Carpet302679.566.67940.50.8
337492.36.48020.40.2
358100.31.38090.31.1
378031.60.48040.50.4
405323.433.57980.60.3
4500100.07371.87.9
5000100.03905.951.3
Table 2. Estimated moving distances for 1 m movements of mobile robots on loop-style carpet.
Table 2. Estimated moving distances for 1 m movements of mobile robots on loop-style carpet.
Conventional Fixed-Focal-Length OFS ModuleProposed Afocal OFS Module
Mean (mm)Standard DeviationError (%)Mean (mm)Standard DeviationError (%)
959.125.74.091000.217.60.02

4. Conclusions

In this paper, a novel method for more scale invariant afocal OFS module with single OFS in height varying indoor environments was introduced and experimented. To minimize the horizontal displacement measurement error due to OFS height variation while the robot is moving, an afocal system was applied to OFS for the first time.
We conducted experiments to evaluate the performance of the proposed method in a linear motion guide on carpet, laminated floor, and vinyl sheet. The sensor height was varied from 30 mm to 50 mm in seven steps over a total distance of 80 cm. The experiments were repeated 10 times. The systematic error corresponding to a 1 mm change in sensor height for the proposed afocal OFS module was 0.1%; for comparison, the corresponding value for a conventional OFS module with a fixed focal length is 14.7%. Finally, the proposed afocal OFS module was installed on a mobile robot and tested 10 times on the loop-style carpet over distances of 1 m. The average distance estimation error and standard deviation (SD) for the new module were 0.02% and 17.6%, respectively; in contrast, the corresponding values for the conventional OFS module were 4.09% and 25.7%, respectively. As the afocal OFS moved away from the vertical height range of 30 mm to 45 mm, the measured moving distances decreased because the pinhole size was not ideally zero, and the illumination on the surface decreases as the height increases.
The proposed approach using an afocal system and an optical mouse sensor showed encouraging results and robust performance for accurate indoor odometry, even in the presence of localized high and low points. The afocal OFS could be an indispensable sensor for indoor robotic navigation.

Acknowledgments

This research has been supported by LG Electronics Inc., Seoul, Korea.

Author Contributions

Dong-Hoon Yi designed and fabricated the device and implemented software for obtaining the displacement and images from the OFS module. Tae-Jae Lee performed the experiments. Dong-Il “Dan” Cho supervised all work and is the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ishigami, G.; Nagatani, K.; Yoshida, K. Path planning for planetary exploration rovers and its evaluation based on wheel slip dynamics. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 2361–2366.
  2. Borenstein, J.; Everett, H.R.; Feng, L.; Wehe, D. Mobile robot positioning-sensors and techniques. J. Robot. Sys. 1997, 14, 231–249. [Google Scholar] [CrossRef]
  3. Munguia, R.; Castillo-Toledo, B.; Grau, A. A robust approach for a filter-based monocular simultaneous localization and mapping (SLAM) system. Sensors 2013, 13, 8501–8522. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Li, I.H.; Chen, M.C.; Wang, W.Y.; Su, S.F.; Lai, T.W. Mobile robot self-localization system using single webcam distance measurement technology in indoor environments. Sensors 2014, 14, 2089–2109. [Google Scholar] [CrossRef] [PubMed]
  5. Lee, D.; Myung, H. Solution to the SLAM problem in low dynamic environments using a pose graph and an RGB-D sensor. Sensors 2014, 14, 12467–12496. [Google Scholar] [CrossRef] [PubMed]
  6. Guerra, E.; Munguia, R.; Grau, A. Monocular SLAM for autonomous robots with enhanced features initialization. Sensors 2014, 14, 6317–6337. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. McCarthy, C.; Bames, N. Performance of optical flow techniques for indoor navigation with a mobile robot. In Proceedings of the 2004 IEEE International Conference on Robotics and Automation, New Orleans, LA, USA, 26 April–1 May 2004; pp. 5093–5098.
  8. Campbell, J.; Sukthankar, R.; Nourbakhsh, I. Techniques for evaluating optical flow for visual odometry in extreme terrain. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, 28 September–2 October 2004; pp. 3704–3711.
  9. Nagatani, K.; Tachibana, S.; Sofne, M.; Tanaka, Y. Improvement of odometry for omnidirectional vehicle using optical flow information. In Proceedings of the 2000 IEEE/RSJ International Conference on the Intelligent Robots and Systems, Takamatsu, Japan, 30 October–5 November 2000; pp. 468–473.
  10. Jackson, J.D.; Callahan, D.W.; Marstrander, J. A rationale for the use of optical mice chips for economic and accurate vehicle tracking. In Proceedings of the 2007 IEEE International Conference on Automation Science and Engineering, Scottsdale, AZ, USA, 22–25 September 2007; pp. 939–944.
  11. Ng, T. The optical mouse as a two-dimensional displacement sensor. Sens. Actuators A Phys. 2003, 107, 21–25. [Google Scholar] [CrossRef]
  12. Lee, S.; Song, J. Mobile robot localization using optical flow sensors. Int. J. Control Autom. Syst. 2004, 2, 485–493. [Google Scholar]
  13. Palacin, J.; Valganon, I.; Pernia, R. The optical mouse for indoor mobile robot odometry measurement. Sens. Actuators A Phys. 2006, 126, 141–147. [Google Scholar] [CrossRef]
  14. Bonarini, A.; Matteucci, M.; Restelli, M. A kinematic-independent dead-reckoning sensor for indoor mobile robotics. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sendai, Japan, 28 September–2 October 2004; pp. 3750–3755.
  15. Bonarini, A.; Matteucci, M.; Restelli, M. Automatic error detection and reduction for an odometric sensor based on two optical mice. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 1675–1680.
  16. Sekimori, D.; Miyazaki, F. Self-localization for indoor mobile robots based on optical mouse sensor values and simple global camera information. In Proceedings of the 2005 IEEE International Conference on Robotics and Biomimetics, Hong Kong SAR and Macau SAR, 29 June–3 July 2005; pp. 605–610.
  17. Minoni, U.; Signorini, A. Low-cost optical motion sensors: An experimental characterization. Sens. Actuators A Phys. 2006, 128, 402–408. [Google Scholar] [CrossRef]
  18. Bradshaw, J.; Lollini, C.; Bishop, B. On the development of an enhanced optical mouse sensor for odometry and mobile robotics education. In Proceedings of the Thirty-Ninth Southeastern Symposium on System Theory, Macon, GA, USA, 4–6 March 2007; pp. 6–10.
  19. Kim, S.; Lee, S. Robust velocity estimation of an omnidirectional mobile robot using a polygonal array of optical mice. Int. J. Control Autom. Syst. 2008, 6, 713–721. [Google Scholar]
  20. Avago. ADNS-3080 and ADNS-3088 High Performance Optical Sensor. Available online: http://www.alldatasheet.com (accessed on 5 February 2015).
  21. Cooney, J.; Xu, W.; Bright, G. Visual dead-reckoning for motion control of a Mecanum-wheeled mobile robot. Mechatronics 2004, 14, 623–637. [Google Scholar] [CrossRef]
  22. Tunwattana, N.; Roskilly, A.; Norman, R. Investigations into the effects of illumination and acceleration on optical mouse sensors as contact-free 2D measurement devices. Sens. Actuators A Phys. 2009, 149, 87–92. [Google Scholar] [CrossRef]
  23. Ross, R.; Devlin, J. Analysis of real-time velocity compensation for outdoor optical mouse sensor odometry. In Proceedings of the 2010 11th International Conference on Control Automation Robotics & Vision, Singapore, 7–10 December 2010; pp. 839–843.
  24. Ross, R.; Devlin, J.; Wang, S. Toward Refocused Optical Mouse Sensors for Outdoor Optical Flow Odometry. IEEE Sens. J. 2012, 12, 1925–1932. [Google Scholar] [CrossRef]
  25. Dahmen, H.; Mallot, H.A. Odometry for Ground Moving Agents by Optic Flow Recorded with Optical Mouse Chips. Sensors 2014, 14, 21045–21064. [Google Scholar] [CrossRef] [PubMed]
  26. Font, D.; Tresanchez, M.; Palleja, T.; Teixido, M.; Palacin, J. Characterization of a Low-Cost Optical Flow Sensor When Using an External Laser as a Direct Illumination Source. Sensors 2011, 11, 11856–11870. [Google Scholar] [CrossRef] [PubMed]
  27. Dille, M.; Grocholsky, B.; Singh, S. Outdoor downward-facing optical flow odometry with commodity sensors. In Field and Service Robotics; Springer: Berlin, Germany, 2010; pp. 183–193. [Google Scholar]
  28. Hyun, D.; Yang, H.S.; Park, H.R.; Park, H.-S. Differential optical navigation sensor for mobile robots. Sens. Actuators A Phys. 2009, 156, 296–301. [Google Scholar] [CrossRef]
  29. John, E.G. Field Guide to Geometrical Optics; SPIE Press: Bellingham, WA, USA, 2004. [Google Scholar]
  30. Avago. Optical Mice and How They Work. Available online: http://www.digikey.com (accessed on 5 February 2015).
  31. Siah, T.H.; Kong, H.Y. Optical Navigation Using One-Dimensional Correlation. U.S. Patent 731,501,3, January 2008. [Google Scholar]
  32. Hu, J.-S.; Chang, Y.-J.; Hsu, Y.-L. Calibration and on-line data selection of multiple optical flow sensors for odometry applications. Sens. Actuators A Phys. 2009, 149, 74–80. [Google Scholar] [CrossRef]
  33. Mensov, S.N.; Lee, H.Y.; Kim, S.W.; Yi, D.H.; Yoon, J.S.; Yoon, H.W.; Park, D.W.; Cho, M.J. Sensor System and Position Recognition System. Korean Patent Registration No. 101304948, March 2008. [Google Scholar]
  34. Dream Robot System. DRSB90. Available online: http://www.drs11.co.kr (accessed on 5 February 2015).
  35. MiSUMi. XWG60. Available online: http://www.misumi-ec.com (accessed on 6 February 2015).
  36. Microchip. dsPIC33FJ256MC710A. Available online: http://www.microchip.com (accessed on 6 February 2015).
  37. DIMETIX. DLS-B 30. Available online: http://www.dimetix.com (accessed on 9 April 2015).

Share and Cite

MDPI and ACS Style

Yi, D.-H.; Lee, T.-J.; Cho, D.-I. Afocal Optical Flow Sensor for Reducing Vertical Height Sensitivity in Indoor Robot Localization and Navigation. Sensors 2015, 15, 11208-11221. https://doi.org/10.3390/s150511208

AMA Style

Yi D-H, Lee T-J, Cho D-I. Afocal Optical Flow Sensor for Reducing Vertical Height Sensitivity in Indoor Robot Localization and Navigation. Sensors. 2015; 15(5):11208-11221. https://doi.org/10.3390/s150511208

Chicago/Turabian Style

Yi, Dong-Hoon, Tae-Jae Lee, and Dong-Il Cho. 2015. "Afocal Optical Flow Sensor for Reducing Vertical Height Sensitivity in Indoor Robot Localization and Navigation" Sensors 15, no. 5: 11208-11221. https://doi.org/10.3390/s150511208

Article Metrics

Back to TopTop