Next Article in Journal
Torus Pairwise Disjoint-Path Routing
Previous Article in Journal
Iss2Image: A Novel Signal-Encoding Technique for CNN-Based Human Activity Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

High-Efficiency Automatic Recharging Mechanism for Cleaning Robot Using Multi-Sensor

1
Department of Computer Science and Information Engineering, National Yunlin University of Science and Technology, Yunlin 64002, Taiwan
2
Intelligent Recognition Industry Service Research Center (IRIS Research Center), National Yunlin University of Science and Technology, Yunlin 64002, Taiwan
3
College of Future, Bachelor Program in Interdisciplinary Studies, National Yunlin University of Science and Technology, Yunlin 64002, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2018, 18(11), 3911; https://doi.org/10.3390/s18113911
Submission received: 26 October 2018 / Revised: 7 November 2018 / Accepted: 8 November 2018 / Published: 13 November 2018
(This article belongs to the Section Physical Sensors)

Abstract

:
Cleaning robot has the highest penetration rate among the service robots. This paper proposes a high-efficiency mechanism for an intelligent cleaning robot automatically returns to charging in a short time when the power is insufficient. The proposed mechanism initially combines the robot’s own motor encoder with neural network linear regression to calculate the moving distance and rotation angle for the location estimation of the robot itself. At the same time, a self-rotating camera is applied to scan the number of infrared spots on the docking station to find the location of the docking station so that the cleaning robot returns to charging properly in two stages, existing infrared range and extended infrared range. In addition, six-axis acceleration and ultrasound are both applied to deal with the angle error that is caused by collision. Experimental results show that the proposed recharging mechanism significantly improves the efficiency of recharging.

1. Introduction

In recent years, mobile robots are being widely used in industrial automation, home automation, hospitals, entertainment, space exploration, military, etc [1,2,3,4,5,6,7,8,9,10,11,12,13]. Due to the decreased size and the low cost of mobile robots, more and more mobile robots are now working around us and they will help us a lot in our daily lives, such as cooking, cleaning, house plant watering, pet feeding, and taking care of children [1,2,3,4,5]. In order to complete these tasks continuously and efficiently, many auto-recharging methods have been proposed.
Currently efficient auto-recharging methods can be generally classified into three types. (1) In simultaneous localization and mapping (SLAM), the robot starts from an unknown environment and acts in this environment, and continuously repeats the information of the environment to achieve its own positioning and status. Then, build an environmental map one by one based on the surrounding environment information constructed by its own location [14,15,16,17,18,19]. Lidar SLAM has high measurement accuracy but is expensive. Camera SLAM is cheaper but the disadvantage is that if the image is large, the calculation is complicated. (2) In wireless positioning and recharging, the strength of the received wireless signal is used to estimate the position of the docking station by using wireless signals to sense the position of the docking station, such as Bluetooth, WiFi or ZigBee wireless communication [20,21,22,23]. Depending on the wireless device, the requirements of the corresponding environment are different. Therefore, this method is susceptible to environmental factors. Especially in the same frequency band in the same area, the signal strength will be seriously affected. (3) In combination with various types of sensor for positioning and recharging, the recharging path is determined by sensing the position/orientation of the cradle by sensors, such as ultrasound, camera, infrared, etc [24,25,26,27,28,29]. The advantage of this method is that it can integrate the information collected by multiple sensors, and the obtained positioning accuracy is more accurate than the information collected by a single sensor. In addition, the cost of this method is much lower than using the laser radar SLAM.
Based on the review and discussion above, this paper uses multiple sensors to propose a high-efficiency automatic recharging mechanism for cleaning robot. Cleaning robots are the most widely used in family. For cleaning robots, there are general types and intelligent ones. The general-type cleaning robot has basic functions, including cleaning and recharging without positioning and path planning. Intelligent cleaning robots are equipped with sensors, such as ultrasound, camera, infrared, etc. These sensors are applied to complete the robot’s positioning and path planning. With various sensors, ultrasound, camera, infrared, six-axis acceleration, we design a high-efficiency mechanism of positioning and path planning for an intelligent cleaning robot automatically returns to charging in a short time when the power is insufficient. The proposed mechanism initially combines robot’s own motor encoder with neural network linear regression to calculate the moving distance and rotation angle for the location estimation of the robot itself. In order to adjust the travel direction of the robot until it enters the exiting infrared signal range, a self-rotating camera is applied to scan the number of infrared spots on the docking station so that the cleaning robot moves toward the infrared spots until the infrared spot changes from one to two, indicating that it enters the exiting infrared signal range (60 degree range) since there is only one infrared spot captured in the extended infrared range of 90 degrees. According to the robot’s speed, both six-axis acceleration and ultrasound are hierarchically applied to solve the problem that is caused by collision. Six-axis sensor is enabled to compensate the angle error that is caused by collision when the travel speed is slow while ultrasonic is enabled to avoid obstacles when the travel speed is fast.
The rest of this paper is organized as follows. Section 2 lists the specification of the test cleaning robot and hardware setting. Section 3 presents the proposed recharging mechanism. Section 4 shows experimental results. Section 5 concludes our work.

2. Introduction to the Testing Cleaning Robot and Hardware Setting

2.1. Introduction to the Testing Cleaning Robot

The testing robot in this paper is a teaching robot produced by American IRobot company, as shown in Figure 1. User communicates with the robot through RS-232 communication interface. Based on this robot, we combine Arduino platform, Raspberry Pi 3, cameras, and ultrasonic sensors to design an automatic return charging mechanism. Figure 2 shows the overall structure of the robot. The hardware setting and the proposed mechanism are introduced in the following subsections.
RS-232 is a serial-type data transmission. It is a bit-by-bit character transfer method. Most of them use the RX and TX pins to communicate with each other. The robot uses the RX to receive the instructions from the Arduino and the TX transfers the state of the robot to the Arduino. The robot has three modes: (1) Passive mode, enable the robot, but cannot modify the parameters of the internal sensor. (2) Safe mode, enable the robot, and modify the parameters of the internal sensor, but some sensors will be restricted. (3) Full mode, has the second function above and no restrictions of sensor modification.
This paper adopts mode (3). Based on the official specifications [5], its control instruction is Opcode (Open Interface control instruction), including Start, Stop, Drive, Drive Direct, which represent start, stop, forward/backward, and rotation, respectively. These commands are used to control robot. We set the speed value of the wheel, so that the robot can travel at this speed and turn left/right. For example, if the speed value is 50, then the robot advances at a speed of 50 mm/s.
In the sweeping function, the clean robot has three functions: random-path cleaning, spiral-path cleaning, and square-path cleaning. In the part that automatically returns to the charging, the signal that is emitted by the infrared sensor of the docking station is used, and during the traveling, the infrared signal is continuously scanned to guide the cleaning robot to return to the docking station for charging.

2.2. Arduino for Calculating Travel Distance and Direction Angle

The general mechanism of the cleaning robot cannot know its position when performing work tasks. When returning to the docking station for charging, it can only rely on the only infrared scanning function, but even if the infrared scanning function is used, the cleaning robot is not in the docking station. In the range of the infrared signal sent, it takes a long time to recharge. In order to solve this problem, the embedded platform Arduino is used as the platform for robot control. Through this platform, the travel distance and direction angle are calculated, as shown in Figure 3.

2.3. Infrared LED and Camera

The automatic back to charging method of the clean robot is to continuously scan the infrared sensor signal on the charging stand by using the infrared sensor on the clean robot. Figure 4 shows the emission range of the infrared light. In this range, the clean robot can return to charging once it detects the infrared signal. However, outside the range, the robot itself does not know that the docking station is beside and thus walks randomly or even away from the docking station.
In order to improve the above situation, we added three infrared LEDs to the docking station to increase the exiting infrared signal range. Since the infrared sensor cannot detect the added infrared LED, it is necessary to use the camera and process the image captured with the Raspberry Pi 3, as shown in Figure 5 and Figure 6.
In order to confirm the range of the camera after adding the camera, as shown in Figure 7, the black dotted circle is the position where the cleaning robot is placed, and the upper figure is the angle corresponding to the docking station. We first placed the robot from the middle of the 70-degree position, and watched whether the camera had infrared light. After testing, it was found that the range of shooting was up to 90 degrees.
If you want to increase the range to more than 100 degrees, then you need to add more LEDs, but in fact the 90-degree range is enough for the robot to return to the docking station correctly, as shown in Table 1.

2.4. Six-Axis Sensors

The MPU-6050 six-axis sensor, as shown in Figure 8, is composed of an accelerometer and a gyroscope that calculate the current direction or angle of the robot. Specifications are shown in Table 2.

3. Proposed Mechanism

This section mainly combines neural network-linear regression with the proposed sensing techniques, including self-rotating camera, six-axis sensors, and ultrasonic sensors to propose a high-efficiency recharging mechanism.

3.1. Neural Network-Linear Regression

We review neural network in case of linear regression for later use before introducing the proposed mechanism for a clean robot automatically returns to charging. A neuron with label y j receives an input x i from predecessor neurons by a connection assigned a weight w j i .
y j = i w j i x i = W j T X
where W j = [ w j i ] T and X = [ x i ] T . The error e j between real output d j and y j is then rewritten as
e j = d j y j = d j W j T X
or equivalently
1 2 e j 2 = 1 2 ( d j W j T X ) 2
The total error is
1 2 e 2 = 1 2 j e j 2 = 1 2 j ( d j W j T X ) 2
which implies
1 2 e 2 W j = j ( d j W j T X ) X = e X
By introducing a parameter η called learning rate, the new weighting W j n e w is obtained by
W j n e w = W j ( η e X ) = W j + η e X
As shown in Figure 9, this expression means that each input will correspond to a different weight, and the output is multiplied by the weight to calculate the input of the next layer, and the weight is continuously updated until convergence to obtain the continuous output finally.

3.1.1. Travel Distance Estimation

The distance D traveled by the cleaning robot is generally defined by
D = v ( d t / 100 )
where v is the speed value of the wheel; d t is the duration of the walking (in milliseconds); and, d t / 100 denotes time calculation unit of 100 milliseconds. Replacing d t / 100 by t, equation (7) can be rewritten as
D ( t ) = v t
In other words, 1 s is equal to 1000 ms/100 ms = 10 which means t = 10. If the travel time is 2 s then D ( 2 ) = v ( 2 × 10 ) . In order to verify the efficiency of formula in (2), we perform a series of experiments and the results are shown in Table 3. By the relation, distance error = actual travel distance − estimated travel distance, these results indicate that the longer the time, the greater the error, as shown in Figure 10.
In order to reduce the distance error in Table 1, we apply neural network-linear regression in section A to find the curve f e r r o r in Figure 6 sketched by the distance error by
f e r r o r = i w i x i = W T X
W n e w = W ( η e X ) = W + η e X
where W = [ w i ] T , X = [ x i ] T , and η = 0.0001 .
When the number of iterations reaches 500 times, the error has become saturated. Therefore, we train 500 times and compare the estimated travel distance with the actual travel distance. The results are shown in Table 4. The estimated distance between the traveled distance and the actual travel distance can be improved a lot. However, when the walking time reaches 30 to 40 s, the actual travel distance is gradually smaller than the estimated travel distance, but, when considering that the robot used in the home is not Continuous straight line travels for 30 s, and the method of this method will recalculate the distance as long as the robot collides or detects obstacles, so the part after 30 s is not considered.

3.1.2. Rotation Angle Estimation

To get the location of the robot in addition to the estimated travel distance, we need to know the rotation angle of robot’s body itself by the formula in Equation (11).
θ = ( d e l a y 100 ) ( | E R E L | L )
where θ is the rotation angle of the robot; E R is the speed value of the right wheel; E L is the speed value of the left wheel; and, L is the distance between the left wheel and the right wheel. The time part of formula (11) can also be replaced by t according to the method of the previous section, and can be obtained formula (12).
θ ( t ) = ( | E R E L | L )
In order to verify the efficiency of the formula (12), we have a series of tests and the results are shown in Table 5. From the results, the longer the duration of rotation, the angle error becomes ever larger. According to the estimated rotation angle, the actual rotation angle is smaller than the estimated rotation angle, so that we have the relation actual rotation angle = estimated rotation angle − angle error. In order to reduce the angle error in Table 5, we apply neural network-linear regression in section A to find the curve θ e r r o r in Figure 11 sketched by the angle error as the following two formulas.
θ e r r o r = i w i x i = W T X
W n e w = W ( η e X ) = W + η e X
where W = [ w i ] T , X = [ x i ] T .
When the number of trainings is 300 to 500 times, the resulting angular error is already close to zero. Therefore, we train 300 times and compare the estimated rotation angle with the actual rotation angle. The results are shown in Table 6. The estimated distance between the traveled distance and the actual travel distance can be improved a lot.

3.2. Cleaning Robot Position Estimation

After we estimate the distance and rotation angle of the robot by compensation, we can estimate the position of the robot. Based on the Cartesian coordinate system, the location of the docking station is set to (0, 0) and then the position of the robot is calculated by the following formula.
{ x = x + D cos ( 90 θ ) y = y + D sin ( 90 θ ) ,   ( 0 θ 2 π )
where D corresponds to the travel distance and 90 θ is the rotation angle between the robot and the x-axis, as shown in Figure 12.
The compensation of travel distance and rotation angle are calculated in Arduino to obtain robot’s Cartesian coordinate. In addition, we use USB to connect Arduino and Rasberry pi, so that we can observe robot’s Cartesian coordinate on end device by Rasberry pi, as shown in Figure 13.

3.3. Self-Rotating Camera for Searching Docking Station

In order to adjust the robot until it enters the exiting infrared signal range, we propose a searching method for the self-rotating camera combined with the angle and distance estimation proposed in the previous section. This search method is described, as follows. When the infrared spot is captured via the HoughCircles function [30], it is judged whether the docking station is currently on the left or the right side of the robot according to the current estimated angle and distance, and proceeds in the direction of the judgment. This method will be used repeatedly until the infrared spot changes from one to two, indicating that it enters the infrared signal range (60° range), because in the 90° range, only one infrared spot is captured. Moreover, we can compare the actual distance between the robot and the docking station from Table 7, thereby eliminating the distance error.

3.4. Collision Angle Error Compensation by Six-Axis Sensors

During the execution of the mission, the clean robot may collide with obstacles. When it hits an obstacle, the robot may have a collision displacement. This collision displacement will cause the robot’s angle to change. In order to correct the collision offset, this paper uses six-axis sensors (MPU-6050) to compensate for collision error angles. As shown in Table 8, there are about 10° of error between estimated angle and actual angle due to the collision. In order to reduce the error, we adopt six-axis sensor with two parts: accelerometer and Gyroscope. The values read by the accelerometer and Gyroscope in the six-axis sensor are shown in Figure 14.

3.4.1. The Part of the Accelerometer

Since the clean robot is traveling horizontally, we calculate the angle ϕ a c c e between the Z-axis and the X-axis of the accelerometer by using the values read by the accelerometer in the six-axis sensor, az and ax, respectively.
ϕ a c c e = tan 1 ( a c z a c x ) × 180 π
where acz = az/accesen, acx = ax/accesen, and accesen is the sensitivity of the accelerometer with unit LSB/g.

3.4.2. The Part of the Gyroscope

The gyroscope is mainly designed to measure the angular velocity, based on the theory of conservation of angular momentum. The angular velocity of rotation shown in Figure 15 is usually expressed in dps (°/s). Since the robot is traveling in a horizontal state, we mainly do angular offset compensation for the Z-axis. If only the Z-axis is rotated, then the X and Y axes output is 0 and the Z-axis output is the angular velocity of rotation.
We first calculate the angular velocity v z along Z-axis by
v z ( t ) = g z ( t ) / g y r o s e n
where gz(t) is the raw value read by the gyroscope and gyrosen is the sensitivity of the gyroscope. Then, the rotation angle ϕ g y r o along z-axis is calculated by
ϕ g y r o = t v z ( t ) d t
There are advantages and disadvantages in both methods. Accelerometers are prone to error interference during fast vibration (high frequency). The gyroscope is prone to drift problems in an almost stationary (low frequency) state. In order to estimate collision offset, we introduce a parameter α ( 0 < α < 1 ) in an equation defined by
ϕ = α ϕ a c c e + ( 1 α ) ϕ g y r o
Accordingly, the collision angle error compensation is completed. The detail of six-axis sensor usage process is shown in Figure 16.

3.5. Avoid Obstacles with Ultrasonic Sensors

When the speed of the clean robot is fast, the collision of the object will result in a large angular offset due to the impact force and the six-axis sensor detection speed cannot keep up with the compensation failure. To solve this problem, as shown in Figure 17 and Figure 18, we use the three ultrasonic sensors that are installed on the clean robot to immediately perform the direction change action when the cleaning robot detects obstacles around it, so as to avoid collisions with obstacles.

3.6. The proposed Machanism

This paper mainly enables the robot automatically recharging with high efficiency. Based on subsections A-F, the entire automatic recharging mechanism is proposed, as shown in Figure 19, which is presented, as follows:
Step 1.
Enable clean robot and perform certain tasks.
Step 2.
Enable recharge instruction. If clean robot receives the recharging instruction, we determine the current position of clean robot and enable recharge function proposed in step 3 and step 4. Otherwise go back to step 1.
Step 3.
Setting a threshold to enable six-axis sensor or ultrasonic. Six-axis sensor is enabled to compensate the angle error caused by collision when the travel speed is slow while ultrasonic is enabled to avoid obstacles when the travel speed is fast.
Step 4.
Calculate clean robot’s angle and capture infrared spot via the HoughCircles function to judge whether the charging dock is currently on the left or the right side of the robot according to the current estimated angle and distance, and proceeds in the direction of the judgment. This action repeats until the infrared spot changes from one to two, indicating that it enters the infrared signal range (60° range). Once the infrared spot changes from one to two, go to step 5.
Step 5.
Enable infrared scanning until the robot returns to charging correctly.

4. Experimental Results

The experimental environment is 275 cm long and 254 cm wide with obstructions, as shown in Figure 20. The black circle is the starting position of the clean robot, and the charging dock placed behind the black circle.

4.1. Collision Angle Compensation

In order to verify that the angle compensation of the six-axis accelerometer has not been added, we compare the robot’s actual position and angle per minute. The experimental results are shown in Table 9. It can be found that the actual position, angle, and estimated position and angle are greatly different. In order to solve this problem, a six-axis accelerometer is added. The corrected results are shown in Table 10. After the addition of the six-axis sensor, the collision angle compensation is smaller than the original without added angle error, and the estimated position is within 50 cm from the actual position.
Although the accuracy of the traveling speed of 50 (mm/s) is improved after being compensated by the six-axis sensor, in comparison with the traveling speed of 200 (mm/s), as shown in Table 11, the error bias can be known. Big. The reason is that the faster the moving speed of the sweeping robot, the more the impact force will cause the angular offset to be large and the detection speed of the six-axis sensor cannot keep up, which will cause the compensation failure. To solve this problem, we use the three ultrasonic sensors. When the clean robot detects obstacles around it, then it can immediately perform the action of changing direction, avoiding collisions with obstacles, and avoiding angle errors that are caused by collisions.
Table 12 shows the results of the experiment after adding the ultrasonic sensor. It can be found that the problem of the angle caused by the collision after the addition of the ultrasonic sensor is greatly improved, but the longer the walking time, the position of the estimation is gradually caused by the problem of the error accumulation. There is a gap in the actual location.

4.2. Comparison between General-Type Auto-Recharging Mechanism and the Proposed Auto-Recharging Mechanism

This section compares the general-type recharge mechanism with the mechanism proposed in this paper. The general-type recharging mechanism has only the basic infrared detection recharging function without additional sensors and method for calculating its position. The mechanism that is proposed in this paper combines a variety of sensors to estimate the approximate position of the robot and thus proposed a new mechanism of automatic recharging. The comparison between the general-type recharging mechanism and the proposed mechanism is as follows.
The experiment is divided into two parts. In the first part, the cleaning robot is placed outside the range of the infrared signal emitted by the infrared sensor of the docking station to back to charging. That is, the cleaning robot is placed outside the range of the dotted fan in front of the docking station in Figure 21 to back to charging. As shown in Figure 21a–d, in the general-type mechanism, the circle with arrow inside is the beginning position of the cleaning robot, and the arrow is the direction of the current orientation. When the recharge instruction is enabled, the cleaning robot travels in the direction of the arrow. When the robot walks to the infrared signal range without detecting the infrared signal, it may bypasses until the infrared signal is detected.
Figure 21e–h shows the automatic recharge mechanism that is proposed in this paper. The robot walks to the same position as the general-type mechanism, and then activates the recharging function. According to our proposed mechanism, it determines which direction should be followed and then finishes the recharge action. From these figures and Table 13, one can find that the efficiency of recharging is greatly improved.
The second part is the experiment performed inside the infrared signal range of the charging dock. Similarly, we compare the recharging path and time in different directions for the general-type mechanism and the proposed mechanism, as shown in Figure 22a–h and Table 14. It can be found that the automatic recharge time of the proposed mechanism is still less than the recharge time of the general mechanism.

5. Conclusions

This paper designs a high-efficiency automatic recharge mechanism to plan the method of automatic return charging through the travel distance, rotation angle, and so on. We made a variety of attempts on the way to find out the appropriate solution for robot travel distance, rotation angle, angle collision and other issues. Through the method that is proposed in this paper, the method of automatically returning the charging of the sweeping robot is planned, which is obviously more efficient than the method of returning charging to the general mechanism.
In future research, we will continue to improve the accumulated error when walking for a long time even the machine learning method is used to compensate the error of the travel distance and the rotation angle of the sweeping robot. In addition, we will also improve the accuracy of image recognition.

Author Contributions

Formal analysis, C.-L.C.; Methodology, C.-Y.C.; Software, Z.-Y.T.; Validation, S.-T.C.

Funding

This research received no external funding.

Acknowledgments

This work was financially supported by the “Intelligent Recognition Industry Research Service Center (IRIS Research Center)” from The Featured Areas Research Center Program within the frame-work of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Song, G.; Wei, Z.; Zhang, W.; Song, A. A hybrid sensor network system for home monitoring applications. IEEE Trans. Consum. Electron. 2007, 53, 1434–1439. [Google Scholar] [CrossRef]
  2. Song, G.; Zhou, Y.; Wei, Z.; Song, A. A smart node architecture for adding mobility to wireless sensor networks. Sens. Actuators A Phys. 2008, 147, 216–221. [Google Scholar] [CrossRef]
  3. Song, G.; Yin, K.; Zhou, Y.; Cheng, X. A Surveillance Robot with Hopping Capabilities for Home Security. IEEE Trans. Consum. Electron. 2009, 55, 2034–2039. [Google Scholar] [CrossRef]
  4. Nugent, C.D.; Finlay, D.D.; Fiorini, P.; Tsumaki, Y.; Prassler, E. Home automation as a means of independent living. IEEE Trans. Autom. Sci. Eng. 2008, 5, 1–8. [Google Scholar] [CrossRef]
  5. Oh, Y.; Yoon, J.; Park, J.; Kim, M.; Kim, H. A name recognition based call-and-come service for home robots. IEEE Trans. Consum. Electron. 2008, 54, 247–253. [Google Scholar] [CrossRef]
  6. de Jesus Rubio, J. Discrete time control based in neural networks for pendulums. Appl. Soft Comput. 2018, 68, 821–832. [Google Scholar] [CrossRef]
  7. Meda-Campaña, J.A.; Grande-Meza, A.; Rubio, J.D.; Tapia-Herrera, R.; Hernández-Cortés, T.; Curtidor-López, A.V.; Páramo-Carranza, L.A.; Cázares-Ramírez, I.O. Design of stabilizers and observers for a class of multivariable TS fuzzy models on the basis on new interpolation functions. IEEE Trans. Fuzzy Syst. 2018, 26, 2649–2662. [Google Scholar] [CrossRef]
  8. de Jesus Rubio, J.; García, E.; Aquino, G.; Aguilar-Ibáñez, C.; Pacheco, J.; Zacarías, A. Learning of operator hand movements via least angle regression to be teached in a manipulator. Evol. Syst. 2018. [Google Scholar] [CrossRef]
  9. Pan, Y.; Liu, Y.; Xu, B.; Yu, H. Hybrid feedback feedforward: An efficient design of adaptive neural network control. Neural Netw. 2016, 76, 122–134. [Google Scholar] [CrossRef] [PubMed]
  10. Rubio, J.J. Modified optimal control with a back propagation network for robotic arms. IET Control Theory Appl. 2012, 6, 2216–2225. [Google Scholar] [CrossRef]
  11. Bae, Y. An improved measurement method for the strength of radiation of reflective beam in an industrial optical sensor based on laser displacement meter. Sensors 2016, 16, 752. [Google Scholar] [CrossRef] [PubMed]
  12. Cho, J.-H.; Kim, Y.-G. Design of autonomous logistics transportation robot system with fork-type lifter. Int. J. Fuzzy Logic Intell. Syst. 2017, 17, 177–186. [Google Scholar] [CrossRef]
  13. Tiep, D.K.; Lee, K.; Im, D.-Y.; Kwak, B.; Ryoo, Y.-J. Design of fuzzy-PID controller for path tracking of mobile robot with differential drive. Int. J. Fuzzy Logic Intell. Syst. 2018, 18, 220–228. [Google Scholar] [CrossRef]
  14. Wolf, D.F.; Sukhatme, G.S. Mobile robot simultaneous localization and mapping in dynamic environments. Auton. Robots 2005, 19, 53–65. [Google Scholar] [CrossRef]
  15. Sujan, V.A.; Meggiolaro, M.A.; Belo, F.A.W. A new technique in mobile robot simultaneous localization and mapping. Rev. Controle Autom. 2006, 17. [Google Scholar] [CrossRef] [Green Version]
  16. Chou, Y.-S.; Liu, J.-S. A robotic indoor 3D mapping system using a 2D laser range finder mounted on a rotating four-bar linkage of mobile platform. Int. J. Adv. Robot. Syst. 2013, 10, 257–271. [Google Scholar] [CrossRef]
  17. Santos, J.M.; Portugal, D.; Rocha, R.P. An evaluation of 2D SLAM techniques available in Robot Operating System. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linkoping, Sweden, 21–26 October 2013; pp. 1–6. [Google Scholar]
  18. Pajaziti, A.; Avdullahu, P. SLAM—Map building and navigation via ROS. Int. J. Intell. Syst. Appl. Eng. (IJISAE) 2014, 2, 71–75. [Google Scholar] [CrossRef]
  19. Lee, S.; Tewolde, G.; Lim, J.; Kwon, J. 2D SLAM solution for low-cost mobile robot based on embedded single board computer. In Proceedings of the 2017 World Congress on Advances in Nano, Bio, Robotics and Energy (ANBRE17), Seoul, Korea, 28 August–1 September 2017. [Google Scholar]
  20. Sheu, J.P.; Hsieh, K.Y.; Cheng, P.W. Design and implementation of mobile robot for nodes replacement in wireless sensor networks. J. Inf. Sci. Eng. 2008, 24, 393–410. [Google Scholar]
  21. Song, G.; Wang, H.; Zhang, J.; Meng, T. Automatic docking system for recharging home surveillance robots. IEEE Trans. Consum. Electron. 2011, 57, 428–435. [Google Scholar] [CrossRef]
  22. Su, K.-L.; Liao, Y.-L.; Lin, S.-P.; Lin, S.-F. An Interactive Auto-recharging System for Mobile Robots. Int. J. Autom. Smart Technol. 2014, 4, 43–53. [Google Scholar] [CrossRef]
  23. Saravanan, S.; Ramamoorthy, K. Implementation of Automatic Docking System Based Home Surveillance Robot. In Proceedings of the International Conference on Innovations in Engineering and Technology (ICIET), Bangkok, Thailand, 5–6 August 2016. [Google Scholar]
  24. Roufas, K.; Zhang, Y.; Duff, D.; Yim, M. Six degree of freedom sensing for docking using IR LED emitters and receivers. In Proceedings of the 7th International Symposium on Experimental Robotics (ISER), Waikiki, HI, USA, 11–13 December 2000. [Google Scholar]
  25. Luo, R.C.; Su, K.L. Multilevel Multisensor-Based Intelligent Recharging System for Mobile Robot. IEEE Trans. Ind. Electron. 2008, 55, 270–279. [Google Scholar] [CrossRef]
  26. Liu, P.J.; Zhu, Y.H.; Cui, X.D.; Wang, X.L.; Yan, J.B.; Zhao, J. Multisensor-based autonomous docking for ubot modular reconfigurable robot. In Proceedings of the International Conference on Mechatronics and Automation (ICMA), Chengdu, China, 5–8 August 2012; pp. 772–776. [Google Scholar]
  27. Ruiz, E.; Acuna, R.; Certad, N.; Terrones, A.; Cabrera, M.E. Development of a control platform for the mobile robot roomba using ROS and a Kinect sensor. In Proceedings of the Latin American Robotics Symposium and Competition, Arequipa, Peru, 21–27 October 2013; pp. 55–60. [Google Scholar]
  28. Zhu, Y.; Jin, H.; Zhang, X. A multi-sensory autonomous docking approach for a self-reconfigurable robot without mechanical guidance. Int. J. Adv. Robot. Syst. 2014, 11, 10. [Google Scholar] [CrossRef]
  29. Poomarin, W.; Chancharoen, R.; Sangveraphunsiri, V. Automatic docking with obstacle avoidance of a differential wheel mobile robot. Int. J. Mech. Eng. Robot. Res. 2016, 5, 11–16. [Google Scholar] [CrossRef]
  30. HoughCircles. Available online: http://monkeycoding.com/?p=654 (accessed on 10 November 2018). (in Chinese).
Figure 1. IRobot Create 2.
Figure 1. IRobot Create 2.
Sensors 18 03911 g001
Figure 2. Cleaning robot hardware architecture.
Figure 2. Cleaning robot hardware architecture.
Sensors 18 03911 g002
Figure 3. Arduino is used as the platform for calculating travel distance and direction angle.
Figure 3. Arduino is used as the platform for calculating travel distance and direction angle.
Sensors 18 03911 g003
Figure 4. Exiting range of infrared light.
Figure 4. Exiting range of infrared light.
Sensors 18 03911 g004
Figure 5. Camera.
Figure 5. Camera.
Sensors 18 03911 g005
Figure 6. Raspberry Pi 3.
Figure 6. Raspberry Pi 3.
Sensors 18 03911 g006
Figure 7. Infrared signal range with and without adding three infrared LEDs.
Figure 7. Infrared signal range with and without adding three infrared LEDs.
Sensors 18 03911 g007
Figure 8. MPU-6050 six-axis sensor.
Figure 8. MPU-6050 six-axis sensor.
Sensors 18 03911 g008
Figure 9. Machine learning-linear regression.
Figure 9. Machine learning-linear regression.
Sensors 18 03911 g009
Figure 10. f e r r o r curve.
Figure 10. f e r r o r curve.
Sensors 18 03911 g010
Figure 11. θ e r r o r Graph.
Figure 11. θ e r r o r Graph.
Sensors 18 03911 g011
Figure 12. Robot position estimation.
Figure 12. Robot position estimation.
Sensors 18 03911 g012
Figure 13. Remote observation interface.
Figure 13. Remote observation interface.
Sensors 18 03911 g013
Figure 14. Six-axis sensor raw value.
Figure 14. Six-axis sensor raw value.
Sensors 18 03911 g014
Figure 15. Gyro rotation direction diagram.
Figure 15. Gyro rotation direction diagram.
Sensors 18 03911 g015
Figure 16. Operation flowchart of six-axis sensors.
Figure 16. Operation flowchart of six-axis sensors.
Sensors 18 03911 g016
Figure 17. Three ultrasonic sensors installed on the clean robot.
Figure 17. Three ultrasonic sensors installed on the clean robot.
Sensors 18 03911 g017
Figure 18. Operation flowchart of ultrasonic sensors.
Figure 18. Operation flowchart of ultrasonic sensors.
Sensors 18 03911 g018
Figure 19. Auto-recharging mechanism for cleaning robot.
Figure 19. Auto-recharging mechanism for cleaning robot.
Sensors 18 03911 g019
Figure 20. Experimental environment.
Figure 20. Experimental environment.
Sensors 18 03911 g020
Figure 21. First part of experiment: the cleaning robot is placed outside the range of the infrared signal emitted by the infrared sensor of the docking station to back to charging by general-type and the proposed mechanisms. (a)Without detecting the infrared signal in the general-type mechanism, robot facing up bypasses until the infrared signal is detected. (b) Without detecting the infrared signal in the general-type mechanism, robot facing the right did not bypasses until the infrared signal is detected. (c) Without detecting the infrared signal in the general-type mechanism, robot facing the left bypasses until the infrared signal is detected. (d) Without detecting the infrared signal in the general-type mechanism, facing down bypasses until the infrared signal is detected. (e) By the proposed mechanism, robot facing up determines which direction should be followed and then finishes the recharge action. (f) By the proposed mechanism, robot facing the right determines which direction should be followed and then finishes the recharge action. (g) By the proposed mechanism, robot facing the left determines which direction should be followed and then finishes the recharge action. (h) By the proposed mechanism, robot facing down determines which direction should be followed and then finishes the recharge action.
Figure 21. First part of experiment: the cleaning robot is placed outside the range of the infrared signal emitted by the infrared sensor of the docking station to back to charging by general-type and the proposed mechanisms. (a)Without detecting the infrared signal in the general-type mechanism, robot facing up bypasses until the infrared signal is detected. (b) Without detecting the infrared signal in the general-type mechanism, robot facing the right did not bypasses until the infrared signal is detected. (c) Without detecting the infrared signal in the general-type mechanism, robot facing the left bypasses until the infrared signal is detected. (d) Without detecting the infrared signal in the general-type mechanism, facing down bypasses until the infrared signal is detected. (e) By the proposed mechanism, robot facing up determines which direction should be followed and then finishes the recharge action. (f) By the proposed mechanism, robot facing the right determines which direction should be followed and then finishes the recharge action. (g) By the proposed mechanism, robot facing the left determines which direction should be followed and then finishes the recharge action. (h) By the proposed mechanism, robot facing down determines which direction should be followed and then finishes the recharge action.
Sensors 18 03911 g021aSensors 18 03911 g021b
Figure 22. Second part of experiment: the cleaning robot is placed inside the range of the infrared signal emitted by the infrared sensor of the docking station to back to charging by general-type and the proposed mechanisms. (a) In the general-type mechanism, robot facing up bypasses until the infrared signal is detected. (b) In the general-type mechanism, robot facing the right bypasses until the infrared signal is detected. (c) In the general-type mechanism, robot facing the left bypasses until the infrared signal is detected. (d) In the general-type mechanism, robot facing down bypasses until the infrared signal is detected. (e) By the proposed mechanism, robot facing up determines which direction should be followed and then finishes the recharge action. (f) By the proposed mechanism, robot facing the right determines which direction should be followed and then finishes the recharge action. (g) By the proposed mechanism, robot facing the left determines which direction should be followed and then finishes the recharge action. (h) By the proposed mechanism, robot facing the right determines which direction should be followed and then finishes the recharge action.
Figure 22. Second part of experiment: the cleaning robot is placed inside the range of the infrared signal emitted by the infrared sensor of the docking station to back to charging by general-type and the proposed mechanisms. (a) In the general-type mechanism, robot facing up bypasses until the infrared signal is detected. (b) In the general-type mechanism, robot facing the right bypasses until the infrared signal is detected. (c) In the general-type mechanism, robot facing the left bypasses until the infrared signal is detected. (d) In the general-type mechanism, robot facing down bypasses until the infrared signal is detected. (e) By the proposed mechanism, robot facing up determines which direction should be followed and then finishes the recharge action. (f) By the proposed mechanism, robot facing the right determines which direction should be followed and then finishes the recharge action. (g) By the proposed mechanism, robot facing the left determines which direction should be followed and then finishes the recharge action. (h) By the proposed mechanism, robot facing the right determines which direction should be followed and then finishes the recharge action.
Sensors 18 03911 g022aSensors 18 03911 g022b
Table 1. Infrared signal range with and without adding three infrared LEDs.
Table 1. Infrared signal range with and without adding three infrared LEDs.
Without Adding Three Infrared LEDsWith Adding Three Infrared LEDs
Infrared signal range (degree)60 60–90
Table 2. Specifications of MPU-6050 six-axis sensor.
Table 2. Specifications of MPU-6050 six-axis sensor.
Chip modelMPU-6050
Power supply3–5 V
Communication protocolI2C
Gyroscope range ± 250 , ± 500 , ± 1000 , ± 2000 ° / s
Accelerometer range ± 2 , ± 4 , ± 8 , ± 16 g
16-bit AD converter/16-bit data output
Table 3. Travel Distance—General Estimation.
Table 3. Travel Distance—General Estimation.
Travel Time (s)Estimated Travel Distance (cm)Actual Travel Distance (cm)Distance Error (cm)
1056.451.45
201012.82.8
301518.23.2
402024.14.1
502530.25.2
603036.26.2
703542.17.1
804047.87.8
904553.78.7
1005059.49.4
Table 4. Comparison of travel distance after compensation.
Table 4. Comparison of travel distance after compensation.
Travel Time (s)Estimated Travel Distance (cm)Actual Travel Distance (cm)Distance Error (cm)
1065.9−0.1
201211.8−0.2
301818.20.2
4023.9724.20.23
5029.9530.10.15
6035.9336.30.37
7041.9242.40.48
8047.948.60.7
9053.8854.30.42
10059.8660.40.54
12071.6371.4−0.23
14083.5783.5−0.07
16095.5196.20.69
200119.39118.5−0.89
250149.23146−3.23
300179.08175.8−3.28
400238.77231.7−7.07
Table 5. Rotation angle calculation—general estimation method.
Table 5. Rotation angle calculation—general estimation method.
Rotation Time (s)Estimated Rotation Angle (°)Actual Rotation Angle (°)Angle Error (°)
14229.512.5
1.563.8254419.825
285.15827.1
2.5106.3757234.375
3127.6586.541.15
Table 6. Comparison of rotation angle after compensation.
Table 6. Comparison of rotation angle after compensation.
Rotation Time (s)Estimated Rotation Angle (°)Actual Rotation Angle (°)Angle Error (°)
1029.49290.49
1544.025440.025
2058.01590.99
2571.985720.015
3085.9786.50.53
35100.895101.50.6
40113.091140.91
45129.675127.5−2.175
50144.07142.3−1.77
55158.465157.5−0.965
Table 7. Distance table.
Table 7. Distance table.
Distance between Robot and Docking Station (cm)Radius (pixel)
1025–33
2025–27
3025–27
4019–23
5018–20
6018–20
7017–19
8015–18
9014–17
10012–15
1109–12
Table 8. Collision angle offset without adding six-axis sensor.
Table 8. Collision angle offset without adding six-axis sensor.
Collision Number Estimated Rotation Angle (°)Actual Rotation Angle (°)Angle Error (°)
158.4571.4
23503522
32812865
4212.82185.2
5161.41708.6
611012010
741.3528.7
8332.834714.2
926427713
1019520611
Table 9. Comparison without Six-Axis Sensors.
Table 9. Comparison without Six-Axis Sensors.
Time (s)Estimated PositionActual LocationDistance Error (cm)Estimated Angle (°)Actual Angle (°)Angle Error (°)
1(−58.1, 157)(−56.8, 164.3)7.4102.41030.6
2(94.3, 182)(97.7, 208.2)26.4339.13499.9
3(55, 98.1)(83.5, 106.6)29.727628610
4(−9.7, 44.5)(17, 46)26118.613415.4
5(−23.1, 259.7)(−53.6, 247.5)32.8329.935727.1
6(115.5, 105.6)(131, 154.3)51.124127634.3
7(−101, 103.4)(33.5, 58.2)141.4110.113726.9
8(−3.1, 263.4)(−73.2, 236)75.354.99237.1
9(−64.7, 196.9)(−66.7, 142.2)54.7225.826842.2
10(−11.3, 11.6)(95, 35.5)108.735133−42
15(−4.9, 50.2)(137.2, 8.6)148.1248.12534.9
20(−68.3, 61.7)(36.5, 53.4)105.1128.71217.7
30(−140.1, 14)(135.6, 1.2)275.71099613
40(−148.2, 161.7)(−83.2, 246.4)106.819.6811.6
50(−137.4, 26.6)(92.1, 3.4)230.7171.61701.6
Table 10. Comparison of adding six-axis sensors (speed 50 mm/s).
Table 10. Comparison of adding six-axis sensors (speed 50 mm/s).
TimeEstimated PositionActual LocationDistance ErrorEstimated AngleActual AngleAngle Error
1(−49, 146.1)(−36, 152)14.3123.61251.4
2(−15.5, 192.3)(−4.2, 194.4)11.538.6390.4
3(111.8, 146.2)(122, 133.8)16.12222231
4(−87.5, 58.1)(−73.5, 56.6)14.184.3861.7
5(−15.2, 252)(−14.8, 243.5)8.53073052
6(137.2, 214.7)(131.4, 200)15.8252.72466.7
7(22.5, 6.5)(10, 9.5)12.9160.91564.9
8(34.2, 140.8)(22.3, 139.8)11.97.8102.2
9(77, 237.5)(77.3, 230)7.5166.11606.1
10(−84.2, 165.3)(−80.1, 170.2)6.475714
15(125.4, 33.8)(138.2, 35.4)12.948.3453.3
20(−45.9, 102.4)(−54.1, 98.6)959.4563.4
30(154.6, 153.4)(133.7, 148.6)21.4350.33473.3
40(−66.8, 79.2)(−50.4, 95.3)22.9244.92404.9
50(118.8, 240.6)(145.2, 227.3)29.5147.11416.1
Table 11. Comparison of adding six-axis sensors (speed 200 mm/s).
Table 11. Comparison of adding six-axis sensors (speed 200 mm/s).
TimeEstimated PositionActual LocationDistance ErrorEstimated AngleActual AngleAngle Error
1(−62.4, 155.1)(−90.5, 223.7)74.1106.641033.64
2(32.3, 38)(65.1, 72.4)47.5216.92106.9
3(79.1, 24.9)(122.3, 78.3)68.7235.426024.6
4(−2.5, 204.2)(−70.5, 244.9)79.2331.935018.1
5(86.2, 89.3)(54.2, 85)32.384.17014.1
6(11.7, 63.8)(37.5, 5.6)63.7171.416110.4
7(26.7, 140.5)(75.1, 168.6)56307.432113.6
8(−3.5, 236.5)(−76, 243.2)72.8343.635814.4
9(−13.3, 102.7)(−83.9, 31.1)100.6174023
10(124.3, 121.9)(46, 141.5)80.7228.92301.1
15(−53.4, 108.4)(−15.6, 133.2)45.2114.81227.2
20(33.9, 214.5)(59.1, 175.3)46.6106.81081.2
30(148.6, 77.2)(88.9, 114.9)70.643.65410.4
40(−38.6, 87.5)(−86.3, 115.7)55.4215.322711.7
50(146.3, 158.9)(86.4, 133.8)64.9157.217214.8
Table 12. Comparison of adding ultrasonic sensor (200 mm/s).
Table 12. Comparison of adding ultrasonic sensor (200 mm/s).
TimeEstimated PositionActual LocationDistance ErrorEstimated AngleActual AngleAngle Error
1(−103.5, 214.2)(−101, 219)5.4193.31921.3
2(17.1, 236.7)(14.6, 241.6)5.5179.91772.9
3(91.8, 165.8)(95.8, 150.3)161121111
4(18.9, 155.3)(14.3, 129.1)26.61121102
5(129.6, 120.1)(123.5, 109)12.796.3960.3
6(141.1, 128.7)(134.1, 116.2)14.3268.12680.1
7(123.9, 113.8)(110.3, 100.1)19.3114.31140.3
8(−87.6, 108.4)(−99, 96.9)16.697.6943.6
9(−54.9, 240.1)(−64.3, 229)14.53543513
10(130.3, 97.9)(123, 88.2)11.998962
15(−103.3, 208.2)(−80.5, 195.1)26.3282.22775.2
20(112.9, 125.3)(133.1, 124.3)20.275.5732.5
30(84.5, 30.3)(53.7, 25.2)31.2347.93416.9
40(104.3, 107)(118.4, 98)16.7128.61271.6
50(99.2, 160)(120.6, 185.4)33.291.8901.8
Table 13. Time spent in the first part of experiment.
Table 13. Time spent in the first part of experiment.
General-Type MechanismRecharging Time (s)The Proposed MechanismRecharging Time (s)
(a)76(e)36
(b)20(f)26
(c)35(g)33
(d)112(h)29
Table 14. Time spent in the second part of experiment.
Table 14. Time spent in the second part of experiment.
General-Type MechanismRecharging Time (s)The Proposed MechanismRecharging Time (s)
(a)32(e)31
(b)43(f)30
(c)29(g)30
(d)41(h)35

Share and Cite

MDPI and ACS Style

Chang, C.-L.; Chang, C.-Y.; Tang, Z.-Y.; Chen, S.-T. High-Efficiency Automatic Recharging Mechanism for Cleaning Robot Using Multi-Sensor. Sensors 2018, 18, 3911. https://doi.org/10.3390/s18113911

AMA Style

Chang C-L, Chang C-Y, Tang Z-Y, Chen S-T. High-Efficiency Automatic Recharging Mechanism for Cleaning Robot Using Multi-Sensor. Sensors. 2018; 18(11):3911. https://doi.org/10.3390/s18113911

Chicago/Turabian Style

Chang, Ching-Lung, Chuan-Yu Chang, Zhi-Yuan Tang, and Shuo-Tsung Chen. 2018. "High-Efficiency Automatic Recharging Mechanism for Cleaning Robot Using Multi-Sensor" Sensors 18, no. 11: 3911. https://doi.org/10.3390/s18113911

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop