Next Article in Journal
Comparison between Laser and Stamping without Die (SWD) for Micro Tapered Hole Forming
Next Article in Special Issue
Reconfiguration for the Maximum Dynamic Wrench Capability of a Parallel Robot
Previous Article in Journal
A Pretreatment Method for the Velocity of DVL Based on the Motion Constraint for the Integrated SINS/DVL
Previous Article in Special Issue
Investigation of the Machining Stability of a Milling Machine with Hybrid Guideway Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Remote Control of a Mobile Robot for Indoor Patrol

Department of Communications, Navigation and Control Engineering, National Taiwan Ocean University, 2 Pei-Ning Road, Keelung 20224, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Sci. 2016, 6(3), 82; https://doi.org/10.3390/app6030082
Submission received: 29 November 2015 / Revised: 14 February 2016 / Accepted: 25 February 2016 / Published: 15 March 2016

Abstract

:
This study applies smartphone, Bluetooth, and Wi-Fi wireless network to control a wheeled mobile robot (WMR) remotely. The first part of this study demonstrates that the WMR can be controlled manually by a smartphone. The smartphone can remotely control the WMR for forward, backward, left-turn, and right-turn operations. The second part of this article presents object tracking. The WMR can follow a moving object through the use of image processing for object tracking and distance detection. In the third part, infrared sensor and fuzzy system algorithms are integrated into the control scheme. Through wall-following and obstacle-avoidance control, the WMR can successfully perform indoor patrol.

Graphical Abstract

1. Introduction

Over the past few years, different types of wheeled mobile robots (WMR) have been proposed. The WMR’s advantages include high mobility, high load, and easy control. Previous WMR studies have chiefly focused on developing effective performance and helping people work in various environments. Many robots have been used in different applications such as indoor services, space exploration, military undertakings, entertainment, healthcare services, etc. In recent years, intelligent control has been applied to WMR fora variety of tasks [1,2,3,4,5,6,7,8,9,10,11,12,13]. One of the most common intelligent controllers is the fuzzy controller [9,10,11,12,13] because the design is relatively simple and flexible. This paper presents three control schemes. The first one is the smartphone remote control, which uses the Zigbee wireless transmission to the computer-controlled robot. Forward, backward, left-turn, and right-turn movements can be performed via the smartphone. The second control scheme utilizes the EYECAM camera and image processing to detect moving objects and calculate the center distance, and then applies fuzzy controller to track the moving object. The moving target can be clearly seen on the screen of the smartphone. In the third scheme, the WMR integrates infrared sensors to the fuzzy control system and distance detection. A camera is put on the WMR for surveillance. The surroundings of the robot can be seen on the phone. Home patrol and monitoring can therefore be accomplished.
With advanced technology and industrial development, path following, navigation, target tracking, multi-vehicle coordination, path planning and other applications of WMR have been widely discussed. Many of the applications use intelligent systems in controller design such as fuzzy system and neural network. Fuzzy system is quite similar to human thinking. In the real world, the fuzzy logic is full of greater fault tolerance, has better promotion, and is better suited for a nonlinear system. When the WMR is meant to work in a variety of environments, the choice of the controller becomes a very important issue. In the paper of Juang et al. [10], the automobile used a visual sensor to track a moving object. The authors utilized a simplified type 2 fuzzy system to a WMR for target-following control. Zhan et al. [12] used a rapid path-planning and fuzzy logic control to make the WMR trace expected paths. Seder et al. [14] proposed a method that is based on the integration of a focused D* search algorithm and dynamic window local obstacle-avoidance algorithm with some adaptations that provide efficient avoidance of moving obstacles. In the paper of Chung et al. [15], a position-control algorithm was proposed with two separate feedback loops: a velocity feedback loop and a position feedback loop. The proposed control algorithm is designed to compensate for both internal and external errors. This algorithm makes it possible for the WMR to accurately follow the designed trajectory. Other studies include the use of image processing and a cerebellar model articulation controller for path-following control [1], in addition to the laser sensor-positioning algorithm in controller design for unknown environment map building [4], automobile ultrasonic sensors [6], a localization system for path planning [7], and distance between the object and the camera [9].
In this study, remote control by smartphone in mini vehicle applications [16,17] is proposed. In our previous work [17], the mobile robot can only be turned on via Wi-Fi, and performed preset path-following control. In this study, more functions are added to the mobile robot. In addition to the turn-on control, dynamic object tracking and obstacle-avoidance controls are added. The robot can also be controlled by a personal computer (PC) or a smartphone remotely, and the moving path can be changed by the operator at the remote site. Images captured by the network camera placed on the robot can be transmitted to the remote PC or smartphone. Intelligent control, wireless communications, and sensor fusion are successfully integrated in this study. The outline of this paper is as follows. In the second section, the discussion of image processing technology for filtering noise on floor, capturing the targets from the image frame, as well as computing the distance and the angle via the camera are given. Section 3 shows the WMR dynamic tracking and indoor patrol schemes; fuzzy steering control rules are also described. The experiments’ settings are explained in section 4 and a description of the architecture of the hardware system, which includes sensors, Zigbee, smartphone, camera, and block diagrams of the communications scheme. Section 5 shows five experimental results of the proposed system. Conclusions are then given in the last section.

2. Image Processing

In most image processing methods, the light source is very important to the final result. In this study, the image in the red/green/blue (RGB) color space is transformed to the hue/saturation/value (HSV) color space which can significantly reduce the impact of lightness. The HSV model defines a color space in terms of three constituent components; hue (H), saturation (S), and value (V). H is an angle from 0 degrees to 360 degrees. Saturation indicates the range of grey in the color space. It ranges from 0 to 100%. Sometimes the value is calculated from 0 to 1. Value is the brightness of the color and varies with color saturation. It ranges from 0 to 100%. The ranges of the hue and saturation color space can be set to appropriated values so that the image processing is performed as a filter and is utilized to discard useless information and preserve useful information. The center of an object can be detected by the use of the distance calculation method of the camera. With these methods, the robot can track a moving object. The RGB color space values are transferred to HSV color space values by the following equations [17,18,19]:
H = cos 1 ( R G ) + ( R B ) ( R B ) 2 + ( R B ) ( G B ) , B G H = 2 π cos 1 ( R G ) + ( R B ) ( R B ) 2 + ( R B ) ( G B ) , B > G
S = max ( R , G , B ) min ( R , G , B ) max ( R + B + G )
V = max ( R , G , B ) 255
After HSV transformation, the image is then transformed to a binary image so that the original image can be easily identified from the foreground and background. The binary image is applied to change the image color from 0 to 255, where 0 represents black and 255 represents white. A designation of 255 is the foreground representing the captured dynamic obstacle, whereas 0 is the background representing the image to be filtered out. In the binary image process, an appropriate threshold first needs to be set up. If the pixels’ values of a captured image are larger than the threshold value, then they are set to 255. If the pixels’ values are smaller than the threshold value, they are set to 0. The threshold will affect the captured image in object tracking control. Figure 1 shows an example of the original RGB image transformed into the HSV and binary images.
Distance between the robot and the moving target can be obtained by a geometry method [20]. Figure 2 and Figure 3 show the top view and side view of the EYECAM camera (Draganfly Innovations Inc., Saskatoon, SK, Canada). Three prerequisite angles can be obtained by Equations (4)–(6), where the angles are shown in Figure 2 and Figure 3.
α = tan 1 ( E b )
θ = tan 1 ( E L y )
β = tan 1 ( L x / 2 b + L y )
E denotes the height from the camera to the floor, b represents the distance from the projection point of the camera on the floor to the closest viewpoint, and Ly is the largest range that the camera can see, as shown in Figure 3. Lx is the largest width that the camera can see; it is the length from the left to the right of the view, as shown in Figure 2. Coordinates of the target can be obtained as follows:
y = E tan [ ( α θ ) v w ]
x = tan [ β ( 2 u H ) 1 ] y
x is the distance of the target away from the center line of the camera, as shown in Figure 2. y is the distance of the target away from the WMR, as shown in Figure 3.
In Equations (7) and (8), u and v are the horizontal pixel and the vertical pixel of the center coordinates of the captured target, respectively. The H and W represent the pixel lengths of an image provided by the camera. In this paper, E is 15 cm, Lx is 160 cm, Ly is 150 cm, b is 10 cm, H is 320, and W is 480. We can obtain the center coordinates of the target after the image processing. The distance estimation is then used to calculate the distance between the WMR and the target. The dynamic tracking task can be performed via this information and the position of the target on an image frame. Figure 4 shows the relative coordinate of the target with respect to the WMR.

3. Fuzzy Control

In the first part of this study, we explored how the mobile robot is designed to track a moving object in complex environments. Figure 5 shows the dynamic tracking control scheme with fuzzy controller. The procedure of dynamic tracking is shown in Figure 6. There are two inputs of the proposed fuzzy steering controller, which are the horizontal position of the target in a frame of captured image and the distance between the moving target and the WMR. The output of the system is the turning angular velocity of the wheeled mobile robot. Figure 7 shows membership functions X of the horizontal position of the target in a frame of captured image; its fuzzy sets are L, MI and R, which represent left, intermediate, and right, respectively. Figure 8 shows membership functions of the vertical position Y of the target in a frame of captured image. Its fuzzy sets are N, M and F, which represent near, medium and far, respectively. The fuzzy sets of the left wheel speed (the output variable is D) are LS, LMS, LMM, LMF, LM, LFS, LFM, LFF, and LF, which represent a turn that is very slow, greatly slow, medium slow, minimally slow, medium, minimally fast, medium fast, fast and very fast, respectively. Membership functions of the left wheel speed are shown in Figure 9, where the value of wheel speed is the digital input code of the wheel’s motor. Definition of the right wheel speed is similar. Table 1 illustrates the fuzzy rules of the left wheel speed for the fuzzy steering controller. Fuzzy rules of the right wheel speed can be obtained by a similar design process.
In this second part of the study, a fuzzy control scheme is designed to drive the WMR in an indoor patrol situation. The WMR is controlled by a sensor signal which provides distance information to the robot such that the robot can be driven and moved along a wall. In wall following, a safety distance between the robot and the wall is predefined. The robot moves along the wall and keeps a constant distance from the wall. Installed on the robot are two infrared (IR) sensors, which are located at the right and left sides of the robot. Each IR sensor points 45 degrees away from the heading angle, and are marked as SR_L and SR_R, as shown in Figure 10. The measured distance from the SR_R is dR and the distance measured by the SR_L is dL. Control sequence of indoor patrol is shown in Figure 11.
In fuzzy steering controller, the inputs are two IR detecting signals and the output of the system is the turning angular velocity of the robot. Fuzzy sets of the left IR sensor are LF, LM and LC, which represent fast, medium, and slow, respectively. For the fuzzy sets of the left wheel, speeds are LS, LMS, LMM, LMF, LM, LFS, LFM, LFF, and LF, which represent turning very slow, greatly slow, medium slow, minimally slow, medium, minimally fast, medium fast, fast and very fast, respectively. The fuzzy rules for the indoor patrol controller of the left wheel speed are shown in Table 2.

4. Experimental Settings

The WMR used in this study is called Boe-Bot Idrobot (Parallax Inc., Rocklin, CA, USA) as shown in Figure 12. In addition to Bluetooth control, the robot can also be controlled by the smartphone (HTC Corporation, Taoyuan, Taiwan) via Zigbee wireless module (Texas Instruments Inc., Dallas, TX, USA). Zigbee is used to transmit command signals between the webcam (Kinyo Inc., Hsinchu, Taiwan), the robot, and the PC (ASUSTeK Computer Inc., Taipei, Taiwan). Wi-Fi communication is applied to the connection of smartphone and PC in indoor patrol. Due to the size of memory space, the Boe-Bot Idrobot can only process and store 2K Bytes. The IR sensors (Parallax Inc., Rocklin, CA, USA) and Zigbee occupied by personal identification number is another problem requiring further study. The transmitting frequency of the robot’s Zigbee and camera (Kinyo Inc., Hsinchu, Taiwan) are 24 GHz. They are assigned to com port 5 (COM5) and COM3 on the PC. The image is transmitted at 50 ms per one frame. Zigbee receives information at 20 ms for one unit of data. Data is then processed by the PC. The dynamic objects are solved by the net wrapper to the open source computer vision library (Emgu.CV) of dynamic link library (DLL) files. Figure 13 shows the buttons of operation: stop, up, down, left, right, track and patrol. We use this operating interface to receive robot information, send control command, and process the feedback values. Image processing and fuzzy control are coded by C# language.
Figure 14 Shows the structure of the experiment settings, where (A), (B), and (C) are described in Figure 15. Figure 15a is the use of radio frequency (RF) wireless from camera to PC network. Figure 15b shows the communication from WMR to PC. First, the IR’s value needs to be obtained, then the value of transcoding using Zigbee wireless network is transferred to the PC. COM5 receives exclusive information. The data is then decoded. Figure 15c shows the control flow chart of the command that is sent to the WMR. Information is processed by fuzzy controller with the use of COM5, and the corresponding decoded speed is then sent to the WMR by the Zigbee wireless network. Figure 16 shows the block diagram of the remote control.

5. Experimental Results

The first experiment is tested in a terrain as shown in Figure 17. Figure 18 shows the PC-based control operation of the wheeled robot for lower right-turn control application. The robot is driven by a simple proportional control scheme.
Figure 19 shows the testing environment of the second test based on smartphone control operation. Figure 20 is the right-turn operation. Manual operation control sequences of the smartphone and the WMR are shown in Figure 21 and Figure 22. After pressing the command arrow, as shown in Figure 20, the command signal will be transmitted to the WMR. The WMR will then perform the required action.
The first and second experiments described above are manually operated. The following three experiments are autonomous control. The flowchart of the automatic dynamic tracking control sequence is shown in Figure 6, whereas the flowchart of the automatic indoor patrol control sequence is shown in Figure 11. In dynamic tracking, Figure 23 shows the tracking operation. Figure 24 shows the images on the PC screen that are captured by the robot’s camera.
Indoor patrol and remote surveillance are also tested. The WMR is controlled by a two-input two-output fuzzy controller. Figure 25 is the testing field. Figure 26 shows different positions of the wall-following patrol. Figure 27 shows the images captured on the PC screen.
Remote surveillance is shown in Figure 28, Figure 29, Figure 30 and Figure 31. Figure 28 is the testing field of the remote surveillance. Figure 29 shows different remote control operations. Figure 30 is the remote site. Figure 31 shows the captured images on the smartphone.

6. Conclusions

This study presents integration of image processing techniques, fuzzy theory, wireless communications, and smartphone to a wheeled mobile robot (WMR) for real-time moving object recognition, tracking and remote surveillance. The WMR uses a webcam to capture its surroundings. The WMR calculates the relative position of the target object through image processing and distance computation algorithms. Fuzzy system is applied to robot control. In this study, three cases of experiments are given. In the first case, a PC and a smartphone are utilized to control directly the WMR’s forward, backward, left-turn, and right-turn movements. The second case focuses on target tracking control. The WMR can track a specific target by the HSV algorithm and fuzzy controller. The target can be clearly seen on the smartphone via the webcam on the WMR. In the third case, the WMR is applied to surveillance usage. The WMR can be controlled remotely by a smartphone via wireless communications. The WMR uses infrared sensor and fuzzy controller for obstacle avoidance and wall-following control. The WMR can perform indoor patrol and monitor its surroundings. The home site conditions can be clearly seen on the smartphone. Experiments show that the proposed control design and system integration of the wheeled mobile robot works well for indoor patrol.

Acknowledgments

This study is partially supported by the National Taiwan Ocean University.

Author Contributions

S.Y. Juang and J.G. Juang conceived and designed the experiments; S.Y. Juang performed the experiments; J.G. Juang wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Juang, J.G.; Hsu, K.J.; Lin, C.M. Path following of a wheeled mobile robot based on image processing and adaptive CMAC. J. Marine Sci. Technol. 2014, 22, 331–340. [Google Scholar]
  2. Juang, J.G.; Chen, H.S.; Lin, C.C. Intelligent path tracking and motion control for wheeled mobile robot. GESTS Int. Trans. Comput. Sci. Eng. 2010, 61, 57–68. [Google Scholar]
  3. Juang, J.G.; Yu, C.L.; Lin, C.M.; Yeh, R.G.; Rudas, I.J. Real-time image recognition and path tracking to wheeled mobile robot for taking an elevator. Acta Polytech. Hung. 2013, 10, 5–23. [Google Scholar]
  4. Juang, J.G.; Wang, J.A. Indoor map building by laser sensor and positioning algorithms. Appl. Mech. Mater. 2015, 765, 752–756. [Google Scholar] [CrossRef]
  5. Chen, Y.S.; Wang, W.H.; Juang, J.G. Application of Intelligent Computing to Autonomous Vehicle Control. In Proceedings of the 2010 IEEE World Congress on Computational Intelligence, Barcelona, Spain, 18–23 July 2010.
  6. Lin, C.C.; Juang, J.G. An Obstacle Avoiding Control Strategy for Wheeled Mobile Robot. In Proceedings of the 2008 National Symposium on System Science and Engineering, ILan, Taiwan, 6 June 2008.
  7. Wang, W.H.; Juang, J.G. Application of Localization System to WMR Path Planning and Parking Control. In Proceedings of the 2009 IEEE International Conference on Advanced Intelligent Mechatronics, Singapore, 14–17 July 2009.
  8. Chen, Y.S.; Juang, J.G. Intelligent Obstacle Avoidance Strategy for Wheeled Mobile Robot. In Proceedings of the 2009 ICROS-SICE International Joint Conference, Fukuoka, Japan, 18–21 August 2009.
  9. Juang, J.G.; Wu, C.H. Type-2 fuzzy control of a mobile robot for avoiding moving object. Key Eng. Mater. J. 2011, 474, 1300–1305. [Google Scholar] [CrossRef]
  10. Juang, J.G.; Lo, C.W. Computer-aided mobile robot control based on visual and fuzzy systems. Adv. Sci. Lett. 2012, 13, 84–89. [Google Scholar] [CrossRef]
  11. Wu, C.H.; Juang, J.G. Application of Image Process and Fuzzy Theory to Dynamic Obstacle Avoidance for an Autonomous Vehicle. In Proceedings of the 2010 National Symposium on System Science and Engineering, Taipei, Taiwan, 1–2 July 2010.
  12. Zhan, J.J.; Wu, C.H.; Juang, J.G. Application of Image Process and Distance Computation to WMR Obstacle Avoidance and Parking Control. In Proceedings of the 5th IEEE Conference on Industrial Electronics and Applications, Taichung, Taiwan, 15–17 January 2010; pp. 1264–1269.
  13. Lee, T.H.; Lam, H.K.; Leung, F.H.F.; Tam, P.K.S. Fuzzy Model Reference Control of Wheeled Mobile Robots. In Proceedings of the 27th Annual Conference of the IEEE Industrial Electronics Society, Denver, CO, USA, 29 November–2 December 2001; pp. 570–573.
  14. Seder, M.; Petrovi’c, I. Dynamic Window Based Approach to Mobile Robot Motion Control in the Presence of Moving Obstacles. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 1987–1991.
  15. Chung, Y.; Park, C.; Harashima, F. A position control differential drive wheeled mobile robot. IEEE Trans. Ind. Electron. 2001, 48, 853–863. [Google Scholar] [CrossRef]
  16. Kelber, C.R.; Dreger, R.S.; Comes, G.G.K.; Webber, D.; Schirmbeck, J.; Netto, R.H.; Borges, D.A. Cell-phone Guided Vehicle, an Application Based on a Drive-by-wire Automated System. In Proceedings of the 2003 IEEE Intelligent Vehicles Symposium, Columbus, OH, USA, 9–11 June 2003.
  17. Juang, S.Y.; Juang, J.G. Real-time Indoor Surveillance Based on Smartphone and Mobile Robot. In Proceedings of the 2012 IEEE International Conference on Industrial Informatics, Beijing, China, 25–27 July 2012.
  18. HSV COLOR SPACE. Available online: http://www.topbits.com/hsv.html (accessed on 20 May 2012).
  19. Bunks. The HSV Colorspace. Available online: http://gimsavvy.com/ BOOK/index.html?node50.html (accessed on 20 May 2012).
  20. Hsiao, Y.L. Navigation and Obstacle Avoidance of Wheeled Mobile Manipulators with an Eye-in-Hand Vision System. Master Thesis, Department of Mechanical Engineering, National Cheng Kung University, Taiwan, July 2005. [Google Scholar]
Figure 1. (a) Original red/green/blue (RGB) image; (b) hue/saturation/value (HSV) space image; (c) Binary image of the HSV space.
Figure 1. (a) Original red/green/blue (RGB) image; (b) hue/saturation/value (HSV) space image; (c) Binary image of the HSV space.
Applsci 06 00082 g001
Figure 2. Top view of camera [17].
Figure 2. Top view of camera [17].
Applsci 06 00082 g002
Figure 3. Side view of camera [17].
Figure 3. Side view of camera [17].
Applsci 06 00082 g003
Figure 4. Relative coordinate of target with respect to wheeled mobile robot (WMR).
Figure 4. Relative coordinate of target with respect to wheeled mobile robot (WMR).
Applsci 06 00082 g004
Figure 5. Dynamic tracking control scheme with fuzzy controller.
Figure 5. Dynamic tracking control scheme with fuzzy controller.
Applsci 06 00082 g005
Figure 6. The flowchart of dynamic tracking.
Figure 6. The flowchart of dynamic tracking.
Applsci 06 00082 g006
Figure 7. Membership functions of the horizontal position of a moving target in an image frame.
Figure 7. Membership functions of the horizontal position of a moving target in an image frame.
Applsci 06 00082 g007
Figure 8. Membership functions of the vertical position of a moving target in an image frame.
Figure 8. Membership functions of the vertical position of a moving target in an image frame.
Applsci 06 00082 g008
Figure 9. Membership functions of the left wheel speed w.r.t. moving target position.
Figure 9. Membership functions of the left wheel speed w.r.t. moving target position.
Applsci 06 00082 g009
Figure 10. Sensors and input variables of the WMR [17].
Figure 10. Sensors and input variables of the WMR [17].
Applsci 06 00082 g010
Figure 11. Flowchart of indoor patrol [17].
Figure 11. Flowchart of indoor patrol [17].
Applsci 06 00082 g011
Figure 12. Experiment setup.
Figure 12. Experiment setup.
Applsci 06 00082 g012
Figure 13. Visual studio 2008 operating interface.
Figure 13. Visual studio 2008 operating interface.
Applsci 06 00082 g013
Figure 14. Schematic diagram for personal computer (PC) connection.
Figure 14. Schematic diagram for personal computer (PC) connection.
Applsci 06 00082 g014
Figure 15. (a) Camera transmits images to the PC; (b) Infrared data transmission from WMR to PC; (c) Command signal is calculated and sent to the WMR.
Figure 15. (a) Camera transmits images to the PC; (b) Infrared data transmission from WMR to PC; (c) Command signal is calculated and sent to the WMR.
Applsci 06 00082 g015
Figure 16. Remote control scheme.
Figure 16. Remote control scheme.
Applsci 06 00082 g016
Figure 17. Surrounding environment of the first test.
Figure 17. Surrounding environment of the first test.
Applsci 06 00082 g017
Figure 18. PC-based control operation example: lower right movement.
Figure 18. PC-based control operation example: lower right movement.
Applsci 06 00082 g018
Figure 19. The testing environment of the second test based on smartphone control operation.
Figure 19. The testing environment of the second test based on smartphone control operation.
Applsci 06 00082 g019
Figure 20. Smartphone control example: Right-turn operation; press right arrow.
Figure 20. Smartphone control example: Right-turn operation; press right arrow.
Applsci 06 00082 g020
Figure 21. Sequence of smartphone operation.
Figure 21. Sequence of smartphone operation.
Applsci 06 00082 g021
Figure 22. Sequence of WMR operation.
Figure 22. Sequence of WMR operation.
Applsci 06 00082 g022
Figure 23. Dynamic tracking operation. (a) Starting tracking operation; (b) Tracking position 1; (c) Tracking position 2.
Figure 23. Dynamic tracking operation. (a) Starting tracking operation; (b) Tracking position 1; (c) Tracking position 2.
Applsci 06 00082 g023
Figure 24. Images on the PC screen that are captured by the robot’s camera. (a) Captured image at Figure 23a position; (b) Captured image at Figure 23b position; (c) Captured image at Figure 23c position.
Figure 24. Images on the PC screen that are captured by the robot’s camera. (a) Captured image at Figure 23a position; (b) Captured image at Figure 23b position; (c) Captured image at Figure 23c position.
Applsci 06 00082 g024
Figure 25. Testing field of indoor patrol.
Figure 25. Testing field of indoor patrol.
Applsci 06 00082 g025
Figure 26. Wall-following patrol. (a) Initial position; (b) Wall-following patrol 1 position; (c) Patrol 2 position; (d) Patrol 3 position; (e) Patrol 4 position; (f) Patrol 5 position; (g) Patrol 6 position; (h) Ending position.
Figure 26. Wall-following patrol. (a) Initial position; (b) Wall-following patrol 1 position; (c) Patrol 2 position; (d) Patrol 3 position; (e) Patrol 4 position; (f) Patrol 5 position; (g) Patrol 6 position; (h) Ending position.
Applsci 06 00082 g026aApplsci 06 00082 g026b
Figure 27. Captured images on PC screen. (a) Captured image on PC screen at Figure 26a position; (b) Captured image on PC screen at Figure 26b position; (c) Captured image on PC screen at Figure 26c position; (d) Captured image on PC screen at Figure 26d position; (e) Captured image on PC screen at Figure 26e position; (f) Captured image on PC screen at Figure 26f position; (g) Captured image on PC screen at Figure 26g position; (h) Captured image on PC screen at Figure 26h position.
Figure 27. Captured images on PC screen. (a) Captured image on PC screen at Figure 26a position; (b) Captured image on PC screen at Figure 26b position; (c) Captured image on PC screen at Figure 26c position; (d) Captured image on PC screen at Figure 26d position; (e) Captured image on PC screen at Figure 26e position; (f) Captured image on PC screen at Figure 26f position; (g) Captured image on PC screen at Figure 26g position; (h) Captured image on PC screen at Figure 26h position.
Applsci 06 00082 g027aApplsci 06 00082 g027b
Figure 28. Initial position in the remote surveillance testing field.
Figure 28. Initial position in the remote surveillance testing field.
Applsci 06 00082 g028
Figure 29. Remote operation. (a) Forward operation; (b) Backward operation; (c) Left-turn operation; (d) Right-turn operation.
Figure 29. Remote operation. (a) Forward operation; (b) Backward operation; (c) Left-turn operation; (d) Right-turn operation.
Applsci 06 00082 g029aApplsci 06 00082 g029b
Figure 30. Smartphone remote control site.
Figure 30. Smartphone remote control site.
Applsci 06 00082 g030
Figure 31. Captured images on smartphone. (a) Captured image at Figure 29a position; (b) Captured image at Figure 29b position; (c) Captured image at Figure 29c position; (d) Captured image at Figure 29d position.
Figure 31. Captured images on smartphone. (a) Captured image at Figure 29a position; (b) Captured image at Figure 29b position; (c) Captured image at Figure 29c position; (d) Captured image at Figure 29d position.
Applsci 06 00082 g031
Table 1. Fuzzy rules of the left wheel speed for the fuzzy steering controller of dynamic tracking.
Table 1. Fuzzy rules of the left wheel speed for the fuzzy steering controller of dynamic tracking.
DYNMF
X
LLMSLMFLS
MILMMLMLFF
RLFMLFSLF
Table 2. Fuzzy rules for the indoor patrol controller of left wheel speed.
Table 2. Fuzzy rules for the indoor patrol controller of left wheel speed.
DYRFRMRC
X
LFLMSLMFLC
LMLMMLMLFF
LCLFMLFSLF

Share and Cite

MDPI and ACS Style

Juang, S.-Y.; Juang, J.-G. Remote Control of a Mobile Robot for Indoor Patrol. Appl. Sci. 2016, 6, 82. https://doi.org/10.3390/app6030082

AMA Style

Juang S-Y, Juang J-G. Remote Control of a Mobile Robot for Indoor Patrol. Applied Sciences. 2016; 6(3):82. https://doi.org/10.3390/app6030082

Chicago/Turabian Style

Juang, Shih-Yao, and Jih-Gau Juang. 2016. "Remote Control of a Mobile Robot for Indoor Patrol" Applied Sciences 6, no. 3: 82. https://doi.org/10.3390/app6030082

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop