Next Article in Journal
Innovation Potentials and Pathways Merging AI, CPS, and IoT
Previous Article in Journal
Design, Implementation, and Field Testing of a Privacy-Aware Compliance Tracking System for Bedside Care in Nursing Homes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Concept Paper

Indoor Autonomous Vehicle Navigation—A Feasibility Study Based on Infrared Technology

Department of Electronics Engineering, National United University, 36003 Miaoli, Taiwan
*
Author to whom correspondence should be addressed.
Appl. Syst. Innov. 2018, 1(1), 4; https://doi.org/10.3390/asi1010004
Submission received: 6 September 2017 / Revised: 8 January 2018 / Accepted: 8 January 2018 / Published: 10 January 2018

Abstract

:
The application of autonomous vehicles has grown dramatically in recent years. Not only has the rail-guided vehicle (RGV) been used widely in traditional production lines, the automatic guided vehicle (AGV) has also been increasingly used. Positioning and path planning are two major functions of autonomous vehicles; however, there are many ways to fulfill the above requirements. The infrared remote control has been heavily and successfully used in home appliances for decades, which has encouraged us to apply this mature and cost-effective technology to an autonomous vehicle. By decoding the coded signal from the infrared light-emitting diode (LED), which is equipped on the ceiling, the autonomous vehicle can be positioned with an accuracy of less than 50 mm. On the other hand, by changing the beam pattern of infrared light from the ceiling, an invisible route can be produced on the ground. That is to say, instead of the traditional rail-guided method, these invisible paths can guide the autonomous vehicle. We have implemented a prototype of an autonomous vehicle system based on the above concept, with the aim of creating a simple and reliable approach for the navigation of an indoor autonomous vehicle.

1. Introduction

Positioning is a key technology that enables the development of indoor autonomous vehicles. There are various kinds of useful approaches, such as radio frequency identification (RFID) [1,2,3], ZigBee, WiFi [4,5,6,7], have been proved in many research studies over the past years. Recently, a robotics company (Marvelmind Robotics: Sunnyvale, CA, USA) has even launched an “Indoor GPS” with ±2 cm precision based on stationary ultrasonic beacons united by radio interface in a license-free band [8]. Indeed, most of the problems of positioning could be solved by any of the above methods. However, this paper verifies a cost-effective method utilizing infrared technology. Not only can the positioning of the test vehicle be accomplished by making use of the mature encoding technology of the infrared signal, the path planning has also been demonstrated by use of the invisible route (described latter). Eventually, based on infrared technology, a simple method for the navigation of an indoor autonomous vehicle has been presented in this article.

2. Methods

2.1. Mechanical

Since the carrying ability is not an issue for this study, a small toy-like tricycle was adopted as the test vehicle, which is shown on the left in Figure 1. There are two wheels, which are driven by dc motors, one in front, and one idler in the back. The vehicle chassis is made up of a transparent acrylic board that can be used as the payload space for the electronic system (shown on the right in Figure 1).

2.2. Electronic

Most of the efforts have been focused on electronic issues. A block diagram of the electronic system for the test vehicle is shown in Figure 2. The details are described below.

2.2.1. Motion Control

The two dc motors, DC MOTOR-1 and DC MOTOR-2, are driven by H-Bridge Driver-1 and H-Bridge Driver-2, respectively, and the PWM-1 and PWM-2 signals, which are provided by the microcontroller (MCU), are designed to complete the motion control of the test vehicle. By adjusting the duty ratio of the PWM signals appropriately, the four basic types of motion (forward/backward/left turn/right turn) can be conducted well. The H-Bridge driver has been applied in many research studies [9,10]; thus, its details are omitted here. On the other hand, the test vehicle has also been equipped with five ultrasonic sensors (SRF05) for obstacle detection.

2.2.2. Infrared Module

The Infrared Module consists of an infrared LED and an infrared receiver which is shown in Figure 3.
The infrared LED can be driven by the general purpose I/O pin of the MCU to transmit invisible light (wavelength around 900 nm). However, to raise the noise immunity of the environmental light, the infrared signal is modulated by a 38-KHz carrier (pulse train). Furthermore, pulse distance encoding (the so called Nippon Electric Company (NEC) infrared (IR) transmission protocol) is adopted as the protocol of communication, and an illustration of pulse distance encoding is shown in Figure 4.

2.2.3. Positioning

An illustration of the positioning for the autonomous vehicle is shown in Figure 5. Two infrared LED lights are equipped in different locations (stations A and B) on the ceiling, which will continuously transmit the “encoded data” as mentioned above. So, not only will the autonomous vehicle know “whether it is approaching a station” by detecting the 38K-Hz pulse train, the autonomous vehicle can also tell “which station it is” by decoding the encoded infrared signal. As a matter of course, the autonomous vehicle can proceed to various missions (ex. “Pass through” or “Stay for 10 s”) by appropriately encoding the infrared signal from the ceiling.

2.2.4. Invisible Route

Instead of the traditional rail-guided (or line-tracing) method, this research has adopted a simple and cost-effective approach for path planning: the invisible route caused by an infrared beam. Precisely speaking, there will be an invisible light zone produced on the ground when the infrared projection comes from the ceiling, and the outer contour of the invisible light zone becomes the invisible route. Furthermore, normally a round type of contour will be when the infrared LED is equipped standalone on the ceiling. However, there will be a semi-circular contour type if an appropriate piece of cardboard is put close to the infrared LED. From a top view, an illustration of the two types of invisible route is shown in Figure 6.
The dotted line range in Figure 6 is the so-called invisible light zone. Since the cardboard can block almost half of the infrared projection, it is not hard to imagine that a semi-circular type of contour will be produced by this approach. Obviously, the diameter (A to B) of the semi-circular portion is the interesting part, which needs further explanation.
An illustration of the “invisible line tracing” navigation is shown in Figure 7. This illustrates a test vehicle that is equipped with two infrared receivers passing through the invisible light zone, and it is “right on the track” (see vehicle “a” in Figure 7). In such a condition, one of the infrared receivers will receive the 38 KHz carrier; however, the other receiver gets nothing. No doubt, the vehicle only needs to drive the two wheels at the same speed to keep on going right on the track. Consider another case (see vehicle “b” in Figure 7), in which the vehicle is going deep into the invisible light zone, and both of the two infrared sensors receive the 38 KHz carrier. Obviously, the vehicle has to adjust the ratio of speed of the two wheels (make a left turn) to be “back on track”.

2.2.5. Path Planning

Basically, at least three types of path planning (left turn, go straight, right turn) are needed in order for an autonomous vehicle to complete the maneuver. By cascading multiple the above semi-circular type invisible routes, a simple and effective method that fulfilled the basic requirement was asserted in this research. The relative position of adjacent infrared LEDs, and the angle of inclination of the cardboard close to the infrared LED, are two key points of this method. However, it is also easy to understand by the simple geometrical illustrations shown in Figure 8.

3. Results and Discussion

3.1. Experimental setups

Using an off-the-shelf internet protocol (IP) camera with night vision, the pictures (shown in Figure 9) of the invisible light zone have been captured with the infrared LEDs that are equipped 100 cm (La) and 75 cm (Lb) above the ground, respectively. It is easy to identify the two types of invisible route from the pictures, since the sharp straight line part of the semi-circular type of contour (shown on the right in Figure 9) is obviously seen. On the other hand, the radii of the two invisible light zones are around 60 cm (Ra) and 45 cm (Rb), respectively, and the relationship between the radius and height for the infrared projection can be explained with the geometry demonstration shown in Figure 10.
The view angle θ of the test infrared LED can be calculated by simple trigonometry below:
tan(θ/2) = Ra/La = Rb/Lb
According to the experiment data mentioned above, the derived value of θ is around 62 degrees for both cases, and this is consistent with the specification (50~60 degrees) of IR remote control of general home appliances. Also, there are a few videos captured by the above IP camera to demonstrate the navigation of the test vehicle. An example of the semi-circular type of path planning is shown in the following serial frame of pictures (Figure 11). There are more vivid results that can be watched in the supplementary materials.

3.2. The Accuracy of Positioning

Figure 12 shows an illustration of the positioning experiment. There were three semi-circular type invisible routes, named A, B, and C, respectively, concatenated into a straight line for the test. The infrared signals sent to the invisible zones A and B were not encoded, however, and the infrared signal sent to invisible zone C was encoded as an identification of the station. On the other hand, to increase the accuracy of the positioning, the third infrared receiver was equipped in the rear of test vehicle (see Figure 12).
In the beginning, the test vehicle moved at normal speed (greater than 20 cm/s) through invisible routes A and B. However, it began to slow down when the front infrared receivers detected the encoded identification signal. Then, the test vehicle continued to move at a relatively slower speed (less than 10 cm/s) until the third receiver detected the encoded identification signal, and then stopped immediately at the positioning point. Figure 13 shows the pictures captured around the positioning point. The red mark ”10” on the ruler was set as the preset position point.
Table 1 shows the data of the positioning experiment. The data in the “offset” row(s) indicate the distance from the preset point. According to the data in Table 1, the maximum positioning deviation is around 30 mm.

3.3. Trajectory Analysis of Path

For a semi-circular type invisible zone, the “light and dark” contrast of the diameter margin is much stronger than that of the circular arc margin. That is to say, the semi-circular type invisible zone is suitable for straight type guidance. Ideally, the trajectory of the autonomous vehicle should be a straight line when the semi-circular type invisible route is applied. Obviously, the key problem is: “How wide is the line?”
Figure 14 shows the illustration of trajectory analysis of the path. “W” is the width of the car body, which equals the distance between two front wheels. “S” is the spacing between the two infrared receivers in front. However, “D” stands for the longitudinal dynamic range of the trajectory. A detailed figure is shown in Figure 15.
Apparently, “D” is highly positively correlated with “S”. Precisely speaking, it can be expressed in the equation below:
D − W < S
In this case, W = 165 mm and S = 30 mm. Table 2 shows the data of the experiment.
Table 2 recorded 20 cruise trips of the test vehicle along the invisible route at a speed of 10 cm/s. For each trip, “Left” stands for the lowest level that the left wheel reached. Similarly, “Right” stands for the highest level that the right wheel reached. All of the data were measured in relative coordinate reference to a fixed point (origin). The final effort was to find out the maximum of the highest levels (which was 268 mm in Table 2), and the minimum of the lowest levels (which was 93 mm in Table 2). The “D”—i.e., the so-called longitudinal dynamic range of the trajectory—should be the difference between the two extreme values, which is:
D = 268 − 93 = 175 (mm)
and, D − W = 175 − 165 = 10 (mm) < S (30 mm)
In principle, the result is consistent with the above discussion. However, the obvious question is: is the smaller “S” better? Indeed, a larger “S” will lead to a relatively wider straight line of trajectory. But, a countereffect might arise when the “S” is smaller than a threshold. Figure 16 indicates the trajectories for different settings of “S”.
Not only would the “D” stop decreasing, the average velocity of the test vehicle could also be significantly reduced when the “S” is smaller than a threshold. In this research, by trial and error, ~2–4 cm is the suitable “S” for navigation on the invisible route. In fact, the cruising speed is another point of concern for setting “S”. However, we have not devoted too much effort to this issue.

4. Conclusions

A prototype of an autonomous vehicle system based on the mature infrared technology has been implemented. Also, a simple and reliable method of positioning and path planning for indoor navigation has been proved on the test vehicle. However, there is still much work left to improve the navigation performance in practical application. For instance, there must be more useful invisible contours of infrared projection, which can be produced by ingeniously processing the cardboard. Furthermore, a network that could effectively manage lots of invisible routes, which may be intricately linked together, is another key problem to be solved. We will continue to invest efforts in the above topics in the near future.

Supplementary Materials

The videos of this paper are available online at www.mdpi.com/2571-5577/1/1/4.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liu, R.; Koch, A.; Zell, A. Path following with passive UHF RFID received signal strength in unknown environments. In Proceedings of the Intelligent Robots and Systems (IROS), Vilamoura, Portugal, 7–12 October 2012. [Google Scholar]
  2. Liu, R.; Yuen, C.; Do, T.N.; Tan, U.X. Fusing Similarity-based Sequence and Dead Reckoning for Indoor Positioning without Training. IEEE Sens. J. 2017, 17, 4197–4207. [Google Scholar] [CrossRef]
  3. Liu, R.; Yuen, C.; Do, T.N.; Jiao, D.; Liu, X.; Tan, U.X. Cooperative Relative Positioning of Mobile Users by Fusing IMU Intertial and UWB Ranging Information. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore, 29 May–3 June 2017; pp. 5623–5629. [Google Scholar]
  4. Ting, S.L.; Kwok, S.K.; Tsang, A.H.C.; Ho, G.T.S. The Study on Using Passive RFID Tags for Indoor Positioning. Int. J. Eng. Bus. Manag. 2011, 3, 9–15. [Google Scholar] [CrossRef]
  5. Weekly, K.; Zou, H.; Xie, L.; Jia, Q.S.; Bayen, A.M. Indoor occupant positioning system using active RFID deployment and particle filters. In Proceedings of the 2014 IEEE International Conference on Distributed Computing in Sensor Systems, Marina Del Rey, CA, USA, 26–28 May 2014. [Google Scholar]
  6. Zhao, Y.; Dong, L.; Wang, J.; Hu, B.; Fu, Y. Implementing indoor positioning system via ZigBee devices. In Proceedings of the 42nd Asilomar Conference on Signals, Systems and Computers, Pacific Grove, CA, USA, 26–29 October 2008. [Google Scholar]
  7. Yang, C.; Shao, H.R. WiFi-based indoor positioning. IEEE Commun. Mag. 2015, 53, 150–157. [Google Scholar] [CrossRef]
  8. Marvelmind Robotics. Precise (±2 cm) Indoor GPS: For Autonomous Robots, Copters and VR. 2017. Available online: https://marvelmind.com/.
  9. Run, R.-S.; Yen, J.-C.; Tsai, C.-Y. A Low Cost Implementation of GPS Guided Driverless Cars. In Proceedings of the 5th IEEE Conference on Industrial Electronics and Applications (ICIEA2010), Taichung, Taiwan, 15–17 June 2010. [Google Scholar]
  10. Run, R.-S.; Chang, Y.-C.; Cheng, F.-C. A Straightforward Approach of Automatic Parking System-“Training-Recording-Play Back”. In Proceedings of the 2012 IEEE International Symposium on Circuits & Systems (ISCAS), Seoul, Korea, 20–23 May 2012. [Google Scholar]
Figure 1. (a) Test vehicle; (b) electronic system.
Figure 1. (a) Test vehicle; (b) electronic system.
Asi 01 00004 g001
Figure 2. The block diagram of the electronic system. MCU: microcontroller. PWM: pulse width modulation.
Figure 2. The block diagram of the electronic system. MCU: microcontroller. PWM: pulse width modulation.
Asi 01 00004 g002
Figure 3. The infrared module: infrared LED and infrared receiver.
Figure 3. The infrared module: infrared LED and infrared receiver.
Asi 01 00004 g003
Figure 4. The timing illustration of pulse distance encoding.
Figure 4. The timing illustration of pulse distance encoding.
Asi 01 00004 g004
Figure 5. The illustration of positioning for the autonomous vehicle.
Figure 5. The illustration of positioning for the autonomous vehicle.
Asi 01 00004 g005
Figure 6. An illustration of different invisible routes.
Figure 6. An illustration of different invisible routes.
Asi 01 00004 g006
Figure 7. An illustration of “invisible line tracing” navigation.
Figure 7. An illustration of “invisible line tracing” navigation.
Asi 01 00004 g007
Figure 8. An illustration of path planning.
Figure 8. An illustration of path planning.
Asi 01 00004 g008
Figure 9. Pictures of the invisible light zone.
Figure 9. Pictures of the invisible light zone.
Asi 01 00004 g009
Figure 10. The relationship between the radius and height for the infrared projection.
Figure 10. The relationship between the radius and height for the infrared projection.
Asi 01 00004 g010
Figure 11. The continuous frame of pictures captured during a demonstration of navigation.
Figure 11. The continuous frame of pictures captured during a demonstration of navigation.
Asi 01 00004 g011
Figure 12. The illustration of positioning.
Figure 12. The illustration of positioning.
Asi 01 00004 g012
Figure 13. The pictures captured around the positioning point.
Figure 13. The pictures captured around the positioning point.
Asi 01 00004 g013
Figure 14. An illustration of the trajectory analysis of the path.
Figure 14. An illustration of the trajectory analysis of the path.
Asi 01 00004 g014
Figure 15. The illustration of the longitudinal dynamic range of the trajectory.
Figure 15. The illustration of the longitudinal dynamic range of the trajectory.
Asi 01 00004 g015
Figure 16. The trajectories for different settings of “S”.
Figure 16. The trajectories for different settings of “S”.
Asi 01 00004 g016
Table 1. The data of the positioning experiment.
Table 1. The data of the positioning experiment.
No.1234567891011121314151617181920
Offset
(mm)
+4+5−15−5−15−7−15−7−6−16−30+2−25−14−29−9−5−130+2
Table 2. The data of the longitudinal dynamic range of the trajectory.
Table 2. The data of the longitudinal dynamic range of the trajectory.
No.1234567891011121314151617181920
Left
(mm)
9495969695969596949493949494959494959496
Right
(mm)
265267268267266266266265264265263263263264265265265266264265

Share and Cite

MDPI and ACS Style

Run, R.-S.; Xiao, Z.-Y. Indoor Autonomous Vehicle Navigation—A Feasibility Study Based on Infrared Technology. Appl. Syst. Innov. 2018, 1, 4. https://doi.org/10.3390/asi1010004

AMA Style

Run R-S, Xiao Z-Y. Indoor Autonomous Vehicle Navigation—A Feasibility Study Based on Infrared Technology. Applied System Innovation. 2018; 1(1):4. https://doi.org/10.3390/asi1010004

Chicago/Turabian Style

Run, Ray-Shine, and Zhi-Yu Xiao. 2018. "Indoor Autonomous Vehicle Navigation—A Feasibility Study Based on Infrared Technology" Applied System Innovation 1, no. 1: 4. https://doi.org/10.3390/asi1010004

Article Metrics

Back to TopTop