# Unmanned Aerial Vehicle Flight Point Classification Algorithm Based on Symmetric Big Data

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Related Work

## 3. Overview of Flying Points

#### 3.1. UAV Point and Angle

_{i,j}, y

_{i,j}, and z

_{i,j}are located on the x-, y-, and z-axes that correspond to the GPS longitude, GPS latitude, and absolute altitude, respectively.

#### 3.2. Expression of the UAV Flight Points

_{i,q}, which is which is the measuring sequence of the point ${P}_{i,j}$ and the angle ${R}_{i,j}$, UAV point ${P}_{i,q}$, and angle ${R}_{i,q}$. Imaging point set

**W**, which is a collection of imaging points, contains all set imaging points.

**U**, which is the collection of non-imaging points, includes takeoff point $B$, and goal point $L$($B$∈

**U**, $L$∈

**U**).

**W**, and non-imaging point set

**U**, as shown in Figure 1. In the imaging points, points and angles are set to obtain images of surveillance points as follow. The two imaging points obtaining the images of one surveillance point are set during the first flight and second flight.

#### 3.3. Normalization of Imaging and Non-Imaging Points

**U**, the set of all points that represent the x-axis points is defined as ${\mathit{P}}_{x}$. The set of all points that represent the y-axis points is defined as ${\mathit{P}}_{y}$, and the set of all points that represent the z-axis points is defined as ${\mathit{P}}_{z}$. Regarding the imaging point set

**W**, all angles located on the x-axis are defined as the set of angles ${\mathit{R}}_{\varphi}$, all angles located on the y-axis are defined as the set of angles ${\mathit{R}}_{\theta}$, and all angles located in the z-axis are defined as the set of angles ${\mathit{R}}_{\phi}$.

**W’**is a collection of normalized imaging points. The normalized qth normalized imaging point $W{\prime}_{i,q}$ is defined as shown in Equation (5). The normalizations for each element of point $P{\prime}_{i,q}$, and angle $R{\prime}_{i,q}$ are shown in Equation (6). In this normalization process, if the normalized element is greater than the maximum value, it is set to 1, and it is set to 0 if it is less than the minimum value.

**U’**is the collection of normalized non-imaging points. The jth normalized non-imaging point $U{\prime}_{i,j}$ is defined, as shown in Equation (7), using normalized point $P{\prime}_{i,j}$. The Normalization for each element of point $P{\prime}_{i,j}$ categorization is shown in Equation (8).

#### 3.4. Weight Establishment

#### 3.5. Setting Error Range

## 4. K-Means-Based Classification

#### 4.1. Definition of Imaging Point and Non-Imaging Point Clusters

**U’**is classified and is added to normalized non-imaging point cluster set

**C’**

_{m}, which is a collection of normalized non-imaging point clusters defined as shown in Equation (10). The oth normalized non-imaging point cluster, which categorizes a collection of normalized non-imaging points, is defined as normalized non-imaging point cluster C’

_{m,o}.

_{m,o}includes a part of normalized non-imaging point set

**U’**, and the normalized non-imaging point cluster centroid is denoted as ${{\mu}^{\prime}}_{m,o}$ and defined as $\left[x{\prime}_{m,o},y{\prime}_{m,o},z{\prime}_{m,o}\right]$.

**U’**is defined by the normalized non-imaging point cluster set

**C’**

_{w}the normalized non-imaging point cluster centroid set $\mathit{M}{\prime}_{w}$.

**W’**, which is the collection of normalized imaging points, is also classified identically to normalized non-imaging point cluster set

**C’**

_{m}, which is the collection of normalized non-imaging point clusters. Set normalized imaging point cluster set

**C’**

_{n}, which is the collection of normalized imaging point clusters during the nth run, is defined as shown in Equation (12). Set normalized imaging point cluster set

**C’**

_{n}includes normalized imaging point cluster C’

_{n,g}, which is the gth normalized imaging point cluster.

_{n,g}includes some elements of normalized imaging point set

**W’**. The normalized imaging point cluster centroid $\mu {\prime}_{n,g}$ is defined by Equation (14), using normalized point $P{\prime}_{n,g}$ and angle $R{\prime}_{n,g}$.

**W’**are the normalized imaging point cluster set

**C’**

_{v}and normalized imaging point cluster centroid set $\mathit{M}{\prime}_{v}$.

#### 4.2. Normalized Non-Imaging Point Categorization

**C’**

_{m}, is defined and normalized non-imaging point cluster centroids are set. K is set by considering the total distance in flying during the process of categorizing the non-imaging points. Second, Non-imaging points are classified by utilizing normalized non-imaging point cluster centroid. Third, normalized non-imaging point cluster centroids are adjusted by non-imaging points in re-allocated normalized non-imaging point clusters. Fourth, the second and the third processes are performed repeatedly until there is no change of normalized non-imaging point cluster centroids.

**C’**

_{w}and Normalized non-imaging point cluster centroid set $M{\prime}_{w}$ are created based on non-imaging point set

**U’**.

**U’**are assigned clusters based on the normalized non-imaging point cluster centroids that were established. In order to categorize normalized non-imaging point $U{\prime}_{i,j}$ as a normalized non-imaging point cluster member, the similarity is calculated, as shown in Equation (15). Function $u\left(P{\prime}_{i,j},P{\prime}_{m,o}\right)$ returns the sum of each weighted Euclidean distance of normalized non-imaging point $U{\prime}_{i,j}$. Using function $u\left(P{\prime}_{i,j},P{\prime}_{m,o}\right)$, the normalized non-imaging point $U{\prime}_{i,j}$ is added to the normalized non-imaging point cluster with the lowest value, as shown in Equation (16).

#### 4.3. Non-Imaging Point Denormalization

#### 4.4. Normalized Imaging Point Categorization

**C’**

_{n}, is determined, and the normalized imaging point cluster centroids are set. Second, the set normalized imaging point cluster centroids are used to divide normalized imaging points. Third, normalized imaging point cluster centroids are readjusted as normalized imaging points belonging to the reassigned normalized imaging point clusters. Fourth, Steps 2 and 3 are repeated until there is no variation in the normalized imaging point cluster centroids.

_{1}th and g

_{2}th imaging point cluster centroids that are within the error range with the lowest cost, the gth normalized imaging point cluster centroid, and the normalized imaging point cluster members. After creating new clusters, the method returns to Step 3. If all two conditions are not met, it proceeds to Step 2.

**W’**are clustered. In order to cluster normalized imaging point, a similarity is calculated, as shown in Equation (25). Function $f\left(P{\prime}_{i,q},\text{}R{\prime}_{i,q},\text{}P{\prime}_{n,g},\text{}R{\prime}_{n,g}\right)$ returns the sum of each weighted Euclidean distance of normalized state. Normalized imaging points are added to the cluster with the lowest such value, as shown in Equation (26).

#### 4.5. Normalized Imaging Point Denormalization

## 5. Experiments

#### 5.1. Environment for Flight Path Collection

#### 5.2. Collected Flight Path

#### 5.3. Definitions of Constant Values

**U**’.

#### 5.4. Categorization Results for the Proposed Method

## 6. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Fahlstrom, P.G.; Gleason, T.J. Introduction to UAV Systems, 4th ed.; Wiley: Hoboken, NJ, USA, 2012. [Google Scholar]
- Kim, D.; Hong, S.K. Target pointing and circling of a region of interest with Quadcopter. Int. J. Appl. Eng. Res.
**2016**, 11, 1082–1088. [Google Scholar] - Kim, J.; Crassidis, J. UAV path planning for maximum visibility of ground targets in an urban area. In Proceedings of the 13th Conference on Information Fusion (FUSION), Edinburgh, UK, 26–29 July 2010.
- Geng, L.; Zhang, Y.F.; Wang, J.J.; Fuh, J.Y.; Teo, S.H. Mission planning of autonomous UAVs for urban surveillance with evolutionary algorithms. In Proceedings of the 10th IEEE International Conference on Control and Automation (ICCA), Hangzhou, China, 2013; pp. 828–833.
- Geng, L.; Zhang, Y.F.; Wang, P.F.; Wang, J.J.; Fuh, J.Y.; Teo, S.H. UAV surveillance mission planning with gimbaled sensors. In Proceedings of the 11th IEEE International Conference on Control & Automation (ICCA), Taichung, Taiwan, 18–20 June 2014; pp. 320–325.
- Kwak, J.; Sung, Y. Structure design of surveillance location-based UAV motor primitives. Korea Inf. Process. Soc. Trans. Softw. Data Eng.
**2016**, 5, 181–186. [Google Scholar] [CrossRef] - Santana, L.V.; Brandao, A.S.; Sarcinelli-Filho, M. Outdoor waypoint navigation with the AR.Drone Quadrotor. In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 303–311.
- Santana, L.V.; Brandao, A.S.; Sarcinelli-Filho, M. An automatic flight control system for the AR.Drone Quadrotor in outdoor environments. In Proceedings of the 2015 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS), Cancun, Mexico, 23–24 November 2015; pp. 401–410.
- Park, J.; Park, S.; Ryoo, C.; Shin, S. A study on the algorithm for automatic generation of optimal waypoint with terrain avoidance. J. Korean Soc. Aeronaut. Space Sci.
**2009**, 37, 1104–1111. [Google Scholar] [CrossRef] - Yan, F.; Zhuang, Y.; Xiao, J. 3D PRM based real-time path planning for UAV in complex environment. In Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China, 11–14 December 2012; pp. 1135–1140.
- Alejo, D.; Cobano, J.A.; Heredia, G.; Ollero, A. Particle swarm optimization for collision-free 4D trajectory planning in unmanned aerial vehicles. In Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 298–307.
- Perazzo, P.; Ariyapala, K.; Conti, M.; Dini, G. The verifier bee: A path planner for drone-based secure location verification, world of wireless. In Proceedings of the 2015 IEEE 16th International Symposium on a Mobile and Multimedia Networks (WoWMoM), Boston, MA, USA, 14–17 June 2015.
- Pena, J.M.; Lozano, J.A.; Larranaga, P. An empirical comparison of four initialization methods for the K-means algorithm. Pattern Recognit. Lett.
**1999**, 20, 1027–1040. [Google Scholar] [CrossRef] - Velez, P.; Certad, N.; Ruiz, E. Trajectory generation and tracking using the AR.Drone 2.0 Quadcopter UAV. In Proceedings of the 2015 12th Latin American Robotics Symposium and 2015 3rd Brazilian Symposium on Robotics, Uberlândia, Brazil, 29–31 October 2015; pp. 73–78.
- Nicolescu, M.N.; Mataric, M.J. Natural methods for robot task learning: instructive demonstrations, generalization and practice. In Proceedings of the Second International Joint Conference on Autonomous Agents and Multiagent Systems, Melbourne, Victoria, Australia, 14–18 July 2003; ACM Press: New York, NY, USA, 2003; pp. 241–248. [Google Scholar]
- Sung, Y.; Kwak, J.; Park, J.H. Graph-based motor primitive generation framework: UAV motor primitives by demonstration-based learning. Hum. Cent. Comput. Inf. Sci.
**2015**, 35, 509–510. [Google Scholar] [CrossRef] - Parrot AR.Drone 2.0. Available online: https://www.parrot.com/us/drones/parrot-ardrone-20-gps-edition (accessed on 26 September 2016).
- Bristeau, P.; Callou, F.; Vissière, D.; Petit, N. The navigation and control technology inside the AR.Drone micro UAV. In Proceedings of the International Federation of Automatic Control (IFAC 2011), Milano, Italy, 28 August–2 September 2011; Volume 18, pp. 1477–1484.

**Figure 3.**Example recorded images of the surveillance points: (

**a**) Club Room; (

**b**) Railings; (

**c**) Soccer Field; and (

**d**) Soccer Goal Post.

**Figure 4.**Five flight paths collected by the pilot: (

**a**) First collected flight path; (

**b**) Second collected flight path; (

**c**) Third collected flight path; (

**d**) Fourth collected flight path; and (

**e**) fifth collected flight path.

Point | ||||||
---|---|---|---|---|---|---|

Elements | Minimum of ${\mathit{P}}_{\mathit{x}}$ | Maximum of ${\mathit{P}}_{\mathit{x}}$ | Minimum of ${\mathit{P}}_{\mathit{y}}$ | Maximum of ${\mathit{P}}_{\mathit{y}}$ | Minimum of ${\mathit{P}}_{\mathit{z}}$ | Maximum of ${\mathit{P}}_{\mathit{z}}$ |

Function | $MIN\left({\mathit{P}}_{x}\right)$ | $MAX\left({\mathit{P}}_{x}\right)$ | $MIN\left({\mathit{P}}_{y}\right)$ | $MAX\left({\mathit{P}}_{y}\right)$ | $MIN\left({\mathit{P}}_{z}\right)$ | $MAX\left({\mathit{P}}_{z}\right)$ |

Angle | ||||||
---|---|---|---|---|---|---|

Elements | Minimum of ${\mathit{R}}_{\mathit{\varphi}}$ | Maximum of ${\mathit{R}}_{\mathit{\varphi}}$ | Minimum of ${\mathit{R}}_{\mathit{\theta}}$ | Maximum of ${\mathit{R}}_{\mathit{\theta}}$ | Minimum of ${\mathit{R}}_{\mathit{\phi}}$ | Maximum of ${\mathit{R}}_{\mathit{\phi}}$ |

Function | $MIN\left({\mathit{R}}_{\varphi}\right)$ | $MAX\left({\mathit{R}}_{\varphi}\right)$ | $MIN\left({\mathit{R}}_{\theta}\right)$ | $MAX\left({\mathit{R}}_{\theta}\right)$ | $MIN\left({\mathit{R}}_{\phi}\right)$ | $MAX\left({\mathit{R}}_{\phi}\right)$ |

Point | |||
---|---|---|---|

Element | x | y | z |

Weight | ${\omega}_{x}$ | ${\omega}_{y}$ | ${\omega}_{z}$ |

Angle | |||
---|---|---|---|

Element | $\varphi $ | $\theta $ | $\phi $ |

Weight | ${\omega}_{\varphi}$ | ${\omega}_{\theta}$ | ${\omega}_{\phi}$ |

Point | |||
---|---|---|---|

Element | x | y | z |

Error range | ${\delta}_{x}$ | ${\delta}_{y}$ | ${\delta}_{z}$ |

Angle | |||
---|---|---|---|

Element | $\varphi $ | $\theta $ | $\phi $ |

Error range | ${\delta}_{\varphi}$ | ${\delta}_{\theta}$ | ${\delta}_{\phi}$ |

Latitude | Longitude | Absolute Altitude | ||
---|---|---|---|---|

Take-Off Point | 35.853235 | 128.489487 | 0 | |

Goal Point | 35.853235 | 128.487913 | 0 | |

Surveillance Points | 1 | 35.853361 | 128.489358 | 1 |

2 | 35.853361 | 128.489358 | 3 | |

3 | 35.853382 | 128.489028 | 1 | |

4 | 35.853382 | 128.489028 | 3 | |

5 | 35.853392 | 128.488721 | 1 | |

6 | 35.853392 | 128.488721 | 3 | |

7 | 35.853379 | 128.488425 | 1 | |

8 | 35.853379 | 128.488425 | 3 | |

9 | 35.853363 | 128.488111 | 1 | |

10 | 35.853363 | 128.488111 | 3 | |

11 | 35.852967 | 128.489392 | 1 | |

12 | 35.852872 | 128.489260 | 1 | |

13 | 35.852887 | 128.488144 | 1 | |

14 | 35.853001 | 128.488031 | 1 |

Point | ||||||
---|---|---|---|---|---|---|

Function | $MIN\left({\mathit{P}}_{\mathit{x}}\right)$ | $MAX\left({\mathit{P}}_{\mathit{x}}\right)$ | $MIN\left({\mathit{P}}_{\mathit{y}}\right)$ | $MAX\left({\mathit{P}}_{\mathit{y}}\right)$ | $MIN\left({\mathit{P}}_{\mathit{z}}\right)$ | $MAX\left({\mathit{P}}_{\mathit{z}}\right)$ |

Value | 35.853043 | 35.853281 | 128.787862 | 128.489505 | 0 | 4.185 |

Angle | ||||||
---|---|---|---|---|---|---|

Function | $MIN\left({\mathit{R}}_{\mathit{\varphi}}\right)$ | $MAX\left({\mathit{R}}_{\mathit{\varphi}}\right)$ | $MIN\left({\mathit{R}}_{\mathit{\theta}}\right)$ | $MAX\left({\mathit{R}}_{\mathit{\theta}}\right)$ | $MIN\left({\mathit{R}}_{\mathit{\phi}}\right)$ | $MAX\left({\mathit{R}}_{\mathit{\phi}}\right)$ |

Value | −0.128718 | 0.069656 | −0.060353 | 0.138317 | −3.074328 | 3.134437 |

Point | Angle | |||||
---|---|---|---|---|---|---|

Weight | ω_{x} | ω_{y} | ω_{z} | ${\omega}_{\varphi}$ | ${\omega}_{\theta}$ | ${\omega}_{\phi}$ |

Value | 17 | 1.5 | 2 | 0 | 0 | 2 |

Point | Angle | |||||
---|---|---|---|---|---|---|

Error range | ${\delta}_{x}$ | ${\delta}_{y}$ | ${\delta}_{z}$ | ${\delta}_{\varphi}$ | ${\delta}_{\theta}$ | ${\delta}_{\phi}$ |

Value | 35.853107 | 128.487940 | 0.5 | 0.104720 | 0.104720 | 0.261799 |

Point | Angle | |||||
---|---|---|---|---|---|---|

Normalized error range | $\delta {\prime}_{x}$ | $\delta {\prime}_{y}$ | $\delta {\prime}_{z}$ | $\delta {\prime}_{\varphi}$ | $\delta {\prime}_{\theta}$ | $\delta {\prime}_{\phi}$ |

Value | 0.267647 | 0.047703 | 0.119474 | 0.527892 | 0.527103 | 0.042166 |

© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Kwak, J.; Park, J.H.; Sung, Y.
Unmanned Aerial Vehicle Flight Point Classification Algorithm Based on Symmetric Big Data. *Symmetry* **2017**, *9*, 1.
https://doi.org/10.3390/sym9010001

**AMA Style**

Kwak J, Park JH, Sung Y.
Unmanned Aerial Vehicle Flight Point Classification Algorithm Based on Symmetric Big Data. *Symmetry*. 2017; 9(1):1.
https://doi.org/10.3390/sym9010001

**Chicago/Turabian Style**

Kwak, Jeonghoon, Jong Hyuk Park, and Yunsick Sung.
2017. "Unmanned Aerial Vehicle Flight Point Classification Algorithm Based on Symmetric Big Data" *Symmetry* 9, no. 1: 1.
https://doi.org/10.3390/sym9010001