Norm-Based Binary Search Trees for Speeding Up KNN Big Data Classification
AbstractDue to their large sizes and/or dimensions, the classification of Big Data is a challenging task using traditional machine learning, particularly if it is carried out using the well-known K-nearest neighbors classifier (KNN) classifier, which is a slow and lazy classifier by its nature. In this paper, we propose a new approach to Big Data classification using the KNN classifier, which is based on inserting the training examples into a binary search tree to be used later for speeding up the searching process for test examples. For this purpose, we used two methods to sort the training examples. The first calculates the minimum/maximum scaled norm and rounds it to 0 or 1 for each example. Examples with 0-norms are sorted in the left-child of a node, and those with 1-norms are sorted in the right child of the same node; this process continues recursively until we obtain one example or a small number of examples with the same norm in a leaf node. The second proposed method inserts each example into the binary search tree based on its similarity to the examples of the minimum and maximum Euclidean norms. The experimental results of classifying several machine learning big datasets show that both methods are much faster than most of the state-of-the-art methods compared, with competing accuracy rates obtained by the second method, which shows great potential for further enhancements of both methods to be used in practice. View Full-Text
Share & Cite This Article
Hassanat, A.B.A. Norm-Based Binary Search Trees for Speeding Up KNN Big Data Classification. Computers 2018, 7, 54.
Hassanat ABA. Norm-Based Binary Search Trees for Speeding Up KNN Big Data Classification. Computers. 2018; 7(4):54.Chicago/Turabian Style
Hassanat, Ahmad B.A. 2018. "Norm-Based Binary Search Trees for Speeding Up KNN Big Data Classification." Computers 7, no. 4: 54.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.