Next Article in Journal
Qinling: A Parametric Model in Speculative Multithreading
Next Article in Special Issue
An Efficient Image Segmentation Algorithm Using Neutrosophic Graph Cut
Previous Article in Journal
A Sparse Signal Reconstruction Method Based on Improved Double Chains Quantum Genetic Algorithm
Previous Article in Special Issue
Vector Similarity Measures between Refined Simplified Neutrosophic Sets and Their Multiple Attribute Decision-Making Method
Open AccessArticle

NS-k-NN: Neutrosophic Set-Based k-Nearest Neighbors Classifier

1
Department of Electrical and Electronics Engineering, Technology Faculty, Firat University, Elazig 23119, Turkey
2
Department of Computer Science, University of Illinois at Springfield, Springfield, IL 62703, USA
3
Department of Mathematics and Sciences, University of New Mexico, Gallup, NM 87301, USA
*
Author to whom correspondence should be addressed.
Symmetry 2017, 9(9), 179; https://doi.org/10.3390/sym9090179
Received: 2 August 2017 / Revised: 16 August 2017 / Accepted: 29 August 2017 / Published: 2 September 2017
(This article belongs to the Special Issue Neutrosophic Theories Applied in Engineering)
k-nearest neighbors (k-NN), which is known to be a simple and efficient approach, is a non-parametric supervised classifier. It aims to determine the class label of an unknown sample by its k-nearest neighbors that are stored in a training set. The k-nearest neighbors are determined based on some distance functions. Although k-NN produces successful results, there have been some extensions for improving its precision. The neutrosophic set (NS) defines three memberships namely T, I and F. T, I, and F shows the truth membership degree, the false membership degree, and the indeterminacy membership degree, respectively. In this paper, the NS memberships are adopted to improve the classification performance of the k-NN classifier. A new straightforward k-NN approach is proposed based on NS theory. It calculates the NS memberships based on a supervised neutrosophic c-means (NCM) algorithm. A final belonging membership U is calculated from the NS triples as U = T + I F . A similar final voting scheme as given in fuzzy k-NN is considered for class label determination. Extensive experiments are conducted to evaluate the proposed method’s performance. To this end, several toy and real-world datasets are used. We further compare the proposed method with k-NN, fuzzy k-NN, and two weighted k-NN schemes. The results are encouraging and the improvement is obvious. View Full-Text
Keywords: k-NN; Fuzzy k-NN; neutrosophic sets; data classification k-NN; Fuzzy k-NN; neutrosophic sets; data classification
Show Figures

Figure 1

MDPI and ACS Style

Akbulut, Y.; Sengur, A.; Guo, Y.; Smarandache, F. NS-k-NN: Neutrosophic Set-Based k-Nearest Neighbors Classifier. Symmetry 2017, 9, 179.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop