k-nearest neighbors (
k-NN), which is known to be a simple and efficient approach, is a non-parametric supervised classifier. It aims to determine the class label of an unknown sample by its
k-nearest neighbors that are stored in a training set. The
k-nearest neighbors are determined based on some distance functions. Although
k-NN produces successful results, there have been some extensions for improving its precision. The neutrosophic set (NS) defines three memberships namely
T,
I and
F.
T,
I, and
F shows the truth membership degree, the false membership degree, and the indeterminacy membership degree, respectively. In this paper, the NS memberships are adopted to improve the classification performance of the
k-NN classifier. A new straightforward
k-NN approach is proposed based on NS theory. It calculates the NS memberships based on a supervised neutrosophic
c-means (NCM) algorithm. A final belonging membership
U is calculated from the NS triples as
. A similar final voting scheme as given in fuzzy
k-NN is considered for class label determination. Extensive experiments are conducted to evaluate the proposed method’s performance. To this end, several toy and real-world datasets are used. We further compare the proposed method with
k-NN, fuzzy
k-NN, and two weighted
k-NN schemes. The results are encouraging and the improvement is obvious.
View Full-Text
►▼
Show Figures
This is an open access article distributed under the
Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited