1. Introduction
The classical measure theory has been widely used to represent uncertainties in data. However, these measures are valid only for precise data, and hence they may be unable to give accurate judgments for data uncertain and imprecise in nature. To handle this, fuzzy set (FS) theory, developed by Zadeh [
1], has received much attention over the last decades because of its capability of handling uncertainties. After this, Atanassov [
2] proposed the concept of an intuitionistic fuzzy set (IFS), which extends the theory of FSs with the addition of a degree of non-membership. As IFS theory has widely been used by researchers [
3,
4,
5,
6,
7,
8,
9,
10,
11,
12,
13,
14,
15,
16] in different disciplines for handling the uncertainties in data, hence its corresponding analysis is more meaningful than FSs’ crisp analysis. Nevertheless, neither the FS nor IFS theory are able to deal with indeterminate and inconsistent information. For instance, we take a person giving their opinion about an object with 0.5 being the possibility that the statement is true, 0.7 being the possibility that the statement is false and 0.2 being the possibility that he or she is not sure. To resolve this, Smarandache [
17] introduced a new component called the “indeterminacy-membership function” and added the “truth membership function” and “falsity membership function”, all which are independent components lying in
, and hence the corresponding set is known as a neutrosophic set (NS), which is the generalization of the IFS and FS. However, without specification, NSs are difficult to apply to real-life problems. Thus, a particular case of the NS called a single-valued NS (SVNS) has been proposed by Smarandache [
17], Wang et al. [
18].
After this pioneering work, researchers have been engaged in extensions and applications to different disciplines. However, the most important task for the decision-maker is to rank the objects so as to obtain the desired object(s). For this, researchers have made efforts to enrich the concept of information measures in neutrosophic environments. Broumi and Smarandache [
19] introduced the Hausdorff distance, while Majumdar [
20] presented the Hamming and Euclidean distance for comparing the SVNSs. Ye [
21] presented the concept of correlation for single-valued neutrosophic numbers (SVNNs). Additionally, Ye [
22] improved the concept of cosine similarity for SVNSs, which was firstly introduced by Kong et al. [
23] in a neutrosophic environment. Nancy and Garg [
24] presented an improved score function for ranking the SVNNs and applied them to solve the decision-making problem. Garg and Nancy [
25] presented the entropy measure of order
and applied them to solve decision-making problems. Recently, Garg and Nancy [
26] presented a technique for order preference by similarity to ideal solution (TOPSIS) method under an interval NS environment to solve decision-making problems. Aside from these, various authors have incorporated the idea of NS theory into the similarity measures [
27,
28], distance measures [
29,
30], the cosine similarity measure [
19,
22,
31], and aggregation operators [
22,
31,
32,
33,
34,
35,
36,
37,
38,
39,
40].
Thus, on the basis of the above observations, it has been observed that distance or similarity measures are of key importance in a number of theoretical and applied statistical inference and data processing problems. It has been deduced from studies that similarity, entropy and divergence measures could be induced by the normalized distance measure on the basis of their axiomatic definitions. On the other hand, SVNSs are one of the most successful theories to handle the uncertainties and certainties in the system, but little systematic research has explored these problems. The gap in the research motivates us to develop some families of the distance measures of the SVNS to solve the decision-making problem, for which preferences related to different alternatives are taken in the form of neutrosophic numbers. The main contributions of this work are summarized as follows: (i) to highlight the shortcomings of the various existing distance measures under the single-valued neutrosophic information through illustrative examples; (ii) to overcome the shortcomings of the existing measures, this paper defines some new series of biparametric distance measures between SVNSs, which depend on two parameters, namely, p and t, where p is the norm and t identifies the level of uncertainty. The various desirable relations between these have been investigated in detail. Then, we utilized these measures to solve the problem of pattern recognition as well as medical diagnosis and compared their performance with that of some of the existing approaches.
The rest of this paper is organized as follows.
Section 2 briefly describes the concepts of NSs, SVNSs and their corresponding existing distance measures.
Section 3 presents a family of the normalized and weighted normalized distance measures between two SVNSs. Some of their desirable properties have also been investigated in detail, while generalized distance measures have been proposed in
Section 4. The defined measures are illustrated, by an example in
Section 5, using the field of pattern recognition and medical diagnosis for demonstrating the effectiveness and stability of the proposed measures. Finally, a concrete conclusion has been drawn in
Section 6.
3. Some New Distance Measures between SVNSs
In this section, we present the Hamming and the Euclidean distances between SVNSs, which can be used in real scientific and engineering applications.
Letting be the class of SVNSs over the universal set X, then we define the distances for SVNSs, and , by considering the uncertainty parameter t, as follows:
- (i)
- (ii)
Normalized Hamming distance:
- (iii)
- (iv)
Normalized Euclidean distance:
where
is a parameter.
Then, on the basis of the distance properties as defined in Definition 4, we can obtain the following properties:
Proposition 1. The above-defined distance , between two SVNSs A and B, satisfies the following properties (P1)–(P4):
- (P1)
;
- (P2)
iff ;
- (P3)
;
- (P4)
If , then and .
Proof. For two SVNSs A and B, we have
- (P1)
, and . Thus, , , and .
Hence, by the definition of , we obtain .
- (P2)
Firstly, we assume that
, which implies that
,
, and
for
. Thus, by the definition of
, we obtain
. Conversely, assuming that
for two SVNSs
A and
B, this implies that
or
After solving, we obtain , and , which implies , and . Therefore, . Hence iff .
- (P3)
This is straightforward from the definition of .
- (P4)
If , then , and , which implies that , , and .
By adding, we obtain . Similarly, we obtain .
☐
Proposition 2. Distance as defined in Equation (
11)
is also a valid measure Proof. For two SVNSs A and B, we have
- (P1)
,
and
. Thus,
,
,
and
. Therefore,
Hence, by the definition of , we obtain .
- (P2)
Assuming that
implies that
,
and
for
, and hence using Equation (
11), we obtain
. Conversely, assuming that
implies
After solving these, we obtain , and ; that is, , and for . Hence . Therefore, iff .
- (P3)
This is straightforward from the definition of .
- (P4)
If
, then
,
, and
. Therefore
Hence by the definition of , we obtain . Similarly, we obtain .
☐
Now, on the basis of these proposed distance measures, we conclude that this successfully overcomes the shortcomings of the existing measures as described above.
Example 3. If we apply the proposed distance measures and on the data considered in Example 1 to classify the pattern C, then corresponding to the parameter , we obtain , , and . Thus, the pattern C is classified with the pattern B and hence is able to identify the best pattern.
Example 4. If we utilize the proposed distances and for the above-considered Example 2, then their corresponding values are , , and . Therefore, there is a significant effect of the change in the falsity membership on the measure values and hence consequently on the ranking values.
Proposition 3. Measures and satisfy the following properties:
- (i)
;
- (ii)
.
Proof. We can easily obtain that , and thus by Proposition 1, we obtain . Similarly, we can obtain . ☐
However, in many practical situations, the different sets may have taken different weights, and thus weight of the element should be taken into account. In the following, we develop a weighted Hamming distance and the normalized weighted Euclidean distance between SVNSs.
- (i)
The normalized weighted Hamming distance:
- (ii)
The normalized weighted Euclidean distance:
where
is a parameter.
It is straightforward to check that the normalized weighted distance between SVNSs A and B also satisfies the above properties (P1)–(P4).
Proposition 4. Distance measures and satisfy the relation .
Proof. Because , then for any two SVNSs A and B, we have ; that is, . ☐
Proposition 5. Let A and B be two SVNSs in X; then and are the distance measures.
Proof. Because and then we can easily obtain . Thus, satisfies (P1). The proofs of (P2)–(P4) are similar to those of Proposition 1. Similar is true for . ☐
Proposition 6. The distance measures and satisfy the relation .
Proof. The proof follows from Proposition 4. ☐
Proposition 7. The distance measures and satisfy the inequality .
Proof. For two SVNSs
A and
B, we have
which implies that
For any
, we have
. Therefore,
By adding these inequalities and by the definition of
, we have
As A and B are arbitrary SVNSs, thus we obtain . ☐
Proposition 8. Measures and satisfy the inequality .
Proof. The proof follows from Proposition 7. ☐
The Hausdroff distance between two non-empty closed and bounded sets is a measure of the resemblance between them. For example, we consider
and
in the Euclidean domain
R; the Hausdroff distance in the additive set environment is given by the following [
8]:
Now, for any two SVNSs A and B over , we propose the following utmost distance measures:
Utmost normalized Hamming distance:
Utmost normalized weighted Hamming distance:
Utmost normalized Euclidean distance:
Utmost normalized weighted Euclidean distance:
Proposition 9. The distance defined in Equation (
14)
for two SVNSs A and B is a valid distance measure.
Proof. The above measure satisfies the following properties:
- (P1)
As
A and
B are SVNSs, so
,
and
. Thus,
Hence, by the definition of , we obtain .
- (P2)
Similar to the proof of Proposition 1.
- (P3)
This is clear from Equation (
14).
- (P4)
Let , which implies , and . Therefore, , and , which implies that , and . Hence . Similarly, we obtain .
☐
Proposition 10. For , , and are the distance measures.
Proof. The proof follows from the above proposition. ☐
Proposition 11. The measures and satisfy the following inequality: .
Proof. Because
, therefore
Hence, . ☐
Proposition 12. The measures and satisfy the inequality .
Proof. The proof follows from Proposition 11. ☐
Proposition 13. The measures and satisfy the inequality .
Proof. Because for any , , the remaining proof follows from Proposition 7. ☐
Proposition 14. The measures and satisfy the inequality .
Proof. The proof follows from Proposition 13. ☐
Proposition 15. The measures and satisfy the following inequality: Proof. For positive numbers , , we have . Thus, for any two SVNSs A and B, we have . Hence . ☐
Proposition 16. The measures and satisfy the following inequality: Proof. The proof follows from Proposition 15. ☐
Proposition 17. The measures , and satisfy the following inequalities:
- (i)
;
- (ii)
.
Proof. Because and , by adding these inequalities, we obtain . On the other hand, by multiplying these, we obtain . ☐