Next Article in Journal
Entropy Change of Biological Dynamics in Asthmatic Patients and Its Diagnostic Value in Individualized Treatment: A Systematic Review
Next Article in Special Issue
Assessing Information Transmission in Data Transformations with the Channel Multivariate Entropy Triangle
Previous Article in Journal
Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data
Previous Article in Special Issue
Identification of Auditory Object-Specific Attention from Single-Trial Electroencephalogram Signals via Entropy Measures and Machine Learning
Article Menu
Issue 6 (June) cover image

Export Article

Entropy 2018, 20(6), 403; doi:10.3390/e20060403

Article
Vague Entropy Measure for Complex Vague Soft Sets
1
Department of Actuarial Science and Applied Statistics, Faculty of Business & Information Science, UCSI University, Jalan Menara Gading, Cheras 56000, Kuala Lumpur, Malaysia
2
School of Mathematics, Thapar Institute of Engineering & Technology (Deemed University), Patiala 147004, Punjab, India
3
A-Level Academy, UCSI College KL Campus, Lot 12734, Jalan Choo Lip Kung, Taman Taynton View, Cheras 56000, Kuala Lumpur, Malaysia
*
Author to whom correspondence should be addressed.
Received: 12 March 2018 / Accepted: 17 May 2018 / Published: 24 May 2018

Abstract

:
The complex vague soft set (CVSS) model is a hybrid of complex fuzzy sets and soft sets that have the ability to accurately represent and model two-dimensional information for real-life phenomena that are periodic in nature. In the existing studies of fuzzy and its extensions, the uncertainties which are present in the data are handled with the help of membership degree which is the subset of real numbers. However, in the present work, this condition has been relaxed with the degrees whose ranges are a subset of the complex subset with unit disc and hence handle the information in a better way. Under this environment, we developed some entropy measures of the CVSS model induced by the axiomatic definition of distance measure. Some desirable relations between them are also investigated. A numerical example related to detection of an image by the robot is given to illustrate the proposed entropy measure.
Keywords:
vague entropy; distance induced vague entropy; distance; complex fuzzy set; complex vague soft set

1. Introduction

Classical information measures deal with information which is precise in nature, while information theory is one of the trusted ways to measure the degree of uncertainty in data. In our day-to-day life, uncertainty plays a dominant role in any decision-making process. In other words, due to an increase of the system day-by-day, decision makers may have to give their judgments in an imprecise, vague and uncertain environment. To deal with such information, Zadeh [1] introduced the theory of fuzzy sets (FSs) for handling the uncertainties in the data by defining a membership function with values between 0 and 1. In this environment, Deluca and Termini [2] proposed a set of axioms for fuzzy entropy. Liu [3] and Fan and Xie [4] both studied information measures related to entropy, distance, and similarity for fuzzy sets. With the growing complexities, researchers are engaged in extensions such as intuitionistic fuzzy set (IFS) [5], vague set (VS) [6], interval-valued IFS [7] to deal with the uncertainties. Under these extensions, Szmidt and Kacprzyk [8] extended the axioms of Deluca and Termini [2] to the IFS environment. Later on, corresponding to Deluca and Termini’s [2] fuzzy entropy measure, Vlachos and Sergiadis [9] extended their measure in the IFS environment. Burillo and Bustince [10] introduced the entropy of intuitionistic fuzzy sets (IFSs), as a tool to measure the degree of intuitionism associated with an IFS. Garg et al. [11] presented a generalized intuitionistic fuzzy entropy measure of order α and degree β to solve decision-making problems. In addition to the mentioned examples, other authors have also addressed the problem of decision-making by using the different information measures [12,13,14,15,16,17,18,19,20,21,22,23,24,25].
All the above-defined work is successfully applied to the various disciplines without considering the parameterization factor during the analysis. Therefore, under some certain cases, these existing theories may be unable to classify the object. To cope with such situations, many researchers are paying more attention to soft set (SS) theory [26]. After its discovery, researchers are engaged in its extensions. For instance, Maji et al. [27,28] combined the theory of SSs with FSs and IFSs and came up with a new concept of the fuzzy soft set (FSS) and intuitionistic fuzzy soft set (IFSS). Further, the concept of the hybridization of the SSs with the others, such as generalized fuzzy soft set [29,30], generalized intuitionistic fuzzy soft set [31,32], distance measures [33,34,35,36], and fuzzy number intuitionistic fuzzy soft sets [37] plays a dominant role during the decision making process. IFSS plays a dominant role in handling the uncertainties in the data by incorporating the idea of the expert as well as the parametric factors. In that environment, Arora and Garg [38,39] presented some aggregation operators for intuitionistic fuzzy soft numbers. Garg and Arora [40] presented some non-linear methodology for solving decision-making problems in an IFSS environment. In terms of the information measures, Garg and Arora [34] developed various distance and similarity measures for dual hesitant FSS. Recently, Garg and Arora [41] presented Bonferroni mean aggregation operators for an IFSS environment. Apart from these, vague soft set [42] is an alternative theory which is the hybridization of the vague set [6] and soft set [26]. In this field, Chen [43] developed some similarity measures for vague sets. Wang and Qu [44] developed some entropy, similarity and distance measures for vague sets. Selvachandran et al. [45] introduced distance induced entropy measures for generalized intuitionistic fuzzy soft sets.
The above theories using FSs, IFSs, IFSSs, VSs, FSSs are widely employed by researchers but they are able to handle only the uncertainty in the data. On the other hand, none of these models will be able to handle the fluctuations of the data at a given phase of time during their execution, but in today’s life, the uncertainty and vagueness of the data changes periodicity with the passage of time and hence the existing theories are unable to consider this information. To overcome this deficiency, Ramot et al. [46] presented a complex fuzzy set (CFS) in which the range of membership function is extended from real numbers to complex numbers with the unit disc. Ramot et al. [47] generalized traditional fuzzy logic to complex fuzzy logic in which the sets used in the reasoning process are complex fuzzy sets, characterized by complex valued membership functions. Later on, Greenfield et al. [48] extended the concept of CFS by taking the grade of the membership function as an interval-number rather than single numbers. Yazdanbakhsh and Dick [49] conducted a systematic review of CFSs and logic and discussed their applications. Later on, Alkouri and Salleh [50] extended the concepts of CFS to complex intuitionistic fuzzy (CIF) sets (CIFSs) by adding the degree of complex non-membership functions and studied their basic operations. Alkouri and Salleh [51] introduced the concepts of CIF relation, composition, projections and proposed a distance measure between the two CIFSs. Rani and Garg [52] presented some series of distance measures for a CIFS environment. Kumar and Bajaj [53] proposed some distance and entropy measures for CIF soft sets. In these theories, a two-dimensional information (amplitude and phase terms) are represented as a single set. The function of the phase term is to model the periodicity and/seasonality of the elements. For instance, when dealing with an economics-related situation, the phase term represents the time taken for the change in an economic variable to impact the economy. On the other hand, in robotics, the phase term can represent direction, whereas in image processing, the phase term can represent the non-physical attributes of the image.
As an alternative to these theories, the concept of the complex vague soft set (CVSS) [54] handles the two-dimensional information by combining the properties of CFSs [46], soft sets [26] and vague sets [6]. The CVSSs differs from the existing sets with the features that they contain: (1) an interval-based membership structure that provides users with the means of recording their hesitancy in the process of assigning membership values for the elements; (2) the ability to handle the partial ignorance of the data; (3) adequate parameterization abilities that allow for a more comprehensive representation of the parameters. Selvachandran et al. [54,55] investigated complex vague soft sets (CVSSs). Selvachandran et al. [56] presented similarity measures for CVSSs and their applications to pattern recognition problems.
Thus, motivated from the concept of CVSS, the focus of this work is to explore the structural characteristics of CVSSs and to present some information measures for handling the uncertainties in the data. Per our knowledge, in the aforementioned studies, the information measures cannot be utilized to handle the CVSS information. Thus, in order to achieve this, we develop the axiomatic definition of the distance and entropy measures between CVSSs and hence propose some new entropy measures. Some of the algebraic properties of these measures and the relations between them are also verified. The proposed measures have the following characteristics: (1) they serve as a complement to the CVSS model and its relations in representing and modeling time-periodic phenomena; (2) they have elegant properties that increase their reach and applicability; (3) they have important applications in many real-world problems in the areas of image detection, pattern recognition, image processing; (4) they add to the existing collection of methodologies and techniques in artificial intelligence and soft computing, where it is often necessary to determine the degree of vagueness of the data, in order to make optimal decisions. This provides support of the increasingly widespread trend in the use of mathematical tools to complement scientific theories and existing procedures, in the handling and solving of real-life problems that involve vague, unreliable and uncertain two-dimensional information. Furthermore, an effort has been put forth to solve the classification problem in multi-dimensional complex data sets. To elaborate the proposed method, we will be focusing on the representation and recognition of digital images defined by multi-dimensional complex data sets using the properties of CVSSs and a new distance and entropy measure for this model.
The rest of the manuscript is organized as follows: in Section 2, we briefly review the basic concepts of SSs and CVSSs. In Section 3, we define the axiomatic definition of the distance and entropy measures for CVSSs. In Section 4, some basic relationships between the distance and entropy measures are defined. In Section 5, the utility of the CVSS model and its entropy measure is illustrated by applying it in a classification of the digital image with multi-dimensional data. Finally, conclusions and future work are stated in Section 6.

2. Preliminaries

In this section, we briefly reviewed some basic concepts related to the VSs, SSs, CVSSs defined over the universal set U.
Definition 1 [6].
A vague set (VS) V in U is characterized by the truth and falsity membership functions tV, fV: U→[0.1] with tV(x) + fV(x) ≤ 1 for any xU. The values assigned corresponding to tV(x) and fV(x) are the real numbers of [0, 1]. The grade of membership for x can be located in [tV(x), 1 − fV(x)] and the uncertainty of x is defined as (1 − fV(x)) − tV(x).
It is clearly seen from the definition that VSs are the generalization of the fuzzy sets. If we assign 1 − fV(x) to be 1 − tV(x) then VS reduces to FS. However, if we set 1 − tV(x) to be νA(x) (called the non-membership degree) then VS reduces to IFS. On the other hand, if we set t V ( x ) = μ V L ( x ) and 1 f V ( x ) = μ V U ( x ) then VS reduces to interval-valued FS. Thus, we conclude that VSs are the generalization of the FSs, IFSs and interval-valued FSs.
Definition 2 [6].
Let A = { < x , [ t A ( x ) , 1 f A ( x ) ] > : x U } and B = { < x , [ t B ( x ) , 1 f B ( x ) ] > : x U } be two VSs defined on U then the basic operational laws between them are defined as follows:
(i) 
AB if tA(x) ≤ tB(x) and 1 − fA(x) ≤ 1 − fB(x) for all x.
(ii) 
Complement: AC = {<x, [fA(x), 1 − tA(x)]>: xU}.
(iii) 
Union: AB = {<x, [max(tA(x), tB(x)),max(1 − fA(x),1 − fB(x))]>: xU}
(iv) 
Intersection: AB = {<x, [min(tA(x), tB(x)), min(1 − fA(x),1 − fB(x))]>: xU}
Definition 3 [26].
Let P ( U ) denote the power set of U . A pair ( F , A ) is called a soft set (SS) over V where F is a mapping given by F : A P ( U ) .
Definition 4 [42].
Let V ( U ) be the power set of VSs over U . A pair ( F ^ , A ) is called a vague soft set (VSS) over U , where F ^ is a mapping given by F ^ : A V ( U ) . Mathematically, VSS can be defined as follows:
( F ^ ,   A ) = { x ,   [ t F ^ ( e ) ( x ) ,   1 f F ^ ( e ) ( x ) ] : x U ,   e A }
It is clearly seen that this set is the hybridization of the SSs and VSs.
Definition 5 [57].
A complex vague set (CVS) is defined as an ordered pair defined as
A = { x , [ t A ( x ) ,   1 f A ( x ) ] : x U }
where t A : U { a : a C , | a | 1 } ,   f A : U { a : a X ,   | a | 1 } are the truth and falsity membership functions with unit disc and are defined as t A ( x ) = r t A ( x ) . e i w t A r ( x ) and 1 f A ( x ) = ( 1 k f A ( x ) ) . e i ( 2 π w f A k ( x ) ) where i = 1 .
Definition 6 [54].
Let P ( U ) denote the complex vague power set of U and E be the set of parameters. For any A E ,   a pair ( F ,   A ) is called a complex vague soft set (CVSS) over U , where F : A P ( U ) , defined as:
F ( x j ) = { ( x j ,   [ r t F a ( x j ) ,   1 k t F a ( x j ) ] .   e i   [ w t F a r ( x j ) ,   2 π w f F a k ( x j ) ] )   : x j U }
where j = 1 ,   2 ,   3 ,   is the number of parameters, [ r t F a ( x ) ,   1 k t F a ( x ) ] are real-valued [ 0 ,   1 ] , the phase terms [ w r t F a ( x ) ,   2 π w k f F a ( x ) ] are real-valued in the interval ( 0 ,   2 π ] , 0 r t F a ( x ) + k F a ( x ) 1 and i = 1 .
The major advantages of the CVSS are that it represents two-dimensional information in a single set and each object is characterized in terms of its magnitude as well as its phase term. Further, the soft set component in CVSS provides an adequate parameterization tool to represent the information.
Definition 7 [54].
Let two CVSSs ( F ,   A ) and ( G ,   B ) over U , the basic operations between them are defined as
(i) 
( F ,   A ) ( G , B ) if and only if the following conditions are satisfied for all x   U :
(a) 
r t F a ( x ) r t G b ( x ) and k f G b ( x ) k f F a ( x ) ;
(b) 
w t F a r ( x ) w t G b r ( x ) and w f G b k ( x ) w f F a k ( x ) .
(ii) 
Null CVSS: ( F ,   A ) = ϕ if r t F a ( x ) = 0 , k f F a ( x ) = 1 and w t F a r ( x ) = 0 π ,   w f F a k ( x ) = 2 π for all x U .
(iii) 
Absolute CVSS: ( F ,   A ) = 1 if r t F a ( x ) = 1 , k f F a ( x ) = 0 and w t F a r ( x ) = 2 π , w f F a k ( x ) = 0 π for all x U .

3. Axiomatic Definition of Distance Measure and Vague Entropy

Let E be a set of parameters and U be the universe of discourse. In this section, we present some information measures namely distance and entropy for the collections of CVSSs, which are denoted by CVSS(U).
Definition 8.
Let ( F ,   A ) ,   ( G ,   B ) ,   ( H ,   C )   C V S S ( U ) . A complex-value function d   : C V S S ( U )   × C V S S ( U )   { a ,   a U ,   | a | 1 } is called a distance measure between CVSSs if it satisfies the following axioms:
(D1) 
d ( ( F ,   A ) ,   ( G ,   B ) ) = d ( ( G ,   B ) ,   ( F ,   A ) )
(D2) 
d ( ( F ,   A ) ,   ( G ,   B ) ) = 0 ( F ,   A ) = ( G ,   B )
(D3) 
d ( ( F ,   A ) ,   ( G ,   B ) ) = 1   e E ,   x U , both ( F ,   A ) and ( G ,   B ) are crisp sets in U , i.e.,
  • ( F ,   A ) = { ( x ,   [ 0 ,   0 ] e i [ 0 π ,   0 π ] ) } and ( G ,   B ) = { ( x ,   [ 1 ,   1 ] e i [ 2 π ,   2 π ] ) } ,
  • or ( F ,   A ) = { ( x ,   [ 0 ,   0 ] e i [ 2 π ,   2 π ] ) } and ( G ,   B ) = { ( x ,   [ 1 ,   1 ] e i [ 0 π ,   0 π ] ) } ,
  • or ( F ,   A ) = { ( x ,   [ 1 ,   1 ] e i [ 2 π ,   2 π ] ) } and ( G ,   B ) = { ( x ,   [ 0 ,   0 ] e i [ 0 π ,   0 π ] ) } ,
  • or ( F ,   A ) = { ( x ,   [ 1 ,   1 ] e i [ 0 π ,   0 π ] ) } and ( G ,   B ) = { ( x ,   [ 0 ,   0 ] e i [ 2 π ,   2 π ] ) } .
(D4) 
If ( F ,   A ) ( G ,   B ) ( H ,   C ) , then d ( ( F ,   A ) ,   ( H ,   C ) ) m a x ( d ( ( F ,   A ) ,   ( G ,   B ) ) ,   d ( ( G ,   B ) ,   ( H ,   C ) ) ) .
Next, we define the axiomatic definition for the vague entropy for a CVSS.
Definition 9.
A complex-valued function M : C V S S ( U ) { a : a ,   | a | 1 } is called vague entropy ofCVSSs, if it satisfies the following axioms for any ( F ,   A ) ,   ( G ,   B ) C V S S ( U ) .
(M1) 
0 | M ( F ,   A ) | 1 .
(M2) 
M ( F ,   A ) = 0   ( F ,   A ) is a crisp set on U for all a A and x U , i.e., r t F a ( x ) = 1 ,   k f F a ( x ) = 0 and w t F a r ( x ) = 2 π ,   w f F a k ( x ) = 0 π or r t F a ( x ) = 1 ,   k f F a ( x ) = 0 and w t F a r ( x ) = 0 π ,   w f F a k ( x ) = 2 π or r t F a ( x ) = 0 ,   k f F a ( x ) = 1 and w t F a r ( x ) = 0 π ,   w f F a k ( x ) = 2 π or r t F a ( x ) = 0 ,   k f F a ( x ) = 1 and w t F a r ( x ) = 2 π ,   w f F a k ( x ) = 0 π .
(M3) 
M ( F ,   A ) = 1   a   A and x U , ( F ,   A ) is completely vague
i.e., r t F a ( x ) = k f F a ( x ) and w t F a r ( x ) = w f F a k ( x ) .
(M4) 
M ( F ,   A ) = M ( ( F ,   A ) c )
(M5) 
If the following two cases holds for all a A and x U ,
C a s e   1 : r t F a ( x ) r t G b ( x ) ,   k f F a ( x ) k f G b ( x ) w h e n e v e r   r t G b ( x ) k f G b ( x ) ;
a n d   w t F a r ( x ) w t G b r ( x ) ,   w f F a k ( x ) w f G b k ( x )   w h e n e v e r   w t G b r ( x ) w f G b k ( x ) ;
C a s e   2 : r t F a ( x ) r t G b ( x ) ,   k f F a ( x ) k f G b ( x ) w h e n e v e r   r t G b ( x ) k f G b ( x )
a n d   w t F a r ( x ) w t G b r ( x ) , w f F a k ( x ) w f G b k ( x ) w h e n e v e r   w t G b r ( x ) w f G b k ( x ) ;
then M ( F ,   A ) M ( G ,   B ) .
Based on this definition, it is clear that a value close to 0 indicates that the CVSS has a very low degree of vagueness whereas a value close to the 1 implies that the CVSS is highly vague. For all x U , the nearer r t F a ( x ) is to k f F a ( x ) , the larger the vague entropy measure and it reaches a maximum when r t F a ( x ) = k f F a ( x ) . Condition M5 on the other hand, is slightly different as it is constructed using the sharpened version of a vague soft set as explained in Hu et al. [58], instead of the usual condition of ( F ,   A ) ( G ,   B ) implies that the entropy of ( F ,   A ) is higher than the entropy of ( G ,   B ) . In [58], Hu et al. proved that this condition is inaccurate and provided several counter-examples to disprove this condition. Subsequently, they replaced this flawed condition with two new cases. We generalized these two cases to derive condition (M5) in this paper, in a bid to increase the accuracy of our proposed vague entropy. We refer the readers to [58] for further information on these revised conditions.

4. Relations between the Proposed Distance Measure and Vague Entropy

In the following, let U be universal and ϕ be empty over CVSSs. Then based on the above definition, we define some of the relationship between them as follows:
Theorem 1.
Let ( F ,   A ) be CVSS and d is the distance measure between CVSSs, then the equations M 1 ,   M 2 and M 3 defined as below
(i)
M 1 ( F ,   A ) = 1 d ( ( F ,   A ) ,   ( F ,   A ) c )
(ii)
M 2 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U )
(iii)
M 3 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   ( F ,   A ) ( F ,   A ) c )
are the valid vague entropies of CVSSs.
Proof. 
Here, we shall prove only the part (i), while others can be proved similarly.
It is clearly seen from the definition of vague entropies that M 1 satisfies conditions (M1) to (M4). So we need to prove only (M5). For it, consider the two cases stated in Definition 9. We only prove that the condition (M5) is satisfied for Case 1; the proof for Case 2 is similar and is thus omitted.
From the conditions given in Case 1 of (M5), we obtain the following relationship:
r t F a ( x ) r t G b ( x ) k f G b ( x ) k f F a ( x )
and   w t F a r ( x ) w t G b r ( x ) w f G b k ( x ) w f F a k ( x ) .
Therefore, we have:
ϕ ( F ,   A ) ( G ,   B ) ( G ,   B ) c ( F ,   A ) c U .
Hence, it follows that:
( ( F ,   A ) ,   ( F ,   A ) c ) d ( ( G ,   B ) ,   ( G ,   B ) c ) .
Now, by definition of M 1 , we have:
M 1 ( F ,   A ) = 1 d ( ( F ,   A ) ,   ( F ,   A ) c )
1 d ( ( G ,   B ) ,   ( G ,   B ) c )
= M 1 ( G ,   B ) .
This completes the proof.  ☐
Theorem 2.
If d is the distance measure between CVSSs, then:
M 4 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( F ,   A ) ( F ,   A ) c ,   U )
is a vague entropy of CVSSs.
Proof. 
For two CVSSs ( F , A )   &   ( G , B ) , clearly seen that M 4 satisfies conditions (M1)–(M4). So, it is enough to prove that M 4 satisfy the condition (M5).
Consider the case:
r t F a ( x ) r t G b ( x ) ,   k f F a ( x ) k f G b ( x )   whenever   r t G b ( x ) k f G b ( x )
and   w t F a r ( x ) w t G b r ( x ) ,   w f F a k ( x ) w f G b k ( x )     whenever   w t G b r ( x ) w f G b k ( x )
which implies that:
r t F a ( x ) r t G b ( x ) k f G b ( x ) k f F a ( x )
and   w t F a r ( x ) w t G b r ( x ) w f G b k ( x ) w f F a k ( x ) .
Thus, we obtain:
ϕ ( F ,   A ) ( F ,   A ) c ( G ,   B ) ( G ,   B ) c ( G ,   B ) ( G ,   B ) c ( F ,   A ) ( F ,   A ) c U .
Therefore, we have:
d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( G ,   B ) ( G ,   B ) c   U )
and   d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( G ,   B ) ( G ,   B ) c   U ) .
Hence, by definition of M 4 , we have:
M 4 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( F ,   A ) ( F ,   A ) c ,   U )
d ( ( G ,   B ) ( G ,   B ) c   U ) d ( ( G ,   B ) ( G ,   B ) c   U )
= M 4 ( G ,   B )
Similarly, we can obtain for other case i.e., when r t F a ( x ) r t G b ( x ) ,   k f F a ( x ) k f G b ( x ) whenever r t G b ( x ) k f G b ( x ) and w t F a r ( x ) w t G b r ( x ) , w f F a k ( x ) w f G b k ( x ) whenever w t G b r ( x ) w f G b k ( x ) , we have M 4 ( F , A ) M 4 ( G ,   B ) . Hence (M5) satisfied.
Therefore, M 4   is a valid entropy measure.  ☐
Theorem 3.
For CVSS ( F ,   A ) and if d is the distance measure between CVSSs, then:
M 5 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) d ( ( F ,   A ) ( F ,   A ) c ,   ϕ )
is a vague entropy of CVSSs.
Proof. 
It can be obtained as similar to Theorem 2, so we omit here.  ☐
Theorem 4.
For two CVSSs ( F , A ) and ( G , B ) .   If d is a distance measure between CVSSs such that:
d ( ( F ,   A ) ,   ( G ,   B ) ) = d ( ( F ,   A ) c ,   ( G ,   B ) c ) ,
then the entropies M 4 and M 5 satisfies the equation M 4 = M 5 .
Proof. 
By definition of M 4 and M 5 , we have:
M 4 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( F ,   A ) ( F ,   A ) c ,   U )
= d ( ( ( F ,   A ) ( F ,   A ) c ) c ,   U c ) d ( ( ( F ,   A ) ( F ,   A ) c ) c ,   U c )
= d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) d ( ( F ,   A ) ( F ,   A ) c ,   ϕ )
= M 5 ( F ,   A ) .
Theorem 5.
For a CVSS ( F , A ) , if d is the distance measure between CVSSs and satisfies:
d ( ( F ,   A ) ,   U ) = d ( ( F ,   A ) ,   ϕ ) ,
Then:
M 6 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( F ,   A ) ( F ,   A ) c ,   ϕ )
is a vague entropy of CVSSs.
Theorem 6.
If d is the distance measure between CVSSs and satisfies d ( ( F ,   A ) ,   U ) = d ( ( F ,   A ) ,   ϕ ) , then:
M 7 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) d ( ( F ,   A ) ( F ,   A ) c ,   U )
is a vague entropy of CVSSs.
Theorem 7.
If d is a distance measure between CVSSs that satisfies:
d ( ( F ,   A ) ,   ( G ,   B ) ) = d ( ( F ,   A ) c ,   ( G ,   B ) c ) ,
then M 6 = M 7 .
Proof. 
The proof of the Theorems 5–7 can be obtained as similar to above, so we omit here.  ☐
Theorem 8.
If d is a distance measure between CVSSs, then:
M 8 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   U ) + d ( ( F ,   A ) ( F ,   A ) c ,   U )
is a vague entropy of CVSSs.
Theorem 9.
If d is a distance measure between CVSSs, then:
M 9 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) + d ( ( F ,   A ) ( F ,   A ) c ,   ϕ )
is a vague entropy of CVSSs.
Theorem 10.
If d is a distance measure between CVSSs (F, A) and (G, B) such that:
d ( ( F ,   A ) ,   ( G ,   B ) ) = d ( ( F ,   A ) c ,   ( G ,   B ) c ) ,
then M 8 = M 9 .
Theorem 11.
If d is a distance measure between CVSSs, then:
M 10 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) + d ( ( F ,   A ) ( F ,   A ) c ,   U )
is a vague entropy of CVSSs.
Theorem 12.
If d is a distance measure between CVSSs, then:
M 11 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   U ) + d ( ( F ,   A ) ( F ,   A ) c ,   ϕ )
is a vague entropy of CVSSs.
Theorem 13.
If d is a distance measure between CVSSs (F,A) and (G,B) such that:
d ( ( F ,   A ) ,   ( G ,   B ) ) = d ( ( F ,   A ) c ,   ( G ,   B ) c ) ,
then M 10 = M 11 .
Proof. 
The proof of these Theorems can be obtained as similar to above, so we omit here.  ☐

5. Illustrative Example

In this section, we present a scenario which necessitates the use of CVSSs. Subsequently, we present an application of the entropy measures proposed in Section 4 to an image detection problem to illustrate the validity and effectiveness of our proposed entropy formula.
Firstly, we shall define the distance between any two CVSSs as follows:
Definition 10.
Let ( F ,   A ) and ( G ,   B ) be two CVSSs over U . The distance between ( F ,   A ) and ( G ,   B ) is as given below:
d ( ( F ,   A ) ,   ( G ,   B ) ) = 1 4 m n j = 1 n i = 1 m [ max { | r t F ( a i ) ( x j ) r t G ( b i ) ( x j ) | ,   | k f G ( b i ) ( x j ) k f F ( a i ) ( x j ) | } + 1 2 π ( max { | w t F ( a i ) r ( x j ) w t G ( b i ) r ( x j ) | ,   | w f G ( b i ) k ( x j ) w f F ( a i ) k ( x j ) | } ) ]
In order to demonstrate the utility of the above proposed entropy measures M i ( i = 1 , 2 , ,   11 ) , we demonstrate it with a numerical example. For it, consider a CVSS ( F , A ) whose data sets are defined over the parameters e 1 ,   e 2 E and x 1 , x 2 , x 3 U as follows:
( F , A ) = x 1 x 2 [ [ 0.2 ,   0.8 ] e i [ 0.1 ( 2 π ) ,   0.2 ( 2 π ) ] [ 0.3 ,   0.5 ] e i [ 0.2 ( 2 π ) ,   0.4 ( 2 π ) ] [ 0.3 ,   0.6 ] e i [ 0.4 ( 2 π ) ,   0.5 ( 2 π ) ] [ 0.3 ,   0.6 ] e i [ 0.4 ( 2 π ) ,   0.5 ( 2 π ) ] [ 0.2 ,   0.3 ] e i [ 0.2 ( 2 π ) ,   0.4 ( 2 π ) ] [ 0.7 ,   0.9 ] e i [ 0.4 ( 2 π ) ,   0.5 ( 2 π ) ] ]
and hence the complement of CVSS is:
( F , A ) c = x 1 x 2 [ [ 0.2 ,   0.8 ] e i [ 0.8 ( 2 π ) ,   0.9 ( 2 π ) ] [ 0.5 ,   0.7 ] e i [ 0.6 ( 2 π ) ,   0.8 ( 2 π ) ] [ 0.4 ,   0.7 ] e i [ 0.5 ( 2 π ) ,   0.6 ( 2 π ) ] [ 0.2 ,   0.5 ] e i [ 0.8 ( 2 π ) ,   0.9 ( 2 π ) ] [ 0.7 ,   0.8 ] e i [ 0.6 ( 2 π ) ,   0.8 ( 2 π ) ] [ 0.1 ,   0.3 ] e i [ 0.5 ( 2 π ) ,   0.6 ( 2 π ) ] ]
Then, the distance measure based on the Definition 10, we get ( ( F , A ) , ( F , A ) c ) = 0.1708 ,   d ( ( F , A ) ( F , A ) c ,   U ) = 0.2167 , d ( ( F , A ) ( F , A ) c ,   ( F , A ) ( F , A ) c ) = 0.1708 ,   d ( ( F , A ) ( F , A ) c ,   U ) = 0.3875 , d ( ( F , A ) ( F , A ) c ,   ϕ ) = 0.3875 , d ( ( F , A ) ( F , A ) c ,   ϕ ) = 0.2167 . Therefore, the values of the entropy measures defined on the Theorem 1 to Theorem 11 are computed as:
(i)
M 1 ( F ,   A ) = 1 d ( ( F ,   A ) ,   ( F ,   A ) c ) = 1 − 0.1708 = 0.8292.
(ii)
M 2 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) = 0.2167.
(iii)
M 3 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   ( F ,   A ) ( F ,   A ) c ) = 1 − 0.1708 = 0.8292.
(iv)
M 4 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( F ,   A ) ( F ,   A ) c ,   U ) = 0.2167 0.3875 = 0.5592 .
(v)
M 5 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) = 0.2167 0.3875 = 0.5592 .
(vi)
M 6 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   U ) d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) = 0.2167 0.3875 = 0.5592 .
(vii)
M 7 ( F ,   A ) = d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) d ( ( F ,   A ) ( F ,   A ) c ,   U ) = 0.2167 0.3875 = 0.5592 .
(viii)
M 8 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   U ) + d ( ( F ,   A ) ( F ,   A ) c ,   U ) = 1 0.3875 + 0.2167 = 0.5592
(ix)
M 9 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) + d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) = 1 0.3875 + 0.2167 = 0.5592
(x)
M 10 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) + d ( ( F ,   A ) ( F ,   A ) c ,   U ) = 1 0.3875 + 0.2167 = 0.5592
(xi)
M 11 ( F ,   A ) = 1 d ( ( F ,   A ) ( F ,   A ) c ,   U ) + d ( ( F ,   A ) ( F ,   A ) c ,   ϕ ) = 1 0.3875 + 0.2167 = 0.5592
Next, we give an illustrative example from the field of pattern recognition which are stated and demonstrated as below.

5.1. The Scenario

A type of robot has a single eye capable of capturing (and hence memorizing) things it sees as an 850 × 640, 24 bit bitmap image. The robot was shown an object (a pillow with a smiley), and the image that was captured by the robot’s eye at that instant is shown in Figure 1. This image was saved as pic001.bmp in the memory of the robot.
The robot was then given a way (in this example, it is done by human input) to recognize the object, whenever the robot encounters the object again, by retrieving the colors at certain coordinates of its field of vision, and then comparing this with the same coordinates from image pic001.bmp stored in its memory. In order to distinguish noises, the coordinates are chosen in clusters of four, as shown in Figure 2. The coordinates of the clusters of the images are summarized in Table 1.
We now have three images, namely image A, image B and image C. The robot needs to recognize if the objects shown in image A, B and C is the same as the image shown in image pic001.bmp stored in the robot’s memory. Images A, B and C are shown in Figure 3, Figure 4 and Figure 5, respectively. For comparison purposes, pic001.bmp is shown alongside all the three images.
From a human perspective, it is clear that the object shown in image A will be recognized as the same object shown in image pic001.bmp, and it will be concluded that the object is shown in image B (a red airplane) is not the image pic001.bmp stored in the memory of the robot. No conclusion can be deduced from image C as it is made up of only noise, and therefore we are unable to deduce the exact object behind the noise. By retrieving the coordinates from Table 1, we now obtain the following sets of colors which are given in Table 2.
The luminosity and hue of the pixels are obtained using a picture editing program, and these are given in Table 3 and Table 4, respectively.
Luminosity, L m , k , n , b , where k {LE, RE, LF, CF, RF, T, M}, b {0, 1, 2, 3} (cluster of four pixels). Hue, m , k , n , b , where k {LE, RE, LF, CF, RF, T, M}, b {0, 1, 2, 3} (cluster of four pixels).

5.2. Formation of CVSS and Calculation of Entropies

Let U = { x 1 , x 2 , x 3 } , and A = {LE, RE, LF, CF, RF, T, M}. We now form three CVSSs ( 𝒷 ,   A ) , 𝒷 { 1 , 2 , 3 } , which denote image A ,   B and C , respectively using the formula given below:
𝒷 ( k ) ( x n ) = [ e ( ( y 𝒷 , ( k , n ) ) 2 ρ ) , e ( ( γ 𝒷 , ( k , n ) ) 2 ρ ) ] e 2 π i [ e ( ( θ 𝒷 , ( k , n ) ) 2 ϱ ) , e ( ( ϑ 𝒷 , ( k , n ) ) 2 ϱ ) ] ,
where:
y 𝒷 , ( k , n ) = min { | L 𝒷 , k , n , p L 0 , k , n , q |   :   p , q { 0 , 1 , 2 , 3 } } , γ 𝒷 , ( k , n ) = max { | L 𝒷 , k , n , p L 0 , k , n , q |   :   p , q { 0 , 1 , 2 , 3 } } ,
θ 𝒷 , ( k , n ) = min { | 𝒷 , k , n , p 0 , k , n , q |   :   p , q { 0 , 1 , 2 , 3 } } , ϑ 𝒷 , ( k , n ) = max { | 𝒷 , k , n , p 0 , k , n , q |   :   p , q { 0 , 1 , 2 , 3 } } .
We choose ρ = 6400 and ϱ = 6400 for this scenario. The CVSSs that were formed for this scenario are as given in Table 5, Table 6 and Table 7.
By using Definition 10, the entropy values for images A, B and C are as summarized in Table 8.
From these values, it can be clearly seen that M i ( 3 ,   A ) > M i ( 2 ,   A ) > M i ( 1 ,   A ) for all i = 1 ,   2 ,   ,   11 . Hence it can be concluded that Image A is the image that is closest to the original image pic001.bmp that is stored in the memory of the robot, whereas Image C is the image that is the least similar to the original pic001.bmp image. The high entropy value for ( 3 ,   A ) is also an indication of the abnormality of Image C compared to Images A and B. These entropy values and the results obtained for this scenario prove the effectiveness of our proposed entropy formula. The entropy values obtained in Table 8 further verifies the validity of the relationships between the 11 formulas that was proposed in Section 4.

6. Conclusions

The objective of this work is to introduce some entropy measures for the complex vague soft set environment to measure the degree of the vagueness between sets. For this, we define firstly the axiomatic definition of the distance and entropy measures for two CVSSs and them some desirable relations between the distance and entropy are proposed. The advantages of the proposed measures are that they are defined over the set where the membership and non-membership degrees are defined as a complex number rather than real numbers. All of the information measures proposed here complement the CVSS model in representing and modeling time-periodic phenomena. The proposed measures are illustrated with a numerical example related to the problem of image detection by a robot. Furthermore, the use of CVSSs enables efficient modeling of the periodicity and/or the non-physical attributes in signal processing, image detection, and multi-dimensional pattern recognition, all of which contain multi-dimensional data. The work presented in this paper can be used as a foundation to further extend the study of the information measures for complex fuzzy sets or its generalizations. On our part, we are currently working on studying the inclusion measures and developing clustering algorithms for CVSSs. In the future, the result of this paper can be extended to some other uncertain and fuzzy environment [59,60,61,62,63,64,65,66,67,68].

Author Contributions

Conceptualization, Methodology, Validation, Writing-Original Draft Preparation, G.S.; Writing-Review & Editing, H.G.; Investigation and Visualization, S.G.Q.; Funding Acquisition, G.S.

Funding

This research was funded by the Ministry of Higher Education, Malaysia, grant number FRGS/1/2017/STG06/UCSI/03/1 and UCSI University, Malaysia, grant number Proj-In-FOBIS-014.

Acknowledgments

The authors are thankful to the editor and anonymous reviewers for their constructive comments and suggestions that helped us in improving the paper significantly. The authors would like to gratefully acknowledge the financial assistance received from the Ministry of Education, Malaysia under grant no. FRGS/1/2017/STG06/UCSI/03/1 and UCSI University, Malaysia under grant no. Proj-In-FOBIS-014.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  2. Deluca, A.; Termini, S. A definition of non-probabilistic entropy in setting of fuzzy set theory. Inf. Control 1971, 20, 301–312. [Google Scholar] [CrossRef]
  3. Liu, X. Entropy, distance measure and similarity measure of fuzzy sets and their relations. Fuzzy Sets Syst. 1992, 52, 305–318. [Google Scholar]
  4. Fan, J.; Xie, W. Distance measures and induced fuzzy entropy. Fuzzy Sets Syst. 1999, 104, 305–314. [Google Scholar] [CrossRef]
  5. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  6. Gau, W.L.; Buehrer, D.J. Vague sets. IEEE Trans. Syst. Man Cybern. 1993, 23, 610–613. [Google Scholar] [CrossRef]
  7. Atanassov, K.; Gargov, G. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 31, 343–349. [Google Scholar] [CrossRef]
  8. Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477. [Google Scholar] [CrossRef]
  9. Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information—Application to pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206. [Google Scholar] [CrossRef]
  10. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 1996, 78, 305–316. [Google Scholar] [CrossRef]
  11. Garg, H.; Agarwal, N.; Tripathi, A. Generalized intuitionistic fuzzy entropy measure of order α and degree β and its applications to multi-criteria decision making problem. Int. J. Fuzzy Syst. Appl. 2017, 6, 86–107. [Google Scholar] [CrossRef]
  12. Liao, H.; Xu, Z.; Zeng, X. Distance and similarity measures for hesitant fuzzy linguistic term sets and their application in multi-criteria decision making. Inf. Sci. 2014, 271, 125–142. [Google Scholar] [CrossRef]
  13. Garg, H.; Agarwal, N.; Tripathi, A. A novel generalized parametric directed divergence measure of intuitionistic fuzzy sets with its application. Ann. Fuzzy Math. Inform. 2017, 13, 703–727. [Google Scholar]
  14. Gou, X.; Xu, Z.; Liao, H. Hesitant fuzzy linguistic entropy and cross-entropy measures and alternative queuing method for multiple criteria decision making. Inf. Sci. 2017, 388–389, 225–246. [Google Scholar] [CrossRef]
  15. Liu, W.; Liao, H. A bibliometric analysis of fuzzy decision research during 1970–2015. Int. J. Fuzzy Syst. 2017, 19, 1–14. [Google Scholar] [CrossRef]
  16. Garg, H. Hesitant pythagorean fuzzy sets and their aggregation operators in multiple attribute decision making. Int. J. Uncertain. Quantif. 2018, 8, 267–289. [Google Scholar] [CrossRef]
  17. Garg, H. Distance and similarity measure for intuitionistic multiplicative preference relation and its application. Int. J. Uncertain. Quantif. 2017, 7, 117–133. [Google Scholar] [CrossRef]
  18. Dimuro, G.P.; Bedregal, B.; Bustince, H.; Jurio, A.; Baczynski, M.; Mis, K. QL-operations and QL-implication functions constructed from tuples (O, G, N) and the generation of fuzzy subsethood and entropy measures. Int. J. Approx. Reason. 2017, 82, 170–192. [Google Scholar] [CrossRef]
  19. Liao, H.; Xu, Z.; Viedma, E.H.; Herrera, F. Hesitant fuzzy linguistic term set and its application in decision making: A state-of-the art survey. Int. J. Fuzzy Syst. 2017, 1–27. [Google Scholar] [CrossRef]
  20. Garg, H.; Agarwal, N.; Tripathi, A. Choquet integral-based information aggregation operators under the interval-valued intuitionistic fuzzy set and its applications to decision-making process. Int. J. Uncertain. Quantif. 2017, 7, 249–269. [Google Scholar] [CrossRef]
  21. Garg, H. Linguistic Pythagorean fuzzy sets and its applications in multiattribute decision-making process. Int. J. Intell. Syst. 2018, 33, 1234–1263. [Google Scholar] [CrossRef]
  22. Garg, H. A robust ranking method for intuitionistic multiplicative sets under crisp, interval environments and its applications. IEEE Trans. Emerg. Top. Comput. Intell. 2017, 1, 366–374. [Google Scholar] [CrossRef]
  23. Yu, D.; Liao, H. Visualization and quantitative research on intuitionistic fuzzy studies. J. Intell. Fuzzy Syst. 2016, 30, 3653–3663. [Google Scholar] [CrossRef]
  24. Garg, H. Generalized intuitionistic fuzzy entropy-based approach for solving multi-attribute decision-making problems with unknown attribute weights. Proc. Natl. Acad. Sci. India Sect. A Phys. Sci. 2017, 1–11. [Google Scholar] [CrossRef]
  25. Garg, H. A new generalized improved score function of interval-valued intuitionistic fuzzy sets and applications in expert systems. Appl. Soft Comput. 2016, 38, 988–999. [Google Scholar] [CrossRef]
  26. Molodtsov, D. Soft set theory—First results. Comput. Math. Appl. 1999, 27, 19–31. [Google Scholar] [CrossRef]
  27. Maji, P.K.; Biswas, R.; Roy, A. Intuitionistic fuzzy soft sets. J. Fuzzy Math. 2001, 9, 677–692. [Google Scholar]
  28. Maji, P.K.; Biswas, R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math. 2001, 9, 589–602. [Google Scholar]
  29. Yang, H.L. Notes on generalized fuzzy soft sets. J. Math. Res. Expos. 2011, 31, 567–570. [Google Scholar]
  30. Majumdar, P.; Samanta, S.K. Generalized fuzzy soft sets. Comput. Math. Appl. 2010, 59, 1425–1432. [Google Scholar] [CrossRef]
  31. Agarwal, M.; Biswas, K.K.; Hanmandlu, M. Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Soft Comput. 2013, 13, 3552–3566. [Google Scholar] [CrossRef]
  32. Garg, H.; Arora, R. Generalized and group-based generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Intell. 2018, 48, 343–356. [Google Scholar] [CrossRef]
  33. Majumdar, P.; Samanta, S. Similarity measure of soft sets. New Math. Nat. Comput. 2008, 4, 1–12. [Google Scholar] [CrossRef]
  34. Garg, H.; Arora, R. Distance and similarity measures for dual hesitant fuzzy soft sets and their applications in multi criteria decision-making problem. Int. J. Uncertain. Quantif. 2017, 7, 229–248. [Google Scholar] [CrossRef]
  35. Kharal, A. Distance and similarity measures for soft sets. New Math. Nat. Comput. 2010, 6, 321–334. [Google Scholar] [CrossRef]
  36. Jiang, Y.; Tang, Y.; Liu, H.; Chen, Z. Entropy on intuitionistic fuzzy soft sets and on interval-valued fuzzy soft sets. Inf. Sci. 2013, 240, 95–114. [Google Scholar] [CrossRef]
  37. Garg, H.; Agarwal, N.; Tripathi, A. Fuzzy number intuitionistic fuzzy soft sets and its properties. J. Fuzzy Set Valued Anal. 2016, 2016, 196–213. [Google Scholar] [CrossRef]
  38. Arora, R.; Garg, H. Prioritized averaging/geometric aggregation operators under the intuitionistic fuzzy soft set environment. Sci. Iran. E 2018, 25, 466–482. [Google Scholar] [CrossRef]
  39. Arora, R.; Garg, H. Robust aggregation operators for multi-criteria decision making with intuitionistic fuzzy soft set environment. Sci. Iran. E 2018, 25, 931–942. [Google Scholar] [CrossRef]
  40. Garg, H.; Arora, R. A nonlinear-programming methodology for multi-attribute decision-making problem with interval-valued intuitionistic fuzzy soft sets information. Appl. Intell. 2017, 1–16. [Google Scholar] [CrossRef]
  41. Garg, H.; Arora, R. Bonferroni mean aggregation operators under intuitionistic fuzzy soft set environment and their applications to decision-making. J. Oper. Res. Soc. 2018, 1–14. [Google Scholar] [CrossRef]
  42. Xu, W.; Ma, J.; Wang, S.; Hao, G. Vague soft sets and their properties. Comput. Math. Appl. 2010, 59, 787–794. [Google Scholar] [CrossRef]
  43. Chen, S.M. Measures of similarity between vague sets. Fuzzy Sets Syst. 1995, 74, 217–223. [Google Scholar] [CrossRef]
  44. Wang, C.; Qu, A. Entropy, similarity measure and distance measure of vague soft sets and their relations. Inf. Sci. 2013, 244, 92–106. [Google Scholar] [CrossRef]
  45. Selvachandran, G.; Maji, P.; Faisal, R.Q.; Salleh, A.R. Distance and distance induced intuitionistic entropy of generalized intuitionistic fuzzy soft sets. Appl. Intell. 2017, 1–16. [Google Scholar] [CrossRef]
  46. Ramot, D.; Milo, R.; Fiedman, M.; Kandel, A. Complex fuzzy sets. IEEE Trans. Fuzzy Syst. 2002, 10, 171–186. [Google Scholar] [CrossRef]
  47. Ramot, D.; Friedman, M.; Langholz, G.; Kandel, A. Complex fuzzy logic. IEEE Trans. Fuzzy Syst. 2003, 11, 450–461. [Google Scholar] [CrossRef]
  48. Greenfield, S.; Chiclana, F.; Dick, S. Interval-Valued Complex Fuzzy Logic. In Proceedings of the IEEE International Conference on Fuzzy Systems (FUZZ), Vancouver, BC, Canada, 24–29 July 2016; pp. 1–6. [Google Scholar] [CrossRef]
  49. Yazdanbakhsh, O.; Dick, S. A systematic review of complex fuzzy sets and logic. Fuzzy Sets Syst. 2018, 338, 1–22. [Google Scholar] [CrossRef]
  50. Alkouri, A.; Salleh, A. Complex Intuitionistic Fuzzy Sets. In Proceedings of the 2nd International Conference on Fundamental and Applied Sciences, Kuala Lumpur, Malaysia, 12–14 June 2012; Volume 1482, pp. 464–470. [Google Scholar]
  51. Alkouri, A.U.M.; Salleh, A.R. Complex Atanassov’s intuitionistic fuzzy relation. Abstr. Appl. Anal. 2013, 2013, 287382. [Google Scholar] [CrossRef]
  52. Rani, D.; Garg, H. Distance measures between the complex intuitionistic fuzzy sets and its applications to the decision-making process. Int. J. Uncertain. Quantif. 2017, 7, 423–439. [Google Scholar] [CrossRef]
  53. Kumar, T.; Bajaj, R.K. On complex intuitionistic fuzzy soft sets with distance measures and entropies. J. Math. 2014, 2014, 972198. [Google Scholar] [CrossRef]
  54. Selvachandran, G.; Majib, P.; Abed, I.E.; Salleh, A.R. Complex vague soft sets and its distance measures. J. Intell. Fuzzy Syst. 2016, 31, 55–68. [Google Scholar] [CrossRef]
  55. Selvachandran, G.; Maji, P.K.; Abed, I.E.; Salleh, A.R. Relations between complex vague soft sets. Appl. Soft Comput. 2016, 47, 438–448. [Google Scholar] [CrossRef]
  56. Selvachandran, G.; Garg, H.; Alaroud, M.H.S.; Salleh, A.R. Similarity measure of complex vague soft sets and its application to pattern recognition. Int. J. Fuzzy Syst. 2018, 1–14. [Google Scholar] [CrossRef]
  57. Singh, P.K. Complex vague set based concept lattice. Chaos Solitons Fractals 2017, 96, 145–153. [Google Scholar] [CrossRef]
  58. Hu, D.; Hong, Z.; Wang, Y. A new approach to entropy and similarity measure of vague soft sets. Sci. World J. 2014, 2014, 610125. [Google Scholar] [CrossRef] [PubMed]
  59. Arora, R.; Garg, H. A robust correlation coefficient measure of dual hesitant fuzzy soft sets and their application in decision making. Eng. Appl. Artif. Intell. 2018, 72, 80–92. [Google Scholar] [CrossRef]
  60. Qiu, D.; Lu, C.; Zhang, W.; Lan, Y. Algebraic properties and topological properties of the quotient space of fuzzy numbers based on mares equivalence relation. Fuzzy Sets Syst. 2014, 245, 63–82. [Google Scholar] [CrossRef]
  61. Garg, H. Generalised Pythagorean fuzzy geometric interactive aggregation operators using Einstein operations and their application to decision making. J. Exp. Theor. Artif. Intell. 2018. [Google Scholar] [CrossRef]
  62. Garg, H.; Arora, R. Novel scaled prioritized intuitionistic fuzzy soft interaction averaging aggregation operators and their application to multi criteria decision making. Eng. Appl. Artif. Intell. 2018, 71, 100–112. [Google Scholar] [CrossRef]
  63. Qiu, D.; Zhang, W. Symmetric fuzzy numbers and additive equivalence of fuzzy numbers. Soft Comput. 2013, 17, 1471–1477. [Google Scholar] [CrossRef]
  64. Garg, H.; Kumar, K. Distance measures for connection number sets based on set pair analysis and its applications to decision making process. Appl. Intell. 2018, 1–14. [Google Scholar] [CrossRef]
  65. Qiu, D.; Zhang, W.; Lu, C. On fuzzy differential equations in the quotient space of fuzzy numbers. Fuzzy Sets Syst. 2016, 295, 72–98. [Google Scholar] [CrossRef]
  66. Garg, H.; Kumar, K. An advanced study on the similarity measures of intuitionistic fuzzy sets based on the set pair analysis theory and their application in decision making. Soft Comput. 2018. [Google Scholar] [CrossRef]
  67. Garg, H.; Nancy. Linguistic single-valued neutrosophic prioritized aggregation operators and their applications to multiple-attribute group decision-making. J. Ambient Intell. Hum. Comput. 2018, 1–23. [Google Scholar] [CrossRef]
  68. Garg, H.; Kumar, K. Some aggregation operators for Linguistic intuitionistic fuzzy set and its application to group decision-making process using the set pair analysis. Arab. J. Sci. Eng. 2018, 43, 3213–3227. [Google Scholar] [CrossRef]
Figure 1. The image of the object captured by the robot.
Figure 1. The image of the object captured by the robot.
Entropy 20 00403 g001
Figure 2. The clusters of the image pic001.bmp for recognition purposes.
Figure 2. The clusters of the image pic001.bmp for recognition purposes.
Entropy 20 00403 g002
Figure 3. Image A (left) and the original image pic001.bmp (right).
Figure 3. Image A (left) and the original image pic001.bmp (right).
Entropy 20 00403 g003
Figure 4. Image B (left) and the original image pic001.bmp (right).
Figure 4. Image B (left) and the original image pic001.bmp (right).
Entropy 20 00403 g004
Figure 5. Image C (left) and the original image pic001.bmp (right).
Figure 5. Image C (left) and the original image pic001.bmp (right).
Entropy 20 00403 g005
Table 1. The coordinates of the clusters for image pic001.bmp from Figure 2.
Table 1. The coordinates of the clusters for image pic001.bmp from Figure 2.
1st Position (n = 1)2nd Position (n = 2)3rd Position (n = 3)
“Left Eye” (LEn)(323, 226), (324, 226),
(323, 227), (324, 227),
(301, 252), (302, 252),
(301, 253), (302, 253),
(345, 252), (346, 252),
(345, 253), (346, 253),
“Right Eye” (REn)(486, 226), (487, 226),
(486, 227), (487, 227),
(464, 252), (465, 252),
(464, 253), (465, 253),
(509, 252), (510, 252),
(509, 253), (510, 253),
“Left side of Face” (LFn)(284, 119), (285, 119),
(284, 120), (285, 120),
(167, 312), (168, 312),
(167, 313), (168, 313),
(275, 519), (276, 519),
(275, 520), (276, 520),
“Centre of Face” (CFn)(407, 168), (408, 168),
(407, 169), (408, 169),
(406, 262), (407, 262),
(406, 263), (407, 263),
(406, 363), (407, 363),
(406, 364), (407, 364),
“Right side of Face” (RFn)(553, 120), (554, 120),
(553, 121), (554, 121),
(671, 307), (672, 307),
(671, 308), (672, 308),
(562, 521), (563, 521),
(562, 522), (563, 522),
“Tongue” (Tn)(581, 404), (582, 404),
(581, 405), (582, 405),
(562, 429), (563, 429),
(562, 430), (563, 430),
(598, 430), (599, 430),
(598, 431), (599, 431),
“Mouth” (Mn)(274, 403), (278, 407),
(282, 411), (286, 415),
(393, 469), (401, 469),
(409, 469), (417, 469),
(553, 395), (556, 389),
(559, 383), (562, 377),
Remark: For a computer image, the top-leftmost pixel is labeled (0, 0).
Table 2. The sets of colors for image A, B, C and image pic001.bmp.
Table 2. The sets of colors for image A, B, C and image pic001.bmp.
pic001.bmp (Memory)Image A Image BImage C
n = 1n = 2n = 3n = 1n = 2n = 3n = 1n = 2n = 3n = 1n = 2n = 3
LE Entropy 20 00403 i001 Entropy 20 00403 i002 Entropy 20 00403 i003 Entropy 20 00403 i004
RE
LF
CF
RF
T
M
Table 3. The values of the luminosity for image A, B, C and image pic001.bmp.
Table 3. The values of the luminosity for image A, B, C and image pic001.bmp.
pic001.bmp (Memory),
m = 0
Image A
m = 1
Image B
m = 2
Image C
m = 3
n = 1n = 2n = 3n = 1n = 2n = 3n = 1n = 2n = 3n = 1n = 2n = 3
LE23232324242324273433323256979674722712857541802
24242424242525283839313166116110818025827810712026
RE242321242324434441424746778787281124802120
23242221242245464143484866887712064067097
LF10199104105909078795557969633162163122120586324316224
96100106106918878785556959533163163125122656714614096
CF858880837879971021031041021016579772424492802210956
8889818278789798103104991005581813332608131392796
RF8382646460591191191361361391416616156263984093945616
848465635959120119135134138139661614616464020304544
T6060595958591421431441441521511161172727127127742807581198
5961616260591441441451441501481201192524125125120251043075
M2623151499192134403335818717519136427722112151424
252613138819204340363686931811457966113841211153133
Table 4. The values of the hue of the pixels for image A, B, C and image pic001.bmp.
Table 4. The values of the hue of the pixels for image A, B, C and image pic001.bmp.
pic001.bmp (Memory),
m = 0
Image A
m = 1
Image B
m = 2
Image C
m = 3
n = 1n = 2n = 3n = 1n = 2n = 3n = 1n = 2n = 3n = 1n = 2n = 3
LE121210101212171715151816187160886755802316022627
1312910131217171515181616016088672141731277016866
RE13131313121119191818202016016016016016016011264227160220119
13131312121119191818192016016016016016016015215016017716077
LF31313131313028282727313118718717316777121671696757211
31303131313029282727313018718717116797232371876186199
CF29292928292930293030313016016088224230155861602141770
29292929292930293030303018018088213220425120010797165
RF2930292931313030323233331601601601601391391545238689436
3030292931313030323232331601601601531371416816019276131158
T1111111210101212121213121391681657711921216045160130
111199910121211121312913164160810292053167160200
M1110131318141615182017161014191820818314511819526124160
111213131418151621191617131417151845422949181115220
Table 5. Tabular representation of ( 1 ,   A ) .
Table 5. Tabular representation of ( 1 ,   A ) .
n
123
kLE[0.996, 1.000]e2πi[0.996, 0.997][0.960, 0.987]e2πi[0.994, 0.996][0.987, 0.994]e2πi[0.994, 0.998]
RE[0.920, 0.945]e2πi[0.994, 0.994][0.927, 0.955]e2πi[0.994, 0.996][0.899, 0.927]e2πi[0.987, 0.992]
LF[0.920, 0.955]e2πi[0.998, 0.999][0.666, 0.708]e2πi[0.997, 0.997][0.990, 0.997]e2πi[0.999, 1.000]
CF[0.955, 0.990]e2πi[0.999, 1.000][0.913, 0.939]e2πi[0.999, 0.999][0.913, 0.939]e2πi[0.999, 0.999]
RF[0.798, 0.825]e2πi[0.999, 1.000][0.434, 0.475]e2πi[0.998, 0.998][0.349, 0.386]e2πi[0.999, 0.999]
T[0.323, 0.358]e2πi[0.999, 0.999][0.314, 0.349]e2πi[0.998, 1.000][0.251, 0.298]e2πi[0.997, 0.999]
M[0.992, 0.999]e2πi[0.994, 0.998][0.868, 0.945]e2πi[0.990, 0.996][0.884, 0.913]e2πi[0.998, 0.999]
Table 6. Tabular representation of ( 2 ,   A ) .
Table 6. Tabular representation of ( 2 ,   A ) .
N
123
kLE[0.945, 0.955]e2πi[0.008, 0.034][0.258, 0.444]e2πi[0.999, 0.999][0.591, 0.708]e2πi[0.992, 0.996]
RE[0.950, 0.960]e2πi[0.034, 0.034][0.955, 0.973]e2πi[0.032, 0.034][0.955, 0.969]e2πi[0.031, 0.032]
LF[0.222, 0.258]e2πi[0.021, 0.022][0.580, 0.612]e2πi[0.042, 0.055][0.807, 0.876]e2πi[0.913, 0.933]
CF[0.332, 0.377]e2πi[0.028, 0.068][0.994, 1.000]e2πi[0.933, 0.939][0.623, 0.728]e2πi[0.001, 0.005]
RF[0.386, 0.405]e2πi[0.068, 0.071][0.666, 0.708]e2πi[0.068, 0.090][0.996, 0.999]e2πi[0.150, 0.172]
T[0.559, 0.623]e2πi[0.999, 0.999][0.798, 0.852]e2πi[0.019, 0.032][0.475, 0.516]e2πi[0.998, 1.000]
M[0.465, 0.623]e2πi[0.997, 1.000][0.007, 0.071]e2πi[0.994, 0.999][0.454, 0.892]e2πi[0.002, 0.987]
Table 7. Tabular representation of ( 3 ,   A ) .
Table 7. Tabular representation of ( 3 ,   A ) .
N
123
kLE[0.178, 0.999]e2πi[0.001, 0.759][0.332, 0.868]e2πi[0.028, 0.973][0.021, 0.999]e2πi[0.000, 0.969]
RE[0.229, 0.997]e2πi[0.048, 0.666][0.718, 0.933]e2πi[0.000, 0.034][0.222, 0.939]e2πi[0.001, 0.516]
LF[0.749, 0.876]e2πi[0.001, 0.992][0.266, 0.749]e2πi[0.051, 0.973][0.495, 0.996]e2πi[0.005, 0.899]
CF[0.559, 0.997]e2πi[0.083, 0.973][0.340, 0.612]e2πi[0.004, 0.386][0.655, 0.955]e2πi[0.032, 0.876]
RF[0.332, 0.969]e2πi[0.068, 0.913][0.728, 0.884]e2πi[0.001, 0.788][0.738, 0.998]e2πi[0.080, 0.996]
T[0.559, 0.973]e2πi[0.001, 0.950][0.548, 0.973]e2πi[0.028, 0.945][0.046, 0.965]e2πi[0.003, 0.105]
M[0.282, 0.999]e2πi[0.057, 0.955][0.161, 1.000]e2πi[0.005, 0.973][0.906, 0.996]e2πi[0.001, 0.229]
Table 8. Summary of the entropy values for image A, B and C.
Table 8. Summary of the entropy values for image A, B and C.
Entropy MeasureImage A
( 1 ,   A )
Image B
( 2 ,   A )
Image C
( 3 ,   A )
M 1 ( i ,   A ) 0.5710.6470.847
M 2 ( i ,   A ) 0.0390.0890.328
M 3 ( i ,   A ) 0.5710.6470.847
M 4 ( i ,   A ) 0.0840.2020.682
M 5 ( i ,   A ) 0.0840.2020.682
M 6 ( i ,   A ) 0.0840.2020.682
M 7 ( i ,   A ) 0.0840.2020.682
M 8 ( i ,   A ) 0.5710.6470.847
M 9 ( i ,   A ) 0.5710.6470.847
M 10 ( i ,   A ) 0.5710.6470.847
M 11 ( i ,   A ) 0.5710.6470.847

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top