Next Article in Journal
Ekeland’s Variational Principle and Minimization Takahashi’s Theorem in Generalized Metric Spaces
Next Article in Special Issue
Fuzzy Semi-Metric Spaces
Previous Article in Journal
The Randomized First-Hitting Problem of Continuously Time-Changed Brownian Motion
Previous Article in Special Issue
N-Hyper Sets
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel (R,S)-Norm Entropy Measure of Intuitionistic Fuzzy Sets and Its Applications in Multi-Attribute Decision-Making

School of Mathematics, Thapar Institute of Engineering & Technology, Deemed University, Patiala 147004, Punjab, India
*
Author to whom correspondence should be addressed.
Mathematics 2018, 6(6), 92; https://doi.org/10.3390/math6060092
Submission received: 16 May 2018 / Revised: 26 May 2018 / Accepted: 28 May 2018 / Published: 30 May 2018
(This article belongs to the Special Issue Fuzzy Mathematics)

Abstract

:
The objective of this manuscript is to present a novel information measure for measuring the degree of fuzziness in intuitionistic fuzzy sets (IFSs). To achieve it, we define an ( R , S ) -norm-based information measure called the entropy to measure the degree of fuzziness of the set. Then, we prove that the proposed entropy measure is a valid measure and satisfies certain properties. An illustrative example related to a linguistic variable is given to demonstrate it. Then, we utilized it to propose two decision-making approaches to solve the multi-attribute decision-making (MADM) problem in the IFS environment by considering the attribute weights as either partially known or completely unknown. Finally, a practical example is provided to illustrate the decision-making process. The results corresponding to different pairs of ( R , S ) give different choices to the decision-maker to assess their results.

1. Introduction

Multi-attribute decision-making (MADM) problems are an important part of decision theory in which we choose the best one from the set of finite alternatives based on the collective information. Traditionally, it has been assumed that the information regarding accessing the alternatives is taken in the form of real numbers. However, uncertainty and fuzziness are big issues in real-world problems nowadays and can be found everywhere as in our discussion or the way we process information. To deal with such a situation, the theory of fuzzy sets (FSs) [1] or extended fuzzy sets such as an intuitionistic fuzzy set (IFS) [2] or interval-valued IFS (IVIFS) [3] are the most successful ones, which characterize the attribute values in terms of membership degrees. During the last few decades, researchers has been paying more attention to these theories and successfully applied them to various situations in the decision-making process. The two important aspects of solving the MADM problem are, first, to design an appropriate function that aggregates the different preferences of the decision-makers into collective ones and, second, to design appropriate measures to rank the alternatives. For the former part, an aggregation operator is an important part of the decision-making, which usually takes the form of a mathematical function to aggregate all the individual input data into a single one. Over the last decade, numerable attempts have been made by different researchers in processing the information values using different aggregation operators under IFS and IVIFS environments. For instance, Xu and Yager [4], Xu [5] presented some weighted averaging and geometric aggregation operators to aggregate the different intuitionistic fuzzy numbers (IFNs). Garg [6] and Garg [7] presented some interactive improved aggregation operators for IFNs using Einstein norm operations. Wang and Wang [8] characterized the preference of the decision-makers in terms of interval-numbers, and then, an MADM was presented corresponding to it with completely unknown weight vectors. Wei [9] presented some induced geometric aggregation operators with intuitionistic fuzzy information. Arora and Garg [10] and Arora and Garg [11] presented some aggregation operators by considering the different parameterization factors in the analysis in the intuitionistic fuzzy soft set environment. Zhou and Xu [12] presented some extreme weighted averaging aggregation operators for solving decision-making problems in terms of the optimism and pessimism points of view. Garg [13] presented some improved geometric aggregation operators for IVIFS. A complete overview about the aggregation operators in the IVIFSs was summarized by Xu and Guo in [14]. Jamkhaneh and Garg [15] presented some new operations for the generalized IFSs and applied them to solve decision-making problems. Garg and Singh [16] presented a new triangular interval Type-2 IFS and its corresponding aggregation operators.
With regard to the information measure, the entropy measure is basically known as the measure for information originating from the fundamental paper “The Mathematical theory of communication” in 1948 by C.E.Shannon [17]. Information theory is one of the trusted areas to measure the degree of uncertainty in the data. However, classical information measures deal with information that is precise in nature. In order to overcome this, Deluca and Termini [18] proposed a set of axioms for fuzzy entropy. Later on, Szmidt and Kacprzyk [19] extended the axioms of Deluca and Termini [18] to the IFS environment. Vlachos and Sergiadis [20] extended their measure to the IFS environment. Burillo and Bustince [21] introduced the entropy of IFSs as a tool to measure the degree of intuitionism associated with an IFS. Garg et al. [22] presented a generalized intuitionistic fuzzy entropy measure of order α and degree β to solve decision-making problems. Wei et al. [23] presented an entropy measure based on the trigonometric functions. Garg et al. [24] presented an entropy-based method for solving decision-making problems. Zhang and Jiang [25] presented an intuitionistic fuzzy entropy by generalizing the measure of Deluca and Termini [18]. Verma and Sharma [26] presented an exponential order measure between IFSs.
In contrast to the entropy measures, the distance or similarity measures are also used by researchers to measure the similarity between two IFSs. In that direction, Taneja [27] presented a theory on the generalized information measures in the fuzzy environment. Boekee and Van der Lubbe [28] presented the R-norm information measure. Hung and Yang [29] presented the similarity measures between the two different IFSs based on the Hausdorff distance. Garg [30], Garg and Arora [31] presented a series of distance and similarity measures in the different sets of the environment to solve decision-making problems. Joshi and Kumar [32] presented an ( R , S )-norm fuzzy information measures to solve decision-making problems. Garg and Kumar [33,34] presented some similarity and distance measures of IFSs by using the set pair analysis theory. Meanwhile, decision-making methods based on some measures (such as distance, similarity degree, correlation coefficient and entropy) were proposed to deal with fuzzy IF and interval-valued IF MADM problems [35,36,37,38].
In [39,40,41,42,43], emphasis was given by the researchers to the attribute weights during ranking of the alternatives. It is quite obvious that the final ranking order of the alternatives highly depends on the attribute weights, because the variation of weight values may result in a different final ranking order of alternatives [39,44,45,46,47]. Now, based on the characteristics of the attribute weights, the decision-making problem can be classified into three types: (a) the decision-making situation where the attribute weights are completely known; (b) the decision-making situation where the attribute weights are completely unknown; (c) the decision-making situation where the attribute weights are partially known. Thus, based on these types, the attribute weights in MADM can be classified as subjective and objective attribute weights based on the information acquisition approach. If the decision-maker gives weights to the attributes, then such information is called subjective. The classical approaches to determine the subjective attribute weights are the analytic hierarchy process (AHP) method [48] and the Delphi method [49]. On the other hand, the objective attribute weights are determined by the decision-making matrix, and one of the most important approaches is the Shannon entropy method [17], which expresses the relative intensities of the attributes’ importance to signify the average intrinsic information transmitted to the decision-maker. In the literature, several authors [39,44,50,51,52] have addressed the MADM problem with subjective weight information. However, some researchers formulated a nonlinear programming model to determine the attribute weights. For instance, Chen and Li [44] presented an approach to assess the attribute weights by utilizing IF entropy in the IFS environment. Garg [53] presented a generalized intuitionistic fuzzy entropy measure to determine the completely unknown attribute weight to solve the decision-making problems. Although some researchers put some efforts into determining the unknown attribute weights [45,46,54,55] under different environments, still it remains an open problem.
Therefore, in an attempt to address such problems and motivated by the characteristics of the IFSs to describe the uncertainties in the data, this paper addresses a new entropy measure to quantify the degree of fuzziness of a set in the IFS environment. The aim of this entropy is to determine the attribute weights under the characteristics of the attribute weights that they are either partially known or completely unknown. For this, we propose a novel entropy measure named the ( R , S )-norm-based information measure, which makes the decision more flexible and reliable corresponding to different values of the parameters R and S. Some of the desirable properties of the proposed measures are investigated, and some of their correlations are dreived. From the proposed entropy measures, some of the existing measures are considered as a special case. Furthermore, we propose two approaches for solving the MADM approach based on the proposed entropy measures by considering the characteristics of the attribute weights being either partially known or completely unknown. Two illustrative examples are considered to demonstrate the approach and compare the results with some of the existing approaches’ results.
The rest of this paper is organized as follows. In Section 2, we present some basic concepts of IFSs and the existing entropy measures. In Section 3, we propose a new ( R , S )-norm-based information measure in the IFS environment. Various desirable relations among the approaches are also investigated in detail. Section 4 describes two approaches for solving the MADM problem with the condition that attribute weights are either partially known or completely unknown. The developed approaches have been illustrated with a numerical example. Finally, a concrete conclusion and discussion are presented in Section 5.

2. Preliminaries

Some basic concepts related to IFSs and the aggregation operators are highlighted, over the universal set X, in this section.
Definition 1.
[2] An IFS A defined in X is an ordered pair given by:
A = { x , ζ A ( x ) , ϑ A ( x ) x X }
where ζ A , ϑ A : X [ 0 , 1 ] represent, respectively, the membership and non-membership degrees of the element x such that ζ A , ϑ A [ 0 , 1 ] and ζ A + ϑ A 1 for all x. For convenience, this pair is denoted by A = ζ A , ϑ A and called an intuitionistic fuzzy number (IFN) [4,5].
Definition 2.
[4,5] Let the family of all intuitionistic fuzzy sets of universal set X be denoted by FS(X). Let A, B ∈ FS(X) be such that then some operations can be defined as follows:
  • A B if ζ A ( x ) ζ B ( x ) and ϑ A ( x ) ϑ B ( x ) , for all x X ;
  • A B if ζ A ( x ) ζ B ( x ) and ϑ A ( x ) ϑ B ( x ) , for all x X ;
  • A = B iff ζ A ( x ) = ζ B ( x ) and ϑ A ( x ) = ϑ B ( x ) , for all x X ;
  • A B = { x , max ( ζ A ( x ) , ζ B ( x ) ) , min ( ϑ A ( x ) , ϑ B ( x ) ) : x X } ;
  • A B = { x , min ( ζ A ( x ) , ζ B ( x ) ) , max ( ϑ A ( x ) , ϑ B ( x ) ) : x X } ;
  • A c = { x , ϑ A ( x ) , ζ A ( x ) : x X } .
Definition 3.
[19] An entropy E: IFS ( X ) R + on IFS(X) is a real-valued functional satisfying the following four axioms for A , B IFS ( X )
(P1)
E ( A ) = 0 if and only if A is a crisp set, i.e., either ζ A ( x ) = 1 , ϑ A ( x ) = 0 or ζ A ( x ) = 0 , ϑ A ( x ) = 1 for all x X .
(P2)
E ( A ) = 1 if and only if ζ A ( x ) = ϑ A ( x ) for all x X .
(P3)
E ( A ) = E ( A c ) .
(P4)
If A B , that is, if ζ A ( x ) ζ B ( x ) and ϑ A ( x ) ϑ B ( x ) for any x X , then E ( A ) E ( B ) .
Vlachos and Sergiadis [20] proposed the measure of intuitionistic fuzzy entropy in the IFS environment as follows:
E ( A ) = - 1 n ln 2 i = 1 n ζ A ( x i ) ln ζ A ( x i ) + ϑ A ( x i ) ln ϑ A ( x i ) - ( 1 - π A ( x i ) ) ln ( 1 - π A ( x i ) ) - π A ( x i ) ln 2
Zhang and Jiang [25] presented a measure of intuitionistic fuzzy entropy based on a generalization of measure of Deluca and Termini [18] as:
E ( A ) = - 1 n i = 1 n ζ A ( x i ) + 1 - ϑ A ( x i ) 2 log ζ A ( x i ) + 1 - ϑ A ( x i ) 2 + ϑ A ( x i ) + 1 - ζ A ( x i ) 2 log ϑ A ( x i ) + 1 - ζ A ( x i ) 2
Verma and Sharma [26] proposed an exponential order entropy in the IFS environment as:
E ( A ) = 1 n ( e - 1 ) i = 1 n ζ A ( x i ) + 1 - ϑ A ( x i ) 2 e 1 - ζ A ( x i ) + 1 - ϑ A ( x i ) 2 + ϑ A ( x i ) + 1 - ζ A ( x i ) 2 e 1 - ϑ A ( x i ) + 1 - ζ A ( x i ) 2 - 1
Garg et al. [22] generalized entropy measure E α β ( A ) of order α and degree β as:
E α β ( A ) = 2 - β n ( 2 - β - α ) i = 1 n log ζ A α 2 - β ( x i ) + ϑ A α 2 - β ( x i ) ( ζ A ( x i ) + ϑ A ( x i ) ) 1 - α 2 - β + 2 1 - α 2 - β ( 1 - ζ A ( x i ) - ϑ A ( x i ) )
where log is to the base two, α > 0 , β [ 0 , 1 ] , α + β 2 .

3. Proposed ( R , S )-Norm Intuitionistic Fuzzy Information Measure

In this section, we define a new ( R , S )-norm information measure, denoted by H R S , in the IFS environment. For it, let Ω be the collection of all IFSs.
Definition 4.
For a collection of IFSs A = { ( x , ζ A ( x ) , ϑ A ( x ) ) x X } , an information measure H R S : Ω n R ; n 2 is defined as follows:
H R S ( A ) = R × S n ( R - S ) i = 1 n ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R ; either R > 1 , 0 < S < 1 or 0 < R < 1 , S > 1 R n ( R - 1 ) i = 1 n 1 - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R ; when S = 1 ; 0 < R < 1 S n ( 1 - S ) i = 1 n ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - 1 ; when R = 1 ; 0 < S < 1 - 1 n i = 1 n ζ A ( x i ) log ζ A ( x i ) + ϑ A ( x i ) log ϑ A ( x i ) + π A ( x i ) log π A ( x i ) ; R = 1 = S .
Theorem 1.
An intuitionistic fuzzy entropy measure H R S ( A ) defined in Equation (6) for IFSs is a valid measure, i.e., it satisfies the following properties.
(P1)
H R S ( A ) = 0 if and only if A is a crisp set, i.e., ζ A ( x i ) = 1 , ϑ A ( x i ) = 0 or ζ A ( x i ) = 0 , ϑ A ( x i ) = 1 for all x i X .
(P2)
H R S ( A ) = 1 if and only if ζ A ( x i ) = ϑ A ( x i ) for all x i X .
(P3)
H R S ( A ) H R S ( B ) if A is crisper than B, i.e., if ζ A ( x i ) ζ B ( x i ) & ϑ A ( x i ) ϑ B ( x i ) , for max { ζ B ( x i ) , ϑ B ( x i ) } 1 3 and ζ A ( x i ) ζ B ( x i ) & ϑ A ( x i ) ϑ B ( x i ) , for min { ζ B ( x i ) , ϑ B ( x i ) } 1 3 for all x i X .
(P4)
H R S ( A ) = H R S ( A c ) for all A I F S ( X ) .
Proof. 
To prove that the measure defined by Equation (6) is a valid information measure, we will have to prove that it satisfies the four properties defined in the definition of the intuitionistic fuzzy information measure.
  • Sharpness: In order to prove (P1), we need to show that H R S ( A ) = 0 if and only if A is a crisp set, i.e., either ζ A ( x ) = 1 , ϑ A ( x ) = 0 or ζ A ( x ) = 0 , ϑ A ( x ) = 1 for all x X .
    Firstly, we assume that H R S ( A ) = 0 for R , S > 0 and R S . Therefore, from Equation (6), we have:
    R × S n ( R - S ) i = 1 n ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R = 0 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R = 0 for all i = 1 , 2 , , n . i . e . , ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S = ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R for all i = 1 , 2 , , n .
    Since R , S > 0 and R S , therefore, the above equation is satisfied only if ζ A ( x i ) = 0 , ϑ A ( x i ) = 1 or ζ A ( x i ) = 1 , ϑ A ( x i ) = 0 for all i = 1 , 2 , , n .
    Conversely, we assume that set A = ( ζ A , ϑ A ) is a crisp set i.e., either ζ A ( x i ) = 0 or 1. Now, for R , S > 0 and R S , we can obtain that:
    ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R = 0
    for all i = 1 , 2 , , n , which gives that H R S ( A ) = 0 .
    Hence, H R S ( A ) = 0 iff A is a crisp set.
  • Maximality: We will find maxima of the function H R S ( A ) ; for this purpose, we will differentiate Equation (6) with respect to ζ A ( x i ) and ϑ A ( x i ) . We get,
    H R S ( A ) ζ A ( x i ) = R × S n ( R - S ) i = 1 n ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - S S ζ A S - 1 ( x i ) - π A S - 1 ( x i ) - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - R R ζ A R - 1 ( x i ) - π A R - 1 ( x i )
    and:
    H R S ( A ) ϑ A ( x i ) = R × S n ( R - S ) i = 1 n ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - S S ϑ A S - 1 ( x i ) - π A S - 1 ( x i ) - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - R R ϑ A R - 1 ( x i ) - π A R - 1 ( x i )
    In order to check the convexity of the function, we calculate its second order derivatives as follows:
    2 H R S ( A ) 2 ζ A ( x i ) = R × S n ( R - S ) i = 1 n ( 1 - S ) ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - 2 S S ζ A S - 1 ( x i ) - π A S - 1 ( x i ) 2 + ( S - 1 ) ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - S S ζ A S - 2 ( x i ) + π A S - 2 ( x i ) - ( 1 - R ) ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - 2 R R ζ A R - 1 ( x i ) - π A R - 1 ( x i ) 2 - ( R - 1 ) ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - R R ζ A R - 2 ( x i ) + π A R - 2 ( x i )
    2 H R S ( A ) 2 ϑ A ( x i ) = R × S n ( R - S ) i = 1 n ( 1 - S ) ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - 2 S S ϑ A S - 1 ( x i ) - π A S - 1 ( x i ) 2 + ( S - 1 ) ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - S S ϑ A S - 2 ( x i ) + π A S - 2 ( x i ) - ( 1 - R ) ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - 2 R R ϑ A R - 1 ( x i ) - π A R - 1 ( x i ) 2 - ( R - 1 ) ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - R R ϑ A R - 2 ( x i ) + π A R - 2 ( x i )
    and
    2 H R S ( A ) ϑ A ( x i ) ζ A ( x i ) = R × S n ( R - S ) i = 1 n ( 1 - S ) ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 - 2 S S × × ϑ A S - 1 ( x i ) - π A S - 1 ( x i ) ζ A S - 1 ( x i ) - π A S - 1 ( x i ) - ( 1 - R ) ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 - 2 R R × × ϑ A R - 1 ( x i ) - π A R - 1 ( x i ) ζ A R - 1 ( x i ) - π A R - 1 ( x i )
    To find the maximum/minimum point, we set H R S ( A ) ζ A ( x i ) = 0 and H R S ( A ) ϑ A ( x i ) = 0 , which gives that ζ A ( x i ) = ϑ A ( x i ) = π A ( x i ) = 1 3 for all i and hence called the critical point of the function H R S .
    (a)
    When R < 1 , S > 1 , then at the critical point ζ A ( x i ) = ϑ A ( x i ) = π A ( x i ) = 1 3 , we compute that:
    2 H R S ( A ) 2 ζ A ( x i ) < 0 and 2 H R S ( A ) 2 ζ A ( x i ) · 2 H R S ( A ) 2 ϑ A ( x i ) - 2 H R S ( A ) ϑ A ( x i ) ζ A ( x i ) 2 > 0
    Therefore, the Hessian matrix of H R S ( A ) is negative semi-definite, and hence, H R S ( A ) is a concave function. As the critical point of H R S is ζ A = ϑ A = 1 3 and by the concavity, we get that H R S ( A ) has a relative maximum value at ζ A = ϑ A = 1 3 .
    (b)
    When R > 1 , S < 1 , then at the critical point, we can again easily obtain that:
    2 H R S ( A ) 2 ζ A ( x i ) < 0 and 2 H R S ( A ) 2 ζ A ( x i ) · 2 H R S ( A ) 2 ϑ A ( x i ) - 2 H R S ( A ) ϑ A ( x i ) ζ A ( x i ) 2 > 0
    This proves that H R S ( A ) is a concave function and its global maximum at ζ A ( x i ) = ϑ A ( x i ) = 1 3 .
    Thus, for all R , S > 0 ; R < 1 , S < 1 or R > 1 , S < 1 , the global maximum value of H R S ( A ) attains at the point ζ A ( x i ) = ϑ A ( x i ) = 1 3 , i.e., H R S ( A ) is maximum if and only if A is the most fuzzy set.
  • Resolution: In order to prove that our proposed entropy function is monotonically increasing and monotonically decreasing with respect to ζ A ( x i ) and ϑ A ( x i ) , respectively, for convince, let ζ A ( x i ) = x , ϑ A ( x i ) = y and π A ( x i ) = 1 - x - y , then it is sufficient to prove that for R , S > 0 , R S , the entropy function:
    f ( x , y ) = R × S n ( R - S ) ( x S + y S + ( 1 - x - u ) S ) 1 S - ( x R + y R + ( 1 - x - y ) R ) 1 R
    where x , y [ 0 , 1 ] is an increasing function w.r.t. x and decreasing w.r.t. y.
    Taking the partial derivative of f with respect to x and y respectively, we get:
    f x = R × S n ( R - S ) x S ( x i ) + y S ( x i ) + ( 1 - x - y ) S ( x i ) 1 - S S x S - 1 ( x i ) - ( 1 - x - y ) S - 1 - x R ( x i ) + y R ( x i ) + ( 1 - x - y ) R ( x i ) 1 - R R x R - 1 ( x i ) - ( 1 - x - y ) R - 1
    and:
    f y = R × S n ( R - S ) x S ( x i ) + y S ( x i ) + ( 1 - x - y ) S ( x i ) 1 - S S y S - 1 ( x i ) - ( 1 - x - y ) S - 1 - x R ( x i ) + y R ( x i ) + ( 1 - x - y ) R ( x i ) 1 - R R y R - 1 ( x i ) - ( 1 - x - y ) R - 1
    For the extreme point of f, we set f x = 0 and f y = 0 and get x = y = 1 3 .
    Furthermore, f x 0 , when x y such that R , S > 0 , R S , i.e., f ( x , y ) is increasing with x y , and f x 0 is decreasing with respect to x, when x y . On the other hand, f y 0 and f y 0 when x y and x y , respectively.
    Further, since H R S ( A ) is a concave function on the IFS A, therefore, if max { ζ A ( x ) , ϑ A ( x ) } 1 3 , then ζ A ( x i ) ζ ( x i ) and ϑ A ( x i ) ϑ B ( x i ) , which implies that:
    ζ A ( x i ) ζ B ( x i ) 1 3 ; ϑ A ( x i ) ϑ B ( x i ) 1 3 ; π A ( x i ) π B ( x i ) 1 3
    Thus, we observe that ( ζ B ( x i ) , ϑ B ( x i ) , π B ( x i ) ) is more around ( 1 3 , 1 3 , 1 3 ) than ( ζ A ( x i ) , ϑ A ( x i ) , π A ( x i ) ) . Hence, H R S ( A ) H R B ( B ) .
    Similarly, if min { ζ A ( x i ) , ϑ A ( x i ) } 1 3 , then we get H R S ( A ) H R B ( B ) .
  • Symmetry: By the definition of H R S ( A ) , we can easily obtain that H R S ( A c ) = H R S ( A ) .
Hence H R S ( A ) satisfies all the properties of the intuitionistic fuzzy information measure and, therefore, is a valid measure of intuitionistic fuzzy entropy. ☐
Consider two IFSs A and B defined over X = { x 1 , x 2 , , x n } . Take the disjoint partition of X as:
X 1 = { x i X A B } , = { x i X ζ A ( x ) ζ B ( x ) ; ϑ A ( x ) ϑ B ( x ) }
and:
X 2 = { x i X A B } = { x i X ζ A ( x ) ζ B ( x ) ; ϑ A ( x ) ϑ B ( x ) }
Next, we define the joint and conditional entropies between IFSs A and B as follows:
  • Joint entropy:
    H R S ( A B ) = R × S n ( R - S ) i = 1 n ζ A B S ( x i ) + ϑ A B S ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) S 1 S - ζ A B R ( x i ) + ϑ A B R ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) R 1 R = R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + ( 1 - ζ B ( x i ) - ϑ B ( x i ) ) S 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + ( 1 - ζ B ( x i ) - ϑ B ( x i ) ) R 1 R + R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + ( 1 - ζ A ( x i ) - ϑ A ( x i ) ) S 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + ( 1 - ζ A ( x i ) - ϑ A ( x i ) ) R 1 R
  • Conditional entropy:
    H R S ( A | B ) = R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R - ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S + ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R
    and:
    H R S ( B | A ) = R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R - ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S + ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R
Theorem 2.
Let A and B be the two IFSs defined on universal set X = { x 1 , x 2 , , x n } , where, A = { x i , ζ A ( x i ) , ϑ A ( x i ) x i X } and B = { x i , ζ B ( x i ) , ϑ B ( x i ) x i X } , such that either A B or A B x i X , then:
H R S ( A B ) + H R S ( A B ) = H R S ( A ) + H R S ( B )
Proof. 
Let X 1 and X 2 be the two disjoint sets of X, where,
X 1 = { x X : A B } , X 2 = { x X : A B }
i.e., for x i X 1 , we have ζ A ( x i ) ζ B ( x i ) , ϑ A ( x i ) ϑ B ( x i ) and x i X 2 , implying that ζ A ( x i ) ζ B ( x i ) , ϑ A ( x i ) ϑ B ( x i ) . Therefore,
H R S ( A B ) + H R S ( A B ) = R × S n ( R - S ) i = 1 n ζ A B S ( x i ) + ϑ A B S ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) S 1 S - ζ A B R ( x i ) + ϑ A B R ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) R 1 R + R × S n ( R - S ) i = 1 n ζ A B S ( x i ) + ϑ A B S ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) S 1 S - ζ A B R ( x i ) + ϑ A B R ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) R 1 R = R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R + R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R + R × S n ( R - S ) x i X 1 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R + R × S n ( R - S ) x i X 2 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B ( x i ) R + ϑ B ( x i ) R + π B R ( x i ) 1 R = H R S ( A ) + H R S ( B )
 ☐
Theorem 3.
The maximum and minimum values of the entropy H R S A are independent of the parameters R and S.
Proof. 
As from the above theorem, we conclude that the entropy is maximum if and only if A is the most IFS and minimum when A is a crisp set. Therefore, it is enough to show that the value of H R S ( A ) in these conditions is independent of R and S. When A is the most IFS, i.e., ζ A ( x i ) = ϑ A ( x i ) , for all x i X , then H R S ( A ) = 1 , and when A is a crisp set, i.e., either ζ A ( x i ) = 0 , ϑ A ( x i ) = 1 or ζ A ( x i ) = 1 , ϑ A ( x i ) = 0 for all x i X , then H R S ( A ) = 0 . Hence, in both cases, H R S ( A ) is independent of the parameters R and S. ☐
Remark 1.
From the proposed measure, it is observed that some of the existing measures can be obtained from it by assigning particular cases to R and S. For instance,
  • When π A ( x i ) = 0 for all x i X , then the proposed measures reduce to the entropy measure of Joshi and Kumar [32].
  • When R = S and S > 0 , then the proposed measures are reduced by the measure of Taneja [27].
  • When R = 1 and R S , then the measure is equivalent to the R-norm entropy presented by Boekee and Van der Lubbe [28].
  • When R = S = 1 , then the proposed measure is the well-known Shannon’s entropy.
  • When S = 1 and R S , then the proposed measure becomes the measure of Bajaj et al. [37].
Theorem 4.
Let A and B be two IFSs defined over the set X such that either A B or B A , then the following statements hold:
  • H R S ( A B ) = H R S ( A ) + H R S ( B | A ) ;
  • H R S ( A B ) = H R S ( B ) + H R S ( A | B ) ;
  • H R S ( A B ) = H R S ( A ) + H R S ( B | A ) = H R S ( B ) + H R S ( A | B ) .
Proof. 
For two IFSs A and B and by using the definitions of joint, conditional and the proposed entropy measures, we get:
  • Consider:
    H R S ( A B ) - H R S ( A ) - H R S ( B | A ) = R × S n ( R - S ) i = 1 n ζ A B S ( x i ) + ϑ A B S ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) S 1 S - ζ A B R ( x i ) + ϑ A B R ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) R 1 R - R × S n ( R - S ) i = 1 n ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R - R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R - ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S + ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R = R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R + R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R - R × S n ( R - S ) x i X 1 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R + R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R - R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R - ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S + ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R = 0
  • Consider:
    H R S ( A B ) - H R S ( B ) - H R S ( A | B ) = R × S n ( R - S ) i = 1 n ζ A B S ( x i ) + ϑ A B S ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) S 1 S - ζ A B R ( x i ) + ϑ A B R ( x i ) + ( 1 - ζ A B ( x i ) - ϑ A B ( x i ) ) R 1 R - R × S n ( R - S ) i = 1 n ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R - R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R - ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S + ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R = R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R + R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R - R × S n ( R - S ) x i X 1 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R + R × S n ( R - S ) x i X 2 ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R - R × S n ( R - S ) x i X 2 ζ A S ( x i ) + ϑ A S ( x i ) + π A S ( x i ) 1 S - ζ A R ( x i ) + ϑ A R ( x i ) + π A R ( x i ) 1 R ζ B S ( x i ) + ϑ B S ( x i ) + π B S ( x i ) 1 S - ζ B R ( x i ) + ϑ B R ( x i ) + π B R ( x i ) 1 R = 0
  • This can be deduced from Parts (1) and (2).
Before elaborating on the comparison between the proposed entropy function and other entropy functions, we state a definition [56] for an IFS of the form A = x , ζ A ( x i ) , ϑ A ( x i ) x X defined on universal set X, which is as follows:
A n = { x , [ ζ A ( x i ) ] n , 1 - [ 1 - ϑ A ( x i ) ] n x X }
Definition 5.
The concentration of an IFS A of the universe X is denoted by C O N ( A ) and is defined by:
C O N ( A ) = { x , ζ C O N ( A ) ( x ) , ϑ C O N ( A ) ( x ) x X }
where ζ ( C O N ( A ) ) ( x ) = [ ζ A ( x ) ] 2 , ϑ C O N ( A ) ( x ) ) = 1 - [ 1 - ϑ A ( x ) ] 2 , i.e., the operation of the concentration of an IFS is defined by C O N ( A ) = A 2 .
Definition 6.
The dilation of an IFS A of the universe X is denoted by D I L ( A ) and is defined by:
D I L ( A ) = { x , ζ D I L ( A ) ( x ) , ϑ D I L ( A ) ( x ) x X }
where ζ D I L ( A ) ( x ) = [ ζ A ( x ) ] 1 / 2 and ϑ D I L ( A ) ( x ) = 1 - [ 1 - ϑ A ( x ) ] 1 / 2 , i.e., the operation of the dilation of an IFS is defined by D I L ( A ) = A 1 / 2
Example 1.
Consider a universe of the discourse X = { x 1 , x 2 , x 3 , x 4 , x 5 } , and an IFS A “LARGE” of X may be defined by:
L A R G E = { ( x 1 , 0 . 1 , 0 . 8 ) , ( x 2 , 0 . 3 , 0 . 5 ) , ( x 3 , 0 . 5 , 0 . 4 ) , ( x 4 , 0 . 9 , 0 ) , ( x 5 , 1 , 0 ) }
Using the operations as defined in Equation (12), we have generated the following IFSs
A 1 / 2 , A 2 , A 3 , A 4 ,
which are defined as follows:
A 1 / 2   may be treated as   More or less LARGE A 2   may be treated as   very LARGE A 3   may be treated as   quite very LARGE A 4   may be treated as   very very LARGE
and their corresponding sets are computed as:
A 1 2 = { ( x 1 , 0 . 3162 , 0 . 5528 ) , ( x 2 , 0 . 5477 , 0 . 2929 ) , ( x 3 , 0 . 7071 , 0 . 2254 ) , ( x 4 , 0 . 9487 , 0 ) , ( x 5 , 1 , 0 ) } A 2 = { ( x 1 , 0 . 01 , 0 . 96 ) , ( x 2 , 0 . 09 , 0 . 75 ) , ( x 3 , 0 . 25 , 0 . 64 ) , ( x 4 , 0 . 81 , 0 ) , ( x 5 , 1 , 0 ) } A 3 = { ( x 1 , 0 . 001 , 0 . 9920 ) , ( x 2 , 0 . 0270 , 0 . 8750 ) , ( x 3 , 0 . 1250 , 0 . 7840 ) , ( x 4 , 0 . 7290 , 0 ) , ( x 5 , 1 , 0 ) } A 4 = { ( x 1 , 0 . 0001 , 0 . 9984 ) , ( x 2 , 0 . 0081 , 0 . 9375 ) , ( x 3 , 0 . 0625 , 0 . 8704 ) , ( x 4 , 0 . 6561 , 0 ) , ( x 5 , 1 , 0 ) }
From the viewpoint of mathematical operations, the entropy values of the above defined IFSs, A 1 / 2 , A, A 2 , A 3 and A 4 , have the following requirement:
E ( A 1 / 2 ) > E ( A ) > E ( A 2 ) > E ( A 3 ) > E ( A 4 )
Based on the dataset given in the above, we compute the entropy measure for them at different values of R and S. The result corresponding to these different pairs of values is summarized in Table 1 along with the existing approaches’ results. From these computed values, it is observed that the ranking order of the linguistic variable by the proposed entropy follows the pattern as described in Equation (13) for some suitable pairs of ( R , S ) , while the performance order pattern corresponding to [19,21,57] and [58] is E ( A ) > E ( A 1 / 2 ) > E ( A 2 ) > E ( A 3 ) > E ( A 4 ) , which does not satisfy the requirement given in Equation (13). Hence, the proposed entropy measure is a good alternative and performs better than the existing measures. Furthermore, for different pairs of ( R , S ) , a decision-maker may have more choices to access the alternatives from the viewpoint of structured linguistic variables.

4. MADM Problem Based on the Proposed Entropy Measure

In this section, we present a method for solving the MADM problem based on the proposed entropy measure.

4.1. Approach I: When the Attribute Weight Is Completely Unknown

In this section, we present a decision-making approach for solving the multi-attribute decision-making problem in the intuitionistic fuzzy set environment. For this, consider a set of ‘n’ different alternatives, denoted by A 1 , A 2 , , A n , which are evaluated by a decision-maker under the ‘m’ different attributes G 1 , G 2 , , G m . Assume that a decision-maker has evaluated these alternatives in the intuitionistic fuzzy environment and noted their rating values in the form of the IFNs α i j = ζ i j , ϑ i j where ζ i j denotes that the degree of the alternative A i satisfies under the attribute G j , while ϑ i j denotes the dissatisfactory degree of an alternative A i under G j such that ζ i j , ϑ i j [ 0 , 1 ] and ζ i j + ϑ i j 1 for i = 1 , 2 , , m and j = 1 , 2 , , n . Further assume that the weight vector ω j ( j = 1 , 2 , , m ) of each attribute is completely unknown. Hence, based on the decision-maker preferences α i j , the collective values are summarized in the form of the decision matrix D as follows:
D = G 1 G 2 G m A 1 A 2 A n ( ζ 11 , ϑ 11 ζ 21 , ϑ 21 ζ n 1 , ϑ n 1 ζ 12 , ϑ 12 ζ 22 , ϑ 22 ζ n 2 , ϑ n 2 ζ 1 m , ϑ 1 m ζ 2 m , ϑ 2 m ζ n m , ϑ n m )
Then, the following steps of the proposed approach are summarized to find the best alternative(s).
Step 1:
Normalize the rating values of the decision-maker, if required, by converting the rating values corresponding to the cost type attribute into the benefit type. For this, the following normalization formula is used:
r i j = ζ i j , ϑ i j ; if the benefit type attribute ϑ i j , ζ i j ; if the cos t type attribute
and hence, we obtain the normalized IF decision matrix R = ( r i j ) n × m .
Step 2:
Based on the matrix R, the information entropy of attribute G j ( j = 1 , 2 , , m ) is computed as:
( H R S ) j = R × S n ( R - S ) i = 1 n ζ i j S + ϑ i j S + π i j S 1 S - ζ i j R + ϑ i j R + π i j R 1 R
where R , S > 0 and R S .
Step 3:
Based on the entropy matrix, H R S ( α i j ) defined in Equation (16), the degree of divergence ( d j ) of the average intrinsic information provided by the correspondence on the attribute G j can be defined as d j = 1 - κ j where κ j = i = 1 n H R S ( α i j ) , j = 1 , 2 , , m . Here, the value of d j represents the inherent contrast intensity of attribute G j , and hence, based on this, the attributes weight ω j ( j = 1 , 2 , , n ) is given as:
ω j = d j j = 1 m d j = 1 - κ j j = 1 m ( 1 - κ j ) = 1 - κ j m - j = 1 m κ j
Step 4:
Construct the weighted sum of each alternative by multiplying the score function of each criterion by its assigned weight as:
Q ( A i ) = j = 1 m ω j ( ζ i j - ϑ i j ) ; i = 1 , 2 , , n
Step 5:
Rank all the alternatives A i ( i = 1 , 2 , , n ) according to the highest value of Q ( A i ) and, hence, choose the best alternative.
The above-mentioned approach has been illustrated with a practical example of the decision-maker, which can be read as:
Example 2.
Consider a decision-making problem from the field of the recruitment sector. Assume that a pharmaceutical company wants to select a lab technician for a micro-bio laboratory. For this, the company has published a notification in a newspaper and considered the four attributes required for technician selection, namely academic record ( G 1 ) , personal interview evaluation ( G 2 ) , experience ( G 3 ) and technical capability ( G 4 ) . On the basis of the notification conditions, only five candidates A 1 , A 2 , A 3 , A 4 and A 5 as alternatives are interested and selected to be presented to the panel of experts for this post. Then, the main object of the company is to choose the best candidate among them for the task. In order to describe the ambiguity and uncertainties in the data, the preferences related to each alternative are represented in the IFS environment. The preferences of each alternative are represented in the form of IFNs as follows:
D = G 1 G 2 G 3 G 4 A 1 A 2 A 3 A 4 A 5 ( 0 . 7 , 0 . 2 0 . 7 , 0 . 1 0 . 6 , 0 . 3 0 . 8 , 0 . 1 0 . 6 , 0 . 3 0 . 5 , 0 . 4 0 . 5 , 0 . 2 0 . 5 , 0 . 1 0 . 6 , 0 . 3 0 . 4 , 0 . 6 0 . 6 , 0 . 2 0 . 7 , 0 . 2 0 . 5 , 0 . 3 0 . 3 , 0 . 7 0 . 7 , 0 . 2 0 . 6 , 0 . 3 0 . 4 , 0 . 5 0 . 6 , 0 . 2 0 . 6 , 0 . 3 0 . 5 , 0 . 4 )
Then, the steps of the proposed approach are followed to find the best alternative(s) as below:
Step 1:
Since all the attributes are of the same type, so there is no need for the normalization process.
Step 2:
Without loss of generality, we take R = 0 . 3 and S = 2 and, hence, compute the entropy measurement value for each attribute by using Equation (16). The results corresponding to it are H R S ( G 1 ) = 3 . 4064 , H R S ( G 2 ) = 3 . 372 , H R S ( G 3 ) = 3 . 2491 and H R S ( G 4 ) = 3 . 7564 .
Step 3:
Based on these entropy values, the weight of each criterion is calculated as ω = ( 0 . 2459 , 0 . 2425 , 0 . 2298 , 0 . 2817 ) T .
Step 4:
The overall weighted score values of the alternative corresponding to R = 0 . 3 , S = 2 and ω = ( 0 . 2459 , 0 . 2425 , 0 . 2298 , 0 . 2817 ) T obtained by using Equation (18) are Q ( A 1 ) = 0 . 3237 , Q ( A 2 ) = 0 . 3071 , Q ( A 3 ) = 0 . 3294 , Q ( A 4 ) = 0 . 2375 and Q ( A 5 ) = 0 . 1684 .
Step 5:
Since Q ( A 3 ) > Q ( A 1 ) > Q ( A 2 ) > Q ( A 4 ) > Q ( A 5 ) , hence the ranking order of the alternatives is A 3 A 1 A 2 A 4 A 5 . Thus, the best alternative is A 3 .
However, in order to analyze the influence of the parameters R and S on the final ranking order of the alternatives, the steps of the proposed approach are executed by varying the values of R from 0.1 to 1.0 and S from 1.0 to 5.0. The overall score values of each alternative along with the ranking order are summarized in Table 2. From this analysis, we conclude that the decision-maker can plan to choose the values of R and S and, hence, their respective alternatives according to his goal. Therefore, the proposed measures give various choices to the decision-maker to reach the target.

4.2. Approach II: When the Attribute Weight Is Partially Known

In this section, we present an approach for solving the multi-attribute decision-making problem in the IFS environment where the information about the attribute weight is partially known. The description of the MADM problem is mentioned in Section 4.1.
Since decision-making during a real-life situation is highly complex due to a large number of constraints, human thinking is inherently subjective, and the importance of the attribute weight vector is incompletely known. In order to represent this incomplete information about the weights, the following relationship has been defined for i j :
  • A weak ranking: ω i ω j ;
  • A strict ranking: ω i - ω j σ i ; ( σ i > 0 ) .
  • A ranking with multiples: ω i σ i ω j , ( 0 σ i 1 ) ;
  • An interval form: λ i ω i λ i + δ i , ( 0 λ i λ i + δ i 1 ) ;
  • A ranking of differences: ω i - ω j ω k - ω l , ( j k l ) .
The set of this known weight information is denoted by Δ in this paper.
Then, the proposed approach is summarized in the following steps to obtain the most desirable alternative(s).
Step 1:
Similar to Approach I.
Step 2:
similar to Approach I.
Step 3:
The overall entropy of the alternative A i ( i = 1 , 2 , , n ) for the attribute G j is given by:
H ( A i ) = j = 1 m H R S ( α i j ) = R × S n ( R - S ) j = 1 m i = 1 n ( ζ i j S + ϑ i j S + π i j S ) 1 S - ( ζ i j R + ϑ i j R + π i j R ) 1 R
where R , S > 0 and R S .
By considering the importance of each attribute in terms of weight vector ω = ( ω 1 , ω 2 , , ω m ) T , we formulate a linear programming model to determine the weight vector as follows:
min H = i = 1 n H ( A i ) = i = 1 n j = 1 m ω j H R S ( α i j ) = R × S n ( R - S ) j = 1 m ω j i = 1 n ( ζ i j S + ϑ i j S + π i j S ) 1 S - ( ζ i j R + ϑ i j R + π i j R ) 1 R s . t . j = 1 m ω j = 1 ω j 0 ; ω Δ
After solving this model, we get the optimal weight vector ω = ( ω 1 , ω 2 , , ω m ) T .
Step 4:
Construct the weighted sum of each alternative by multiplying the score function of each criterion by its assigned weight as:
Q ( A i ) = j = 1 m ω j ( ζ i j - ϑ i j ) ; i = 1 , 2 , , n
Step 5:
Rank all the alternative A i ( i = 1 , 2 , , n ) according to the highest value of Q ( A i ) and, hence, choose the best alternative.
To demonstrate the above-mentioned approach, a numerical example has been taken, which is stated as below.
Example 3.
Consider an MADM problem, which was stated and described in Example 2, where the five alternatives A 1 , A 2 , , A 5 are assessed under the four attributes G 1 , G 2 , G 3 , G 4 in the IFS environment. Here, we assume that the information about the attribute weight is partially known and is given by the decision-maker as Δ = { 0 . 15 ω 1 0 . 45 , 0 . 2 ω 2 0 . 5 , 0 . 1 ω 3 0 . 3 , 0 . 1 ω 4 0 . 2 , ω 1 ω 4 , j = 1 4 ω j = 1 } . Then, based on the rating values as mentioned in Equation (19), the following steps of the Approach II are executed as below:
Step 1:
All the attributes are te same types, so there is no need for normalization.
Step 2:
Without loss of generality, we take R = 0 . 3 and S = 2 and, hence, compute the entropy measurement value for each attribute by using Equation (20). The results corresponding to it are H R S ( G 1 ) = 3 . 4064 , H R S ( G 2 ) = 3 . 372 , H R S ( G 3 ) = 3 . 2491 and H R S ( G 4 ) = 3 . 7564 .
Step 3:
Formulate the optimization model by utilizing the information of rating values and the partial information of the weight vector Δ = { 0 . 15 ω 1 0 . 45 , 0 . 2 ω 2 0 . 5 , 0 . 1 ω 3 0 . 3 , 0 . 1 ω 4 0 . 2 , ω 1 ω 4 , j = 1 4 ω j = 1 } as:
min H = 3 . 4064 ω 1 + 3 . 372 ω 2 + 3 . 2491 ω 3 + 3 . 7564 ω 4 subject to 0 . 15 ω 1 0 . 45 , 0 . 2 ω 2 0 . 5 , 0 . 1 ω 3 0 . 3 , 0 . 1 ω 4 0 . 2 , ω 1 ω 4 , and ω 1 + ω 2 + ω 3 + ω 4 = 1 .
Hence, we solve the model with the help of MATLAB software, and we can obtain the weight vector as ω = ( 0 . 15 , 0 . 45 , 0 . 30 , 0 . 10 ) T .
Step 4:
The overall weighted score values of the alternative corresponding to R = 0 . 3 , S = 2 and ω = ( 0 . 15 , 0 . 45 , 0 . 30 , 0 . 10 ) T obtained by using Equation (21) are Q ( A 1 ) = 0 . 2700 , Q ( A 2 ) = 0 . 3650 , Q ( A 3 ) = 0 . 3250 and Q ( A 4 ) = 0 . 1500 and Q ( A 5 ) = 0 . 1150 .
Step 5:
Since Q ( A 2 ) > Q ( A 3 ) > Q ( A 1 ) > Q ( A 4 ) > Q ( A 5 ) , hence the ranking order of the alternatives is A 2 A 3 A 1 A 4 A 5 . Thus, the best alternative is A 2 .

5. Conclusions

In this paper, we propose an entropy measure based on the ( R , S )-norm in the IFS environment. Since the uncertainties present in the data play a crucial role during the decision-making process, in order to measure the degree of fuzziness of a set and maintaining the advantages of it, in the present paper, we addressed a novel ( R , S )-norm-based information measure. Various desirable relations, as well as some of its properties, were investigated in detail. From the proposed measures, it was observed that some of the existing measures were the special cases of the proposed measures. Furthermore, based on the different parametric values of R and S, the decision-maker(s) may have different choices to make a decision according to his/her choice. In addition to these and to explore the structural characteristics and functioning of the proposed measures, two decision-making approaches were presented to solve the MADM problems in the IFS environment under the characteristics that attribute weights are either partially known or completely unknown. The presented approaches were illustrated with numerical examples. The major advantages of the proposed measure are that it gives various choices to select the best alternatives, according to the decision-makers’ desired goals, and hence, it makes the decision-makers more flexible and reliable. From the studies, it is concluded that the proposed work provides a new and easy way to handle the uncertainty and vagueness in the data and, hence, provides an alternative way to solve the decision-making problem in the IFS environment. In the future, the result of this paper can be extended to some other uncertain and fuzzy environments [59,60,61,62].

Author Contributions

Conceptualization, Methodology, Validation, H.G.; Formal Analysis, Investigation, H.G., J.K.; Writing-Original Draft Preparation, H.G.; Writing-Review & Editing, H.G.; Visualization, H.G.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  2. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  3. Atanassov, K.; Gargov, G. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 31, 343–349. [Google Scholar] [CrossRef]
  4. Xu, Z.S.; Yager, R.R. Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J. Gen. Syst. 2006, 35, 417–433. [Google Scholar] [CrossRef]
  5. Xu, Z.S. Intuitionistic fuzzy aggregation operators. IEEE Trans. Fuzzy Syst. 2007, 15, 1179–1187. [Google Scholar]
  6. Garg, H. Generalized intuitionistic fuzzy interactive geometric interaction operators using Einstein t-norm and t-conorm and their application to decision-making. Comput. Ind. Eng. 2016, 101, 53–69. [Google Scholar] [CrossRef]
  7. Garg, H. Novel intuitionistic fuzzy decision-making method based on an improved operation laws and its application. Eng. Appl. Artif. Intell. 2017, 60, 164–174. [Google Scholar] [CrossRef]
  8. Wang, W.; Wang, Z. An approach to multi-attribute interval-valued intuitionistic fuzzy decision-making with incomplete weight information. In Proceedings of the 15th IEEE International Conference on Fuzzy Systems and Knowledge Discovery, Jinan, China, 18–20 October 2008; Volume 3, pp. 346–350. [Google Scholar]
  9. Wei, G. Some induced geometric aggregation operators with intuitionistic fuzzy information and their application to group decision-making. Appl. Soft Comput. 2010, 10, 423–431. [Google Scholar] [CrossRef]
  10. Arora, R.; Garg, H. Robust aggregation operators for multi-criteria decision-making with intuitionistic fuzzy soft set environment. Sci. Iran. E 2018, 25, 931–942. [Google Scholar] [CrossRef]
  11. Arora, R.; Garg, H. Prioritized averaging/geometric aggregation operators under the intuitionistic fuzzy soft set environment. Sci. Iran. 2018, 25, 466–482. [Google Scholar] [CrossRef]
  12. Zhou, W.; Xu, Z. Extreme intuitionistic fuzzy weighted aggregation operators and their applications in optimism and pessimism decision-making processes. J. Intell. Fuzzy Syst. 2017, 32, 1129–1138. [Google Scholar] [CrossRef]
  13. Garg, H. Some robust improved geometric aggregation operators under interval-valued intuitionistic fuzzy environment for multi-criteria decision -making process. J. Ind. Manag. Optim. 2018, 14, 283–308. [Google Scholar] [CrossRef]
  14. Xu, Z.; Gou, X. An overview of interval-valued intuitionistic fuzzy information aggregations and applications. Granul. Comput. 2017, 2, 13–39. [Google Scholar] [CrossRef]
  15. Jamkhaneh, E.B.; Garg, H. Some new operations over the generalized intuitionistic fuzzy sets and their application to decision-making process. Granul. Comput. 2018, 3, 111–122. [Google Scholar] [CrossRef]
  16. Garg, H.; Singh, S. A novel triangular interval type-2 intuitionistic fuzzy sets and their aggregation operators. Iran. J. Fuzzy Syst. 2018. [Google Scholar] [CrossRef]
  17. Shanon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  18. Deluca, A.; Termini, S. A definition of Non-probabilistic entropy in setting of fuzzy set theory. Inf. Control 1971, 20, 301–312. [Google Scholar] [CrossRef]
  19. Szmidt, E.; Kacprzyk, J. Entropy for intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477. [Google Scholar] [CrossRef]
  20. Vlachos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information-application to pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206. [Google Scholar] [CrossRef]
  21. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 1996, 78, 305–316. [Google Scholar] [CrossRef]
  22. Garg, H.; Agarwal, N.; Tripathi, A. Generalized Intuitionistic Fuzzy Entropy Measure of Order α and Degree β and its applications to Multi-criteria decision-making problem. Int. J. Fuzzy Syst. Appl. 2017, 6, 86–107. [Google Scholar] [CrossRef]
  23. Wei, C.P.; Gao, Z.H.; Guo, T.T. An intuitionistic fuzzy entropy measure based on the trigonometric function. Control Decis. 2012, 27, 571–574. [Google Scholar]
  24. Garg, H.; Agarwal, N.; Tripathi, A. Entropy based multi-criteria decision-making method under Fuzzy Environment and Unknown Attribute Weights. Glob. J. Technol. Optim. 2015, 6, 13–20. [Google Scholar]
  25. Zhang, Q.S.; Jiang, S.Y. A note on information entropy measure for vague sets. Inf. Sci. 2008, 178, 4184–4191. [Google Scholar] [CrossRef]
  26. Verma, R.; Sharma, B.D. Exponential entropy on intuitionistic fuzzy sets. Kybernetika 2013, 49, 114–127. [Google Scholar]
  27. Taneja, I.J. On generalized information measures and their applications. In Advances in Electronics and Electron Physics; Elsevier: New York, NY, USA, 1989; Volume 76, pp. 327–413. [Google Scholar]
  28. Boekee, D.E.; Van der Lubbe, J.C. The R-norm information measure. Inf. Control 1980, 45, 136–155. [Google Scholar] [CrossRef]
  29. Hung, W.L.; Yang, M.S. Similarity measures of intuitionistic fuzzy sets based on Hausdorff distance. Pattern Recognit. Lett. 2004, 25, 1603–1611. [Google Scholar] [CrossRef]
  30. Garg, H. Distance and similarity measure for intuitionistic multiplicative preference relation and its application. Int. J. Uncertain. Quantif. 2017, 7, 117–133. [Google Scholar] [CrossRef]
  31. Garg, H.; Arora, R. Distance and similarity measures for Dual hesistant fuzzy soft sets and their applications in multi criteria decision-making problem. Int. J. Uncertain. Quantif. 2017, 7, 229–248. [Google Scholar] [CrossRef]
  32. Joshi, R.; Kumar, S. An (R,S)-norm fuzzy information measure with its applications in multiple-attribute decision-making. Comput. Appl. Math. 2017, 1–22. [Google Scholar] [CrossRef]
  33. Garg, H.; Kumar, K. An advanced study on the similarity measures of intuitionistic fuzzy sets based on the set pair analysis theory and their application in decision making. Soft Comput. 2018, 1–12. [Google Scholar] [CrossRef]
  34. Garg, H.; Kumar, K. Distance measures for connection number sets based on set pair analysis and its applications to decision-making process. Appl. Intell. 2018, 1–14. [Google Scholar] [CrossRef]
  35. Garg, H.; Nancy. On single-valued neutrosophic entropy of order α. Neutrosophic Sets Syst. 2016, 14, 21–28. [Google Scholar]
  36. Selvachandran, G.; Garg, H.; Alaroud, M.H.S.; Salleh, A.R. Similarity Measure of Complex Vague Soft Sets and Its Application to Pattern Recognition. Int. J. Fuzzy Syst. 2018, 1–14. [Google Scholar] [CrossRef]
  37. Bajaj, R.K.; Kumar, T.; Gupta, N. R-norm intuitionistic fuzzy information measures and its computational applications. In Eco-friendly Computing and Communication Systems; Springer: Berlin, Germany, 2012; pp. 372–380. [Google Scholar]
  38. Garg, H.; Kumar, K. Improved possibility degree method for ranking intuitionistic fuzzy numbers and their application in multiattribute decision-making. Granul. Comput. 2018, 1–11. [Google Scholar] [CrossRef]
  39. Mei, Y.; Ye, J.; Zeng, Z. Entropy-weighted ANP fuzzy comprehensive evaluation of interim product production schemes in one-of-a-kind production. Comput. Ind. Eng. 2016, 100, 144–152. [Google Scholar] [CrossRef]
  40. Chen, S.M.; Chang, C.H. A novel similarity measure between Atanassov’s intuitionistic fuzzy sets based on transformation techniques with applications to pattern recognition. Inf. Sci. 2015, 291, 96–114. [Google Scholar] [CrossRef]
  41. Garg, H. Hesitant Pythagorean fuzzy sets and their aggregation operators in multiple attribute decision-making. Int. J. Uncertain. Quantif. 2018, 8, 267–289. [Google Scholar] [CrossRef]
  42. Chen, S.M.; Cheng, S.H.; Chiou, C.H. Fuzzy multiattribute group decision-making based on intuitionistic fuzzy sets and evidential reasoning methodology. Inf. Fusion 2016, 27, 215–227. [Google Scholar] [CrossRef]
  43. Kaur, G.; Garg, H. Multi-Attribute Decision-Making Based on Bonferroni Mean Operators under Cubic Intuitionistic Fuzzy Set Environment. Entropy 2018, 20, 65. [Google Scholar] [CrossRef]
  44. Chen, T.Y.; Li, C.H. Determining objective weights with intuitionistic fuzzy entropy measures: A comparative analysis. Inf. Sci. 2010, 180, 4207–4222. [Google Scholar] [CrossRef]
  45. Li, D.F. TOPSIS- based nonlinear-programming methodology for multiattribute decision-making with interval-valued intuitionistic fuzzy sets. IEEE Trans. Fuzzy Syst. 2010, 18, 299–311. [Google Scholar] [CrossRef]
  46. Garg, H.; Arora, R. A nonlinear-programming methodology for multi-attribute decision-making problem with interval-valued intuitionistic fuzzy soft sets information. Appl. Intell. 2017, 1–16. [Google Scholar] [CrossRef]
  47. Garg, H.; Nancy. Non-linear programming method for multi-criteria decision-making problems under interval neutrosophic set environment. Appl. Intell. 2017, 1–15. [Google Scholar] [CrossRef]
  48. Saaty, T.L. Axiomatic foundation of the analytic hierarchy process. Manag. Sci. 1986, 32, 841–845. [Google Scholar] [CrossRef]
  49. Hwang, C.L.; Lin, M.J. Group Decision Making under Multiple Criteria: Methods and Applications; Springer: Berlin, Germany, 1987. [Google Scholar]
  50. Arora, R.; Garg, H. A robust correlation coefficient measure of dual hesistant fuzzy soft sets and their application in decision-making. Eng. Appl. Artif. Intell. 2018, 72, 80–92. [Google Scholar] [CrossRef]
  51. Garg, H.; Kumar, K. Some aggregation operators for linguistic intuitionistic fuzzy set and its application to group decision-making process using the set pair analysis. Arab. J. Sci. Eng. 2018, 43, 3213–3227. [Google Scholar] [CrossRef]
  52. Abdullah, L.; Najib, L. A new preference scale mcdm method based on interval-valued intuitionistic fuzzy sets and the analytic hierarchy process. Soft Comput. 2016, 20, 511–523. [Google Scholar] [CrossRef]
  53. Garg, H. Generalized intuitionistic fuzzy entropy-based approach for solving multi-attribute decision-making problems with unknown attribute weights. Proc. Natl. Acad. Sci. India Sect. A Phys. Sci. 2017, 1–11. [Google Scholar] [CrossRef]
  54. Xia, M.; Xu, Z. Entropy/cross entropy-based group decision-making under intuitionistic fuzzy environment. Inf. Fusion 2012, 13, 31–47. [Google Scholar] [CrossRef]
  55. Garg, H.; Nancy. Linguistic single-valued neutrosophic prioritized aggregation operators and their applications to multiple-attribute group decision-making. J. Ambient Intell. Humaniz. Comput. 2018, 1–23. [Google Scholar] [CrossRef]
  56. De, S.K.; Biswas, R.; Roy, A.R. Some operations on intuitionistic fuzzy sets. Fuzzy Sets Syst. 2000, 117, 477–484. [Google Scholar] [CrossRef]
  57. Zeng, W.; Li, H. Relationship between similarity measure and entropy of interval-valued fuzzy sets. Fuzzy Sets Syst. 2006, 157, 1477–1484. [Google Scholar] [CrossRef]
  58. Hung, W.L.; Yang, M.S. Fuzzy Entropy on intuitionistic fuzzy sets. Int. J. Intell. Syst. 2006, 21, 443–451. [Google Scholar] [CrossRef]
  59. Garg, H. Some methods for strategic decision-making problems with immediate probabilities in Pythagorean fuzzy environment. Int. J. Intell. Syst. 2018, 33, 687–712. [Google Scholar] [CrossRef]
  60. Garg, H. Linguistic Pythagorean fuzzy sets and its applications in multiattribute decision-making process. Int. J. Intell. Syst. 2018, 33, 1234–1263. [Google Scholar] [CrossRef]
  61. Garg, H. Generalized interaction aggregation operators in intuitionistic fuzzy multiplicative preference environment and their application to multicriteria decision-making. Appl. Intell. 2017, 1–17. [Google Scholar] [CrossRef]
  62. Garg, H.; Arora, R. Generalized and Group-based Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Intell. 2018, 48, 343–356. [Google Scholar] [CrossRef]
Table 1. Entropy measures values corresponding to existing approaches, as well as the proposed approach.
Table 1. Entropy measures values corresponding to existing approaches, as well as the proposed approach.
Entropy Measure A 1 2 A A 2 A 3 A 4
E { B B } [21]0.08180.10000.09800.09340.0934
E { S K } [19]0.34460.37400.19700.13090.1094
E { Z L } [57]0.41560.42000.23800.15460.1217
E { H Y } [58]0.34160.34400.26100.19930.1613
E { Z J } [25]0.28510.30500.10420.03830.0161
E 0 . 4 0 . 2 [22]0.59950.59810.53350.46310.4039
H R S (proposed measure)
R = 0 . 3 , S = 2 2.36152.35891.86241.43121.1246
R = 0 . 5 , S = 2 0.87230.87830.69450.53920.4323
R = 0 . 7 , S = 2 0.57210.57690.44320.33900.2725
R = 2 . 5 , S = 0 . 3 2.28822.28581.80281.38511.0890
R = 2 . 5 , S = 0 . 5 0.83090.83680.65830.51040.4103
R = 2 . 5 , S = 0 . 7 0.53690.54150.41130.31380.2538
Table 2. Effect of R and S on the entropy measure H R S by using Approach I.
Table 2. Effect of R and S on the entropy measure H R S by using Approach I.
SR H R S ( A 1 ) H R S ( A 2 ) H R S ( A 3 ) H R S ( A 4 ) H R S ( A 5 ) Ranking Order
1.20.10.32680.30840.32910.24290.1715 A 3 A 1 A 2 A 4 A 5
0.30.32410.30810.32920.23740.1690 A 3 A 1 A 2 A 4 A 5
0.50.31650.28940.33370.23680.1570 A 3 A 1 A 2 A 4 A 5
0.70.1688-0.09880.42960.2506-0.0879 A 3 A 4 A 1 A 5 A 2
0.90.35890.39920.30650.23280.2272 A 2 A 1 A 3 A 4 A 5
1.50.10.32680.30840.32910.24290.1715 A 3 A 1 A 2 A 4 A 5
0.30.32390.30760.32930.23740.1688 A 3 A 1 A 2 A 4 A 5
0.50.31320.28110.33590.23710.1515 A 3 A 1 A 2 A 4 A 5
0.70.41390.54040.27120.22720.3185 A 2 A 1 A 5 A 3 A 2
0.90.34980.37410.31250.23340.2121 A 2 A 1 A 3 A 4 A 5
2.00.10.32680.30840.32910.24290.1715 A 3 A 1 A 2 A 4 A 5
0.30.32370.30710.32940.23750.1684 A 3 A 1 A 2 A 4 A 5
0.50.30720.26660.33960.23810.1415 A 3 A 1 A 2 A 4 A 5
0.70.36600.41400.30220.23080.2393 A 2 A 1 A 3 A 5 A 4
0.90.34610.36310.31500.23310.2062 A 2 A 1 A 3 A 4 A 5
2.50.10.32680.30840.32910.24290.1715 A 3 A 1 A 2 A 4 A 5
0.30.32350.30670.32950.23760.1681 A 3 A 1 A 2 A 4 A 5
0.50.30100.25170.34360.23960.1308 A 3 A 1 A 2 A 4 A 5
0.70.35780.39200.30740.23040.2261 A 2 A 1 A 3 A 4 A 5
0.90.34490.35910.31580.23220.2045 A 2 A 1 A 3 A 4 A 5
3.00.10.32680.30840.32910.24290.1715 A 3 A 1 A 2 A 4 A 5
0.30.32340.30640.32960.23760.1678 A 3 A 1 A 2 A 4 A 5
0.50.29460.23680.34760.24170.1199 A 3 A 1 A 4 A 2 A 5
0.70.35450.38290.30950.22980.2209 A 2 A 1 A 3 A 4 A 5
0.90.34420.35700.31610.23140.2037 A 2 A 1 A 3 A 4 A 5
5.00.10.32680.30840.32910.24290.1715 A 3 A 1 A 2 A 4 A 5
0.30.32310.30580.32980.23790.1674 A 3 A 1 A 2 A 4 A 5
0.50.27010.17780.36380.25200.0767 A 3 A 1 A 4 A 2 A 5
0.70.34960.37060.31230.22770.2137 A 2 A 1 A 3 A 4 A 5
0.90.34280.35320.31680.22930.2020 A 2 A 1 A 3 A 4 A 5

Share and Cite

MDPI and ACS Style

Garg, H.; Kaur, J. A Novel (R,S)-Norm Entropy Measure of Intuitionistic Fuzzy Sets and Its Applications in Multi-Attribute Decision-Making. Mathematics 2018, 6, 92. https://doi.org/10.3390/math6060092

AMA Style

Garg H, Kaur J. A Novel (R,S)-Norm Entropy Measure of Intuitionistic Fuzzy Sets and Its Applications in Multi-Attribute Decision-Making. Mathematics. 2018; 6(6):92. https://doi.org/10.3390/math6060092

Chicago/Turabian Style

Garg, Harish, and Jaspreet Kaur. 2018. "A Novel (R,S)-Norm Entropy Measure of Intuitionistic Fuzzy Sets and Its Applications in Multi-Attribute Decision-Making" Mathematics 6, no. 6: 92. https://doi.org/10.3390/math6060092

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop