Next Article in Journal
Dynamic Fault-Tolerant Workflow Scheduling with Hybrid Spatial-Temporal Re-Execution in Clouds
Previous Article in Journal
A Novel Fractional-Order Grey Prediction Model and Its Modeling Error Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Similarity Measures of Linguistic Cubic Hesitant Variables for Multiple Attribute Group Decision-Making

Department of Electrical and Information Engineering, Shaoxing University, 508 Huancheng West Road, Shaoxing 312000, China
*
Author to whom correspondence should be addressed.
Information 2019, 10(5), 168; https://doi.org/10.3390/info10050168
Submission received: 4 April 2019 / Revised: 24 April 2019 / Accepted: 24 April 2019 / Published: 5 May 2019

Abstract

:
A linguistic cubic hesitant variable (LCHV) is a hybrid form of linguistic values in group decision-making environments. It is composed of an interval language variable and multiple single-valued language variables given by different decision-makers (DMs). Due to the uncertainty and hesitation of DMs, the numbers of language variables in different LCHVs are unequal. Thus, the least common multiple number (LCMN) extension method was adopted. Based on the included angle and distance of two LCHVs, we presented two cosine similarity measures and developed a multiple attribute group decision-making (MAGDM) approach. An example of engineer selection was used to implement the proposed LCHV MAGDM method and demonstrate the simplicity and feasibility of the proposed method. The sensitivity analysis of weight changes for the two measures showed that the similarity measure based on distance was more stable than the similarity measure based on included angle in this application.

1. Introduction

In the age of big data, we use a large amount of data information to solve decision-making (DM) problems in various fields, such as manufacturing domain [1], selection of power generation technology [2], the selection of a transport service provider [3], etc. But not all evaluation information is directly represented by a real number. How do we deal with existing fuzzy information? Thus, decision-making theory and methods are still a critical research topic. Humans are more used to evaluating in linguistic expression than numerical values. Since 1975, the concept of language variables (LVs) was first proposed by Zadeh [4]. Then Herrera [5,6] used linguistic information to solve DM problems [5,6]. Next, scholars introduced various aggregation operations [7,8,9,10,11] to handle linguistic decision information. In some uncertain environments it is difficult for decision-makers to make an assessment in a single-valued LV; they prefer to give the evaluation in an interval language variable rather than a single-value language variable. An interval language variable is called an uncertain linguistic value (ULV) [12]. In order to solve uncertain linguistic DM problems, many uncertain linguistic aggregation operators were induced such as the ULV ordered weighted averaging (ULOWA) operator [13], the UL hybrid geometric mean (ULHGM) operator [14], and the uncertain pure linguistic hybrid harmonic averaging (UPLHHA) operator [15]. After that, the linguistic cubic variable (LCV) was proposed by Ye [16]. The LCV is a hybrid linguistic evaluation form which consists of an interval LV (ULV) and a single-valued LV. An LCV represents a comprehensive evaluation of an attribute given by a group of people. Some LCV aggregation operators were developed such as the LCV weighted geometric averaging (LCVWGA) operator [16], the LCV weighted arithmetic averaging (LCVWAA) operator [16], the LCV Dombi weighted geometric average (LCVDWGA) operator [17], and LCV Dombi weighted arithmetic average (LCVDWAA) operator [17]. Based on these operators, scholars developed corresponding approaches to solve MAGDM problems under a LCV environment.
Although LCV can express group decision-making information, the LCV cannot express the evaluation values of the group objectively and accurately when the evaluation values given by decision-makers differ greatly. For this reason, Ye [18] put forward another hybrid form of linguistic values and defined it as a linguistic cubic hesitant variable (LCHV). In the case of group decision-making, each of the decision-makers provides an evaluation over an attribute with an interval LV or single-valued LV. We merged all the interval LVs into one interval LV and combined all the different single-valued LVs into a hesitant linguistic set. Then, the interval LV and the hesitant linguistic set (HLS) together constitute a new hybrid linguistic value form, which is called LCHV. Hesitant linguistic sets of different LCHVs may have different lengths. Thus, it is difficult to directly aggregate two LCHVs with different lengths. There are many methods to get uniform lengths for hesitant linguistic sets with different lengths [19,20,21,22]. Among them, the least common multiple number (LCMN) of hesitant linguistic elements as the extension size is more objective. According to the LCMN extension method, Ye [18] presented the LCVWGA and LCVWAA aggregation operators, but the methods were implemented in a given weight. The stability of this method has not been tested.
The similarity algorithm is an effective tool to measure the degree of similarity between two fuzzy variables or fuzzy sets. The similarity measure has been widely applied to handle various decision-making information such as trapezoidal fuzzy neutrosophic numbers information [23], simplified neutrosophic hesitant fuzzy set information [24], linguistic neutrosophic numbers information [25], neutrosophic cubic sets information [26], and hesitant linguistic neutrosophic numbers information [27]. So far, there are no other studies on LCHVs except for the WGA and WAA aggregation operators proposed by Ye [18]. In this study, two similarity measures were developed to measure the degree of similarity between two LCHVs and a novel method is presented for solving MAGDM problems with LCHV information.
The rest of this article is organized as follows. In Section 2, we briefly review LCHVs. Based on the distance and included angle of two LCHVs, two cosine similarity measures are proposed in Section 3. Then, based on the cosine similarity measures of the LCHVs, the weighted MAGDM methods are presented in Section 4. An application example is presented to implement the proposed DM method and the influence of hesitation extension on similarity is analyzed in Section 5. A sensitivity analysis to change weights is performed in Section 6. Finally, we conclude the study and put forward future prospects for research.

2. The Concept of Linguistic Cubic Hesitant Variables

The LCHV is a new hybrid linguistic variable form. Ye [18] first presented the definition of LCHV as follow.
Definition 1.
[18] Let S = { L γ | γ [ 0 , τ ] } be an LTS, where τ is an even number. An LCHV V is defined as V = ( L u ˜ , L h ˜ ) , where L u ˜ = [ L p , L q ] for L p , L q S and q p is an uncertain (interval) LV and L h ˜ = { L ϕ β | L ϕ β S , β = 1 , , k } is a hesitant linguistic variable (HLV). The HLV is a set of k single-valued LVs which are ranked in an ascending order.
For example, five experts are invited to assess the service management level of a hospital. They give their assessments from a linguistic term set S = {L0 (none), L1 (very low), L2 (low), L3 (slightly low), L4 (moderate), L5 (slightly high), L6 (high), L7 (very high), L8 (super high)}.Two experts give interval LVs [L3, L5] and [L4, L6]. The other three experts give single-valued LVs—L5, L3, and L5. Then we merge two interval LVs into one, such as [L3, L6]. Different single-valued variables form a hesitation set as {L3, L5}. The newly constructed uncertain LV and hesitation set form a linguistic cubic hesitant variable (LCHV) described as ([L3, L6], {L3, L5}). The LCHV ([L3, L6], {L3, L5}) represents the assessments of the five experts.
Definition 2.
[18] Let V = ( L u ˜ , L h ˜ ) = ( [ L p , L q ] , { L ϕ 1 , L ϕ 2 , , L ϕ β } ) for L p , L q , L ϕ i { L γ | γ [ 0 , τ ] } ( i = 1 , 2 , , β ) be a LCHV. We classify it as follows:
(1) if ϕ i [ p , q ] ( i = 1 , 2 , , β ) for p , q [ 0 , τ ] , then V = ( [ L p , L q ] , { L ϕ 1 , L ϕ 2 , , L ϕ β } ) is an internal LCHV.
(2) if ϕ i [ p , q ] ( i = 1 , 2 , , β ) for p , q , ϕ i [ 0 , τ ] , then V = ( [ L p , L q ] , { L ϕ 1 , L ϕ 2 , , L ϕ β } ) is an external LCHV.

3. Cosine Measures of LCHVs

In the actual situation, the experts evaluated different objects with different hesitant degrees, so the LCHVs obtained from the experts have various numbers of HLVs. In order to realize the similarity measures between them, it is necessary to extend the HLVs to reach the same number of LVs. We used the LCMN extension method [18] to extend the number of HLVs.
Assume that V i = ( L u i ˜ , L h i ˜ ) = ( [ L p i , L q i ] , { L ϕ i 1 , L ϕ i 2 , , L ϕ i α i } ) ( i = 1 , 2 , , k ) is a set of LCHVs; λ is the LCMN of (α1, α2, …, αk) for L h i ˜ ( i = 1 , 2 , , k ) . Then we can extend the LVs of each LCHVs to the same number as the following extension forms:
V 1 e = ( [ L p 1 , L q 1 ] , { L ϕ 11 1 , L ϕ 11 2 , , L ϕ 11 λ / α 1 λ / α 1 , L ϕ 12 1 , L ϕ 12 2 , , L ϕ 12 λ / α 1 λ / α 1 , , L ϕ 1 α 1 1 , L ϕ 1 α 1 2 , , L ϕ 1 α 1 λ / α 1 λ / α 1 λ } ) , V 2 e = ( [ L p 2 , L q 2 ] , { L ϕ 21 1 , L ϕ 21 2 , , L ϕ 21 λ / α 2 λ / α 2 , L ϕ 22 1 , L ϕ 22 2 , , L ϕ 22 λ / α 1 λ / α 2 , , L ϕ 2 α 2 1 , L ϕ 2 α 2 2 , , L ϕ 2 α 2 λ / α 2 λ / α 2 λ } ) , , V k e = ( [ L p k , L q k ] , { L ϕ k 1 1 , L ϕ k 1 2 , , L ϕ k 1 λ / α k λ / α k , L ϕ k 2 1 , L ϕ k 2 2 , , L ϕ k 2 λ / α k λ / α k , , L ϕ k α k 1 , L ϕ k α k 2 , , L ϕ k α k λ / α k λ / α k λ } ) .
Example 1.
Suppose V 1 = ( [ L 4 , L 5 ] , { L 4 , L 6 , L 7 } ) and V 2 = ( [ L 3 , L 5 ] , { L 4 , L 6 } ) are two LCHVs in the LTS S = { L γ | γ [ 0 , 8 ] } . Then we can obtain the LCMN λ = 6 according to α 1 = 3 and α 2 = 2 . Based on Equation (1), the extended forms of the two LCHVs are as follows:
V 1 e = ( [ L 4 , L 5 ] , { L 4 , L 4 , L 6 , L 6 , L 7 , L 7 } ) and V 2 e = ( [ L 3 , L 5 ] , { L 4 , L 4 , L 4 , L 6 , L 6 , L 6 } ) .
Definition 3.
Let V 1 = ( [ L p 1 , L q 1 ] , { L ϕ 11 , L ϕ 12 , , L ϕ 1 λ } ) and V 2 = ( [ L p 2 , L q 2 ] , { L ϕ 21 , L ϕ 22 , , L ϕ 2 λ } ) be two LCHVs in the LTS S = { L γ | γ [ 0 , τ ] } . The linguistic scale function is f ( L γ ) = γ τ for γ [ 0 , τ ] . Then based on the distance and included angle of two LCHVs. The two cosine similarity measures between LCHVs are presented below.
(1) Cosine similarity measures on the basis of distance
S D L C H V ( V 1 , V 2 ) = 1 2 { cos ( | f ( L p 1 ) f ( L p 2 ) | + | f ( L q 1 ) f ( L q 2 ) | 4 π ) + cos ( i = 1 λ | f ( L ϕ 1 i ) f ( L ϕ 2 i ) | 2 λ π ) }      = 1 2 { cos ( | p 1 p 2 | + | q 1 q 2 | 4 τ π ) + cos ( i = 1 λ | ϕ 1 i ϕ 2 i | 2 τ λ π ) }
(2) Cosine similarity measure on the basis of the included angle of the two LCHVs
S A L C H V ( V 1 , V 2 ) = 1 2 { f ( L p 1 ) f ( L p 2 ) + f ( L q 1 ) f ( L q 2 ) ( f ( L p 1 ) ) 2 + ( f ( L q 1 ) ) 2 ( f ( L p 2 ) ) 2 + ( f ( L q 2 ) ) 2 + i = 1 λ ( f ( L ϕ 1 i ) f ( L ϕ 2 i ) ) i = 1 λ ( f ( L ϕ 1 i ) ) 2 i = 1 λ ( f ( L ϕ 2 i ) ) 2 } = 1 2 { p 1 × p 2 + q 1 × q 2 ( p 1 ) 2 + ( q 1 ) 2 ( p 2 ) 2 + ( q 2 ) 2 + i = 1 λ ϕ 1 i × ϕ 2 i i = 1 λ ( ϕ 1 i ) 2 i = 1 λ ( ϕ 2 i ) 2 }
According to the Equations (2) and (3), the cosine similarity measures S D ( V 1 , V 2 ) and S A ( V 1 , V 2 ) satisfy properties (q1)–(q3) as follows:
(q1) 0 S D L C H V ( V 1 , V 2 ) 1 , 0 S A L C H V ( V 1 , V 2 ) 1 .
(q2) S D L C H V ( V 1 , V 2 ) = S D L C H V ( V 2 , V 1 ) , S A L C H V ( V 1 , V 2 ) = S A L C H V ( V 2 , V 1 ) .
(q3) If V 1 = V 2 , then S D L C H V ( V 1 , V 2 ) = 1 , S A L C H V ( V 1 , V 2 ) = 1 .
Proof. 
We will prove the properties (q1)–(q3) of S D L C H V ( V 1 , V 2 ) first.
(q1) Let x = | f ( L p 1 ) f ( L p 2 ) | + | f ( L q 1 ) f ( L q 2 ) | for f ( L p 1 ) , f ( L q 1 ) , f ( L p 2 ) , .
f ( L q 2 ) [ 0 , 1 ] and Y = i = 1 λ | f ( L ϕ 1 i ) f ( L ϕ 2 i ) | for f ( L ϕ 1 i ) , f ( L ϕ 2 i ) [ 0 , 1 ] where.
i = 1 , 2 , λ . There exists 0 X 2 and 0 Y λ , then 0 cos ( x π / 4 ) 1 and 0 cos ( Y π / 2 λ ) 1 . Thus, we get 0 S D L C H V ( V 1 , V 2 ) 1 .
(q2) It is obvious.
(q3) If V 1 = V 2 , there exists f ( L p 1 ) = f ( L p 2 ) , f ( L q 1 ) = f ( L q 2 ) , f ( L ϕ 1 i ) = f ( L ϕ 2 i ) for i = 1 , 2 , λ . Then S D L C H V ( V 1 , V 2 ) = 1 holds.
The properties (q1)–(q3) of S L C H V A ( V 1 , V 2 ) can be proved as follows:
(q1) S A L C H V ( V 1 , V 2 ) 0 is obvious. S A L C H V ( V 1 , V 2 ) 1 can be proved as follows:
Based on the Cauchy–Schwarz formula for inequality:
( α 1 β 1 + α 2 β 2 + + α n β n ) 2 ( α 1 2 + α 2 2 + α n 2 ) × ( β 1 2 + β 2 2 + β n 2 ) , where
( α 1 , α 2 , α n ) R n and ( β 1 , β 2 , β n ) R n , then the following equality holds.
f ( L p 1 ) f ( L p 2 ) + f ( L q 1 ) f ( L q 2 ) ( f ( L p 1 ) ) 2 + ( f ( L q 1 ) ) 2 ( f ( L p 2 ) ) 2 + ( f ( L q 2 ) ) 2 and i = 1 λ ( f ( L ϕ 1 i ) f ( L ϕ 2 i ) ) i = 1 λ ( f ( L ϕ 1 i ) ) 2 i = 1 λ ( f ( L ϕ 2 i ) ) 2 .
Hence, we can get the following results:
f ( L p 1 ) f ( L p 2 ) + f ( L q 1 ) f ( L q 2 ) ( f ( L p 1 ) ) 2 + ( f ( L q 1 ) ) 2 ( f ( L p 2 ) ) 2 + ( f ( L q 2 ) ) 2 1 and
i = 1 λ ( f ( L ϕ 1 i ) f ( L ϕ 2 i ) ) i = 1 λ ( f ( L ϕ 1 i ) ) 2 i = 1 λ ( f ( L ϕ 2 i ) ) 2 1
According to Equation (3), we have S A L C H V ( V 1 , V 2 ) 1 . Thus, 0 S A L C H V ( V 1 , V 2 ) 1 holds
(q2) It is obvious.
(q3) If V 1 = V 2 , there exists f ( L p 1 ) = f ( L p 2 ) , f ( L q 1 ) f ( L q 2 ) , f ( L ϕ 1 i ) f ( L ϕ 2 i ) for i = 1 , 2 , λ . Then, we have the following equations.
f ( L p 1 ) f ( L p 2 ) + f ( L q 1 ) f ( L q 2 ) = ( f ( L p 1 ) ) 2 + ( f ( L q 1 ) ) 2 ( f ( L p 2 ) ) 2 + ( f ( L q 2 ) ) 2 and i = 1 λ ( f ( L ϕ 1 i ) f ( L ϕ 2 i ) ) = i = 1 λ ( f ( L ϕ 1 i ) ) 2 i = 1 λ ( f ( L ϕ 2 i ) ) 2
Hence, S A L C H V ( V 1 , V 2 ) = 1 holds. □
Definition 4.
Let R = { V r 1 , V r 2 , , V r k } and G = { V g 1 , V g 2 , , V g k } be two HLCV sets, where V r i and V g i (i = 1, 2, …k) are HLCVS in the LTS S = { L γ | γ [ 0 , τ ] } . If we take into account the weights of the elements V r i and V g i (i = 1, 2, …k), the similarity between R and G can be defined, respectively, as follows:
S ω D L C H V S ( R , G ) = j = 1 j = k ω j S D L C H V ( V rj , V gj )
S ω A L C H V S ( R , G ) = j = 1 j = k ω j S A L C H V ( V rj , V gj )
where ω j [ 0 , 1 ] and j = 1 k ω j = 1 for i = 1 , 2 , , k .
In addition, the above two weighted cosine similarity measures S ω D L C H V S ( R , G ) and S ω A L C H V S ( R , G ) also have following properties (q1)–(q3):
(q1) 0 S ω D L C H V S ( R , G ) 1 , 0 S ω A L C H V S ( R , G ) 1 ;
(q2) S ω D L C H V S ( R , G ) = S ω D L C H V S ( G , R ) , S ω A L C H V S ( R , G ) = S ω A L C H V S ( G , R ) ;
(q3) If R = G, then S ω D L C H V S ( R , G ) = 1 , S ω A L C H V S ( R , G ) = 1 .
The above proprieties (q1)~(q3) for S ω D L C H V S ( R , G ) and S ω A L C H V S ( R , G ) can be proved easily.

4. MAGDM Approach Based on Cosine Similarity Measures of LCHVs

Based on cosine similarity measures of LCHVs, we will propose an MAGDM method to solve DM problems with LCHVs information.
Suppose V = { V 1 , V 2 , , V n } is the set of n alternatives and A = { A 1 , A 2 , , A m } is the set of m attributes in a case of MAGMD. When decision-makers make decisions about the A j attribute of the V i alternative, each decision-maker can give an interval LV or a single-value LV from the LTS S = { L γ | γ [ 0 , τ ] } . The evaluation values about the A j attribute of the V i alternative given by all decision-makers constitute an LCHV V i j which is described as V i j = ( L u i j ˜ , L h i j ˜ ) = ( [ L p i j , L q i j ] , { L ϕ i j ( 1 ) , L ϕ i j ( 2 ) , , L ϕ i j ( α i j ) } ) (i = 1, 2, …n; j = 1, 2, …m) for the uncertain LV ( [ L p i j , L q i j ] ) satisfies q i j p i j and the HLV set L h i j ˜ is ranked in an ascending order. Hence, all the assessed LCHVs construct a decision matrix M = ( V i j ) n × m . Furthermore, ω j is the weight of attribute A j , where ω j [ 0 , 1 ] and j = 1 m ω j = 1 . Determining the weight values of the attribute ω j is important for the objectivity of decision results. Some algorithms have presented to determine weight coefficients such as the analytic hierarchy process (AHP) method [28], the Decision-Making and Trial Evaluation Laboratory (DEMATEL)method [29], the best–worst (BWM) method [30], and the full consistency (FUCOM) method [31]. Each of them applies to different areas of decision-making. For instance, the BWM was used for the selection of wagons for the internal transport of a logistics company [32] and the location selection for round about construction [33]; the AHP was applied to evaluate university web pages [34], etc.
According to the proposed cosine similarity measures, we developed an MAGDM method of LCHVs using the following steps:
Step 1. Suppose α i j is the number of LVs in L h i j ˜ for V i j . The LCMN of ( α 1 j , α 2 j , , α n j ) ( j = 1 , 2 , , m ) is λ j . Extend the HLVs of ( L ˜ h 1 j , L ˜ h 2 j , , L ˜ h n j ) ( j = 1 , 2 , , m ) to reach the same number λ j . Based on the extension method mentioned above, we can extend one of the LCHV in matrix M = ( V i j ) n × m as the following form:
V i j e = ( [ L p i j , L q i j ] , { L ϕ i j ( 1 ) 1 , L ϕ i j ( 1 ) 2 , , L ϕ i j ( 1 ) λ j / α i j λ j / α i j , L ϕ i j ( 2 ) 1 , L ϕ i j ( 2 ) 2 , , L ϕ i j ( 3 ) λ j / α i j λ j / α i j , , L ϕ i j ( α i j ) 1 , L ϕ i j ( α i j ) 2 , , L ϕ i j ( α i j ) λ j / α i j λ j / α i j λ j } ) .
Then, we get the extension matrix below:
M e = V 1 e V 2 e . . . V n e [ V 11 e V 12 e V 1 m e V 21 e V 22 e V 2 m e . . . . . . . . . . . . V n 1 e V n 2 e V n m e ]
Step 2. Establish an ideal LCHV set as V * = ( V 1 * , V 2 * , V m * ) for V j * = ( [ L τ , L τ ] , { L τ , L τ , L λ j τ } ) for ( j = 1 , 2 , , m ) .
Step 3. According to Equation (2) or Equation (3), the cosine similarity can be measured between V i j e and V j * , then obtain S D L C H V ( V i j e , V j * ) or S A L C H V ( V i j e , V j * ) .
Step 4. Based on the weight of each attribute Aj (j = 1, 2, …, m), calculate the overall weighted cosine similarity as follows:
S ω D L C H V S ( V i e , V * ) = j = 1 j = m ω j S D L C H V ( V i j e , V j * )
S ω A L C H V S ( V i e , V * ) = j = 1 j = m ω j S A L C H V ( V i j e , V j * )
where ω j [ 0 , 1 ] and j = 1 m ω j = 1 .
Step 5. Rank the alternatives by the results of the overall weighted cosine similarity measure.
The better alternative is the one with the bigger cosine similarity measure result.
Step 6. End.

5. Illustrative Example

Firstly, we cite an example to demonstrate the proposed MAGDM method in LCHVs is as feasible as the existing method in actual applications. Then we compare and analyze the characteristics of the proposed cosine similarity measures.

5.1. Illustrative Example

A computer company is looking for a software engineer. The four candidates, V1, V2, V3, and V4, are selected by the human resources department from all of the applicants. They will be further evaluated from three aspects by five experts. The requirements of the three aspects are innovation capability, work experience, and self-confidence. w = (0.45, 0.35, 0.2) is the weight vector of the three requirements. Then, the five experts evaluate each candidate Vi (i = 1, 2, 3, 4) over the three attributes Aj (j = 1, 2, 3) by LCHVs. S = {L0 (none), L1 (very low), L2 (low), L3 (slightly low), L4 (moderate), L5 (slightly high), L6 (high), L7 (very high), L8 (super high)}, and is the linguistic term set. The following evaluation matrix consists of the LCHV information given by five experts.
D = ( V i j ) 4 × 3 = V 1 V 2 V 3 V 4 [ ( [ L 4 , L 6 ] , { L 5 , L 6 } ) ( [ L 4 , L 6 ] , { L 4 , L 6 , L 7 } ) ( [ L 4 , L 7 ] , { L 5 , L 6 } ) ( [ L 3 , L 5 ] , { L 4 , L 5 , L 6 } ) ( [ L 5 , L 7 ] , { L 6 , L 7 } ) ( [ L 4 , L 6 ] , { L 4 , L 5 } ) ( [ L 5 , L 7 ] , { L 5 , L 6 } ) ( [ L 6 , L 7 ] , { L 4 , L 5 , L 6 } ) ( [ L 5 , L 7 ] , { L 4 , L 6 , L 7 } ) ( [ L 6 , L 7 ] , { L 5 , L 6 , L 7 } ) ( [ L 5 , L 7 ] , { L 5 , L 7 } ) ( [ L 4 , L 6 ] , { L 6 , L 7 } ) ]
Thus, the decision-making steps based on the distance cosine similarity measure are as follows:
Step 1. According to the number α i j of LVs in L h i j ˜ for V i j ( i = 1 , 2 , 3 , 4 ; j = 1 , 2 , 3 ) , we can obtain the LCMN of ( α 1 j , α 2 j , , α 4 j ) ( j = 1 , 2 , 3 ) as λ j = 6 . Then we can extend each of the LCHV in matrix M = ( V i j ) 4 × 3 in the following form:
D e = ( V i j ) 4 × 3 = V 1 e V 2 e V 3 e V 4 e [ ( [ L 4 , L 6 ] , { L 5 , L 5 , L 5 , L 6 , L 6 , L 6 } ) ( [ L 4 , L 6 ] , { L 4 , L 4 , L 6 , L 6 , L 7 , L 7 } ) ( [ L 4 , L 7 ] , { L 5 , L 5 , L 5 , L 6 , L 6 , L 6 } ) ( [ L 3 , L 5 ] , { L 4 , L 4 , L 5 , L 5 , L 6 , L 6 } ) ( [ L 5 , L 7 ] , { L 6 , L 6 , L 6 , L 7 , L 7 , L 7 } ) ( [ L 4 , L 6 ] , { L 4 , L 4 , L 4 , L 5 , L 5 , L 5 } ) ( [ L 5 , L 7 ] , { L 5 , L 5 , L 5 , L 6 , L 6 , L 6 } ) ( [ L 6 , L 7 ] , { L 4 , L 4 , L 5 , L 5 , L 6 , L 6 } ) ( [ L 5 , L 7 ] , { L 4 , L 4 , L 6 , L 6 , L 7 , L 7 } ) ( [ L 6 , L 7 ] , { L 5 , L 5 , L 6 , L 6 , L 7 , L 7 } ) ( [ L 5 , L 7 ] , { L 5 , L 5 , L 5 , L 7 , L 7 , L 7 } ) ( [ L 4 , L 6 ] , { L 6 , L 6 , L 6 , L 7 , L 7 , L 7 } ) ]
Step 2. Establish the ideal LCHV set as V * = ( V 1 * , V 2 * , V j * ) . Due to λ j = 6 for (j = 1, 2, 3), thus, V j * = ( [ L 8 , L 8 ] , { L 8 , L 8 , L 8 , L 8 , L 8 , L 8 } ) for ( j = 1 , 2 , 3 ) . If λ j is unequal, the size of V j * is different.
Step 3. Calculate the cosine similarity measures between V i j e and V j * based on Equation (2). Suppose X 1 = V 11 e = ( [ L 4 , L 6 ] , { L 5 , L 5 , L 5 , L 6 , L 6 , L 6 } ) and X 2 = V 1 * = ( [ L 8 , L 8 ] , { L 8 , L 8 , L 8 , L 8 , L 8 , L 8 } ) , we calculate S D L C H V ( V 11 e , V 1 * ) as below.
S D L C H V ( V e 11 , V * 1 ) = S D L C H V ( X 1 , X 2 ) = 1 2 { cos ( | f ( L p 1 ) f ( L p 2 ) | + | f ( L q 1 ) f ( L q 2 ) | 4 π ) + cos ( i = 1 λ | f ( L ϕ 1 i ) f ( L ϕ 2 i ) | 2 λ π ) }       = 1 2 { cos ( | p 1 p 2 | + | q 1 q 2 | 4 τ π ) + cos ( i = 1 λ | ϕ 1 i ϕ 2 i | 2 τ λ π ) }       = 1 2 { cos ( | 4 8 | + | 6 8 | 4 × 8 π ) + cos ( 3 | 5 8 | + 3 | 6 8 | 2 × 8 × 6 π ) }       = 1 2 { cos ( 5 16 π ) + cos ( 5 32 π ) }       = 0.8567
Similarly, we can get all the values of S D L C H V ( V i j e , V j * ) for ( i = 1 , 2 , 3 , 4 ) and ( j = 1 , 2 , 3 )
S D L C H V ( V 11 e , V 1 * ) = 0.8567 ,   S D L C H V ( V 12 e , V 2 * ) = 0.8642 , S D L C H V ( V 13 e , V 3 * ) = 0.8819 ; S D L C H V ( V 21 e , V 1 * ) = 0.7693 ,   S D L C H V ( V 22 e , V 2 * ) = 0.9404 ,   S D L C H V ( V 23 e , V 3 * ) = 0.8022 ; S D L C H V ( V 31 e , V 1 * ) = 0.9029 ,   S D L C H V ( V 32 e , V 2 * ) = 0.9194 ,   S D L C H V ( V 33 e , V 3 * ) = 0.9104 ; S D L C H V ( V 41 e , V 1 * ) = 0.9404 ,   S D L C H V ( V 42 e , V 2 * ) = 0.9239 ,   S D L C H V ( V 43 e , V 3 * ) = 0.8942 .
Step 4. Through Equation (6), we can get the overall weighted distance cosine similarity as follows:
S ω D L C H V S ( V 1 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 1 j e , V j * ) = 0.45 × 0.8567 + 0.35 × 0.8642 + 0.2 × 0.8819 = 0.8644 S ω D L C H V S ( V 2 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 2 j e , V j * ) = 0.45 × 0.7693 + 0.35 × 0.9404 + 0.2 × 0.8022 = 0.8358 S ω D L C H V S ( V 3 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 3 j e , V j * ) = 0.45 × 0.9029 + 0.35 × 0.9194 + 0.2 × 0.9104 = 0.9102 S ω D L C H V S ( V 4 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 4 j e , V j * ) = 0.45 × 0.9404 + 0.35 × 0.9239 + 0.2 × 0.8942 = 0.9254
Step 5. Using the similarity results, we can rank the four candidates as v 4 v 3 v 1 v 2 .
Similarly, according to the included angle cosine similarity measure of the two LCHVs in Equation (3), suppose X 1 = V 11 e = ( [ L 4 , L 6 ] , { L 5 , L 5 , L 5 , L 6 , L 6 , L 6 } ) and X 2 = V 1 * = ( [ L 8 , L 8 ] , { L 8 , L 8 , L 8 , L 8 , L 8 , L 8 } ) , we calculate S A L C H V ( V 11 e , V 1 * ) as below.
s A L C H V ( V 11 e , V 1 * ) = S A L C H V ( X 1 , X 2 ) = 1 2 { f ( L p 1 ) f ( L p 2 ) + f ( L q 1 ) f ( L q 2 ) ( f ( L p 1 ) ) 2 + ( f ( L q 1 ) ) 2 ( f ( L p 2 ) ) 2 + ( f ( L q 2 ) ) 2 + i = 1 λ ( f ( L ϕ 1 i ) f ( L ϕ 2 i ) ) i = 1 λ ( f ( L ϕ 1 i ) ) 2 i = 1 λ ( f ( L ϕ 2 i ) ) 2 } = 1 2 { p 1 × p 2 + q 1 × q 2 ( p 1 ) 2 + ( q 1 ) 2 ( p 2 ) 2 + ( q 2 ) 2 + i = 1 λ ϕ 1 i × ϕ 2 i i = 1 λ ( ϕ 1 i ) 2 i = 1 λ ( ϕ 2 i ) 2 } = 1 2 { 4 × 8 + 6 × 8 4 2 + 6 2 8 2 + 8 2 + 5 × 8 + 5 × 8 + 5 × 8 + 6 × 8 + 6 × 8 + 6 × 8 3 × 5 2 + 3 × 6 2 6 × 8 2 } = 1 2 { 5 26 + 33 1098 } = 0.9882
In the same way, we can obtain the cosine similarity values of each attribute S A L C H V ( V i j e , V j * ) for ( i = 1 , 2 , 3 , 4 ) and ( j = 1 , 2 , 3 ) as following.
S A L C H V ( V 11 e , V 1 * ) = 0.9882 ,   S A L C H V ( V 12 e , V 2 * ) = 0.9786 ,   S A L C H V ( V 31 e , V 3 * ) = 0.9803 ; S A L C H V ( V 21 e , V 1 * ) = 0.9785 ,   S A L C H V ( V 22 e , V 2 * ) = 0.9917 ,   S A L C H V ( V 32 e , V 3 * ) = 0.9872 ; S A L C H V ( V 31 e , V 1 * ) = 0.9911 ,   S A L C H V ( V 23 e , V 2 * ) = 0.9965 ,   S A L C H V ( V 33 e , V 3 * ) = 0.9815 ; S A L C H V ( V 31 e , V 1 * ) = 0.9940 ,   S A L C H V ( V 24 e , V 2 * ) = 0.9864 ,   S A L C H V ( V 34 e , V 3 * ) = 0.9888 .
by Equation (7), we get the overall weighted cosine similarity values as follows:
S ω A L C H V S ( V 1 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 1 j e , V j * ) = 0.45 × 0.9882 + 0.35 × 0.9786 + 0.2 × 0.9803 = 0.9833 S ω A L C H V S ( V 2 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 2 j e , V j * ) = 0.45 × 0.9785 + 0.35 × 0.9917 + 0.2 × 0.9872 = 0.9849 S ω A L C H V S ( V 3 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 3 j e , V j * ) = 0.45 × 0.9911 + 0.35 × 0.9965 + 0.2 × 0.9815 = 0.9911 S ω A L C H V S ( V 4 e , V * ) = j = 1 j = 3 ω j S D L C H V ( V 4 j e , V j * ) = 0.45 × 0.9940 + 0.35 × 0.9864 + 0.2 × 0.9888 = 0.9903
From the above similarity results, we can see that the four alternatives are ranked as v 3 v 4 v 2 v 1 . The rank order based on the included angle cosine similarity measure of two LCHVs is different from the results based on the distance cosine similarity measure.

5.2. Related Comparison

Ye’s [18] proposed the LCHVWAA and LCHVWGA aggregation operators of LCHVs. Based on the existing operators proposed by Ye [18] and the cosine similarity measures proposed in this paper, the MAGDM results for the example 1 are shown in the Table 1.
We can see from Table 1 that the results based on the distance cosine similarity measure are consistent with the results provided by Ye [18]. But the results based on the included angle cosine similarity measure are different from them.

5.3. Extension Analysis

In order to analyze the reason why we get different DM results based on two cosine similarity measure methods, we give two LCHVs Z 1 = ( [ L 5 , L 7 ] , { L 4 , L 6 } ) and Z 2 = ( [ L 4 , L 6 ] , { L 5 , L 7 } ) . Suppose Z1 and Z2 are two LCHVs of a practical example. Suppose LCMN is 6, then Z e 1 = ( [ L 5 , L 7 ] , { L 4 , L 4 , L 4 , L 6 , L 6 , L 6 } ) and Z e 2 = ( [ L 4 , L 6 ] , { L 5 , L 5 , L 5 , L 7 , L 7 , L 7 } ) are the extensions of Z1 and Z2. To compare the influence of the extension on the two cosine similarity measures, assume V = ( [ L 8 , L 8 ] , { L 8 , L 8 } ) and V * = ( [ L 8 , L 8 ] , { L 8 , L 8 , L 8 , L 8 , L 8 , L 8 } ) . We list the results of S D L C H V ( Z i , V ) , S A L C H V ( Z i , V ) , S D L C H V ( Z * i , V * ) , S A L C H V ( Z * i , V * ) for ( i = 1 , 2 ) in Table 2.
From the table, we can see that S D L C H V ( Z 1 V ) is equal to S D L C H V ( Z 2 V ) and S A L C H V ( Z 1 V ) is equal to S A L C H V ( Z 2 , V ) too. The ranges of the hesitation and uncertainty variables are interchanged in the two HLVS. The results of two cosine measures are the same before extension. After extension, the included angle cosine similarity measures between ideal LCHV are still the same, but the distance cosine similarity measures between ideal LCHV are different. Moreover, the cosine similarity measures between ideal LCHV based on the included angle are increased with the extension, but the cosine similarity measures between ideal LCHV based on the distance are decreased. In this case, we can see that the cosine similarity measure based on the distance is more sensitive to the extension of LCHVs.

6. Sensitivity Analysis to Change Weights

Scholars have proposed various models to analyze the stability of proposed multi-criteria decision-making (MDM) methods [35,36]. As shown in the above section, different ranking results were obtained with two similarity measure MDM methods. We usually choose the best alternative according to the ranking results. The ranking results mostly depend on the values of weight coefficients. We will perform a sensitivity analysis to assess how changes in the weights would change the ranking of the alternatives. The sensitivity analysis is shown through eight scenarios with different weight coefficients, as shown in Table 3.
The ranking results by scenarios is shown in Table 4 and Table 5. Results show that rankings of alternatives will change as the weight coefficient changes, as can be seen in the alternatives ranking in Table 4. Similarity measures based on the distance method prefer the alternative V4, because alternative V4 is ranked first in seven out of the eight scenarios. By comparing with the alternatives ranking in Table 5, the alternatives V3 and V4, respectively, are the two best alternatives based on the included angle cosine similarity measure method.
Sensitivity analysis showed that the two similarity measure methods were sensitive to changes in weight. The similarity measure based on the distance method was relatively stable and mostly favored alternative V4. The similarity measure based on the included angle method was more sensitive to changes in weight and the ranking results changed dramatically, but the worst alternative was V1.

7. Conclusions

Under an MAGDM environment, the distance and included angle cosine similarity measures were firstly applied to deal with LCHV information in this paper. Next, we established a novel MAGDM approach on the basis of the LCMN extension and the cosine similarity measures of LCHVs. Then, a practical example was presented to implement the proposed DM method. Although the approaches can solve the MAGDM problem, the DM results of the two methods were different in the example case. Later, we compared and discussed the impact of the extension on the two cosine similarity measures. Finally, we analyzed the sensitivity of the two methods to weights. We summarize the main highlights of the proposed method below.
(1) Cosine similarity measures based on distance and included angle were used to solve a MAGDM problem with LCHV information for the first time. By using the linguistic scale function, the calculation process of similarity measures was simple and the number of calculations small.
(2) In order to demonstrate the stability of the proposed methods, the sensitivity analysis to weight change was performed. By comparison, the similarity measure based on the distance method was a better fit for the engineer selection case.
(3) Although the LCMN extension method was more objective, this paper provides a preliminary analysis of the influence of hesitation extension on similarity measures.
In the future, we could do more research on LCHV MAGDM. For instance, we can forward aggregation operators or measure methods which are not affected by the degree of hesitation. More models are used to analyze stability of the proposed LCHV MAGDM methods in order that decision-makers choose appropriate methods based on the stability. We also can apply the proposed methods to various fields.

Author Contributions

J.Y. proposed the cosine similarity measure operators. X.L. presented the MAGDM methods and comparative analysis. All authors wrote the paper together.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chatterjee, P.; Mondal, S.; Boral, S. A novel hybrid method for non-traditional machining process selection using factor relationship and Multi-Attributive Border Approximation Method. Facta Univ. Ser. Mech. Eng. 2017, 15, 439–456. [Google Scholar] [CrossRef]
  2. Dragan, P.; Ibrahim, B.; Korica, S.; Radojko, O. A novel approach for the selection of power generation technology using an linguistic neutrosophic combinative distance-based assessment (CODAS) method: A case study in Libya. Energies 2018, 11, 2489. [Google Scholar]
  3. Liu, F.; Wu, G.A.; Vesko, L.; Milena, V. A multicriteria model for the selection of the transport service provider: A single valued neutrosophic DEMATEL multicriteria model. Decis. Mak. Appl. Manag. Eng. 2018, 1, 121–130. [Google Scholar] [CrossRef]
  4. Zadeh, L.A. The concept of a linguistic variable and its application to approximate reasoning Part I. Inf. Sci. 1975, 8, 199–249. [Google Scholar] [CrossRef]
  5. Herrera, F.; Herrera-Viedma, E.; Verdegay, L. A model of consensus in group decision making under linguistic assessments. Fuzzy Sets Syst. 1996, 79, 73–87. [Google Scholar] [CrossRef]
  6. Herrera, F.; Herrera-Viedma, E. Linguistic decision analysis: Steps for solving decision problems under linguistic information. Fuzzy Sets Syst. 2000, 115, 67–82. [Google Scholar] [CrossRef]
  7. Xu, Z.S. A method based on linguistic aggregation operators for group decision making with linguistic preference relations. Inf. Sci. 2004, 166, 19–30. [Google Scholar] [CrossRef]
  8. Xu, Z.S. A note on linguistic hybrid arithmetic averaging operator in multiple attribute group decision making with linguistic information. Group Decis. Negot. 2006, 15, 593–604. [Google Scholar] [CrossRef]
  9. Merigó, J.M.; Casanovas, M.; Martínez, L. Linguistic aggregation operators for linguistic decision making based on the Dempster-Shafer theory of evidence. Int. J. Uncertain. Fuzz. Knowl.-Based Syst. 2010, 18, 287–304. [Google Scholar] [CrossRef]
  10. Xu, Y.J.; Merigó, J.M.; Wang, H.M. Linguistic power aggregation operators and their application to multiple attribute group decision making. Appl. Math. Model. 2012, 36, 5427–5444. [Google Scholar] [CrossRef]
  11. Merigó, J.M.; Casanovas, M.; Palacios-Marqués, D. Linguistic group decision making with induced aggregation operators and probabilistic information. Appl. Soft Comput. 2014, 24, 669–678. [Google Scholar] [CrossRef]
  12. Xu, Z.S. Uncertain linguistic aggregation operators based approach to multiple attribute group decision making under uncertain linguistic environment. Inf. Sci. 2004, 168, 171–184. [Google Scholar] [CrossRef]
  13. Xu, Z.S. Induced uncertain linguistic OWA operators applied to group decision making. Inf. Sci. 2006, 7, 231–238. [Google Scholar] [CrossRef]
  14. Wei, G.W. Uncertain linguistic hybrid geometric mean operator and its application to group decision making under uncertain linguistic environment. Int. J. Uncertain. Fuzz. Knowl.-Based Syst. 2009, 17, 251–267. [Google Scholar] [CrossRef]
  15. Peng, B.; Ye, C.; Zeng, S. Uncertain pure linguistic hybrid harmonic averaging operator and generalized interval aggregation operator based approach to group decision making. Int. J. Uncertain. Fuzz. Knowl.-Based Syst. 2012, 36, 175–181. [Google Scholar] [CrossRef]
  16. Ye, J. Multiple attribute decision-making method based on linguistic cubic variables. J. Intell. Fuzzy Syst. 2018, 34, 2351–2361. [Google Scholar] [CrossRef]
  17. Lu, X.P.; Ye, J. Dombi aggregation operators of linguistic cubic variables for multiple attribute decision making. Information 2018, 9, 188. [Google Scholar] [CrossRef]
  18. Ye, J.; Cui, W.H. Multiple Attribute Decision-Making Method Using Linguistic Cubic Hesitant Variable. Algorithms 2018, 11, 135. [Google Scholar] [CrossRef]
  19. Zhu, B.; Xu, Z. Consistency Measures for Hesitant Fuzzy Linguistic Preference Relations. IEEE Trans. Fuzzy Syst. 2014, 22, 35–45. [Google Scholar] [CrossRef]
  20. Liao, H.; Xu, Z.; Zeng, X.J.; Merigó, J.M. Qualitative decision making with correlation coefficients of hesitant fuzzy linguistic term sets. Int. J. Uncertain. Fuzz. Knowl.-Based Syst. 2015, 76, 127–138. [Google Scholar] [CrossRef]
  21. Ye, J. Multiple-attribute Decision-Making Method under a Single-Valued Neutrosophic Hesitant Fuzzy Environment. J. Intell. Syst. 2014, 24, 23–36. [Google Scholar] [CrossRef]
  22. Ye, J. Multiple Attribute Decision-Making Methods Based on the Expected Value and the Similarity Measure of Hesitant Neutrosophic Linguistic Numbers. Cogn. Comput. 2018, 10, 454–463. [Google Scholar] [CrossRef]
  23. Biswas, P.; Pramanik, S.; Giri, B.C. Cosine Similarity Measure Based Multi-attribute Decision-making with Trapezoidal Fuzzy Neutrosophic Numbers. Neutrosophic Sets Syst. 2014, 8, 47–57. [Google Scholar]
  24. Mahmood, T.; Ye, J.; Khan, Q. Vector similarity measures for simplified neutrosophic hesitant fuzzy set and their applications. J. Inequal. Spec. Funct. 2016, 7, 176–194. [Google Scholar]
  25. Shi, L.L.; Ye, J. Cosine Measures of Linguistic Neutrosophic Numbers and Their Application in Multiple Attribute Group Decision-Making. Information 2017, 8, 117. [Google Scholar] [Green Version]
  26. Lu, Z.K.; Ye, J. Cosine Measures of Neutrosophic Cubic Sets for Multiple Attribute Decision-Making. Symmetry 2017, 9, 121. [Google Scholar] [Green Version]
  27. Cui, W.H.; Ye, J. Multiple-Attribute Decision-Making Method Using Similarity Measures of Hesitant Linguistic Neutrosophic Numbers Regarding Least Common Multiple Cardinality. Symmetry 2018, 10, 330. [Google Scholar] [CrossRef]
  28. Saaty, T.L. Analytic Hierarchy Process; McGraw-Hill: New York, NY, USA, 1980. [Google Scholar]
  29. Gabus, A.; Fontela, E. World Problems an Invitation to Further Thought within the Framework of DEMATEL; Battelle Geneva Research Centre: Geneva, Switzerland, 1972; pp. 1–8. [Google Scholar]
  30. Rezaei, J. Best-worst multi-criteria decision-making method. Omega 2015, 53, 49–57. [Google Scholar] [CrossRef]
  31. Dragan, P.; Željko, S.; Siniša, S. A New Model for Determining Weight Coefficients of Criteria in MCDM Models: Full Consistency Method (FUCOM). Symmetry 2018, 10, 393. [Google Scholar]
  32. Lashko, S.I.; Lashko, T.A. The selection of wagons for the internal transport of a logistics company: A novel approach based on rough BWM and rough SAW methods. Symmetry 2017, 9, 264. [Google Scholar]
  33. Stević, Ž.; Pamučar, D.; Subotić, M.; Antuchevičiene, J.; Zavadskas, E. The location selection for round about construction using Rough BWM -Rough WASPAS approach based on a new Rough Hamy aggregator. Sustainability 2017, 10, 2817. [Google Scholar] [CrossRef]
  34. Dragan, P.; Željko, S.; Edmundas, K.Z. Integration of interval rough AHP and interval rough MABAC methods for evaluating university web pages. Appl. Soft Comput. 2018, 67, 141–163. [Google Scholar]
  35. Dragan, P.; Darko, B.; Aca, R. Multi-criteria decision making: An example of sensitivity analysis. Serb. J. Manag. 2017, 11, 1–27. [Google Scholar]
  36. Irik, M.; Dragan, P. A sensitivity analysis in MCDM problems: A statistical approach. Decis. Mak. Appl. Manag. Eng. 2018, 1, 51–80. [Google Scholar]
Table 1. LCHV1 MAGDM2 results.
Table 1. LCHV1 MAGDM2 results.
MAGDMSimilarity or ScoreRanking OrderThe Best
S ω D L C H V s ( V i e , V * ) 0.8644, 0.8358, 0.9102, 0.9254 v 4 v 3 v 1 v 2 V4
S ω A L C H V s ( V i e , V * ) 0.9833, 0.9849, 0.9911, 0.9903 v 3 v 4 v 2 v 1 V3
LCHVWAA3 [18]5.3666, 5.3228, 5.8822, 6.1173 v 4 v 3 v 1 v 2 V4
LCHVWGA4 [18]5.3172, 5.0878, 5.8428, 6.0388 v 4 v 3 v 1 v 2 V4
1 LCHV= linguistic cubic hesitant variable; 2 MAGDM = multiple attribute group decision making; 3 LCHVWAA = linguistic cubic hesitant variable weighted arithmetic average; 4 LCHVWGA = linguistic cubic hesitant variable weighted geometric average.
Table 2. Similarity measures before and after extension.
Table 2. Similarity measures before and after extension.
Distance SimilarityResultIncluded Angle SimilarityResult
S D L C H V ( Z 1 V ) 0.9999 S A L C H V ( Z 1 V ) 0.9831
S D L C H V ( Z 2 V ) 0.9999 S A L C H V ( Z 2 , V ) 0.9831
S D L C H V ( Z * 1 , V * ) 0.9839 S A L C H V ( Z * 1 , V * ) 0.9839
S D L C H V ( Z * 2 , V * ) 0.9851 S A L C H V ( Z * 2 , V * ) 0.9839
Table 3. Scenarios with different attribute weights.
Table 3. Scenarios with different attribute weights.
ScenariosAttribute Weight
A1A2A3
S-1: Uniform of Weight0.330.330.33
S-2: Priority of Attribute A10.80.10.1
S-3: Priority of Attribute A20.10.80.1
S-4: Priority of Attribute A30.10.10.8
S-5: Priority of Attribute A1, A20.40.40.2
S-6: Priority of Attribute A2, A30.20.40.4
S-7: Priority of Attribute A1, A30.40.20.4
S-8: Given weight0.450.30.25
Table 4. Alternatives ranking for different weight scenarios (distance similarity).
Table 4. Alternatives ranking for different weight scenarios (distance similarity).
AlternativeAlternatives Ranking by Scenario
S-1S-2S-3S-4S-5S-6S-7S-8
V133433333
V244344444
V322212222
V411121111
Table 5. Alternatives ranking for different weight scenarios (included angle similarity).
Table 5. Alternatives ranking for different weight scenarios (included angle similarity).
AlternativeAlternatives Ranking by Scenario
S-1S-2S-3S-4S-5S-6S-7S-8
V143444444
V234223333
V322131121
V411312212

Share and Cite

MDPI and ACS Style

Lu, X.; Ye, J. Similarity Measures of Linguistic Cubic Hesitant Variables for Multiple Attribute Group Decision-Making. Information 2019, 10, 168. https://doi.org/10.3390/info10050168

AMA Style

Lu X, Ye J. Similarity Measures of Linguistic Cubic Hesitant Variables for Multiple Attribute Group Decision-Making. Information. 2019; 10(5):168. https://doi.org/10.3390/info10050168

Chicago/Turabian Style

Lu, Xueping, and Jun Ye. 2019. "Similarity Measures of Linguistic Cubic Hesitant Variables for Multiple Attribute Group Decision-Making" Information 10, no. 5: 168. https://doi.org/10.3390/info10050168

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop