Next Article in Journal
Application of Internet of Things and Sensors in Healthcare
Next Article in Special Issue
Data-Driven Non-Linear Current Controller Based on Deep Symbolic Regression for SPMSM
Previous Article in Journal
BEST—Blockchain-Enabled Secure and Trusted Public Emergency Services for Smart Cities Environment
Previous Article in Special Issue
The Group Decision-Making Using Pythagorean Fuzzy Entropy and the Complex Proportional Assessment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Making Group Decisions within the Framework of a Probabilistic Hesitant Fuzzy Linear Regression Model

1
Department of Statistics, Lahore Campus, COMSATS University Islamabad, Islamabad 45550, Pakistan
2
Research Team on Intelligent Decision Support Systems, Department of Artificial Intelligence and Applied Mathematics, Faculty of Computer Science and Information Technology, West Pomeranian University of Technology in Szczecin, ul. Zołnierska 49, 71-210 Szczecin, Poland
3
National Institute of Telecommunications, Szachowa 1, 04-894 Warsaw, Poland
4
Department of Mathematics, Virtual University of Pakistan, Lahore 54000, Pakistan
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(15), 5736; https://doi.org/10.3390/s22155736
Submission received: 14 July 2022 / Revised: 26 July 2022 / Accepted: 27 July 2022 / Published: 31 July 2022
(This article belongs to the Special Issue Fuzzy Systems and Neural Networks for Engineering Applications)

Abstract

:
A fuzzy set extension known as the hesitant fuzzy set (HFS) has increased in popularity for decision making in recent years, especially when experts have had trouble evaluating several alternatives by employing a single value for assessment when working in a fuzzy environment. However, it has a significant problem in its uses, i.e., considerable data loss. The probabilistic hesitant fuzzy set (PHFS) has been proposed to improve the HFS. It provides probability values to the HFS and has the ability to retain more information than the HFS. Previously, fuzzy regression models such as the fuzzy linear regression model (FLRM) and hesitant fuzzy linear regression model were used for decision making; however, these models do not provide information about the distribution. To address this issue, we proposed a probabilistic hesitant fuzzy linear regression model (PHFLRM) that incorporates distribution information to account for multi-criteria decision-making (MCDM) problems. The PHFLRM observes the input–output (IPOP) variables as probabilistic hesitant fuzzy elements (PHFEs) and uses a linear programming model (LPM) to estimate the parameters. A case study is used to illustrate the proposed methodology. Additionally, an MCDM technique called the technique for order preference by similarity to ideal solution (TOPSIS) is employed to compare the PHFLRM findings with those obtained using TOPSIS. Lastly, Spearman’s rank correlation test assesses the statistical significance of two rankings sets.

1. Introduction

Statistical regression analysis is a valuable tool for determining the functional relationship between an output variable (the dependent variable) and the input variables (the independent variables). In statistical regression analysis, the relationship between IPOP variables is determined using precise data and precise relationships. However, when a phenomenon is imprecise, when there is vague variability rather than stochastic variability, and when the underlying regression model distributional assumptions are violated or cannot be tested (e.g., due to small sample size), it is more reasonable to assume a fuzzy relationship rather than a crisp relationship. Several researchers have modified and extended notions of statistical regression analysis to overcome these limitations using the fuzzy set theory (FST). Firstly, Tanaka et al. [1] introduced fuzzy regression analysis employing LPM. Further, Tanaka [2] introduced fuzzy intervals, Celmin [3], and Diamond [4] introduced fuzzy least-square models. Tanaka’s model was very sensitive to outliers, and then Peters [5] generalized Tanaka’s approach [1] where output values no longer fall within or outside the interval but rather belong to a certain degree of membership. Wang and Tsaur [6] presented a variable selection approach for a FLRM with crisp input and fuzzy output based on two criteria: the minimal total sum of vagueness and the minimal total sum of squares in estimation. Hong et al. [7] used fuzzy arithmetic operations to evaluate FLRMs based on Tanaka’s approach [1], where both IPOP data are fuzzy numbers (FNs). The solutions are derived using a generalized linear algorithm. Sakawa [7] modified FLRM by introducing two-phase construction of a linear regression model incorporating least-square estimation and LPM in different phases. Tanaka and Lee [8] used the proposed identification approach to perform exponential possibility regression analysis, resulting in a smaller region of possibility distribution that considered all possible sets of IPOP linear systems. To make LPMs more predictable and reduce computational effort, Modarres et al. [9] developed three FLRMs: risk seeking, risk neutral, and risk averse. They also developed a mathematical programming model to estimate FLRM parameters from crisp input and fuzzy output data. Parvathi et al. [10] introduced intuitionistic FLRM by incorporating an extension of FST called an intuitionistic fuzzy set into FLRM, where the parameters are symmetric triangular intuitionistic FNs. The parameters of an intuitionistic FLRM are estimated using an LPM that minimizes the total fuzziness of intuitionistic FLRM, which is associated with the width of intuitionistic fuzzy parameters. Sultan et al. [11] developed a fuzzy regression model employing HFS to solve a decision-making problem, in which IPOP variables are observed as hesitant fuzzy elements.
Nonlinear programming is used when the constraints or the objective function are nonlinear. To account for the nonlinear situation, Bárdossy [12]—considering regression models for FNs and the nonlinear problem—developed a generalized mathematical programming model. When the relationship between IPOP variables is intricate and nonlinear, determining the number of input variables for the model selection and the number of powers for input variables is challenging. Fuzzy regression analysis has also been studied from the perspective of the least-square approach, where the variability between the predicted fuzzy values and the actual fuzzy data is minimized for different distance measurements between two FNs. The fuzzy least-square method was initially proposed by Celmin [3] and Diamond [4] simultaneously, who estimated the fuzzy model parameters by minimizing the sum of squared error of the output variable.
The FST, introduced by Zadeh [13], is an excellent tool for describing ambiguous/vague information. The FST and their generalizations are powerful tools used in different fields [14,15]. The FST effectively employs membership functions and fuzzy numbers to deal with uncertainty in decision-making problems. It also has limitations when dealing with imprecise and vague data; as a result, the FST has been developed in a number of different directions, including the type-2 FSs [16], the hesitant FSs [17], probabilistic hesitant FSs [18], and the intuitionistic FSs [19] etc. Recently, the study of decision-making problems with the use of hesitant fuzzy information has become a significant focus of research, such as that of Liu et al. [20], who introduced a correlation coefficient approach to determine the strength of association between HFSs, which can be used to evaluate whether they are negatively or positively associated. Zeng et al. [21] introduced the weighted dual HFS, along with a few fundamental mathematical operations for weighted dual hesitant fuzzy elements, including union, intersection, multiplication, and complement; and Yan et al. [22] proposed a mathematical model for monitoring and evaluation bridge safety based on HFS. The HFS comprises a significant weakness in terms of data loss; to address this flaw, an extension of HFS called the PHFS has been proposed that enhances the HFS with probability and is capable of retaining more information than the HFS. Firstly, Zhu and Xu [18] developed a concept of PHFS, which incorporates distribution information into the HFS. Afterward, Zhang et al. [23] improved PHFE at first, then developed properties and aggregation operations for the modified PHFEs. In addition, Gao et al. [24] introduced a dynamic reference point technique using PHFS for emergency response that was based on probabilistic hesitant fuzzy information. Li and Wang [25] modified the QUALIFLEX approach to include probabilistic hesitant fuzzy environments and applied the suggested method to the selection of green suppliers; Wu et al. [26] developed a novel consensus-achieving approach for probabilistic hesitant fuzzy group decision making, and they implemented the suggested method to evaluate the strategic positions of energy transmission and distribution networks, and so on.
A single criterion is not enough in real-world decision-making problems, as they are often poorly structured and highly complex. Multi-Criteria Decision-Making (MCDM) methods solve complex problems and help to make the right decision. Finding the best alternative among the multiple alternatives is a challenging task. In decision-making problems, several techniques are used to assist DMs in ranking the alternatives, such as the Analytic Hierarchy Process [27], the Best Worth Method [28], EDAS (Evaluation on Distance from Average Solution) [29], and TOPSIS (a Technique for Order Performance by Similarity to Ideal Solution) [30]. The TOPSIS method is a well-known technique and considers the distances to both Positive Ideal Solution (PIS) and Negative Ideal Solution (NIS) simultaneously, and assigns a preference order based on their relative closeness and a combination of these distance measures. Recently, many papers have been devoted to developing new approaches, i.e., a new logarithm methodology of additive weights [31], FUCOM [32], COMET extensions [33,34], WASPAS method [35], SPOTIS [36,37], RAFSI [38], and an integrated SWOT–fuzzy PIPRECIA [39]. These methods are valuable and address the main challenges of MCDA techniques such as rank reversal paradox resistance or handling uncertainty. Sometimes, authors propose a new operators to support decision making [40,41,42].
The literature review shows how gradually the area of regression analysis has developed and how researchers continue to show increasing interest over time. We can see that most of the researchers’ attention has focused on the FLRM, a simple linear regression model developed using FST. Still, several extensions of the FST can be employed in the FLRM for complex problems. The PHFS works in a hesitant environment so that a researcher not only collects information in a HFS, but also finds its probability values for each HFE, which are referred to as PHFEs. Motivated by PHFS, a fuzzy regression model developed by Peters [5] has been extended using probabilistic information in a hesitant environment called PHFLRM, where IPOP variables are observed as PHFEs. We introduce the concept of PHFLRM such that the model’s coefficients are STFNs. Consequently, the PHFLRM incorporates these PHFEs into the fuzzy regression analysis and uses the LPM to estimate the PHFLRM parameters. Furthermore, alternatives are ranked according to the residual values of the proposed PHFLRM. The proposed approach is evaluated by comparing the results of PHFLRM to those of TOPSIS, which is the most popular MCDM technique. Previously, fuzzy regression models such as the FLRM [43] and HFLRM were used for decision-making; however, these models do not give distribution information. The novelty of our proposed model PHFLRM [11] is that it incorporates distribution information to account for multi-criteria decision-making (MCDM) problems.
This study is organized as follows: In Section 2, some basic definitions and terminologies are discussed. In Section 3, we establish the idea of PHFLRM. Section 4 includes an algorithm of the proposed approach PHFLRM. Section 5 presents an application example of the purposed approach, and a comparative study of the PHFLRM with TOSPSIS methods is discussed. This study concludes in Section 6.

2. Preliminaries

This section discusses basic definitions and terminologies to help readers understand the proposed approach. It is generally tough to reach a final conclusion, because people are usually hesitant when making decisions. Torra [17] developed the following definition of HFS in consideration of this problem:
Definition 1
([17]). For a fixed set Z, a HFS on Z is a function that, when applied to Z, returns a subset of values that fall within the interval [ 0 , 1 ] . Mathematically, it is defined as:
E = z , h E z , z Z
where h E z denotes the possible hesitant membership degrees of z Z to set E, and it is called the hesitant fuzzy element.
The PHFS proposed by Zhu and Xu [18] is an enhanced form of HFS that not only addresses the situation in which decision makers are uncertain as to which of several assessment values best represents their perspective, but also assigns varying probabilities to the assessed values. Mathematically, it is defined as:
Definition 2
([18]). Let Z be a reference set, then a PHFS on Z is defined as:
E p = { z , h z ( γ l | p l ) , z Z } ,
where h z ( γ l | p l ) denotes the probabilistic degrees of memberships of the element z Z to set E p . This is referred to as PHFEs, which can take several membership degrees γ l = ( l = 1 , 2 , , # h z ( p ) ) with the probabilities p l = l = 1 , 2 , , # h z ( p ) ) such that, p l = 1 # h z ( p ) l = 1 . For sake of convenience, we have assumed h z ( γ l | p l ) as h z ( p ) i.e., h z ( p ) = h z ( γ l | p l ) .
Sometimes, the probabilistic information for a PHFE is incomplete; in this situation, an estimate for the incomplete probabilistic information is used by averaging the available data.
Definition 3
([23]). If a PHFE h z ( p ) is given by l = 1 # h z ( p ) p l < 1 , then probabilities for the h z ( p ) are obtained as p l = p l | l = 1 # h z ( p ) p l | , l = 1 , 2 , . . . , # h z ( p ) .
 Some basic operations of PHFEs are defined as follows.
Definition 4.
Let h z 1 ( p ) , h 2 z ( p ) and h z ( p ) be three PHFEs; then, for any λ > 0 ,
1. h 1 z ( p ) h 2 z ( p ) = γ 1 l h z 1 ( p ) , γ 2 k h 2 z ( p ) { [ γ 1 l + γ 2 k γ 1 l γ 2 k ] ( p 1 l p 2 k / l = 1 # h z 1 ( p ) p 1 l . l = 1 # h z 2 ( p ) p 2 k ) } ;
2. h z λ ( p ) = γ l h z ( p ) γ l λ ( p l ) ;
3. λ h = γ l h z ( p ) [ 1 1 γ l λ ] ( p l ) .
Definition 5
([23]). Let h z ( p ) be a PHFE, the score function of h z ( p ) is defined as:
S r ( h z ( p ) ) = ( l = 1 # h z ( p ) γ l . p l ) / l = 1 # h z ( p ) p l
Let h z 1 ( p ) and h z 2 ( p ) be two PHFEs, then
1. if S r ( h z 1 ( p ) ) > S r ( h z 2 ( p ) ) , then h z 2 ( p ) < h z 1 ( p ) ;
2. if S r ( h z 1 ( p ) = S r ( h z 2 ( p ) ) , then h z 1 ( p ) = h z 2 ( p ) .
Definition 6.
Suppose h z 1 ( p ) a n d   h z 2 ( p ) are two PHFEs. Assuming # h 1 ( p ) = # h 2 ( p ) , the distance between h z 1 ( p ) and h z 2 ( p ) is defined as
D ( h z 1 ( p ) , h z 2 ( p ) ) = l = 1 # h z 1 ( p ) = # h z 2 ( p ) | ( ( γ 1 l ] ( p l ) γ 2 l ( p l ) ) 2
The distance measure D ( h z 1 ( p ) , h z 2 ( p ) ) satisfies the following properties:
1 . D ( h z 1 ( p ) , h z 2 ( p ) ) < 1 ;
2 . D ( h z 1 ( p ) , h z 2 ( p ) ) = 0 if and only if h z 1 ( p ) = h z 2 ( p ) .
3 . D ( h z 1 ( p ) , h z 2 ( p ) ) = D ( h z 2 ( p ) , h z 1 ( p ) )

3. Probabilistic Hesitant Fuzzy Linear Regression Model

In this section, we discuss our purposed methodology about PHFLRM from a statistical perspective using hesitant fuzzy information.
Initially, the FLRM was introduced by Tanaka et al. [1]. It is defined as:
Y ^ i = A ˜ 0 + A ˜ 1 X i 1 + A ˜ 2 X i 2 + A ˜ 3 X i 3 + + A ˜ N X i N ,
where the parameters A ˜ j = ( α j , c j ) are symmetrical TFNs, α j is the centre, and c j is the spread of the symmetrical TFNs. The FLRM minimizes the spread of the symmetrical TFNs in the following way [44]:
min j = 0 N c j i = 0 M x i j
with following constraints
y i j = 0 N α j x i j + F m 1 j = 0 N c j x i j ,
y i j = 0 N α j x i j F m 1 j = 0 N c j x i j ,
x i 0 = 1 , c j 0
where F is the membership function of a standardized fuzzy parameter [43].
Peters [5] modified Tanak’s model [1], introducing a new variable λ in the following way:
max λ ¯ = 1 M i = 1 M λ i
with the constraints
d 0 1 λ ¯ p 0 i = 0 M j = 0 N c j x i j , y i 1 λ i p i j = 0 N α j x i j + j = 0 N c j x i j , y i 1 λ i p i + j = 0 N α j x i j + j = 0 N c j x i j , x i 0 = 1 , c j 0 , F m 1 ( H ) = 1 ,
where λ ( 0 λ i 1 ) represents the degree of membership that belongs to a set of good solutions.
The parameters d 0 , p 0 , a n d p i are selected to determine the width of the estimated interval. If a wide interval (a high p 0 and a small p i ) is deemed to minimize the spread, the requirement is regarded as lenient, while a narrow interval (a small p 0 and a high p i ) is taken as a strict condition. The value of d 0 , a desired value of the objective function, is taken as 0 [5].
Motivated by Peters [5], we introduced the PHFLRM for solving decision-making problems. The output variable Y i = { y i k ( p y i k ) | 1 < i < M , 1 < k < P } and the input variables X j = { x i j k ( p x i j k ) | 1 < i < M , 0 < j < N , 1 < k < P } are PHFEs. It is defined as:
Y i = γ ˜ 0 X 0 γ ˜ 1 X 1 γ ˜ 2 X 2 γ ˜ 3 X 3 γ ˜ N X N
where the parameters γ ˜ j = α j k , c j k , 0 < j < N are symmetrical TFNs and k denotes the number of values assigned by the P DMs to the IPOP variables. The PHFLRM parameters are estimated using the following LPM.
max λ k ¯ = 1 M λ i k i = 1 M
with the constraints
d 0 1 λ k ¯ p 0 + i = 1 M j = 0 N c j k x i j k ( p x i j k ) , y i k ( p y i k ) 1 λ i k p i + j = 0 N α j k x i j k ( p x i j k ) + j = 0 N c j k x i j k ( p x i j k ) , y i k ( p y i k ) 1 λ i k p i j = 0 N α j k x i j k ( p x i j k ) + j = 0 N c j k x i j k ( p x i j k ) , λ i k 1 , x i 0 k ( p x i 0 k ) = 1 , c j k 0 .

4. Decision-Making Algorithms

In this section, we will describe the algorithms that are used to solve the PHFLRM and the TOPSIS method, respectively, in detail.

4.1. Algorithm for PHFLRM

Assume A = { A 1 , A 2 , , A M } is a set of alternatives and D = { d l , 1 < l < P } is a set of DMs that provide their evaluations in the form of PHFEs about alternatives A i under some input variables X j ( j = 0 , 1 , 2 , , N ) and output variable Y i ( i = 1 , 2 , , M ) . Let H 1 = [ X i j ] M × N be an input variable decision matrix, H 2 = [ Y i ] M × 1 be an output variable decision matrix, where X i j = { x i j k ( p x i j k ) , k = 1 , 2 , , # ( X i j ) } and Y i = { y i k ( p y i k ) , k = 1 , 2 , , # ( Y i ) } are PHFEs. Figure 1 shows the flowchart of the proposed algorithm, and below are the detailed steps of this algorithm.
Step 1.
Let H = [ Z i j ] M × ( N + 1 ) be a connected IPOP variable decision matrix provided by the DMs, where Z i j = { z i j k ( p z i j k ) , k = 1 , 2 , , # ( Z i j ) } are PHFEs.
Step 2.
For two finite PHFEs h 1 and h 2 , there are two opposite principles for normalization. The first one is α normalization, in which we remove some elements of h 1 and h 2 which have more elements than the other ones. The second one is β normalization, in which we add some elements to h 1 and h 2 , which have fewer elements than the other one. In this study, we use the principle of β normalization to make all PHFEs equal in the matrix H. Let H ` = [ Z ` i j ] M × ( N + 1 ) be the normalized matrix, where Z ` i j = { z ` i j k ( p z ¯ k ) , k = 1 , 2 , , S } are PHFEs.
Step 3.
Using Definition 3, probabilistic information is completed for the PHFES in the decision matrix H ` . Let H ¯ = [ Z ¯ i j ] M × ( N + 1 ) be a decision matrix after completing probabilistic information in the matrix H ` , where Z ¯ i j = { z ¯ i j k ( p z ¯ k ) , k = 1 , 2 , , P } are PHFEs.
Step 4.
Again, normalize the matrix H ¯ by using the following equation
Z ^ i j = z ¯ i j k p z ¯ i j k min ( Z ¯ i j ) max ( Z ¯ i j ) min ( Z ¯ i j )
Let H ^ = [ Z ^ i j ] M × ( N + 1 ) be a normalized decision matrix where Z ^ i j = { z ^ i j k p z ^ i j k , k = 1 , 2 , , P } are PHFEs.
Step 5.
By using the normalized decision matrix H ^ , the PHFLRM is obtained. We further estimate the parameters of PHFLRM employing LPM.
Step 6.
Rank the alternatives using residual values obtained from the score values of Y i ( i = 1 , 2 , , M ) and Y i * ( i = 1 , 2 , , M ) i.e., e i = S c ( Y i ) S c ( Y i * ) , where Y i * are predicted values which are calculated by using Definitions 3 and 4.
Step 7.
Finally, the alternatives are ranked according to the values of e i ( i = 1 , 2 , , M ) . The alternative with the least residual is identified as the best choice.
The HFS, an extension of FST, has attracted the attention of many researchers in a short period, as hesitant situations are very common in real-world problems. Numerous extensions are introduced to address the uncertainty caused by hesitation; PHFS is one of them. The PHFS illustrates not only decision-makers’ hesitancy when they are undecided about something, but also the hesitant distribution of information. In the PHFLRM (3) IPOP variables are observed as PHFEs instead HFEs, which is a basic form of PHFS.

4.2. The TOPSIS Algorithm

A MCDM methodology, TOPSIS, was developed by Hwang and Yoon [30], which provides the shortest distance from the positive ideal solution (PIS) and the longest from the negative ideal solution (NIS) for all possible alternatives. The mathematical formulation of the TOPSIS method when the criteria values are PHFEs is as follows:
Step 1.
Take the decision matrices H , H ` and H ¯ same as mentioned in Step 1, 2, and 3 of Section 4.1.
Step 2.
Normalize the decision matrix H ¯ with the help of the following formula.
Z ^ i j = z 11 1 p z 11 1 1 M i = 1 z i 1 1 p z 11 1 1 2 , z 11 2 p z 12 2 2 M i = 1 z i 2 2 p z i 2 2 2 2 , , z 1 ( N + 1 ) P p z 1 ( N + 1 ) P P M i = 1 z i ( N + 1 ) P p z i ( N + 1 ) P P 2
Let H ^ = [ Z ^ i j ] M × ( N + 1 ) be the normalized decision matrix, where Z ^ i j are PHFEs.
Step 3.
The weighted normalized decision matrix is calculated by multiplaying the normalized decision matrix with its associated weights, i.e., V i j = Z ^ i j × W j , where V i j is a PHFE.
Step 4.
Determine the positive ideal solution A + and negative ideal solution A as
A + = { ( max i V i j | j J b ) , ( min i V i j | j J c | i = 1 , 2 , , N ) } = { A 1 + , A 2 + , , A J + , , A N + 1 } A = { ( min i V i j | j J b ) , ( max i V i j | j J c | i = 1 , 2 , , N ) } = { A 1 , A 2 , , A J , , A N + 1 }
where J b and J c represent the set of benefit and cost criteria, respectively.
Step 5.
Calculate the Euclidean distances of D i + and D i of each alternative A i from the positive ideal solution A + and negative ideal solution A , respectively, by using Definition 6.
Step 6.
Calculate the relative closeness P i of each alternative to the ideal solution as
P i = D i D i + D i + , i = 1 , 2 , , M .
Step 7.
The alternatives A i ( i = 1 , 2 , , M ) are ranked according to relative closeness values P i in the descending order.

5. Application Example

Wheat is the most important rabi crop in Pakistan, and it is also the country’s staple diet. Wheat production is one of the most pressing concerns confronting the agricultural industry today, and it is expected to continue to grow. Various factors such as farm size, seed quality, fertilizer price, irrigation area, and rain amount contribute to the yield of wheat. In this example, a simultaneous analysis including multiple variables is performed for efficient decision making. We consider rain amount ( X 1 ) , farm size ( X 2 ) , and irrigation area ( X 3 ) in order to determine their effect on wheat yield ( Y ) . Twelve districts A i ( i = 1 , 2 , , 12 ) of Punjab (Pakistan) are selected in the form of alternatives. These alternatives are evaluated using Y i ( i = 1 , 2 , , 12 ) and X j + 1 ( j = 0 , 1 , 2 ) as input and output variables, respectively. The IPOP variables have been evaluated by three agriculture department experts. The steps necessary to resolve this problem are listed below.
Step 1.
Table 1 shows the connected IPOP variable decision matrix provided by the DMs employing PHFEs.To make all PHFES equal using b e t a normalization and to make the sum of all probabilities equal to one for all PHFES in the decision matrix H and H ` , respectively, and to obtain matrix H ¯ , as shown in Table 2.
Step 2 & 3.
We obtain the matrix H ¯ , which can be shown in Table 2, by making all PHFES equal using b e t a normalization and making the sum of all probabilities equal to one for all PHFES in the decision matrix H and H ` , respectively.
Step 4 & 5.
We will estimate the parameters from the normalized decision matrix H ^ using LP after normalizing the matrix H ¯ , as follows:
For k = 1
Max λ ¯ 1 = λ 1 M i = 1 M
Subject to the constraints
λ 1 1 + λ 2 1 + + λ 12 1 + 12 1000 ( 12 c 0 1 + 5.0417 c 1 1 + 0.3597 c 2 1 + 0.0534 c 3 1 12
and
λ 1 1 ( α 0 1 + 0.4398 α 1 1 + 0.0121 α 2 1 + 0.0073 α 3 1 ) ( c 0 1 + 0.4398 c 1 1 + 0.0121 c 2 1 + 0.0073 c 3 1 ) 0.2416 λ 2 1 ( α 0 1 + 0.3254 α 1 1 + 0.0277 α 2 1 + 0.0040 α 3 1 ) ( c 0 1 + 0.3254 c 1 1 + 0.0277 c 2 1 + 0.0040 c 3 1 ) 0.1681 λ 3 1 ( α 0 1 + 0.5052 α 1 1 + 0.0173 α 2 1 + 0.0028 α 3 1 ) ( c 0 1 + 0.5052 c 1 1 + 0.0173 c 2 1 + 0.0028 c 3 1 ) 0.3053 λ 4 1 ( α 0 1 + 0.4251 α 1 1 + 0.0290 α 2 1 + 0.0026 α 3 1 ) ( c 0 1 + 0.4251 c 1 1 + 0.0290 c 2 1 + 0.0026 c 3 1 ) 0.1061 λ 5 1 ( α 0 1 + 0.4914 α 1 1 + 0.0248 α 2 1 + 0.0079 α 3 1 ) ( c 0 1 + 0.4914 c 1 1 + 0.0248 c 2 1 + 0.0079 c 3 1 ) 0.0915 λ 6 1 ( α 0 1 + 0.4447 α 1 1 + 0.0264 α 2 1 + 0.0052 α 3 1 ) ( c 0 1 + 0.4447 c 1 1 + 0.0264 c 2 1 + 0.0052 c 3 1 ) 0.0467 λ 7 1 ( α 0 1 + 0.4643 α 1 1 + 0.0305 α 2 1 + 0.0055 α 3 1 ) ( c 0 1 + 0.4643 c 1 1 + 0.0305 c 2 1 + 0.0055 c 3 1 ) 0.1354 λ 8 1 ( α 0 1 + 0.3590 α 1 1 + 0.0379 α 2 1 + 0.0089 α 3 1 ) ( c 0 1 + 0.3590 c 1 1 + 0.0379 c 2 1 + 0.0089 c 3 1 ) 0.0280 λ 9 1 ( α 0 1 + 0.4002 α 1 1 + 0.0460 α 2 1 + 0.0071 α 3 1 ) ( c 0 1 + 0.4002 c 1 1 + 0.0460 c 2 1 + 0.0071 c 3 1 ) 0.2544 λ 10 1 ( α 0 1 + 0.4290 α 1 1 + 0.0310 α 2 1 + 0.0011 α 3 1 ) ( c 0 1 + 0.4290 c 1 1 + 0.0011 c 2 1 + 0.0243 c 3 1 ) 0.0244 λ 11 1 ( α 0 1 + 0.4067 α 1 1 + 0.0513 α 2 1 + 0.0010 α 3 1 ) ( c 0 1 + 0.4067 c 1 1 + 0.0513 c 2 1 + 0.0010 c 3 1 ) 0.1288 λ 12 1 ( α 0 1 + 0.3509 α 1 1 + 0.0257 α 2 1 + 0.0000 α 3 1 ) ( c 0 1 + 0.3509 c 1 1 + 0.0257 c 2 1 + 0.0000 c 3 1 ) 0.3132
and
λ 1 1 + ( α 0 1 + 0.4398 α 1 1 + 0.0121 α 2 1 + 0.0073 α 3 1 ) ( c 0 1 + 0.4398 c 1 1 + 0.0121 c 2 1 + 0.0073 c 3 1 ) 1.7584 λ 2 1 + ( α 0 1 + 0.3254 α 1 1 + 0.0277 α 2 1 + 0.0040 α 3 1 ) ( c 0 1 + 0.3254 c 1 1 + 0.0277 c 2 1 + 0.0040 c 3 1 ) 1.8319 λ 3 1 + ( α 0 1 + 0.5052 α 1 1 + 0.0173 α 2 1 + 0.0028 α 3 1 ) ( c 0 1 + 0.5052 c 1 1 + 0.0173 c 2 1 + 0.0028 c 3 1 ) 1.6947 λ 4 1 + ( α 0 1 + 0.4251 α 1 1 + 0.0290 α 2 1 + 0.0026 α 3 1 ) ( c 0 1 + 0.4251 c 1 1 + 0.0290 c 2 1 + 0.0026 c 3 1 ) 1.8939 λ 5 1 + ( α 0 1 + 0.4914 α 1 1 + 0.0248 α 2 1 + 0.0079 α 3 1 ) ( c 0 1 + 0.4914 c 1 1 + 0.0248 c 2 1 + 0.0079 c 3 1 ) 1.9085 λ 6 1 + ( α 0 1 + 0.4447 α 1 1 + 0.0264 α 2 1 + 0.0052 α 3 1 ) ( c 0 1 + 0.4447 c 1 1 + 0.0264 c 2 1 + 0.0052 c 3 1 ) 1.9533 λ 7 1 + ( α 0 1 + 0.4643 α 1 1 + 0.0305 α 2 1 + 0.0055 α 3 1 ) ( c 0 1 + 0.4643 c 1 1 + 0.0305 c 2 1 + 0.0055 c 3 1 ) 1.8646 λ 8 1 + ( α 0 1 + 0.3590 α 1 1 + 0.0379 α 2 1 + 0.0089 α 3 1 ) ( c 0 1 + 0.3590 c 1 1 + 0.0379 c 2 1 + 0.0089 c 3 1 ) 1.9720 λ 9 1 + ( α 0 1 + 0.4002 α 1 1 + 0.0460 α 2 1 + 0.0071 α 3 1 ) ( c 0 1 + 0.4002 c 1 1 + 0.0460 c 2 1 + 0.0071 c 3 1 ) 1.7456 λ 10 1 + ( α 0 1 + 0.4290 α 1 1 + 0.0310 α 2 1 + 0.0011 α 3 1 ) ( c 0 1 + 0.4290 c 1 1 + 0.0011 c 2 1 + 0.0243 c 3 1 ) 1.9757 λ 11 1 + ( α 0 1 + 0.4067 α 1 1 + 0.0513 α 2 1 + 0.0010 α 3 1 ) ( c 0 1 + 0.4067 c 1 1 + 0.0513 c 2 1 + 0.0010 c 3 1 ) 1.8712 λ 12 1 + ( α 0 1 + 0.3509 α 1 1 + 0.0257 α 2 1 + 0.0000 α 3 1 ) ( c 0 1 + 0.3509 c 1 1 + 0.0257 c 2 1 + 0.0000 c 3 1 ) 1.6868
For k = 2
Max λ ¯ 2 = λ 2 M i = 1 M
Subject to the constraints
λ 1 2 + λ 2 2 + λ 3 2 + + λ 12 2 + 12 1000 ( 12 c 0 2 + 5.0660 c 1 2 + 0.3584 c 2 2 + 0.0308 c 3 2 12
and
λ 1 2 ( α 0 2 + 0.3758 α 1 2 + 0.0197 α 2 2 + 0.0012 α 3 2 ) ( c 0 2 + 0.3758 c 1 2 + 0.0197 c 2 2 + 0.0012 c 3 2 ) 0.3483 λ 2 2 ( α 0 2 + 0.3843 α 1 2 + 0.0228 α 2 2 + 0.0012 α 3 2 ) ( c 0 2 + 0.3843 c 1 2 + 0.0228 c 2 2 + 0.0012 c 3 2 ) 0.1615 λ 3 2 ( α 0 2 + 0.5068 α 1 2 + 0.0261 α 2 2 + 0.0030 α 3 2 ) ( c 0 2 + 0.5052 c 1 2 + 0.0261 c 2 2 + 0.0030 c 3 2 ) 0.3020 λ 4 2 ( α 0 2 + 0.3632 α 1 2 + 0.0187 α 2 2 + 0.0018 α 3 2 ) ( c 0 2 + 0.3632 c 1 2 + 0.0018 c 2 2 + 0.0026 c 3 2 ) 0.2275 λ 5 2 ( α 0 2 + 0.4201 α 1 2 + 0.0264 α 2 2 + 0.0016 α 3 2 ) ( c 0 2 + 0.0264 c 1 2 + 0.0248 c 2 2 + 0.0016 c 3 2 ) 0.3174 λ 6 2 ( α 0 2 + 0.4463 α 1 2 + 0.0281 α 2 2 + 0.0052 α 3 2 ) ( c 0 2 + 0.4463 c 1 2 + 0.0281 c 2 2 + 0.0052 c 3 2 ) 0.2810 λ 7 2 ( α 0 2 + 0.3969 α 1 2 + 0.0236 α 2 2 + 0.0056 α 3 2 ) ( c 0 2 + 0.3969 c 1 2 + 0.0236 c 2 2 + 0.0056 c 3 2 ) 0.1272 λ 8 2 ( α 0 2 + 0.4235 α 1 2 + 0.0379 α 2 2 + 0.0024 α 3 2 ) ( c 0 2 + 0.4235 c 1 2 + 0.0379 c 2 2 + 0.0024 c 3 2 ) 0.0000 λ 9 2 ( α 0 2 + 0.3509 α 1 2 + 0.0383 α 2 2 + 0.0012 α 3 2 ) ( c 0 2 + 0.3509 c 1 2 + 0.0383 c 2 2 + 0.0012 c 3 2 ) 0.1223 λ 10 2 ( α 0 2 + 0.5052 α 1 2 + 0.0408 α 2 2 + 0.0043 α 3 2 ) ( c 0 2 + 0.5052 c 1 2 + 0.0408 c 2 2 + 0.0043 c 3 2 ) 0.0149 λ 11 2 ( α 0 2 + 0.4790 α 1 2 + 0.0414 α 2 2 + 0.0011 α 3 2 ) ( c 0 2 + 0.4790 c 1 2 + 0.0414 c 2 2 + 0.0011 c 3 2 ) 0.2474 λ 12 2 ( α 0 2 + 0.4140 α 1 2 + 0.0048 α 2 2 + 0.0000 α 3 2 ) ( c 0 2 + 0.4140 c 1 2 + 0.0346 c 2 2 + 0.0048 c 3 2 ) 0.1893
and
λ 1 2 + ( α 0 2 + 0.3758 α 1 2 + 0.0197 α 2 2 + 0.0012 α 3 2 ) ( c 0 2 + 0.3758 c 1 2 + 0.0197 c 2 2 + 0.0012 c 3 2 ) 1.6517 λ 2 2 + ( α 0 2 + 0.3843 α 1 2 + 0.0228 α 2 2 + 0.0012 α 3 2 ) ( c 0 2 + 0.3843 c 1 2 + 0.0228 c 2 2 + 0.0012 c 3 2 ) 1.8385 λ 3 2 + ( α 0 2 + 0.5068 α 1 2 + 0.0261 α 2 2 + 0.0030 α 3 2 ) ( c 0 2 + 0.5052 c 1 2 + 0.0261 c 2 2 + 0.0030 c 3 2 ) 1.6980 λ 4 2 + ( α 0 2 + 0.3632 α 1 2 + 0.0187 α 2 2 + 0.0018 α 3 2 ) ( c 0 2 + 0.3632 c 1 2 + 0.0018 c 2 2 + 0.0026 c 3 2 ) 1.7725 λ 5 2 + ( α 0 2 + 0.4201 α 1 2 + 0.0264 α 2 2 + 0.0016 α 3 2 ) ( c 0 2 + 0.0264 c 1 2 + 0.0248 c 2 2 + 0.0016 c 3 2 ) 1.6826 λ 6 2 + ( α 0 2 + 0.4463 α 1 2 + 0.0281 α 2 2 + 0.0052 α 3 2 ) ( c 0 2 + 0.4463 c 1 2 + 0.0281 c 2 2 + 0.0052 c 3 2 ) 1.7190 λ 7 2 + ( α 0 2 + 0.3969 α 1 2 + 0.0236 α 2 2 + 0.0056 α 3 2 ) ( c 0 2 + 0.3969 c 1 2 + 0.0236 c 2 2 + 0.0056 c 3 2 ) 1.8728 λ 8 2 + ( α 0 2 + 0.4235 α 1 2 + 0.0379 α 2 2 + 0.0024 α 3 2 ) ( c 0 2 + 0.4235 c 1 2 + 0.0379 c 2 2 + 0.0024 c 3 2 ) 2.0000 λ 9 2 + ( α 0 2 + 0.3509 α 1 2 + 0.0383 α 2 2 + 0.0012 α 3 2 ) ( c 0 2 + 0.3509 c 1 2 + 0.0383 c 2 2 + 0.0012 c 3 2 ) 1.8777 λ 10 2 + ( α 0 2 + 0.5052 α 1 2 + 0.0408 α 2 2 + 0.0043 α 3 2 ) ( c 0 2 + 0.5052 c 1 2 + 0.0408 c 2 2 + 0.0043 c 3 2 ) 1.9851 λ 11 2 + ( α 0 2 + 0.4790 α 1 2 + 0.0414 α 2 2 + 0.0011 α 3 2 ) ( c 0 2 + 0.4790 c 1 2 + 0.0414 c 2 2 + 0.0011 c 3 2 ) 1.7526 λ 12 2 + ( α 0 2 + 0.4140 α 1 2 + 0.0048 α 2 2 + 0.0000 α 3 2 ) ( c 0 2 + 0.4140 c 1 2 + 0.0346 c 2 2 + 0.0048 c 3 2 ) 1.8107
For k = 3
Max λ ¯ 3 = λ 3 M i = 1 M
Subject to the constraints
λ 1 3 + λ 2 3 + λ 3 3 + + λ 12 3 + 12 1000 ( 12 c 0 3 + 5.0111 c 1 3 + 0.3423 c 2 3 + 0.0424 c 3 3 12
and
λ 1 3 ( α 0 3 + 0.4431 α 1 3 + 0.0277 α 2 3 + 0.0012 α 3 3 ) ( c 0 3 + 0.4431 c 1 3 + 0.0277 c 2 3 + 0.0012 c 3 3 ) 0.2301 λ 2 3 ( α 0 3 + 0.3859 α 1 3 + 0.0173 α 2 3 + 0.0045 α 3 3 ) ( c 0 3 + 0.3859 c 1 3 + 0.0173 c 2 3 + 0.0045 c 3 3 ) 0.2754 λ 3 3 ( α 0 3 + 0.4333 α 1 3 + 0.0202 α 2 3 + 0.0001 α 3 3 ) ( c 0 3 + 0.4333 c 1 3 + 0.0202 c 2 3 + 0.0001 c 3 3 ) 0.4029 λ 4 3 ( α 0 3 + 0.4284 α 1 3 + 0.0243 α 2 3 + 0.0052 α 3 3 ) ( c 0 3 + 0.4284 c 1 3 + 0.0243 c 2 3 + 0.0052 c 3 3 ) 0.2216 λ 5 3 ( α 0 3 + 0.4216 α 1 3 + 0.0201 α 2 3 + 0.0016 α 3 3 ) ( c 0 3 + 0.4216 c 1 3 + 0.0201 c 2 3 + 0.0016 c 3 3 ) 0.3090 λ 6 3 ( α 0 3 + 0.3815 α 1 3 + 0.0229 α 2 3 + 0.0065 α 3 3 ) ( c 0 3 + 0.3815 c 1 3 + 0.0229 c 2 3 + 0.0065 c 3 3 ) 0.2754 λ 7 3 ( α 0 3 + 0.4676 α 1 3 + 0.0265 α 2 3 + 0.0023 α 3 3 ) ( c 0 3 + 0.4676 c 1 3 + 0.0265 c 2 3 + 0.0023 c 3 3 ) 0.2460 λ 8 3 ( α 0 3 + 0.4251 α 1 3 + 0.0313 α 2 3 + 0.0025 α 3 3 ) ( c 0 3 + 0.4251 c 1 3 + 0.0313 c 2 3 + 0.0025 c 3 3 ) 0.5023 λ 9 3 ( α 0 3 + 0.3582 α 1 3 + 0.0477 α 2 3 + 0.0014 α 3 3 ) ( c 0 3 + 0.3582 c 1 3 + 0.0477 c 2 3 + 0.0014 c 3 3 ) 0.1190 λ 10 3 ( α 0 3 + 0.4319 α 1 3 + 0.0339 α 2 3 + 0.0046 α 3 3 ) ( c 0 3 + 0.4319 c 1 3 + 0.0339 c 2 3 + 0.0046 c 3 3 ) 0.5135 λ 11 3 ( α 0 3 + 0.4807 α 1 3 + 0.0342 α 2 3 + 0.0076 α 3 3 ) ( c 0 3 + 0.4807 c 1 3 + 0.0342 c 2 3 + 0.0076 c 3 3 ) 0.1141 λ 12 3 ( α 0 3 + 0.3538 α 1 3 + 0.0362 α 2 3 + 0.0049 α 3 3 ) ( c 0 3 + 0.3538 c 1 3 + 0.0362 c 2 3 + 0.0049 c 3 3 ) 0.1877
and
λ 1 3 + ( α 0 3 + 0.4431 α 1 3 + 0.0277 α 2 3 + 0.0012 α 3 3 ) ( c 0 3 + 0.4431 c 1 3 + 0.0277 c 2 3 + 0.0012 c 3 3 ) 1.7699 λ 2 3 + ( α 0 3 + 0.3859 α 1 3 + 0.0173 α 2 3 + 0.0045 α 3 3 ) ( c 0 3 + 0.3859 c 1 3 + 0.0173 c 2 3 + 0.0045 c 3 3 ) 1.7246 λ 3 3 + ( α 0 3 + 0.4333 α 1 3 + 0.0202 α 2 3 + 0.0001 α 3 3 ) ( c 0 3 + 0.4333 c 1 3 + 0.0202 c 2 3 + 0.0001 c 3 3 ) 1.5971 λ 4 3 + ( α 0 3 + 0.4284 α 1 3 + 0.0243 α 2 3 + 0.0052 α 3 3 ) ( c 0 3 + 0.4284 c 1 3 + 0.0243 c 2 3 + 0.0052 c 3 3 ) 1.7784 λ 5 3 + ( α 0 3 + 0.4216 α 1 3 + 0.0201 α 2 3 + 0.0016 α 3 3 ) ( c 0 3 + 0.4216 c 1 3 + 0.0201 c 2 3 + 0.0016 c 3 3 ) 1.6910 λ 6 3 + ( α 0 3 + 0.3815 α 1 3 + 0.0229 α 2 3 + 0.0065 α 3 3 ) ( c 0 3 + 0.3815 c 1 3 + 0.0229 c 2 3 + 0.0065 c 3 3 ) 1.7246 λ 7 3 + ( α 0 3 + 0.4676 α 1 3 + 0.0265 α 2 3 + 0.0023 α 3 3 ) ( c 0 3 + 0.4676 c 1 3 + 0.0265 c 2 3 + 0.0023 c 3 3 ) 1.7540 λ 8 3 + ( α 0 3 + 0.4251 α 1 3 + 0.0313 α 2 3 + 0.0025 α 3 3 ) ( c 0 3 + 0.4251 c 1 3 + 0.0313 c 2 3 + 0.0025 c 3 3 ) 1.4977 λ 9 3 + ( α 0 3 + 0.3582 α 1 3 + 0.0477 α 2 3 + 0.0014 α 3 3 ) ( c 0 3 + 0.3582 c 1 3 + 0.0477 c 2 3 + 0.0014 c 3 3 ) 1.8810 λ 10 3 + ( α 0 3 + 0.4319 α 1 3 + 0.0339 α 2 3 + 0.0046 α 3 3 ) ( c 0 3 + 0.4319 c 1 3 + 0.0339 c 2 3 + 0.0046 c 3 3 ) 1.4865 λ 11 3 + ( α 0 3 + 0.4807 α 1 3 + 0.0342 α 2 3 + 0.0076 α 3 3 ) ( c 0 3 + 0.4807 c 1 3 + 0.0342 c 2 3 + 0.0076 c 3 3 ) 1.8859 λ 12 3 + ( α 0 3 + 0.3538 α 1 3 + 0.0362 α 2 3 + 0.0049 α 3 3 ) ( c 0 3 + 0.3538 c 1 3 + 0.0362 c 2 3 + 0.0049 c 3 3 ) 1.8123
After solving the linear programming model, as mentioned above, we get the values of λ i k ( i = 1 , 2 , , 12 ) , α j k ( j = 1 , 2 , 3 , 4 ) and c j k ( j = 1 , 2 , 3 , 4 ) , which are shown in Table 3:
We can see in Table 3 that the estimated values λ i k obtained by solving a LP model are either equal to 1 or very close to 1. The resultant estimated PHFLRM employing equations from Section 3, is obtained as follows:
Y * = ( 0.6467 , 0.01252 ) ( 0.1575 , 0.06378 ) X 1 ( 5.3685 , 2.6160 ) X 2 ( 8.8450 , 0.4811 ) X 3 .
Step 6 & 7.
By using PHFLRM, we will find the estimated PHFEs ( Y * ) of all possible alternatives. To save time, we will just compute the estimated PHFE Y 1 * against the alternative A 1 using the Definition 3 and 4, as follows:
Y 1 * = { 0.7194 ( 0.03889 ) , 0.7137 ( 0.03333 ) , 0.7197 ( 0.03888 ) , 0.7308 ( 0.04667 ) , 0.7253 ( 0.04667 ) , 0.7311 ( 0.04667 ) , 0.7424 ( 0.05444 ) , 0.7371 ( 0.04667 ) , 0.7559 ( 0.05444 ) , 0.7038 ( 0.02917 ) , 0.6977 ( 0.02500 ) , 0.7041 ( 0.02917 ) , 0.7158 ( 0.03500 ) , 0.7100 ( 0.03000 ) , 0.7162 ( 0.03500 ) , 0.7281 ( 0.04083 ) , 0.7225 ( 0.03500 ) , 0.7284 ( 0.04083 ) , 0.7038 ( 0.02917 ) , 0.6977 ( 0.02500 ) , 0.7041 ( 0.02917 ) , 0.7158 ( 0.03500 ) , 0.7100 ( 0.03000 ) , 0.7162 ( 0.03500 ) , 0.7281 ( 0.04083 ) , 0.7281 ( 0.03500 ) , 0.7284 ( 0.04083 ) }
By using Definition 5, the score value ( S c ( Y 1 * ) ) of the estimated PHFE ( Y 1 * ) is computed, i.e., 0.7223 . In the same way, we can find all score values ( S c ( Y i * ) of the estimated PHFEs, Y i * ( i = 2 , , 12 ) , in Table 4. Further, residual values e i against each alternative Y i are calculated as e i = S c ( Y i ) S c ( Y i * ) , i = 1 , 2 , , 12 , and finally, all alternatives are ranked using these residual values e i , in Table 4. We have the smallest residual e 3 = 30.04827 against the alternative A 3 , so it is considered as the best choice. Additionally, the alternative A 11 has the largest residual e 11 = 47.9838 , and is considered the worst alternative.

A Comparative Study of the PHFLRM and the TOPSIS

The TOPSIS method, which is a MCDM tool, has been used to verify the results and efficiency of our proposed approach. For the same problem, the results of the proposed method are compared with the results of the TOPSIS method. We have taken rain amount ( X 1 ) , farm size ( X 2 ) , irrigated area ( X 3 ) , and wheat yield ( Y ) as the benefit criteria. Following steps 1 , 2 , a n d 3 of the TOPSIS algorithm (Section 4.1), we have the PIS ( A + ) and NIS ( A ) , as follows:
A + = { { 0.3422 , 0.3420 , 0.3292 } , { 0.4079 , 0.3542 . 0.4019 } , { 0.3437 , 0.3312 , 0.3420 } , { 0.3297 , 0.3540 , 0.3500 } } A = { { 0.2246 , 0.2403 , 0.2454 } , { 0.1765 , 0.2180 , 0.2153 } , { 0.2289 , 0.2674 , 0.2398 } , { 0.2338 , 0.2328 , 0.1953 } }
Further, the values of Euclidean distances ( D + and D ) and relative closeness ( P i ) for each alternative are computed in Table 5 by using step 5, 6, and 7 of the algorithm (Section 4.1), as follows:
Table 5 shows that the best choice among the alternatives is A 3 as it has the largest value of P i , whereas the alternative A 1 1 is considered the worst choice of alternative, as it has the largest value of P i . Further, two sets of ranking R H F L L R and R T O P S I S are compared using the bar chart in Figure 2, as follows:
Figure 2 illustrates that the ranking order between two sets of rankings, R P H F L L R and R T O P S I S , is nearly similar, and that there is no significant difference between them. Although the graphical presentation provides a quick assessment of the performance of two ranking sets, R P H F L L R and R T O P S I S , it is not conclusive. In order to determine the statistical significance of the two sets of rankings, the Spearman’s rank correlation coefficient is calculated, as shown in Table 6.
From Table 6, Spearman’s correlation coefficient is calculated as r s = 1 6 ( 38 ) 1584 = 0.87 , which shows that two sets of rankings, R P H F L L R and R T O P S I S , are strongly related to each other [45]. To evaluate whether the correlation coefficient r s = 0.87 is meaningful or not, a statistical test is performed, taking the null hypothesis ( H 0 : there is no relationship between the two sets of rankings) against the alternative hypothesis ( H 1 : there is a relationship between two sets of rankings) at the 5 % level of significance. As the calculated value, Z c r s M 1 = 0.87 12 1 = 2.88 , exceeds the table value Z 0.05 = 1.645 , we reject H 0 and conclude that there is a very strong relationship between the two sets of rankings. Additionally, the values of correlation r w and similarity coefficient W S [46] were examined for the considered example. These values are 0.8607 and 0.9289, respectively, confirming the close correlation between the obtained results.

6. Conclusions

This paper provides a MCDM approach to FLRMs by incorporating probabilistic hesitant information. This concept has not been explored previously, and is a novel alternative to statistical regression in resolving MCDM challenges. The proposed methodology PHFLRM is applied in agriculture to evaluate wheat production in different Pakistan districts by considering significant factors such as rainfall, farm size, and irrigated area. We examined twelve districts’ yields across the country in the context of four factors that significantly affect wheat yield production. Similarly, we may include more criteria and alternatives, but computing becomes more complicated as the number of alternatives or criteria examined increases. Finally, the suggested methodology’s (PHFRM) outcomes are compared to the widely used decision-making technique called TOPSIS.
Compared with TOPSIS, the complexity of the proposed methodologies does not increase by inserting more criteria and alternatives into the given MCDM problems. The proposed methodology provides results by solving a simple LP model to obtain the ranking for decision-making problems, which provides results quickly, with less computational time than TOPSIS. The proposed methodologies may be a feasible alternative decision-making approach that accommodates a high-level system fuzziness. In the future, we will further investigate the applications of FLRM in decision-making using different FS extensions, and we should also investigate the accuracy of the obtained results.

Author Contributions

Conceptualization, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); methodology, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); software, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); validation, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); formal analysis, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); investigation, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); resources, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); data curation, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); writing—original draft preparation, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); writing—review and editing, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); visualization, A.S. (Ayesha Sultan), W.S., S.F., M.I. and A.S. (Andrii Shekhovtsov); supervision, W.S. and S.F.; project administration, W.S. and S.F.; funding acquisition, W.S. All authors have read and agreed to the published version of the manuscript.

Funding

The work was supported by the National Science Centre 2018/29/B/HS4/02725 and 2021/41/B/HS4/01296 (Andrii Shekhovtsov and Wojciech Sałabun).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the editor and the anonymous reviewers, whose insightful comments and constructive suggestions helped us to significantly improve the quality of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MCDMMulti-Criteria Decision-Making;
HFSHesitant Fuzzy Set;
PHFSProbabilistic Hesitant Fuzzy Set;
FLRMFuzzy Linear Regression Model;
PHFEProbabilistic Hesitant Fuzzy Elements;
IPOPInput–Output;
LPMLinear Programming Model;
PHFLRMProbabilistic Hesitant Fuzzy Linear Regression Model;
TOPSISTechnique For Order Preference by Similarity to Ideal Solution.

References

  1. Asai, H.; Tanaka, S.; Uegima, K. Linear regression analysis with fuzzy model. IEEE Trans. Syst. Man Cybern 1982, 12, 903–907. [Google Scholar]
  2. Tanaka, H. Fuzzy data analysis by possibilistic linear models. Fuzzy Sets Syst. 1987, 24, 363–375. [Google Scholar] [CrossRef]
  3. Celmiņš, A. Least squares model fitting to fuzzy vector data. Fuzzy Sets Syst. 1987, 22, 245–269. [Google Scholar] [CrossRef]
  4. Diamond, P. Fuzzy least squares. Inf. Sci. 1988, 46, 141–157. [Google Scholar] [CrossRef]
  5. Peters, G. Fuzzy linear regression with fuzzy intervals. Fuzzy Sets Syst. 1994, 63, 45–55. [Google Scholar] [CrossRef]
  6. Wang, H.F.; Tsaur, R.C. Bicriteria variable selection in a fuzzy regression equation. Comput. Math. Appl. 2000, 40, 877–883. [Google Scholar] [CrossRef]
  7. Hong, D.H.; Song, J.K.; Do, H.Y. Fuzzy least-squares linear regression analysis using shape preserving operations. Inf. Sci. 2001, 138, 185–193. [Google Scholar] [CrossRef]
  8. Tanaka, H.; Lee, H. Fuzzy linear regression combining central tendency and possibilistic properties. In Proceedings of the 6th International Fuzzy Systems Conference, Barcelona, Spain, 5 July 2009; Volume 1. pp. 63–68. [Google Scholar]
  9. Modarres, M.; Nasrabadi, E.; Nasrabadi, M.M. Fuzzy linear regression analysis from the point of view risk. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2004, 12, 635–649. [Google Scholar] [CrossRef]
  10. Parvathi, R.; Malathi, C.; Akram, M.; Atanassov, K.T. Intuitionistic fuzzy linear regression analysis. Fuzzy Optim. Decis. Mak. 2013, 12, 215–229. [Google Scholar] [CrossRef]
  11. Sultan, A.; Sałabun, W.; Faizi, S.; Ismail, M. Hesitant Fuzzy linear regression model for decision making. Symmetry 2021, 13, 1846. [Google Scholar] [CrossRef]
  12. Bardossy, A. Note on fuzzy regression. Fuzzy Sets Syst. 1990, 37, 65–75. [Google Scholar] [CrossRef]
  13. Zadeh, L.A. Information and control. Fuzzy Sets 1965, 8, 338–353. [Google Scholar]
  14. Sahu, R.; Dash, S.R.; Das, S. Career selection of students using hybridized distance measure based on picture fuzzy set and rough set theory. Decis. Mak. Appl. Manag. Eng. 2021, 4, 104–126. [Google Scholar] [CrossRef]
  15. Gorcun, O.F.; Senthil, S.; Küçükönder, H. Evaluation of tanker vehicle selection using a novel hybrid fuzzy MCDM technique. Decis. Mak. Appl. Manag. Eng. 2021, 4, 140–162. [Google Scholar] [CrossRef]
  16. Zadeh, L.A. The concept of a linguistic variable and its application to approximate reasoning—I. Inf. Sci. 1975, 8, 199–249. [Google Scholar] [CrossRef]
  17. Torra, V. Hesitant fuzzy sets. Int. J. Intell. Syst. 2010, 25, 529–539. [Google Scholar] [CrossRef]
  18. Zhu, B.; Xu, Z. Probability-hesitant fuzzy sets and the representation of preference relations. Technol. Econ. Dev. Econ. 2018, 24, 1029–1040. [Google Scholar] [CrossRef]
  19. Atanasov, K. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  20. Liu, X.; Wang, Z.; Zhang, S.; Garg, H. Novel correlation coefficient between hesitant fuzzy sets with application to medical diagnosis. Expert Syst. Appl. 2021, 183, 115393. [Google Scholar] [CrossRef]
  21. Zeng, W.; Xi, Y.; Yin, Q.; Guo, P. Weighted dual hesitant fuzzy set and its application in group decision making. Neurocomputing 2021, 458, 714–726. [Google Scholar] [CrossRef]
  22. Yan, Y.; Wu, X.; Wu, Z. Bridge safety monitoring and evaluation based on hesitant fuzzy set. Alex. Eng. J. 2022, 61, 1183–1200. [Google Scholar] [CrossRef]
  23. Zhang, S.; Xu, Z.; He, Y. Operations and integrations of probabilistic hesitant fuzzy information in decision making. Inf. Fusion 2017, 38, 1–11. [Google Scholar] [CrossRef]
  24. Gao, J.; Xu, Z.; Liao, H. A dynamic reference point method for emergency response under hesitant probabilistic fuzzy environment. Int. J. Fuzzy Syst. 2017, 19, 1261–1278. [Google Scholar] [CrossRef]
  25. Li, J.; Wang, J.Q. An extended QUALIFLEX method under probability hesitant fuzzy environment for selecting green suppliers. Int. J. Fuzzy Syst. 2017, 19, 1866–1879. [Google Scholar] [CrossRef]
  26. Wu, Z.; Jin, B.; Xu, J. Local feedback strategy for consensus building with probability-hesitant fuzzy preference relations. Appl. Soft Comput. 2018, 67, 691–705. [Google Scholar] [CrossRef]
  27. Saaty, T.L. The Analytic Hierarchy Process; Agricultural Economics Review; Mcgraw Hill: New York, NY, USA, 1980; Volume 70. [Google Scholar]
  28. Rezaei, J. Best-worst multi-criteria decision-making method. Omega 2015, 53, 49–57. [Google Scholar] [CrossRef]
  29. Keshavarz Ghorabaee, M.; Zavadskas, E.K.; Olfat, L.; Turskis, Z. Multi-criteria inventory classification using a new method of evaluation based on distance from average solution (EDAS). Informatica 2015, 26, 435–451. [Google Scholar] [CrossRef]
  30. Palczewski, K.; Sałabun, W. The fuzzy TOPSIS applications in the last decade. Procedia Comput. Sci. 2019, 159, 2294–2303. [Google Scholar] [CrossRef]
  31. Pamucar, D.; Žižović, M.; Biswas, S.; Božanić, D. A new logarithm methodology of additive weights (LMAW) for multi-criteria decision-making: Application in logistics. Facta Univ. Ser. Mech. Eng. 2021, 19, 361–380. [Google Scholar] [CrossRef]
  32. Pamucar, D.; Ecer, F. Prioritizing the weights of the evaluation criteria under fuzziness: The fuzzy full consistency method–FUCOM-F. Facta Univ. Ser. Mech. Eng. 2020, 18, 419–437. [Google Scholar] [CrossRef]
  33. Faizi, S.; Sałabun, W.; Ullah, S.; Rashid, T.; Więckowski, J. A new method to support decision-making in an uncertain environment based on normalized interval-valued triangular fuzzy numbers and comet technique. Symmetry 2020, 12, 516. [Google Scholar] [CrossRef] [Green Version]
  34. Faizi, S.; Rashid, T.; Sałabun, W.; Zafar, S.; Wątróbski, J. Decision making with uncertainty using hesitant fuzzy sets. Int. J. Fuzzy Syst. 2018, 20, 93–103. [Google Scholar] [CrossRef] [Green Version]
  35. Stanujkić, D.; Karabašević, D. An extension of the WASPAS method for decision-making problems with intuitionistic fuzzy numbers: A case of website evaluation. Oper. Res. Eng. Sci. Theory Appl. 2018, 1, 29–39. [Google Scholar] [CrossRef]
  36. Dezert, J.; Tchamova, A.; Han, D.; Tacnet, J.M. The SPOTIS rank reversal free method for multi-criteria decision-making support. In Proceedings of the 2020 IEEE 23rd International Conference on Information Fusion (FUSION), Rustenburg, South Africa, 6–9 July 2020; pp. 1–8. [Google Scholar]
  37. Shekhovtsov, A.; Kizielewicz, B.; Sałabun, W. New rank-reversal free approach to handle interval data in mcda problems. In Proceedings of the International Conference on Computational Science, Krakow, Poland, 16–18 June 2021; Springer: Berlin/Heidelberg, Germany, 2021; pp. 458–472. [Google Scholar]
  38. Božanić, D.; Milić, A.; Tešić, D.; Salabun, W.; Pamučar, D. D numbers–FUCOM–fuzzy RAFSI model for selecting the group of construction machines for enabling mobility. Facta Univ. Ser. Mech. Eng. 2021, 19, 447–471. [Google Scholar] [CrossRef]
  39. Đalić, I.; Ateljević, J.; Stević, Ž.; Terzić, S. An integrated swot–fuzzy piprecia model for analysis of competitiveness in order to improve logistics performances. Facta Univ. Ser. Mech. Eng. 2020, 18, 439–451. [Google Scholar] [CrossRef]
  40. Li, Z.; Wei, G. Pythagorean fuzzy heronian mean operators in multiple attribute decision making and their application to supplier selection. Int. J. Knowl.-Based Intell. Eng. Syst. 2019, 23, 77–91. [Google Scholar] [CrossRef]
  41. Ashraf, A.; Ullah, K.; Hussain, A.; Bari, M. Interval-Valued Picture Fuzzy Maclaurin Symmetric Mean Operator with application in Multiple Attribute Decision-Making. Rep. Mech. Eng. 2022, 3, 301–317. [Google Scholar] [CrossRef]
  42. Wei, G.; Lu, M.; Gao, H. Picture fuzzy heronian mean aggregation operators in multiple attribute decision making. Int. J. Knowl.-Based Intell. Eng. Syst. 2018, 22, 167–175. [Google Scholar] [CrossRef]
  43. Karsak, E.E.; Sener, Z.; Dursun, M. Robot selection using a fuzzy regression-based decision-making approach. Int. J. Prod. Res. 2012, 50, 6826–6834. [Google Scholar] [CrossRef]
  44. Kim, K.J.; Moskowitz, H.; Koksalan, M. Fuzzy versus statistical linear regression. Eur. J. Oper. Res. 1996, 92, 417–434. [Google Scholar] [CrossRef]
  45. Chowdhury, A.K.; Debsarkar, A.; Chakrabarty, S. Novel methods for assessing urban air quality: Combined air and noise pollution approach. J. Atmos. Pollut. 2015, 3, 1–8. [Google Scholar]
  46. Sałabun, W.; Urbaniak, K. A new coefficient of rankings similarity in decision-making problems. In Proceedings of the International Conference on Computational Science, Amsterdam, The Netherlands, 3–5 June 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 632–645. [Google Scholar]
Figure 1. Flowchart of the proposed algorithm for PHFLRM.
Figure 1. Flowchart of the proposed algorithm for PHFLRM.
Sensors 22 05736 g001
Figure 2. Ranking with PHFLRM and TOPSIS.
Figure 2. Ranking with PHFLRM and TOPSIS.
Sensors 22 05736 g002
Table 1. Decision matrix H.
Table 1. Decision matrix H.
A i Y i X i 1 X i 2 X i 3
A 1 { 475 ( 0.35 ) , 478 ( 0.30 ) , 482 ( 0.35 ) } { 280 ( 0.35 ) , 281 ( 0.30 ) , 282 ( 0.35 ) } { 23 ( 0.25 ) , 24 ( 0.30 ) , 25 ( 0.35 ) } { 13.40 ( 0.40 ) , 13.50 ( 0.30 ) }
A 2 { 520 ( 0.35 ) , 524 ( 0.35 ) , 530 ( 0.30 ) } { 245 ( 0.30 ) , 246 ( 0.35 ) , 247 ( 0.35 ) } { 25 ( 0.35 ) , 26 ( 0.30 ) , 27 ( 0.25 ) } { 13.30 ( 0.35 ) , 13.50 ( 0.30 ) , 13.60 ( 0.35 ) }
A 3 { 436 ( 0.35 ) , 438 ( 0.35 ) , 439 ( 0.30 ) } { 320 ( 0.35 ) , 321 ( 0.35 ) , 322 ( 0.30 ) } { 24 ( 0.25 ) , 25 ( 0.30 ) , 26 ( 0.25 ) } { 12.55 ( 0.35 ) , 12.65 ( 0.35 ) , 12.70 ( 0.30 ) }
A 4 { 530 ( 0.35 ) , 536 ( 0.30 ) , 540 ( 0.30 ) } { 271 ( 0.35 ) , 272 ( 0.30 ) , 273 ( 0.35 ) } { 25 ( 0.40 ) , 26 ( 0.30 ) , 30 ( 0.30 ) } { 13.75 ( 0.30 ) , 13.25 ( 0.30 ) , 13.30 ( 0.35 ) }
A 5 { 496 ( 0.40 ) , 500 ( 0.30 ) , 506 ( 0.30 ) } { 296 ( 0.35 ) , 297 ( 0.30 ) , 298 ( 0.30 ) } { 26 ( 0.35 ) , 27 ( 0.35 ) , 27 ( 0.30 ) } { 13.68 ( 0.40 ) , 13.75 ( 0.30 ) , 13.80 ( 0.30 ) }
A 6 { 520 ( 0.40 ) , 526 ( 0.30 ) , 530 ( 0.30 ) } { 283 ( 0.35 ) , 284 ( 0.35 ) , 285 ( 0.30 ) } { 27 ( 0.35 ) , 28 ( 0.30 ) , 29 ( 0.30 ) } { 14.00 ( 0.35 ) , 14.50 ( 0.30 ) , 14.80 ( 0.35 ) }
A 7 { 540 ( 0.35 ) , 545 ( 0.35 ) , 551 ( 0.30 ) } { 295 ( 0.35 ) , 296 ( 0.30 ) , 297 ( 0.35 ) } { 28 ( 0.35 ) , 28 ( 0.30 ) , 30 ( 0.30 } { 14.18 ( 0.35 ) , 14.25 ( 0.35 ) , 14.30 ( 0.30 ) }
A 8 { 530 ( 0.40 ) , 545 ( 0.40 ) , 552 ( 0.40 ) } { 269 ( 0.30 ) , 270 ( 0.35 ) , 271 ( 0.35 ) } { 34 ( 0.35 ) , 34 ( 0.35 ) , 35 ( 0.30 ) } { 14.23 ( 0.40 ) , 14.35 ( 0.30 ) , 14.40 ( 0.30 ) }
A 9 { 545 ( 0.30 ) , 548 ( 0.35 ) , 550 ( 0.35 ) } { 243 ( 0.35 ) , 250 ( 0.30 ) , 255 ( 0.30 ) } { 39 ( 0.35 ) , 40 ( 0.30 ) } { 13.30 ( 0.40 ) , 13.50 ( 0.30 ) , 13.60 ( 0.30 ) }
A 10 { 532 ( 0.40 ) , 537 ( 0.40 ) , 540 ( 0.20 ) } { 303 ( 0.30 ) , 304 ( 0.35 ) , 305 ( 0.30 ) } { 33 ( 0.30 ) , 34 ( 0.35 ) , 35 ( 0.30 ) } { 13.38 ( 0.30 ) , 13.45 ( 0.35 ) , 13.65 ( 0.35 ) }
A 11 { 544 ( 0.35 ) , 550 ( 0.30 ) , 553 ( 0.35 ) } { 303 ( 0.30 ) , 304 ( 0.35 ) , 305 ( 0.35 ) } { 38 ( 0.35 ) , 38 ( 0.30 ) , 40 ( 0.25 ) } { 13.35 ( 0.30 ) , 13.45 ( 0.30 ) , 13.55 ( 0.40 ) }
A 12 { 503 ( 0.30 ) , 507 ( 0.35 ) , 508 ( 0.35 ) } { 250 ( 0.30 ) , 251 ( 0.35 ) , 252 ( 0.30 ) } { 31 ( 0.30 ) , 32 ( 0.35 ) , 33 ( 0.35 ) } { 12.63 ( 0.30 ) , 13.75 ( 0.35 ) , }
Table 2. Decision matrix.
Table 2. Decision matrix.
A i Y i X i 1 X i 2 X i 3
A 1 { 475 ( 0.35 ) , 478 ( 0.30 ) , 482 ( 0.35 ) } { 280 ( 0.35 ) , 281 ( 0.30 ) , 282 ( 0.35 ) } { 23 ( 0.25 ) , 24 ( 0.30 ) , 25 ( 0.35 ) } { 13.40 ( 0.40 ) , 13.50 ( 0.30 ) , 13.50 ( 0.30 ) }
A 2 { 520 ( 0.35 ) , 524 ( 0.35 ) , 530 ( 0.30 ) } { 245 ( 0.30 ) , 246 ( 0.35 ) , 247 ( 0.35 ) } { 25 ( 0.35 ) , 26 ( 0.30 ) , 27 ( 0.25 ) } { 13.30 ( 0.35 ) , 13.50 ( 0.30 ) , 13.60 ( 0.35 ) }
A 3 { 436 ( 0.35 ) , 438 ( 0.35 ) , 439 ( 0.30 ) } { 320 ( 0.35 ) , 321 ( 0.35 ) , 322 ( 0.30 ) } { 24 ( 0.25 ) , 25 ( 0.30 ) , 26 ( 0.25 ) } { 12.55 ( 0.35 ) , 12.65 ( 0.35 ) , 12.70 ( 0.30 ) }
A 4 { 530 ( 0.35 ) , 536 ( 0.30 ) , 540 ( 0.30 ) } { 271 ( 0.35 ) , 272 ( 0.30 ) , 273 ( 0.35 ) } { 25 ( 0.40 ) , 26 ( 0.30 ) , 30 ( 0.30 ) } { 13.75 ( 0.30 ) , 13.25 ( 0.30 ) , 13.30 ( 0.35 ) }
A 5 { 496 ( 0.40 ) , 500 ( 0.30 ) , 506 ( 0.30 ) } { 296 ( 0.35 ) , 297 ( 0.30 ) , 298 ( 0.30 ) } { 26 ( 0.35 ) , 27 ( 0.35 ) , 27 ( 0.30 ) } { 13.68 ( 0.40 ) , 13.75 ( 0.30 ) , 13.80 ( 0.30 ) }
A 6 { 520 ( 0.40 ) , 526 ( 0.30 ) , 530 ( 0.30 ) } { 283 ( 0.35 ) , 284 ( 0.35 ) , 285 ( 0.30 ) } { 27 ( 0.35 ) , 28 ( 0.30 ) , 29 ( 0.30 ) } { 14.00 ( 0.35 ) , 14.50 ( 0.30 ) , 14.80 ( 0.35 ) }
A 7 { 540 ( 0.35 ) , 545 ( 0.35 ) , 551 ( 0.30 ) } { 295 ( 0.35 ) , 296 ( 0.30 ) , 297 ( 0.35 ) } { 28 ( 0.35 ) , 28 ( 0.30 ) , 30 ( 0.30 } { 14.18 ( 0.35 ) , 14.25 ( 0.35 ) , 14.30 ( 0.30 ) }
A 8 { 530 ( 0.40 ) , 545 ( 0.40 ) , 552 ( 0.40 ) } { 269 ( 0.30 ) , 270 ( 0.35 ) , 271 ( 0.35 ) } { 34 ( 0.35 ) , 34 ( 0.35 ) , 35 ( 0.30 ) } { 14.23 ( 0.40 ) , 14.35 ( 0.30 ) , 14.40 ( 0.30 ) }
A 9 { 545 ( 0.30 ) , 548 ( 0.35 ) , 550 ( 0.35 ) } { 243 ( 0.35 ) , 250 ( 0.30 ) , 255 ( 0.30 ) } { 39 ( 0.35 ) , 40 ( 0.30 ) , 40 ( 0.30 ) } { 13.30 ( 0.40 ) , 13.50 ( 0.30 ) , 13.60 ( 0.30 ) }
A 10 { 532 ( 0.40 ) , 537 ( 0.40 ) , 540 ( 0.20 ) } { 303 ( 0.30 ) , 304 ( 0.35 ) , 305 ( 0.30 ) } { 33 ( 0.30 ) , 34 ( 0.35 ) , 35 ( 0.30 ) } { 13.38 ( 0.30 ) , 13.45 ( 0.35 ) , 13.65 ( 0.35 ) }
A 11 { 544 ( 0.35 ) , 550 ( 0.30 ) , 553 ( 0.35 ) } { 303 ( 0.30 ) , 304 ( 0.35 ) , 305 ( 0.35 ) } { 38 ( 0.35 ) , 38 ( 0.30 ) , 40 ( 0.25 ) } { 13.35 ( 0.30 ) , 13.45 ( 0.30 ) , 13.55 ( 0.40 ) }
A 12 { 503 ( 0.30 ) , 507 ( 0.35 ) , 508 ( 0.35 ) } { 250 ( 0.30 ) , 251 ( 0.35 ) , 252 ( 0.30 ) } { 31 ( 0.30 ) , 32 ( 0.35 ) , 33 ( 0.35 ) } { 12.63 ( 0.30 ) , 13.75 ( 0.35 ) , 13.75 ( 0.35 ) }
Table 3. Estimated values obtained by PHFLRM.
Table 3. Estimated values obtained by PHFLRM.
k = 1 k = 2 k = 3
λ 1 1 = 1.0000 λ 1 2 = 1.0000 λ 1 3 = 1.0000
λ 2 1 = 1.0000 λ 2 2 = 0.9872 λ 2 3 = 1.0000
λ 3 1 = 1.0000 λ 3 2 = 1.0000 λ 3 3 = 0.9975
λ 4 1 = 1.0000 λ 4 2 = 1.0000 λ 4 3 = 1.0000
λ 5 1 = 1.0000 λ 5 2 = 1.0000 λ 5 2 = 1.0000
λ 6 1 = 1.0000 λ 6 2 = 1.0000 λ 6 2 = 1.0000
λ 7 1 = 1.0000 λ 7 2 = 1.0000 λ 7 2 = 1.0000
λ 8 1 = 1.0000 λ 8 2 = 1.0000 λ 8 3 = 1.0000
λ 9 1 = 0.9824 λ 9 2 = 1.0000 λ 9 3 = 1.0000
λ 10 1 = 1.0000 λ 10 2 = 1.0000 λ 10 3 = 0.9797
λ 11 1 = 1.0000 λ 11 2 = 1.0000 λ 11 3 = 1.0000
λ 12 1 = 1.0000 λ 12 2 = 1.0000 λ 12 3 = 1.0000
α 0 1 = 0.5984 α 0 2 = 0.8602 α 0 3 = 0.4812
α 1 1 = 2.1288 α 1 2 = 15.5364 α 1 3 = 8.8699
α 2 1 = 3.2602 α 2 2 = 11.1963 α 2 3 = 1.6489
α 3 1 = 0.3059 α 3 2 = 1.0018 α 3 3 = 0.2233
c 0 1 = 0.0000 c 0 2 = 0.03757 c 0 3 = 0.0000
c 1 1 = 0.0000 c 1 2 = 0.0000 c 1 3 = 1.4431
c 2 1 = 1.3864 c 2 2 = 1.7105 c 2 3 = 4.7510
c 3 1 = 0.1913 c 3 2 = 0.0000 c 3 3 = 0.0000
Table 4. Ranking with PHFLRM ( R P H F L R ).
Table 4. Ranking with PHFLRM ( R P H F L R ).
A i Sc ( Y i ) Sc ( Y i * ) e i R PHFLR
A 1 37.0085 0.7223 36.2862 2
A 2 44.2253 0.7203 43.5050 6
A 3 30.7738 0.7255 30.04823 1
A 4 46.0855 0.7258 45.3597 8
A 5 40.1481 0.7305 39.4176 3
A 6 44.1899 0.7333 43.4566 5
A 7 47.7795 0.7362 47.0433 10
A 8 46.3455 0.7465 45.5989 9
A 9 48.6282 0.7526 47.8756 11
A 10 45.4646 0.7476 44.7179 7
A 11 48.7416 0.7578 47.9838 12
A 12 41.5150 0.7352 40.7798 4
Table 5. Ranking using the TOPSIS (Section 4.1) approach.
Table 5. Ranking using the TOPSIS (Section 4.1) approach.
A i D + D P i R TOPSIS
A 1 0.3595 0.1877 0.6570 2
A 2 0.3311 0.1781 0.6503 3
A 3 0.3585 0.1850 0.6591 1
A 4 0.2976 0.2094 0.5870 5
A 5 0.3075 0.2156 0.5878 4
A 6 0.2764 0.2262 0.5500 6
A 7 0.2497 0.2465 0.5032 8
A 8 0.2446 0.2948 0.4535 10
A 9 0.2051 0.3605 0.3627 11
A 10 0.2478 0.2936 0.4577 9
A 11 0.1856 0.3731 0.3323 12
A 12 0.2817 0.2385 0.5416 7
Table 6. Spearman’s rank correlation coefficients.
Table 6. Spearman’s rank correlation coefficients.
A i R PHFLLR R TOPSIS d 2
A 1 220
A 2 369
A 3 110
A 4 589
A 5 431
A 6 651
A 7 8104
A 8 1091
A 9 11110
A 10 974
A 11 12120
A 12 749
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sultan, A.; Sałabun, W.; Faizi, S.; Ismail, M.; Shekhovtsov, A. Making Group Decisions within the Framework of a Probabilistic Hesitant Fuzzy Linear Regression Model. Sensors 2022, 22, 5736. https://doi.org/10.3390/s22155736

AMA Style

Sultan A, Sałabun W, Faizi S, Ismail M, Shekhovtsov A. Making Group Decisions within the Framework of a Probabilistic Hesitant Fuzzy Linear Regression Model. Sensors. 2022; 22(15):5736. https://doi.org/10.3390/s22155736

Chicago/Turabian Style

Sultan, Ayesha, Wojciech Sałabun, Shahzad Faizi, Muhammad Ismail, and Andrii Shekhovtsov. 2022. "Making Group Decisions within the Framework of a Probabilistic Hesitant Fuzzy Linear Regression Model" Sensors 22, no. 15: 5736. https://doi.org/10.3390/s22155736

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop