Modifying Hellwig’s Method for Multi-Criteria Decision-Making with Mahalanobis Distance for Addressing Asymmetrical Relationships

: Hellwig’s method is a multi-criteria decision-making technique designed to facilitate the ranking of alternatives based on their proximity to the ideal solution. Typically, this approach calculates distances using the Euclidean norm, assuming implicitly that the considered criteria are independent. However, in real-world situations, the assumption of criteria independence is rarely met. The paper aims to propose an extension of Hellwig’s method by incorporating the Mahalanobis distance. Substituting the Euclidean distance with the Mahalanobis distance has proven to be effective in handling correlations among criteria, especially in the context of asymmetrical relationships between criteria. Subsequently, we investigate the impact of the Euclidean and Mahalanobis distance measures on the several variants of Hellwig procedures, analyzing examples based on various illustrative data with 10 alternatives and 4 criteria. Additionally, we examine the influence of three normalization formulas in Hellwig’s aggregation procedures. The investigation results indicate that both the distance measure and normalization formulas have some impact on the final rankings. The evaluation and ranking of alternatives using the Euclidean distance measure are influenced by the normalization formula, albeit to a limited extent. In contrast, the Mahalanobis distance-based Hellwig’s method remains unaffected by the choice of normalization formulas. The study concludes that the ranking of alternatives is strongly dependent on the distance measure employed, whether it is Euclidean or Mahalanobis. The Mahalanobis distance-based Hellwig method is deemed a valuable tool for decision-makers in real-life situations. It enables the evaluation of alternatives by considering interactions between criteria, providing a more comprehensive perspective for decision-making.


Introduction
Multi-criteria decision-making (MCDM) methods are a collection of techniques designed to address complex problems that involve the evaluation and ranking of alternatives based on multiple criteria, which may sometimes conflict with each other [1,2].These methods are widely used in various fields [3], including business [4], engineering [5], environmental science [6], sustainability [7], and public policy [8], among others.The goal is to provide decision-makers with a systematic and structured approach to making choices when faced with a range of alternatives.Among them, there is a class of techniques based on aggregation formulas incorporating reference solutions, such as TOPSIS (Technique for Ordering Preferences by Similarity to Ideal Solution) [9], Hellwig's method [10], VIKOR (VlseKriterijuska Optimizacija I Komoromisno Resenje) [11], DAPR [12], BWM (the Best-Worst Method) [13], or BIPOLAR [14].
MCDM methods typically involve several steps to determine the overall preference value for each alternative, as follows: • Weights determination: This process involves assigning weights to the criteria based on their relative importance in the decision-making process.

•
Distance Measure: This step calculates the distance between the alternatives and reference points, providing a measure of their dissimilarity or similarity.• Aggregation Formula: Aggregation involves combining the normalized values, weights, and distance measures to obtain an overall preference value for each alternative.
The paper focuses on Hellwig's method [10] based on the measurement distances from the alternative to the ideal solution.Two critical aspects of this method have been scrutinized: the distance measure and normalization formula.
The aims of the paper are twofold.Firstly, it introduces an extension of Hellwig's method, namely the Mahalanobis distance-based Hellwig method (HM).The classical Hellwig method (H) relies on Euclidean distance, assuming implicitly that the criteria are independent.However, real-life situations may not always align with this assumption.Therefore, it is necessary to adapt the technique to the new situation.The Mahalanobis distance is employed to measure the distance between the ideal and alternatives, taking into consideration the dependence among criteria.While the Euclidean distance presupposes independence among variables, the Mahalanobis distance considers the covariance structure, making it more appropriate for datasets with correlated or asymmetrically distributed variables.
Secondly, we specifically investigate the impact of the distance measure (Euclidean vs. Mahalanobis) and the normalization formula in Hellwig's measure.Various normalization methods have been proposed in the literature [1,9,15] that can be employed within MCDM.The article by Jahan and Edwards [15] undertakes a comparative analysis of six normalization techniques within multi-criteria decision-making methods.For our comparative analysis, we employed three well-known normalization procedures: vector normalization, linear scale transformation (Max-Min method), and linear scale transformation (Sum method).
Several authors have investigated how alternative normalization procedures can influence the ranking of alternatives obtained through MCDM methods [16][17][18][19][20][21][22][23].We analyze and compare results derived from examples utilizing different variants of Hellwig's method, taking into account two distance measures and three normalization formulas.This analysis is conducted using illustrative data comprising 10 alternatives and 4 criteria.
The rest of the paper is organized as follows: In Section 2, we briefly outline the concept of Mahalanobis distance and its application in multi-criteria analyses.In Section 3, the classical and extended Hellwig methods are presented.In Section 4, five illustrative examples are investigated concerning distance measures (Euclidean and Mahalanobis) and normalization formulas (vector normalization, min-max method, sum method) with differences in the dependence between criteria.The paper finishes with a conclusion.

Mahalanobis Distance and Multi-Criteria Analyses
The Mahalanobis distance, first proposed by Mahalanobis in 1936 [24], is a statistical metric for measuring distance with particular applicability in tasks such as classification, clustering, and multi-criteria decision-making.This distance is based on the separation between two points within a multi-dimensional space, based on the covariance among different variables.The covariance matrix incorporated in the distance measure calculation represents the interrelationships and interdependencies among variables.When the covariance matrix is equal to the identity matrix, the Mahalanobis distance simplifies to the Euclidean distance.The more precise description, calculation, and comparison of Euclidean distance and Mahalanobis distance can be found in [25].Studies [26][27][28][29] are devoted to the Mahalanobis distance and its properties in the context of multicriteria analysis.
Multi-criteria methods based on Mahalanobis and Euclidean distances find widespread application in data analysis.The Mahalanobis distance finds utility in several MCDM approaches, including TOPSIS [26][27][28][29][30][31], TODIM (an acronym in Portuguese for Interactive and Multicriteria Decision Making) [32], or other decision-making problems [33][34][35][36].The Mahalanobis distance, incorporating correlations with diverse criteria, empowers us to proficiently address the asymmetrical relationships among criteria.It aids decision-makers in evaluating alternatives based on their preferences and goals, taking into account the interaction between criteria.

The Hellwig's Framework-A Short Literature Review
Hellwig's method [10], originally proposed by Hellwig in 1968, has undergone several modifications to address real-life problems.In his pioneering work [10], Hellwig introduced the concept of the development measure based on the pattern of economic development based on the most favorable values for each criterion.This method allows for determining the ranking of objects described in the multidimensional space by calculating the distances between the pattern of development and the objects.This concept has been applied to assess differences and similarities among various countries regarding qualified staff, corresponding to the economic development level of each country.
Hellwig's method is particularly popular as a linear ordering technique in Polish literature, especially in the field of economic research.It is worth noting that the number of citations has been steadily increasing, particularly due to numerous publications in English.It has also gained recognition among international researchers as a multi-criteria method based on a reference point.According to Google Scholar (Harzing's Publish or Perish 8 software as of 1 January 2024), the paper [10] has been cited 1479 times (26.41 times per year).
The Hellwig's general framework consists of the following steps: Step 1. Determination of the decision matrix: where x ij is the value of the j-th criterion for i-th alternative i = 1, . . ., m, j = 1, . . ., n.
Step 2. Determination of the vector of weights: where w j > 0 (j = 1, . . ., n) is the weight of the criterion C j and ∑ n j=1 w j = 1.In the later analyses, we implemented equal weights.However, it should be noted that in the literature, various proposals exist for establishing weights [59][60][61][62][63][64].Tzeng et al. [65] classifies weighting methods as objective when weights are computed from outcomes and subjective when they depend only on the preferences of decision-makers.The third class is the combination of subjective and objective weighting methods.Da Silva et al. [62] identified and discussed more than 50 methods, of which 49 are subjective, 7 are objective, and others are hybrid.
We presented here three well-known and frequently used normalization techniques that we later applied for comparison studies [9,19,20]: • Vector normalization, which transforms performance ratings into a normalized vector as follows: • Linear scale transformation (Max-Min method) which involves scaling the performance ratings linearly based on the minimum and maximum values observed across criteria.
x ij = • Linear scale transformation (Sum method), where performance ratings are linearly transformed based on the sum of values across all criteria.
where x ij is the value of the j-th criterion for i-th alternative i = 1, . . ., m, j = 1, . . ., n.
Step 5. Building the weighted normalized matrix: where Step 6. Calculating the distances of i-th alternative A i from the ideal I by using Euclidean or Mahalanobis distance measure

•
Euclidean distance measure (dE i ) [10]: where x ij , x + j are weighted normalized values x ij and x + j , respectively.

•
Mahalanobis distance measure (dM i0 ) [29,31]: where C is the variance-covariance matrix of the data matrix D, W = diag( √ w 1 , . . . ,√ w n ) is the diagonal matrix, where w 1 , w 2 , . . ., w n are the weights assigned to the criteria.In practical terms, the choice of distance measure depends on the data's characteristics and the specifics of the multi-criteria analysis.While the Euclidean distance presupposes independence among variables, the Mahalanobis distance considers the covariance structure, making it more appropriate for datasets with correlated or asymmetrically distributed data.The Mahalanobis distance between the alternative and ideal solution is based on the normalized data and the estimated covariance matrix, which represents the relationships and dependencies between criteria.
Step 6. Calculating the Hellwig's measure H i or Hellwig's measure based on Mahalanobis distance HM i for the i-th alternative using the formula.

•
Classical approach (H measure based on Euclidean distance): where • Extended approach (HM measure based on Mahalanobis distance): where Step 7. Ranking of objects according to descending H i or HM i values.
A higher value of Hellwig's measure corresponds to a higher ranking position for the respective alternative.
In the paper, Wang and Wang [31] showed the following: Property 1: [31]: The non-singular linear transformation of data doesn't affect the Mahalanobis distance measure.

Numerical Examples
This section compares the procedures and results obtained from the different Hellwig's methods: the Hellwig method with Euclidean distance based on vector normalization (H1), max-min normalization (H2), and sum normalization (H3), and the Hellwig method with Mahalanobis distance (HM).Let us note that, from property 2, the results of HM methods don't depend on the formalization formulas N1, N2, and N3.This gives us five variants of Hellwig's method.The results of variants for Hellwig's method were compared (a) based on Euclidean distance for different normalization formulas and (b) based on Euclidean distance with Mahalanobis distance measure.
The problem under consideration involves assessing ten alternatives with four benefit criteria.We assumed equal weight for the analyses to concentrate only on the distance measure and normalization formula incorporated in the algorithm.The examples differ in the data and correlations between criteria.To validate the HM method and examine the relationship between the criteria, we utilize the Pearson correlation coefficient.Additionally, the correlation between results obtained from different variants of Hellwig's method is analyzed using both the Spearman and Pearson coefficients.The interpretation absolute value of the Pearson coefficient or Spearman coefficient is as follows: [0,0.1)-negligible;[0.1,0.40)-weak;[0.4,0.7) moderate; [0.7,0.9)strong; [0.9,1] very strong.

Example 1. (negligible or weak correlation between criteria).
Table 1 displays the data and correlation matrix among the criteria in Example 1.In this case, a negligible or weak correlation is evident between the criteria.The highest Pearson correlation exists between criterion C3 and C4 (0.116), followed by C3 and C2 (0.103).All other Pearson coefficients are below 0.100.The ideal based on max and min values (see Formula ( 3)) has the form: 30, 20, 30].
The criteria values are normalized using Formulas ( 6)-( 8), respectively.Following this, the Euclidean or Mahalanobis distances between the alternative and the ideal object are calculated using Formulas (11) or (12), respectively.
Finally, the synthetic measure is derived using Formula ( 13) or (14).The outcomes of various Hellwig's measures are presented in Table 1.
From Table 2, we can observe that rankings differ for Hellwig's measures based on the Euclidean distance and various normalization formulas, but these differences are not so evident.Spearman coefficients between Hellwig's measure based on Euclidean distance are the following: S(H1, H2) = 0.952, S(H1, H3) = 0.939, and S(H2, H3) = 0.915.The Pearson coefficient also confirms a very strong correlation: P(H1, H2) = 0.976, P(H1, H3) = 0.994, and P(H2, H3) = 0.956.In all variants of Hellwig's method, the rankings converge for alternatives A3 and A7.For the remaining alternatives, the disparity ranges only from 1 to 2 positions.Additionally, Spearman coefficients between the measure HM and other measures are very high: S(H1, Symmetry 2024, 16, 77 7 of 18 HM) = 0.952, S(H2, HM) = 0.976, or high S(H3, HM) = 0.891.Similarly, a very strong correlation was observed when comparing those measures using the Pearson coefficient: P(H1, HM) = 0.956, P(H2, HM) = 0.991, and P(H3, HM) = 0.923.The highest concordance for HM is achieved with H2.The graphical representation results of Hellwig's measures are illustrated in the accompanying Figure 1 In all variants of Hellwig's method, the rankings converge for alternatives A3 and A7.For the remaining alternatives, the disparity ranges only from 1 to 2 positions.Additionally, Spearman coefficients between the measure HM and other measures are very high: S(H1, HM) = 0.952, S(H2, HM) = 0.976, or high S(H3, HM) = 0.891.Similarly, a very strong correlation was observed when comparing those measures using the Pearson coefficient: P(H1, HM) = 0.956, P(H2, HM) = 0.991, and P(H3, HM) = 0.923.The highest concordance for HM is achieved with H2.The graphical representation results of Hellwig's measures are illustrated in the accompanying Figure 1.We can observe that in this case, disparities between all variants of Hellwig's methods are marginal.

Example 2. (from weak to very strong correlation between criteria).
Table 3 presents the data and correlation matrix for the criteria in Example 2. In this instance, discrepancies in the correlation coefficients range from 0.136 to 0.992.The strongest Pearson correlation is observed between criterion C3 and C2 (0.992), followed by C3 and C1 (0.881), and C1 and C2 (0.708).Meanwhile, the lowest Pearson coefficients are found between C1 and C4 (0.136).We can observe that in this case, disparities between all variants of Hellwig's methods are marginal.

Example 2. (from weak to very strong correlation between criteria).
Table 3 presents the data and correlation matrix for the criteria in Example 2. In this instance, discrepancies in the correlation coefficients range from 0.136 to 0.992.The strongest Pearson correlation is observed between criterion C3 and C2 (0.992), followed by C3 and C1 (0.881), and C1 and C2 (0.708).Meanwhile, the lowest Pearson coefficients are found between C1 and C4 (0.136).The outcomes of various Hellwig's measures obtained in Example 2 are presented in Table 4. Table 4 indicates that the rankings obtained through the Hellwig procedure and Euclidean distance measure are identical, resulting in S(H1, H2) = S(H1, H3) = S(H2, H3) = 1.000.Also, we observed a very strong correlation between Hi obtained by the Pearson coefficient: P(H1, H2) = 0.993, P(H1, H3) = 0.999, and P(H2, H3) = 0.988.
Distinctions arise when comparing Hellwig's methods based on Euclidean distance and those based on Mahalanobis distance.Nevertheless, in all cases, the rankings converge for alternatives A3, A7, and A8.Discrepancies for the remaining alternatives range from 1 to 6 positions.The Spearman coefficients between the HM measure reveal moderate relationships among these measures: S(H1, HM) = S(H2, HM) = S(H3, HM) = 0.552.A higher Pearson correlation (moderate or strong) was observed when comparing these measures: P(H1, HM) = 0.709, P(H2, HM) = 0.655, and P(H3, HM) = 0.725.The highest concordance for HM is achieved with H3 (0.725).The graphical representation of Hellwig's measures is depicted in Figure 2. Note that Hellwig's approach, neglecting the interaction between criteria, results in an overestimation of the values for the top-scoring alternatives A3, A7, A8, and A10 while comparing with the HM measure.Conversely, it exhibits an opposite deviation for the low-scoring alternatives A1 and A5.Note that Hellwig's approach, neglecting the interaction between criteria, results in an overestimation of the values for the top-scoring alternatives A3, A7, A8, and A10 while comparing with the HM measure.Conversely, it exhibits an opposite deviation for the low-scoring alternatives A1 and A5.
More distinctions arise when comparing Hellwig's methods based on Euclidean distance and those based on Mahalanobis distance.In all cases, discrepancies for the Symmetry 2024, 16, 77 10 of 18 alternatives range from 1 to 6 positions.The Spearman coefficients between the HM measure reveal week S(H2, HM) = 0.382 or moderate S(H1, HM) = 0.442 and S(H3, HM) = 0.442 correlation among these measures.A strong Pearson correlation was observed when comparing these measures: P(H1, HM) = 0.755, P(H2, HM) = 0.747, and P(H3, HM) = 0.751.The highest concordance for HM is achieved with H1 (0.755).The graphical representation of Hellwig's measures is depicted in Figure 3.
More distinctions arise when comparing Hellwig's methods based on Euclidean distance and those based on Mahalanobis distance.In all cases, discrepancies for the alternatives range from 1 to 6 positions.The Spearman coefficients between the HM measure reveal week S(H2, HM) = 0.382 or moderate S(H1, HM) = 0.442 and S(H3, HM) = 0.442 correlation among these measures.A strong Pearson correlation was observed when comparing these measures: P(H1, HM) = 0.755, P(H2, HM) = 0.747, and P(H3, HM) = 0.751.The highest concordance for HM is achieved with H1 (0.755).The graphical representation of Hellwig's measures is depicted in Figure 3. Please note that Hellwig's methods, when utilizing Euclidean distance measurement, lead to an overestimation of values for high-scoring alternatives A3, A8, and A10 while comparing with HM measure.Conversely, it exhibits an opposite deviation for lowerscoring alternatives A1 and A4.

Example 4. (strong or very strong correlation between criteria).
Table 7 presents both the data and the correlation matrix for the criteria outlined in Example 4. It is noteworthy that we observe high Pearson correlation coefficients ranging from 0.723 (between C3 and C1 or C4 and C1) to 0.910 (between C4 and C2).
The outcomes of various Hellwig's measures obtained in Example 4 are presented in Table 8.
For alternatives A5, A7, A8, and A10, rankings consistently converge in all cases.Disparities for the remaining alternatives range only from 1 to 2 positions.Moreover, the Spearman coefficients between the HM measure and other classical Hellwig measures are very strong: S(H1, HM) = 0.921, S(H2, HM) = 0.921, and S(H3, HM) = 0.947.Similarly, a high Pearson correlation is observed when comparing these measures: P(H1, HM) = 0.827, P(H2, HM) = 0.829, and P(H3, HM) = 0.827.The highest concordance for HM is achieved with H2 for the Pearson coefficient and H3 for the Spearman coefficient.The graphical representation of the results for Hellwig's measures is depicted in Figure 4.It is worth noting that the alternative in the first position, according to the HM measure, has a value of 1.Additionally, Hellwig's approach, neglecting the interaction between criteria, results in an overestimation of the values for the high-scoring alternatives A2, A3, A5, and A7 when compared with the HM measure.Conversely, the low-scoring alternative A1 is underestimated according to the HM measure.

Example 5. (moderate and strong correlation between criteria).
Table 9 presents both the data and the correlation matrix for the criteria outlined in Example 5.The Pearson coefficient varies from 0.656 (between C4 and C1) to 0.747 (between C4 and C3).
The outcomes of various Hellwig's measures obtained in Example 5 are presented in Table 10.
For alternatives A1, A5, and A8, rankings consistently converge in all cases.Disparities for the remaining alternatives range only from 1 to 3 positions.Moreover, the Spearman coefficients between the HM measure and other classical Hellwig measures are very strong: S(H1, HM) = 0.842, S(H2, HM) = 0.842, and S(H3, HM) = 0.842.Similarly, a high Pearson correlation is observed when comparing these measures: P(H1, HM) = 0.821, P(H2, HM) = 0.812, and P(H3, HM) = 0.819.The highest concordance for HM is achieved with H1.The graphical representation of the results for Hellwig's measures is depicted in Figure 5.
For alternatives A5, A7, A8, and A10, rankings consistently converge in all cases.Disparities for the remaining alternatives range only from 1 to 2 positions.Moreover, the Spearman coefficients between the HM measure and other classical Hellwig measures are very strong: S(H1, HM) = 0.921, S(H2, HM) = 0.921, and S(H3, HM) = 0.947.Similarly, a high Pearson correlation is observed when comparing these measures: P(H1, HM) = 0.827, P(H2, HM) = 0.829, and P(H3, HM) = 0.827.The highest concordance for HM is achieved with H2 for the Pearson coefficient and H3 for the Spearman coefficient.The graphical representation of the results for Hellwig's measures is depicted in Figure 4.It is worth noting that the alternative in the first position, according to the HM measure, has a value of 1.Additionally, Hellwig's approach, neglecting the interaction between criteria, results in an overestimation of the values for the high-scoring alternatives A2, A3, A5, and A7 when compared with the HM measure.Conversely, the low-scoring alternative A1 is underestimated according to the HM measure.

Example 5. (moderate and strong correlation between criteria).
Table 9 presents both the data and the correlation matrix for the criteria outlined in Example 5.The Pearson coefficient varies from 0.656 (between C4 and C1) to 0.747 (between C4 and C3).It is worth noting that Hellweg's methods based on Euclidean distance, neglecting the interaction between criteria, result in an overestimation of the values for the highscoring alternatives A2, A3, A5, A7, and A8.Conversely, the low-scoring alternatives A1 and A9 are underestimated when compared to the HM measure.
Table 11 compares the results obtained in the five examples.The results can be summarized as follows:

Relationships between Hi and HM Measure
Firstly, it should be noted that the normalization formula when the Euclidean distance is implemented has an impact on the final ranking but is only marginal.However,  Firstly, it should be noted that the normalization formula when the Euclidean distance is implemented has an impact on the final ranking but is only marginal.However, this does not occur with Mahalanobis distance, as the results remain the same regardless of the type of normalization employed.
Secondly, it can be observed that the rankings obtained using classical Hellwig methods based on Euclidean distance and Hellwig methods based on Mahalanobis distance are different when there is a certain dependence within the data.Those results are consistent with other results in the literature [31].Even in the case of moderate or small relationships between criteria, the ratings obtained by classical Hellwig's methods and those of HM do not coincide.It is also difficult to say which of the normalization formulas, in the case of the Euclidean-based Hellwig method, gives results more consistent with Mahalanobis distance-based Hellwig method concerning the Pearson coefficient.
Thirdly, we can observe that Hellwig's method, neglecting the interaction between criteria, results usually in an overestimation of the values for the high-scoring alternatives.Conversely, the low-scoring alternatives are underestimated when compared with their values in the Mahalanobis distance-based Hellwig's method.It should be noted that these results are consistent with findings in the literature, where TOPSIS methods based on Euclidean and Mahalanobis distances were compared [31].

Conclusions
In the paper, we proposed the Mahalanobis distance-based Hellwig method, incorporating dependencies among criteria.We also investigated the impact of the distance measure (Euclidean and Mahalanobis) and normalization (vector normalization, Min-Max method, Sum method) in the several variants of Hellwig's procedure.We analyze five illustrative examples that differ in relationships between criteria.
Summing up, the contributions of the article include the following: 1. Developing a modification of the Hellwig measure by utilizing the Mahalanobis distance, which considers correlations with different criteria, enables us to effectively account for the asymmetrical relationships between criteria.

2.
Investigating the impact of the distance measure and normalization variants of Hellwig procedures for the evaluation and rank ordering of alternatives.

3.
Analyzing the impact of the correlation between criteria on the consistency of results obtained using different variants of Hellwig's method.
The Mahalanobis distance proves valuable when dealing with asymmetric datasets or datasets featuring correlated variables.Asymmetric datasets often exhibit varying degrees of correlation between variables, and the Mahalanobis distance provides a means to adjust for these correlations.In contrast to the Euclidean distance, which assumes independence among variables, the Mahalanobis distance considers variable relationships by incorporating the covariance matrix.
Consequently, this study shows that the multi-criteria HM method, relying on the Mahalanobis distance, proves effective in addressing correlations between criteria-a critical aspect in the context of asymmetric data.This methodology enables a more accurate reflection of the true data structure, mitigating potential errors associated with assuming criteria independence.At the same time, the Euclidean distance may be less suitable for datasets with asymmetric dependencies between criteria.It neglects information regarding correlation and data structure, potentially resulting in inaccuracies when criteria exhibit strong correlation or asymmetric dependencies.
This work acknowledges certain limitations that will serve as subjects for further research.In the paper, the focus was limited to a few examples that served as illustrations of the challenges and consequences associated with the choice of a variant of Hellwig's method.Further research could delve into considering different normalization techniques to better understand and potentially mitigate their impact on rankings, especially when utilizing Euclidean distance.Future studies may aim to explore and quantify the extent of criteria interdependence, seeking to establish patterns or criteria characteristics that contribute to the divergence in rankings between Hellwig methods based on Euclidean distance and those based on Mahalanobis distance.Future investigations could focus on the interaction effects between criteria, examining the nuances that lead to the overestimation of high-scoring alternatives and the underestimation of low-scoring ones, particularly in the context of variants of Hellwig's method.It would be beneficial to extend the study to different datasets to assess the generalizability of the observed patterns and to identify any dataset-specific factors that may influence the results.Consideration of comparisons with alternative methods beyond the TOPSIS approach could provide a broader perspective on the performance of Hellwig's methods and their variations.By addressing these aspects in future research, a more comprehensive understanding of the observed phenomena and potential strategies for improvement or mitigation can be achieved.

Figure 1 .
Figure 1.Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 1.

Symmetry 2024 , 18 Figure 2 .
Figure 2. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 2.

Example 3 .
(from negligible to very strong correlation between criteria).

Figure 2 .
Figure 2. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 2.

Figure 3 .Figure 3 .
Figure 3. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 3.

Figure 4 .
Figure 4. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 4.

Figure 4 .
Figure 4. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 4.

Figure 5 .
Figure 5. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 5.

Figure 5 .
Figure 5. Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 5.

Funding:
The contribution was supported by the grant WZ/WI-IIT/2/22 from Bialystok University of Technology and founded by the Ministry of Education and Science.

Table 1 .
Data and correlation matrix for Example 1.

Table 2 .
The distance values, measure values, and rank-ordering of alternatives obtained by different variants of Hellwig's method (Example 1). .

Table 3 .
Data and correlation matrix for Example 2.
Figure 1.Graphical representation ranking of alternatives obtained by different variants of Hellwig's methods in Example 1.

Table 3 .
Data and correlation matrix for Example 2.

Table 4 .
The distance values, measure values and rank-ordering of alternatives obtained by different variants of Hellwig's measure (Example 2).

Table 5 .
Data and correlation matrix for Example 3.

Table 6 .
The distance values, measure values, and rank-ordering of alternatives obtained by different variants of Hellwig's measure (Example 3).

Table 7 .
Data and correlation matrix for Example 4.

Table 8 .
The distance values, measure values, and rank-ordering of alternatives obtained by different variants of Hellwig's measure (Example 4).

Table 9 .
Data and correlation matrix for Example 5.

Table 9 .
Data and correlation matrix for Example 5.

Table 10 .
The distance values, measure values, and rank-ordering of alternatives are obtained by different variants of Hellwig's measure (Example 5).

Table 11 .
Comparison results obtained in the examples.

Table 11
compares the results obtained in the five examples.The results can be summarized as follows:

Table 11 .
Comparison results obtained in the examples.