Next Article in Journal
Multi-Augmentation-Based Contrastive Learning for Semi-Supervised Learning
Previous Article in Journal
An Adaptive Linear Programming Algorithm with Parameter Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

On the Use of Data Envelopment Analysis for Multi-Criteria Decision Analysis

CSIRO Environment, Queensland Biosciences Precinct, 306 Carmody Road, St. Lucia 4067, Australia
Algorithms 2024, 17(3), 89; https://doi.org/10.3390/a17030089
Submission received: 1 December 2023 / Revised: 16 February 2024 / Accepted: 17 February 2024 / Published: 20 February 2024

Abstract

:
Data envelopment analysis (DEA) has been proposed as a means of assessing alternative management options when there are multiple criteria with multiple indicators each. While the method has been widely applied, the implications of how the method is applied on the resultant management alternative ranking have not been previously considered. We consider the impact on option ranking of ignoring an implicit hierarchical structure when there are different numbers of indicators associated with potential higher-order objectives. We also consider the implications of the use of radial or slacks-based approaches on option ranking with and without a hierarchical structure. We use an artificial data set as well as data from a previous study to assess the implications of the approach adopted, with the aim to provide guidance for future applications of DEA for multi-criteria decision making. We find substantial benefits in applying a hierarchical approach in the evaluation of the management alternatives. We also find that slacks-based approaches are better able to differentiate between management alternatives given multiple objectives and indicators.

1. Introduction

The impetus to achieve triple bottom line outcomes by businesses, ecologically sustainable development goals by governments, and social, economic, and ecological sustainability in natural resource use has resulted in a wide range of management options being developed that produce different combinations of outcomes across the broad set of criteria being used for their assessment. In many cases, these management alternatives result in trade-offs in outcomes against different objectives. For example, achieving environmental goals may result in a loss of employment opportunities at a national level, while at the business level, social considerations may result in loss of profits. Assessing trade-offs in these outcomes, however, is complex, particularly when different management options produce substantially different outcomes against each criterion. This trade-off is made more difficult when the outcomes of the different management options are measured in different and non-commensurable units (e.g., pollution in tons, profits in monetary values, employment impacts in terms of number of jobs lost, etc.).
Several approaches have been developed to rank different management options given their impact on the different criteria of interest, usually through the estimate of a composite indicator that allows the overall impacts to be compared. These approaches fall under the general area of multi-criteria decision analysis (MCDA), which aggregates outcomes over multiple objectives or criteria to provide a ranking of management alternatives given these potential trade-offs. Key to the application of MCDA is the use of preference weights associated with each objective, allowing a weighted composite value to be estimated by aggregation across all objectives. The use of MCDA to support management decisions has been applied in a wide range of areas, such as healthcare [1,2], agriculture, [3,4], transportation, [5,6] and resource and environmental management, [7,8,9,10], to name a few.
Data envelopment analysis (DEA) has also been broadly employed in an MCDA context. DEA is commonly used to estimate productivity measures such as technical efficiency, technical change, and capacity utilisation of individual firms or businesses. These firms are generally considered individual decision making units (DMUs). These measures reflect the extent to which a set of outputs produced by a DMU are maximized given the set of inputs employed (an output-oriented approach), or the degree to which their input use is minimized to achieve a given level of output (an input-oriented approach). From a decision support perspective, outcomes under different management options (even for a given firm) can also be considered effectively as the unit of production, where the outputs are the expected set of outcomes associated with each criterion or objective under each management option. Consistent with DEA terminology, we will consider these equivalent to DMUs, even though they represent the set of alternatives rather than decision makers per se.
The similarities between DEA and MCDA have long been established [11,12,13,14], and the use of DEA as an MCDA approach is gaining increasing interest in the decision making literature, often combined with criterion preference weights derived using other approaches, such as the analytic hierarchy process (AHP) [15], or used to estimate the criteria weights based on AHP-type pairwise comparisons [16]. de Oliveira et al. [15] identified over 220 papers published between 1994 and 2022 that combined DEA and MCDA approaches or where DEA was used in an MCDA context. Applications of DEA as an MCDA tool have been undertaken in a wide variety of industries, such as agriculture [12], energy [17], fisheries [18], and railways [19]. DEA has also been applied to compare the efforts of different countries in protecting biodiversity [20] and improving water quality [21], where outputs are expressed in terms of biodiversity or water quality indices, respectively.
The DEA efficiency measure is sometimes referred to as the ‘benefit of the doubt’ indicator (BODI) [22,23] in an MCDA context, as it implicitly recognizes that DMUs may have different objective preferences and that their choices of output mix may reflect these preferences. In the case of DMUs actually representing different management options (rather than decision makers per se), these options may have been designed with achieving particular outcomes in mind, and hence their performance across multiple criteria may reflect this. A key advantage of the benefit of the doubt indicator is that objective preference weights do not need to be explicitly provided, as they are assumed implicit in the production mix of the DMU. However, as noted above, decision maker preferences can be directly included in the DEA model.
A feature of these studies is that all criterion indicators and their associated outcomes are compared at the same time. Where these indicators could be considered sub-indicators of a broader objective set (e.g., ecological, economic, or social objectives), different numbers of indicators in each implicitly apply different importance weightings to their corresponding broader objective, as they have a greater overall influence on the BODI/efficiency score. For example, an objective represented by five indicators would have (implicitly) greater influence on the overall efficiency score than an objective with two or three indicators. While this may be intentional in some cases, in others it may reflect availability of indicators rather than an implicit preference for some outcomes. For example, environmental indicators may be more readily accessible than social indicators (or vice versa). Without some compensating analysis, the environmental outcomes would dominate the determination of the ‘best’ management option in this case.
Two key approaches to DEA are the radial [24] and additive (slacks-based) [25] approaches. Both approaches have been widely applied in efficiency analysis, and each has particular theoretical and practical advantages and disadvantages. Radial models are conceptually simple, assuming that outputs (or inputs) change proportionally given an output (input) orientation, subject to any underlying returns to scale conditions. Slacks-based measures remove the proportionality assumption and assume that outputs (or inputs) can change individually based on what combinations are observed in the data [26]. The previously cited applications of DEA used either the radial approach [12,17,18] or the slacks-based approach [19], without consideration of the impact of the approach used on the outcomes of the analysis.
In this study, we consider the implications of both approaches when using DEA for MCDA. We also consider the impact of imposing a hierarchical structure on the resultant ranking of options. Hierarchical structures are commonly imposed in MCDA [27], with one method—the analytic hierarchy process (AHP) [28]—designed specifically to provide decision support through a hierarchical structure. AHP is generally used to estimate weights associated with each objective, with the weights estimated for each level of the hierarchy (i.e., higher-level objectives such as maximizing overall ‘social’ or ‘economic’ outcomes; lower level (but more specific) objectives such as community benefits, equity, etc.). We consider the impact on option ranking of ignoring the implicit hierarchical structure when there are different number of indicators associated with each higher order objective.
The overall purpose of the study is to provide guidance to future applications of DEA for multi-criteria decision making as to the implications of the approach adopted. Specifically, we consider how the type of DEA model used and how it is applied affects the estimated relative efficiency score related to each management alternative being considered by decision makers. We do this through the use of, first, a hypothetical data set to illustrate the differences in outcomes more generally, and second, using an example data set from a previous MCDA application.

2. Materials and Methods

2.1. Data Envelopment Analysis

DEA is a non-parametric (linear programming) frontier-based method that is used to assess the relative efficiency of a set of decision making units (DMUs). DEA is a well-established method for productivity analysis [29,30,31] that has been broadly applied to a wide range of industries.
The efficiency of a DMU is determined by its level of outputs (e.g., level of production) and inputs (e.g., capital and labor) relative to other DMUs. The efficient frontier is defined by the data, given by the set of DMUs that produces the highest level of (different combinations of) outputs for a given set of inputs. A particular advantage of DEA is the ability to include multiple outputs and inputs in the analysis. A further advantage of DEA is that the underlying functional form of the production function (i.e., relationship between outputs and inputs) does not need to be specified, and essentially, a separate ‘production function’ is identified for each DMU.
Outputs are represented by the extent to which objective or criterion outcomes are achieved under each management option. In this context, each management option is effectively treated as DMU-equivalent in a production context. In practice, these outputs are generally represented by indicators associated with the objective rather than the objective per se. There do not need to be any inputs in the traditional productivity analysis sense if each option involves the same level of resources to implement, although if different management options involve different resource inputs, these can be captured in the analysis. For example, if the different management options involve different costs, these can be included as an input, and the resultant measure is then similar to what may be produced in a cost effectiveness analysis. Otherwise, each DMU is assigned an equal ‘input’ with a value of 1 (one).
We consider two approaches to the estimation of the management option ‘efficiency’: an output-oriented radial DEA model [24] and an output-oriented additive model (often referred to as a slacks-based model) [25,32,33]. A simplified comparison of the two approaches is given in Figure 1, where the radial model results in a radial expansion to the frontier, while the additive model estimates the horizontal or vertical distance to the frontier. The radially (or additive) expanded inefficient DMU at the frontier may fall between two (or more in the case of multiple outputs) other DMUs (as seen in Figure 1), the position determined by the weighted average of the outputs of the efficient DMUs.
The radial DEA model has been applied in a wide range of contexts, including to assess different management options given multiple objectives [18]. For the purposes of using DEA as an MCDA tool, we employ an output-oriented model as we seek to identify the management options that maximize the outcomes with respect to the multiple objectives or criteria. We can define the output-oriented radial model as:
M a x   Φ 1   Subject   to Φ 1 y 1 , m j z j y j , m m M j z j x j , n x 1 , n n N
where Φ 1 is a scalar showing by how much the outcomes of each DMU can increase output, yj,m is the outcome against objective m by DMU j, xj,n is amount of input n used by the DMU j (set at 1 for all DMUs) and zj are weighting factors. In a productivity context, we can add an additional constraint, j z j = 1 , to imposes variable returns to scale in the model; otherwise, the model (implicitly) imposes constant returns to scale. With a single (constant) input, there is no difference between imposing variable returns or constant returns to scale [34]. With variable inputs (e.g., costs of each option), imposing variable returns to scale as well as constant returns to scale may (via the derived measure of scale efficiency) provide additional information to assist decision makers. However, assuming decision makers aim to maximize the outcomes given the available resources, constant returns to scale may be a more appropriate assumption.
From this, the technical efficiency (TE) associated with the management option can be given by:
T E = y ( Φ 1 y ) 1 = Φ 1 1
The measure of TE ranges from zero to 1, with 1 being full efficiency (i.e., on the frontier). Values less than 1 indicate that the DMU is operating at less than its full potential given the set of inputs. In the context of our analysis, the most efficient management option is that which best achieves the overall set of objectives.
In this study, we also consider an additive (also known as slacks-based) form of the DEA model, following Tone [25] given by:
M i n   S 1 = 1 ( 1 / n ) i = 1 n s i / x i 0 1 ( 1 / m ) j = 1 m s j + / y j 0 y m 0 = j z j y j , m s 1 , m + m M x n 0 = j z j x j , n + s 1 , n n N
where the y and x variables are as defined previously, with s 1 , m + and s 1 , n representing the slack variables required to reach the frontier. As with the radial model, adding an additional constraint j z j = 1 imposes variable returns to scale in the model while excluding this constraint results in constant returns to scale being (implicitly) imposed.
As all DMUs have the same input level, s 1 , n = 0     n and the objective function can be re-expressed as [35]:
M i n   S 1 = 1 ( 1 / m ) j = 1 m s j + / y j 0
In this case, technical efficiency is given by T E j s = 0 S j 1 . A further feature of the measure is that T E j s T E j . That is, the slacks-based measure of efficiency is always less than or equal to the radial measure.
The radial models were implemented using the Benchmarking [36] package in R [37], and the slacks-based model using deaR [38].

2.2. Hierarchical Structure of Decision Making

MCDA approaches originate from different theoretical traditions, but most involve some form of aggregation model to allow preference estimates to be combined and compared across criteria [39]. This is generally achieved through the construction of an objective or criterion hierarchy, with specific measurable objectives/criteria [27].
An example of such a hierarchy is given in Figure 2, in which three objectives have a different number of indicators (or measurable sub-objectives) associated with each. In this example, ignoring the hierarchical structure and comparing all indicators at the same time will implicitly give more weight to Objective 3 (as it has the most indicators) and least weight to Objective 2 (as it has the fewest associated indicators).
The hierarchical structure is taken into account in the analysis through first estimating the efficiency score for each objective separately (based on the indicator values that make up that objective), then estimating the overall score using the individual objective scores as the measures. This two-stage process compensates for different objectives having different numbers of indicators when preferences relating to each indicator are unknown.

2.3. Data and Analyses

The analysis utilized two different data sets to test the implications of model choice given the structure of the data used and the importance of hierarchical structures. First, a hypothetical data set was generated to ensure a wide spread of criterion outcomes. The ‘efficiency’ of each set of outcomes was compared given different hierarchical structures and model type (i.e., radial or slacks-based) to assess how these impact the overall scores.
Second, a waste management data set from a previous MCDA application was re-examined using the two different DEA methods, with the DEA outcomes compared with the original MCDA analysis.
The two data sets differ in terms of the relationship between the variables. The variables, representing outcome indicators, within the hypothetical data set are relatively uncorrelated. In contrast, the outcome variables in the waste management data set were more correlated.
A hierarchical structure was also applied to both data sets to test the effects of introducing such a structure on the overall management alternative efficiency score. In the case of the hypothetical data set, this structure was varied to test the effects of different number of indicators/sub-objectives/criteria on the estimated efficiency score and overall estimate of performance. For the waste management data set, the indicators naturally grouped into environmental, technological and economic higher order groups.

2.3.1. Hypothetical Data and Analysis

One hundred DMUs were randomly generated with each producing nine outputs (i.e., indicators) with a range of values from 1 to 10. The R code used to generate and analyze the data is presented in the Supporting Information. The range of values of each variable and the correlation between these values is illustrated in Figure 3. Correlation between the variables was low, generally less than +/−0.2.
The data were analyzed first by comparing all indicators together using both DEA methods (i.e., radial and slacks-based), and the set of efficient DMUs were identified, representing the management options that produce the greatest overall benefits. A set of different hierarchical structures was then developed. Each structure was assumed to include three different broader ‘objectives’, each with varying numbers of associated indicators. An outline of the experimental design is given in Table 1, where the number in brackets represents the number of indicators associated with each of the three objectives. The same experimental design was applied to both DEA model types.
For the hierarchical analysis, a two-step process was implemented. First, the indicators within each broader (higher-level) objective were compared first giving an efficiency score for each objective. For example, in the ‘equal’ scenario (Table 1), the efficiency score for the first higher-level objective group was derived using only variables one to three; the second using variables four to six; and the third using variables seven to nine. Next, these higher-level objective group efficiency scores were used as the outcome associated with each broader objective, and the three objectives were compared in a second round of analysis. That is, the efficiency score of each of the higher-level objective group was used as the output, which was then compared with the efficiency score of the other higher order objective groups. The number of indicators associated with each of the broader objectives was changed to see the effects of this on the resultant efficiency score of each DMU.

2.3.2. Waste Management Data

The analysis was also applied using data from a previous MCDA analysis involving a range of different options for development of a waste management system. For each management alternative, a measure of the cost of its implementation and the outcomes in terms of environmental impacts and benefits were estimated. The data were originally produced by Hokkanen and Salminen [40], who applied ELECTRE III [41] to assess the different options, and more recently used by Sarkis [42], who compared DEA to a range of other MCDA approaches.
The data include several undesirable outputs. There are a number of ways of dealing with undesirable outputs, including ignoring them, including them as inputs, or transforming the variables [43]. Sarkis [42] included these undesirable outputs as inputs, along with the costs of their implementation. In our study, we have applied the approach developed by Seiford and Zhu [44], where the undesirable outputs (y) are transformed as y′ = −y + D, where D is a constant value that makes the resultant transformed output (y’) positive for all y. In this case, the maximum value of each of the undesirable variables plus one was added to the negative values. The ‘plus one’ was added such that the minimum value of the transformed variable was 1 (rather than zero). The higher the value of the transformed variable, the better the option compares to the worst performing option. This approach was adopted as the resultant data can be readily applied in both a radial and slacks-based DEA.
The transformed data are presented in Table 2 for the 22 different management alternatives. A hierarchy containing three higher-level criteria (Environmental, Technical, and Economic) was assumed and the outputs allocated accordingly. All Environmental outputs were undesirable, with the values in Table 2 being transformed as indicated above. The cost of each management alternative was used as the input.
This is a relatively small data set for use in DEA. To ensure appropriate degrees of freedom in the analysis, the number of comparisons generally should be greater than max[3(m + n), m × n], where m and n are the number of outputs and inputs respectively [45], suggesting that 24 or more comparisons would produce more reliable results. However, fewer alternatives may incorrectly increase the share of fully efficient options (i.e., TE = 1).
Unlike the hypothetical data, the waste treatment data are more highly correlated (Table 3), both positively and negatively. This indicates a greater degree of proportionality in outputs as well as explicit trade-offs (e.g., greenhouse gases compared with acidic gases). This may affect the efficiency scores derived using the different methods.
As with the hypothetical data, the alternatives were examined using both the radial and slacks-based models, and with and without a hierarchical structure imposed.

3. Results

3.1. Using Hypothetical Data

Using the radial model, 66 DMUs were found to be fully efficient when using all indicators together (Table 4). The number of efficient DMUs declined when the hierarchical structure was imposed. Overall, only 48 DMUs were estimated to be efficient across all scenarios, indicating that some treatments suggested that a DMU was efficient, while others suggested that it was not efficient. All three hierarchical treatments produced similar numbers of efficient DMUs, even though the number of indicators compared within each of the broader objectives varied. Forty-eight DMUs were found to be fully efficiency in all cases.
In contrast, the slacks-based model found only 34 fully efficient DMUs, and this number decreased to between two and ten when a hierarchical structure was imposed. In contrast to the analysis using the radial model, no single DMU was found to be fully efficient under all treatments. That is, the results are sensitive to how the variables are distributed between each higher order criterion.
The relative ranking of the outcomes was sensitive to the type of model used as well as how the model was applied (Table 5). The DMU rank using all the indicators was reasonably correlated between the outputs of the two different models, but the rank correlation declined when a hierarchy approach was adopted (and how many indicators were associated with each higher-level objective in the hierarchy). Ranks rather than efficiency scores were compared, as the ordering of the management options (DMUs) is of more relevance to decision analysis.
The outcomes are also compared in Figure 4 and Figure 5 for the radial and slacks-based model respectively. These figures present the distribution of efficiency scores of each treatment against each other treatment.
For both model types, using all indicators separately in the analysis (i.e., the ‘All’ category in Figure 4 and Figure 5) results in a greater proportion of options being identified as fully efficient (i.e., TE = 1), with many of these values being less than 1 when the indicators are split into sub-groups and assessed as a hierarchy.
The value of the efficiency score is also dependent on how the hierarchy is structured, i.e., whether equally balanced across the objectives or unequally balanced. This is more pronounced for the slacks-based model (Figure 5), where the efficiency scores showed greater differences based on which hierarchy structure was imposed than for the radial DEA model.

3.2. Using the Waste Management Data

The estimated efficiency scores (i.e., cost-effectiveness measures) and their relative rankings are given in Table 6. Both the radial and slacks-based approach estimated that more alternatives were efficient when all data were compared than when the hierarchical structure of the information was taken into consideration. For example, the number of efficient (i.e., TE = 1) alternatives using all data was found to be 12 and 11 for the radial and slacks-based model respectively, decreasing to 5 and 4 alternatives when the hierarchical structure was imposed.
The relative rankings also changed substantially when the hierarchical structure was imposed. For example, alternative 3 was found to be efficient (TE = 1; rank = 1) when all the data were used, although it fell to 18th and 20th place, respectively, when the hierarchical structure was imposed (Table 6).
The efficiency scores also differed substantially for several of the alternatives when estimated using the different models. For example, the efficiency scores for alternative 2 were substantially lower when estimated using the slacks-based model than the radial model, although the impact of this on their rank was less pronounced.
The correlations between the ranks from the different measures are presented in Table 7. Hokkanen and Salminen [40] provided the relative rankings of each of the option using the ELECTRE III method, and these are also included in the correlation matrix in Table 7. The correlation between ELECTRE III and the ranks derived using DEA were generally low, but were higher for the slacks-based model in general and for the hierarchical slacks-based model in particular. The ELECTRE III method incorporates preference weights for the different criteria, although unfortunately these weights were not presented by the authors [40].

4. Discussion

The underlying use of hierarchical structures in MCDA is relatively common. Different management objectives often have a different number of sub-objectives, or a different number of indicators relating to each objective. Introducing a hierarchical structure in the assessment of management alternatives using DEA provides a number of benefits. Foremost, it allows an assessment of how each management alternative relates to each objective as an overall benefit of the doubt measure over all objectives and indicators. While not considered in this study, the efficiency score associated with each group of indicators (i.e., potentially representing a separate management objective) could be considered as well as the overall performance measure of the management option.
The use of an intermediate step also has benefits in terms of saving degrees of freedom when data are limited. The number of indicators that can be compared in an analysis is limited by the number of available observations. The discrimination power and accuracy of DEA with respect to the ability to estimate the performance of DMUs decreases as the number of DMUs decreases or the numbers of inputs and outputs increases [46]. While there is no definitive estimate of the number of observations required to ensure robust estimation of the efficiency scores, a commonly applied rule of thumb is that the number of observations (DMUs) should be greater than max[3(m + n), m × n], where m and n are the number of outputs and inputs, respectively [45]. In our hypothetical example, with nine outputs and one input, 30 observations would be required to meet this. With 100 generated DMUs, this was not a concern. For the waste management example, however, degrees of freedom issues may have contributed to the large number of efficient alternatives when all the data were used, but substantially fewer when breaking the problem up into essentially four separate estimation stages (one each for each objective and one for the combined objective level analysis). For example, when all data were used, the problem required 24 observations (i.e., 3 × (7 + 1) alternatives) although only 22 alternatives were available. In contrast, estimating the efficiency score associated with just the environmental objectives required only 15 observations (i.e., 3 × (4 + 1)).
If decision makers do have differing preferences for the different objectives, then weights can be applied also at each stage to provide a weighted overall efficiency score. While not explicitly covered in this study, an example of how importance weights can be applied in DEA is given by Pascoe et al. [18]. If weights were available for the waste management example, it is expected that the correlation between the MCDA and DEA model rankings would have been greater.
In the hypothetical analysis, each observation was treated as a separate management option for consideration. In many MCDA studies, multiple sets of observations are associated with each management option reflecting uncertainty in management outcomes. These may be derived through stochastic simulation using models or through (differing) expert opinion from a range of experts. The DEA framework offers particular advantages in this case, as each potential outcome set is compared with all other sets. The resultant efficiency scores can be considered as a distribution associated with each management option, allowing the effects of uncertainty to be explicitly captured in the analysis.
We find that the results (i.e., efficiency scores and relative rankings) are also sensitive to model choice. From the results using both data sets, the slacks-based model approach provided greater discrimination between management options (DMUs), with a smaller set of options identified as efficient. Avkiranet al. [35] note a key challenge with radial models is that an identified efficient DMU may still exhibit lower levels of output in some indicators that others that are also identified as efficient. Slacks-based models directly assess these indicator-specific differences, although the loss of the proportionality assumption may mean that the ability of the options to achieve efficiency is limited if such a proportionality constraint exists [35]. However, as we are not interested in improving the efficiency of inefficient management alternatives but identifying which options are the most efficient in a multi-criterion environment then this is not necessarily a problem.
The choice of which model to use to assess the management alternative may depend on the degree of independence between management outcomes. If management outcomes are perfectly independent, such that trade-offs between them are possible, then the slacks-based model may be more appropriate. Conversely, if the outcomes are not totally independent, such that increasing one will require in some increase in another (and vice versa), then the radial model may be more appropriate given the associated proportionality assumptions. In the hypothetical data set, each outcome was total independent, and hence the slacks-based model was able to provide greater discrimination between the hypothetical alternatives that produced them. For the waste management example, several outputs were correlated (e.g., toxins, acidic gases and nitrogen released in the water), suggesting the proportionality approach underlying the radial model may have been appropriate. For the waste management example, both models produced similar rankings and numbers of ‘efficient’ options, with the utilisation of the hierarchical structure more important in distinguishing the most efficient alternatives.
The study assumes that the relative preferences for each of the management outcomes is unknown, and hence the models provide a ‘benefit of the doubt’ measure of the efficiency of the different alternative. Under this scenario, the outcomes are sensitive to what form of model is used and how (if any) hierarchy of criteria is employed. The addition of preference weights is possible in DEA as noted previously, and this may reduce these differences if the importance of each criterion to the decision maker was more explicit. This is an area for future consideration.

5. Conclusions

The results of the analysis have several implications for the use of DEA for MCDA. First, as a ‘benefit of the doubt’ measure, DEA will not necessarily identify a single ‘best’ management option, but may instead identify a subset of efficient options. These options will have different strengths with regard to different objectives. With the addition of objective weightings, this subset may be able to be narrowed further and potentially a single best option identified. Without a set of objective weights, managers will need to assess trade-offs within this (reduced) group subjectively.
Second, in cases where different objectives have a different number of indicators (or quantifiable and measurable sub-objectives), ignoring the implicit (or explicit) hierarchical structure can result in less efficient management options appearing to be efficient. This could result in less-than optimal decisions being made, as the pool of potentially optimal management options is greater than it should be. The triple bottom line framework (i.e., environmental, social and economic) provides a useful conceptual classification system for many MCDA variables, even if an explicit hierarchy has not been pre-determined.
The third key result is that the efficiency scores, and hence ranking of the management options, are sensitive to the DEA approach, with the radial approach potentially overestimating the proportion of efficient options when the outcome data are uncorrelated. In contrast, the slacks-based approach can provide greater discrimination between management options, minimizing the set of efficient options that need to be subsequently considered.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/a17030089/s1, Code used in the study.

Funding

This research received no external funding.

Data Availability Statement

All data used not given in the paper directly can be generated using the code in the Supplementary Materials.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Diaby, V.; Campbell, K.; Goeree, R. Multi-criteria decision analysis (MCDA) in health care: A bibliometric analysis. Oper. Res. Health Care 2013, 2, 20–24. [Google Scholar] [CrossRef]
  2. Gongora-Salazar, P.; Rocks, S.; Fahr, P.; Rivero-Arias, O.; Tsiachristas, A. The Use of Multicriteria Decision Analysis to Support Decision Making in Healthcare: An Updated Systematic Literature Review. Value Health 2023, 26, 780–790. [Google Scholar] [CrossRef] [PubMed]
  3. Mayorga-Martínez, A.A.; Kucha, C.; Kwofie, E.; Ngadi, M. Designing nutrition-sensitive agriculture (NSA) interventions with multi-criteria decision analysis (MCDA): A review. Crit. Rev. Food Sci. Nutr. 2023, 1–20. [Google Scholar] [CrossRef]
  4. Blanquart, S. Role of multicriteria decision-aid (MCDA) to promote sustainable agriculture: Heterogeneous data and different kinds of actors in a decision process. Int. J. Agric. Resour. Gov. Ecol. 2009, 8, 258–281. [Google Scholar] [CrossRef]
  5. Kügemann, M.; Polatidis, H. Multi-Criteria Decision Analysis of Road Transportation Fuels and Vehicles: A Systematic Review and Classification of the Literature. Energies 2020, 13, 157. [Google Scholar] [CrossRef]
  6. Broniewicz, E.; Ogrodnik, K. Multi-criteria analysis of transport infrastructure projects. Transp. Res. Part D Transp. Environ. 2020, 83, 102351. [Google Scholar] [CrossRef]
  7. Kiker, G.A.; Bridges, T.S.; Varghese, A.; Seager, T.P.; Linkov, I. Application of multicriteria decision analysis in environmental decision making. Integr. Environ. Assess. Manag. 2005, 1, 95–108. [Google Scholar] [CrossRef]
  8. Huang, I.B.; Keisler, J.; Linkov, I. Multi-criteria decision analysis in environmental sciences: Ten years of applications and trends. Sci. Total Environ. 2011, 409, 3578–3594. [Google Scholar] [CrossRef]
  9. Martínez-García, M.; Valls, A.; Moreno, A.; Aldea, A. A semantic multi-criteria approach to evaluate different types of energy generation technologies. Environ. Model. Softw. 2018, 110, 129–138. [Google Scholar] [CrossRef]
  10. Cegan, J.C.; Filion, A.M.; Keisler, J.M.; Linkov, I. Trends and applications of multi-criteria decision analysis in environmental sciences: Literature review. Environ. Syst. Decis. 2017, 37, 123–133. [Google Scholar] [CrossRef]
  11. Stewart, T.J. Relationships between Data Envelopment Analysis and Multicriteria Decision Analysis. J. Oper. Res. Soc. 1996, 47, 654–665. [Google Scholar] [CrossRef]
  12. André, F.J.; Herrero, I.; Riesgo, L. A modified DEA model to estimate the importance of objectives with an application to agricultural economics. Omega 2010, 38, 371–382. [Google Scholar] [CrossRef]
  13. Dyckhoff, H.; Souren, R. Integrating multiple criteria decision analysis and production theory for performance evaluation: Framework and review. Eur. J. Oper. Res. 2021, 297, 795–816. [Google Scholar] [CrossRef]
  14. Belton, V. Applications in Business, Industry and Commerce: Proceedings of the Ninth International Conference on Multiple Criteria Decision Making, Berlin, 1992. In On Integrating Data Envelopment Analysis with Multiple Criteria Decision Analysis, Theory; Goicoechea, A., Duckstein, L., Zionts, S., Eds.; Springer: Berlin, Germany, 1992; pp. 71–79. [Google Scholar]
  15. de Oliveira, M.S.; Steffen, V.; de Francisco, A.C.; Trojan, F. Integrated data envelopment analysis, multi-criteria decision making, and cluster analysis methods: Trends and perspectives. Decis. Anal. J. 2023, 8, 100271. [Google Scholar] [CrossRef]
  16. Tavana, M.; Soltanifar, M.; Santos-Arteaga, F.J.; Sharafi, H. Analytic hierarchy process and data envelopment analysis: A match made in heaven. Expert Syst. Appl. 2023, 223, 119902. [Google Scholar] [CrossRef]
  17. Galán-Martín, Á.; Guillén-Gosálbez, G.; Stamford, L.; Azapagic, A. Enhanced data envelopment analysis for sustainability assessment: A novel methodology and application to electricity technologies. Comput. Chem. Eng. 2016, 90, 188–200. [Google Scholar] [CrossRef]
  18. Pascoe, S.; Cannard, T.; Dowling, N.A.; Dichmont, C.M.; Asche, F.; Little, L.R. Use of Data Envelopment Analysis (DEA) to assess management alternatives in the presence of multiple objectives. Mar. Policy 2023, 148, 105444. [Google Scholar] [CrossRef]
  19. Azadeh, A.; Ghaderi, S.F.; Izadbakhsh, H. Integration of DEA and AHP with computer simulation for railway system improvement and optimization. Appl. Math. Comput. 2008, 195, 775–785. [Google Scholar] [CrossRef]
  20. Halkos, G.E.; Tzeremes, N.G. Measuring biodiversity performance: A conditional efficiency measurement approach. Environ. Model. Softw. 2010, 25, 1866–1873. [Google Scholar] [CrossRef]
  21. Macpherson, A.J.; Principe, P.P.; Mehaffey, M. Using Malmquist Indices to evaluate environmental impacts of alternative land development scenarios. Ecol. Indic. 2013, 34, 296–303. [Google Scholar] [CrossRef]
  22. Cherchye, L.; Moesen, W.; Rogge, N.; Puyenbroeck, T.V. An Introduction to ‘Benefit of the Doubt’ Composite Indicators. Soc. Indic. Res. 2007, 82, 111–145. [Google Scholar] [CrossRef]
  23. O’Donnell, C.J. Productivity and Efficiency Analysis: An Economic Approach to Measuring and Explaining Managerial Performance; Springer: Singapore, 2018. [Google Scholar]
  24. Charnes, A.; Cooper, W.W.; Rhodes, E. Measuring the efficiency of decision making units. Eur. J. Oper. Res. 1978, 2, 429–444. [Google Scholar] [CrossRef]
  25. Tone, K. A slacks-based measure of efficiency in data envelopment analysis. Eur. J. Oper. Res. 2001, 130, 498–509. [Google Scholar] [CrossRef]
  26. Tone, K. Slacks-Based Measure of Efficiency. In Handbook on Data Envelopment Analysis; Cooper, W.W., Seiford, L.M., Zhu, J., Eds.; Springer: Boston, MA, USA, 2011; pp. 195–209. [Google Scholar]
  27. Marttunen, M.; Haag, F.; Belton, V.; Mustajoki, J.; Lienert, J. Methods to inform the development of concise objectives hierarchies in multi-criteria decision analysis. Eur. J. Oper. Res. 2019, 277, 604–620. [Google Scholar] [CrossRef]
  28. Saaty, T. The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation; McGraw-Hill: New York, NY, USA, 1980. [Google Scholar]
  29. Färe, R.; Grosskopf, S. Theory and application of directional distance functions. J. Product. Anal. 2000, 13, 93–103. [Google Scholar] [CrossRef]
  30. Färe, R.; Grosskopf, S.; Kokkelenberg, E.C. Measuring Plant Capacity, Utilization and Technical Change: A Nonparametric Approach. Int. Econ. Rev. 1989, 30, 655–666. [Google Scholar] [CrossRef]
  31. Färe, R.; Grosskopf, S.; Kirkley, J. Multi-Output Capacity Measures and Their Relevance for Productivity. Bull. Econ. Res. 2000, 52, 101–113. [Google Scholar] [CrossRef]
  32. Charnes, A.; Cooper, W.W.; Golany, B.; Seiford, L.; Stutz, J. Foundations of data envelopment analysis for Pareto-Koopmans efficient empirical production functions. J. Econom. 1985, 30, 91–107. [Google Scholar] [CrossRef]
  33. Tone, K.; Toloo, M.; Izadikhah, M. A modified slacks-based measure of efficiency in data envelopment analysis. Eur. J. Oper. Res. 2020, 287, 560–571. [Google Scholar] [CrossRef]
  34. Tone, K. A strange case of the cost and allocative efficiencies in DEA. J. Oper. Res. Soc. 2002, 53, 1225–1231. [Google Scholar] [CrossRef]
  35. Avkiran, N.K.; Tone, K.; Tsutsui, M. Bridging radial and non-radial measures of efficiency in DEA. Ann. Oper. Res. 2008, 164, 127–138. [Google Scholar] [CrossRef]
  36. Bogetoft, P.; Otto, L. Benchmarking with DEA, SFA, and R; Springer: New York, NY, USA, 2020. [Google Scholar]
  37. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2018. [Google Scholar]
  38. Coll-Serrano, V.; Bolos, V.; Suarez, R.B. deaR: Conventional and Fuzzy Data Envelopment Analysis; Version 1.2.5; CRAN.R: Vienna, Austria, 2022; Available online: https://cran.r-project.org/web/packages/deaR/deaR.pdf (accessed on 1 December 2023).
  39. Regier, D.A.; Peacock, S. Theoretical Foundations of MCDA. In Multi-Criteria Decision Analysis to Support Healthcare Decisions; Marsh, K., Goetghebeur, M., Thokala, P., Baltussen, R., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 9–28. [Google Scholar]
  40. Hokkanen, J.; Salminen, P. Choosing a solid waste management system using multicriteria decision analysis. Eur. J. Oper. Res. 1997, 98, 19–36. [Google Scholar] [CrossRef]
  41. Figueira, J.R.; Mousseau, V.; Roy, B. ELECTRE Methods. In Multiple Criteria Decision Analysis: State of the Art Surveys; Greco, S., Ehrgott, M., Figueira, J.R., Eds.; Springer: New York, NY, USA, 2016; pp. 155–185. [Google Scholar]
  42. Sarkis, J. A comparative analysis of DEA as a discrete alternative multiple criteria decision tool. Eur. J. Oper. Res. 2000, 123, 543–557. [Google Scholar] [CrossRef]
  43. Halkos, G.; Petrou, K.N. Treating undesirable outputs in DEA: A critical review. Econ. Anal. Policy 2019, 62, 97–104. [Google Scholar] [CrossRef]
  44. Seiford, L.M.; Zhu, J. Modeling undesirable factors in efficiency evaluation. Eur. J. Oper. Res. 2002, 142, 16–20. [Google Scholar] [CrossRef]
  45. Cooper, W.W.; Seiford, L.M.; Tone, K. Data Envelopment Analysis: A Comprehensive Text with Models, Applications, References and DEA-Solver Software; Springer: New York, NY, USA, 2007; Volume 2. [Google Scholar]
  46. Khezrimotlagh, D.; Cook, W.D.; Zhu, J. Number of performance measures versus number of decision making units in DEA. Ann. Oper. Res. 2021, 303, 529–562. [Google Scholar] [CrossRef]
Figure 1. Comparison of (a) radial expansion and (b) slacks-based DEA. Green points represent efficient DMUs which define the frontier; yellow points represent inefficient DMUs that sit below the frontier.
Figure 1. Comparison of (a) radial expansion and (b) slacks-based DEA. Green points represent efficient DMUs which define the frontier; yellow points represent inefficient DMUs that sit below the frontier.
Algorithms 17 00089 g001
Figure 2. Example objective hierarchy.
Figure 2. Example objective hierarchy.
Algorithms 17 00089 g002
Figure 3. Summary of data used in the analysis: (a) distribution of the values of each variable and (b) distribution of correlation coefficients between variables across the 100 randomly generated DMUs.
Figure 3. Summary of data used in the analysis: (a) distribution of the values of each variable and (b) distribution of correlation coefficients between variables across the 100 randomly generated DMUs.
Algorithms 17 00089 g003
Figure 4. Comparison of efficiency scores with different hierarchical structure, radial model.
Figure 4. Comparison of efficiency scores with different hierarchical structure, radial model.
Algorithms 17 00089 g004
Figure 5. Comparison of efficiency scores with different hierarchical structure, slacks-based model.
Figure 5. Comparison of efficiency scores with different hierarchical structure, slacks-based model.
Algorithms 17 00089 g005
Table 1. Experimental design used in the hypothetical analysis.
Table 1. Experimental design used in the hypothetical analysis.
ScenarioStructureVariables in Each Group
All indicators9 individual indicatorsV1–V9
Hierarchy
Equal(3, 3, 3)V1–V3; V4–V6; V7–V9
Unequal(2, 3, 4)V1–V2; V3–V5; V6–V9
Very unequal(2, 2, 5)V1–V2; V3–V4; V5–V9
Table 2. Transformed data used in the waste management example.
Table 2. Transformed data used in the waste management example.
AlternativeCost (Input)EnvironmentalTechnical
Reliability
Economic
Greenhouse GasesToxinsAcidic GasesNitrogenEmplomentWaste Recovery
165616.99662501351413,900
278630.5619650141823,600
391289.1091188942439,767
45899.8932604924091013,900
570637.3883464927971423,600
683499.061171142996.51840,667
75808.6862734926391013,900
868237.4493574929071423,600
9838103.080170113106.52241,747
105798.118298492789913,900
1168837.3723794931371323,600
12838104.318172103226.51742,467
135959.1741714914591213,900
1470936.7002694919471723,600
1584995.53723141456.52040,667
166049.1741714914591213,900
1773636.7002694919471723,600
1887195.53723141456.52040,667
195791176471259713,900
2069532.7372474514861823,600
21827112.49020317071645,167
22982112.46820117071645,167
Table 3. Correlation between the transformed variable values in the waste treatment data.
Table 3. Correlation between the transformed variable values in the waste treatment data.
Greenhouse GasToxinsAcidic GasesSurface Water
Nitrogen
Technical ReliabilityEmploymentResource Recovery
Greenhouse gas1.000
Toxins−0.5701.000
Acidic gases−0.9580.6731.000
Surface water nitrogen0.1570.633−0.1081.000
Technical reliability−0.4680.4370.3060.4161.000
Employees0.746−0.498−0.607−0.152−0.7861.000
Resource recovery0.995−0.559−0.9440.138−0.4930.7651.000
Table 4. Number of efficient (TE = 1) DMUs in each scenario.
Table 4. Number of efficient (TE = 1) DMUs in each scenario.
ScenarioRadialSlacks-Based
All indicators6634
Hierarchy
Equal (3,3,3)496
Unequal (2,3,4)492
Very unequal (2,2,5)4810
All scenarios480
Table 5. Spearman’s rank correlations between different treatments.
Table 5. Spearman’s rank correlations between different treatments.
Radial ModelSlacks-Based Model
ModelAll3,3,32,3,42,2,5All3,3,32,3,42,2,5
Radial
All1.0000.8210.8580.8410.7850.4460.4410.450
3,3,30.8211.0000.9640.9760.6280.4700.4380.435
2,3,40.8580.9641.0000.9750.6870.4390.3790.378
2,2,50.8410.9760.9751.0000.6560.4720.4170.416
Slacks-based
All0.7850.6280.6870.6561.0000.4220.3950.430
3,3,30.4460.4700.4390.4720.4221.0000.8780.829
2,3,40.4410.4380.3790.4170.3950.8781.0000.884
2,2,50.4500.4350.3780.4160.4300.8290.8841.000
Table 6. Efficiency scores and relative rankings given different estimation procedures.
Table 6. Efficiency scores and relative rankings given different estimation procedures.
AlternativeEfficiency ScoresRank of Efficiency Scores
RadialSlacks-BasedRadialSlacks-Based
AllHierarchyAllHierarchyAllHierarchyAllHierarchy
10.9840.9340.2680.2421692019
20.9310.7970.0250.02320182121
31.0000.7741.0000.028118120
40.9960.9770.9610.941136115
50.9660.8310.9550.76615141111
60.9980.8270.9530.68612141112
71.0001.0001.0001.0001111
81.0000.8911.0000.9211714
91.0000.8631.0000.91411014
101.0001.0001.0001.0001111
111.0000.8641.0000.8301814
121.0000.8331.0000.6891816
131.0001.0001.0001.0001111
141.0000.9431.0000.8231413
150.9710.8050.4140.4055666
160.9850.9700.9230.7964333
170.9630.8750.9190.6814433
180.9470.7640.3820.3434444
191.0001.0000.4280.4301133
201.0001.0001.0000.8731112
211.0000.8751.0001.0001111
220.8420.6210.1790.1491111
Table 7. Rank correlations between different models, waste management example.
Table 7. Rank correlations between different models, waste management example.
Electre IIIRadial DEASlacks-Based Model
AllHierarchyAllHierarchy
ELECTRE III1.000
Radial all0.3691.000
Radial Hierarchy0.2280.4051.000
SBM all0.5100.9070.3561.000
SBM Hierarchy0.6210.5090.7620.6271.000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pascoe, S. On the Use of Data Envelopment Analysis for Multi-Criteria Decision Analysis. Algorithms 2024, 17, 89. https://doi.org/10.3390/a17030089

AMA Style

Pascoe S. On the Use of Data Envelopment Analysis for Multi-Criteria Decision Analysis. Algorithms. 2024; 17(3):89. https://doi.org/10.3390/a17030089

Chicago/Turabian Style

Pascoe, Sean. 2024. "On the Use of Data Envelopment Analysis for Multi-Criteria Decision Analysis" Algorithms 17, no. 3: 89. https://doi.org/10.3390/a17030089

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop