Next Article in Journal
Elderly Caregivers’ Awareness of Caregiving Health Risks
Previous Article in Journal
Diabetes and Oral Health (DiabOH): The Perspectives of Primary Healthcare Providers in the Management of Diabetes and Periodontitis in China and Comparison with Those in Australia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Relative Efficiency of Radiation Treatment Centers: An Application of Data Envelopment Analysis

1
Ivey Business School, University of Western Ontario, London, ON N6G 0N1, Canada
2
School of Management, Economics, and Mathematics, King’s University College, University of Western Ontario, London, ON N6A 2M3, Canada
*
Author to whom correspondence should be addressed.
Healthcare 2022, 10(6), 1033; https://doi.org/10.3390/healthcare10061033
Submission received: 24 April 2022 / Revised: 22 May 2022 / Accepted: 30 May 2022 / Published: 2 June 2022

Abstract

:
This study determines the relative efficiencies of a number of cancer treatment centers in Ontario, taking into account the differences among them so that their performances can be compared against the provincial targets. These differences can be in physical and financial resources, and patient demographics. An analytical framework is developed based on a three-step data envelopment analysis (DEA) model to build efficiency metrics for planning, delivery, and quality of treatment at each center. Regression analysis is used to explain the efficiency metrics and demonstrates how these findings can inform continuous improvement efforts.

1. Introduction

The role of radiotherapy in oncology has increased over the last decade as technological advances have continued [1,2]. Providing access to timely and appropriate radiotherapy services is crucial in order to minimize radiotoxicity and optimize patient outcomes [3,4,5].
In Ontario, Canada, there are 15 regional cancer centers (RCCs) that provide radiotherapy services to its 14.5 million residents. As a means of promoting access, RCCs are distributed across the Province and vary in size, availability of specialized equipment, and extent of clinical expertise [6]. While the provincial health authority devised a plan in 2015 to increase the performance of its cancer treatment centers through continuous improvement cycles, the challenge became how to compare, identify, and subsequently implement improvement opportunities and best practices to all its centers [7]. Given the heterogeneity of center attributes and available resources, measuring their relative performances against the same benchmarks may not be a fair assessment. For RCCs with a similar set of resources (inputs), one would expect similar levels of performance, including, for example, the number of patients completing treatment and the percentage of patients starting treatment within pre-specified wait-time targets (outputs). Similarly, one may expect an RCC with fewer inputs to yield lower performance. In reality, however, some RCCs may be more (or less) efficient, producing greater (or fewer) outputs with the same or fewer inputs due to a variety of reasons, such as patient composition, types of services provided, and whether the center is a teaching hospital or not.
Data envelopment analysis (DEA) is a linear programming technique that is widely used to compare decision-making units (DMUs) that provide similar services but operate with differing levels of resources. DEA enables an improved comparison of the relative efficiency of each DMU, that is, how well each DMU is able to transform its set of inputs into desired outputs. Following DEA, regression analysis can be used to identify factors associated with inefficiency, and how a less-efficient DMU can improve its performance [8,9,10,11,12]. (We use the terms DMU and RCC interchangeably).
This study employs DEA and regression analysis using data from 2013 to 2016 obtained from two provincial databases that report cancer-related patient-level activity and diagnoses. DEA is used to identify the factors associated with efficient planning and treatment of radiation therapy for patients treated for cancer in Ontario RCCs, and regression analysis is used to explain findings from the DEA model. We show how this analysis can be used in the development of continuous improvement initiatives, which is not typically discussed in DEA research. There is an opportunity to address gaps in the literature, specifically by contributing a DEA study focused on cancer care in a Canadian province with novel managerial insights regarding how to interpret and act upon efficiency scores. To the best of our knowledge, this is the first study to use DEA to evaluate radiation treatment center performance, and coupled with this are useful interpretations of model results that can initiate improvement efforts. With this study, more awareness is shone on challenges faced by provincial healthcare providers. We also demonstrate how integrated benchmarks can guide decision-makers and lead to potentially beneficial collaborations between RCCs.

2. Literature Review

Özcan [13], Canter and Poh [8], and Kohl et al. [14] explored DEA in healthcare settings, providing several examples of applications to hospitals, nursing homes, and international health studies. In particular, Cantor and Poh [8] reviewed articles that use DEA in combination with other technical approaches, such as regression or factor analysis, to measure healthcare system efficiency, and Kohl et al. [14] provided a review of DEA applied specifically in hospital settings. Those authors note an important disconnect between the findings of DEA and the action taken to improve efficiency: being able to identify low-performing DMUs and quantifying inputs (or outputs) that would yield 100% relative efficiency are important first steps; however, DEA does not prescribe a process for using that knowledge to achieve those targets.
Efficiency in hospitals has been studied before [10,15,16], along with the effectiveness of cancer screening programs as measured through detection rates [17,18] and cost [19,20]. However, few studies directly relate to our context, i.e., efficiency analysis of radiation centers. Langabeer and Özcan [21] applied DEA Malmquist in their longitudinal study of inpatient cancer centers across a five-year period in the United States, and uncovered that greater specialization of treatment does not necessarily lead to higher efficiency or lower costs. This was one of few DEA studies dedicated to cancer care. Meanwhile, Allin et al. [11] focused on comparing 89 health regions in Canada, examining the potential years of life lost that could be due to system inefficiencies, though their focus was not on cancer care.
Expanding on seminal work in DEA [22,23], Simar and Wilson [24] and Lothgren and Tambour [25] established bootstrapping frameworks for approximating a sampling distribution of the relative scale efficiencies, thereby enabling the construction of confidence intervals for these estimates. Bootstrap DEA can be applied to any industry sector, from banking [26] to education [27], to and healthcare supply chains [28]. In each of these settings, regression analysis follows DEA to identify potential causes of inefficiency for the DMUs.

3. Problem Description

Each of the 15 cancer centers in Ontario is subject to specific levels of available resources and expertise (Table 1). For example, some programs are located in teaching hospitals and others are not, and each center has its own level of specialized equipment required for radiotherapy, or medical resource level. Centers can be further distinguished by their treatment capabilities, or their “diversification”. It was observed that for every center, the majority of treatments delivered were to the pelvis and chest, but certain centers had wider or more diversified “portfolios” of body regions treated (e.g., brain). This is in contrast to specialization, which implies that a center would treat only certain body regions and not others. We express diversification as the proportion of radiation treatments delivered to body regions other than pelvis or chest. Finally, centers are situated across the province, where local populations vary. Catchment population refers to the census population within a 50-kilometer radius of the RCC as determined by population counts reported in the 2011 and 2016 census reports [29,30]. Annual population growth rates for Canada [31] were used to estimate populations between 2012 and 2015, inclusive.
Measuring the performances of these distinct centers against the same benchmarks may not be the best assessment. Rather, their performances relative to their available inputs and respective outputs would provide a clearer picture of how efficiently they are operating compared to one another. The scope of our study is limited to cancer treatments delivered by linear accelerators, or LINACs.
To determine the appropriate inputs and outputs for the regional cancer centers (RCCs), a patient’s radiation treatment journey is followed, while considering three dimensions of treatment: planning, delivery, and quality. The planning dimension begins with the patient’s initial diagnosis date. The patient is then referred for a consultation with a radiation oncologist. The time between referral and the first consultation visit with a radiation oncologist is an important performance indicator, called referral-to-consult (RTC). If radiotherapy is indicated, the radiation oncologist will then develop a course of treatment, and will also determine the date that treatment could begin (the ready-to-treat date). The time between when the patient is physically ready to be treated and their first treatment is another important indicator, the ready-to-treat to first treatment (RTT) time. To monitor wait times for the radiation treatment program, the provincial health authority established a 14-day benchmark for both the RTC and RTT wait times for all RCCs; the proportion of patients whose wait time is within that target is tracked yearly. While each patient’s course of treatment (e.g., dosages and timing of treatments) could differ, this planning phase should be consistent for all patients (Figure 1).
Once a plan is in place, radiation treatments are administered according to that plan (the delivery dimension of treatment). This requires patients to visit their respective regional cancer centers on specific dates and times (e.g., radiation is applied Monday through Friday for 3 weeks). The number and timing of treatments that can be delivered at a center depend upon that center’s availability and utilization of its resources, which comprise medical equipment such as LINACs.
Patient support visits and quality assurance visits could also be booked around these treatment visits to minimize the number of center visits required of the patient. Patient support visits consist of patient education and coordination/scheduling of radiation-related visits. Quality assurance visits include exposing the patient to a thermoluminescent dosimeter, acquisition of portal images or volumetric images, use of active breathing control, use of respiratory gating equipment, manual calculations, fluence/dosimetry checks and peer review. These activities are captured in the quality dimension of treatment planning and delivery. Figure 1 illustrates the approximate timing of each treatment dimension. Though several visits may occur simultaneously without requiring the patient to physically change locations within the center, we consider them to be distinct. This allows for resources, such as number of staff, technicians, and physicians, to be indirectly incorporated in the DEA model, should information pertaining to full-time equivalent levels not be easily accessible.

3.1. Data

Data between 2013 and 2016 were examined from two databases: Activity Level Reporting (ALR), which reports patient-level activity within the cancer system, and Ontario Cancer Registry (OCR), a database of residents in Ontario who have been diagnosed with cancer and residents who have died of cancer. From ALR, total number of visits by type (e.g., planning and simulation) and number of incident cases (i.e., number of new cases of diagnosed cancer) were gathered. The patient was assigned to an RCC based on the location of their first treatment. Number of deaths by year and by RCC was calculated by linking to the OCR and the Registered Persons Database, which hold information on Ontario residents’ access to public health services. The proportion of patients whose RTC and RTT wait times were within the provincial target of 14 days was calculated, along with total number of radiation treatments delivered to specific body regions. From both ALR and OCR, number of new cancer diagnoses after the first initial diagnosis for a patient as a surrogate for the subsequent cancer diagnosis rate was identified. Publicly available sources for teaching hospital designations were consulted [32], and they were for data on medical resource (MR) capacity and utilization too [33,34].

3.2. Privacy and Software

This study was approved by the Western University Health Sciences Research Ethics Board. Data envelopment analysis was performed using optimization solver CPLEX 12.10 via the Python API (Python version 3.6) and regression analysis was performed with statistical software R (version 3.6.0).

4. Computing Relative Efficiencies

To compare the performances of RCCs between 2013 and 2016, a DEA model to compute the relative efficiency score for each center across the three treatment dimensions each year was constructed. We posit that the provincial health authority has more control over its inputs than outputs and that resources are limited in our problem setting (as is typical in healthcare environments), so the DEA was formulated as an input-oriented, variable-returns-to-scale (VRS) model, where economies of scale may exist (i.e., we do not assume there is a constant rate of substitution between inputs and outputs). In other words, the VRS assumption is more general and does not require that a change in the inputs produce a proportional change in the outputs. Details regarding the mathematical formulation of the input-oriented VRS DEA model, [ D E A ] , can be found in Appendix A.1, and we note once again that RCCs are analogous to decision-making units (DMUs).
The relative efficiencies computed in a DEA model indicate how well an RCC is able to transform its inputs into outputs, relative to other RCCs. Consider Figure 2. The points A through F represent different RCCs and the levels of output they can achieve based on their inputs. The DEA determined that RCCs A, B, C, and D are efficient, and joining their coordinates will form the efficient frontier. An inefficient RCC such as E can improve its relative efficiency in several ways: increase output without changing its original input (move to E1), use fewer inputs to produce the same level of outputs (E2), or adjust both inputs and outputs in such a way as to reach the frontier (E3).
Efficiency scores for the planning dimension were generated using the numbers of clinic visits, planning visits, and simulation visits as inputs; and the percentages of patients with RTC and RTT times ≤14 days as outputs (Table 2). Efficiency scores for the delivery dimension used medical resource (MR) capacity and the inverse of RTT times ≤14 days as inputs; and MR utilization and the number of treatment visits as outputs. The inverse of RTT was used because a smaller input value is desirable in an input-oriented DEA model (i.e., a lower inverse RTT implies that a higher proportion of patients are within the target wait time window). Similarly, a larger output value is desirable in this type of analysis.
Lastly, efficiency in the quality dimension used the number of patient support visits, quality assurance visits, and treatment visits as inputs, and survival rate and the inverse of the subsequent diagnosis rates as the outputs. The subsequent diagnosis rate was measured by first identifying for each patient in the population over all years in our study whether an incident (i.e., cancer diagnosis) was the first or a subsequent cancer diagnosis. Then, in a given year, the total number of subsequent cancer diagnoses across the patient population was divided by the total number of cancer incidents to determine the subsequent diagnosis rate. The complement of this value is used in this analysis, since larger outputs are more desirable in an input-oriented VRS model.
Solving [ D E A ] provides only a snapshot of relative efficiency scores, based on single measures of inputs and outputs at a given point in time. To better capture the variability in these efficiencies, a bootstrap DEA [24,35] allows computation of an average efficiency and confidence interval through sampling with replacement. This bootstrap approach (presented in Appendix A.2) repeatedly samples inputs to generate a range of efficiencies, ultimately providing an average efficiency score that better distinguishes efficient RCCs from one another, even if the differences are small. Table 3, Table 4 and Table 5 show efficiency scores derived from [ D E A ] , along with the bias-corrected mean efficiencies and their 5th and 95th quantiles (denoted by θ , θ ^ , q 0.05 , and q 0.95 , respectively). The ranking of RCCs remains, in general, consistent with results from solving [ D E A ] just once, so we see that results are robust.

5. Explaining Relative Efficiencies

We want to understand the differences in efficiencies to gain insights on how to interpret them. External factors that may be determinants of relative efficiency include:
(1)
Center size;
(2)
Center catchment population;
(3)
Radiation treatment diversification;
(4)
Teaching hospital designation.
Center size was estimated using the medical resource level (MRL) as a proxy (Table 1). Catchment population and radiation treatment diversification were defined in Section 3. Table 6 presents the variables in our regression model, and shows mean and standard deviation values for these regression variables over the four years of study.
The regression model takes the following form:
θ ^ j * = β 0 + β 1 MRL j + β 2 DIV j + β 3 POP j + β 4 TEA j + ϵ j
where θ ^ j * denotes a transformed bias-corrected efficiency score. As θ ^ j are censored at 1, (i.e., 0 θ ^ j 1 ), this requires various cut-off points (and thus regression models) to be developed to compute meaningful efficiency estimates that remain within 0 and 1. However, by transforming scores so that they are left-censored at 0 only, a censored (or Tobit) regression can be applied and interpreted similarly to an ordinary least squares regression [13]. We apply the following transformation to compute θ ^ j * :
θ ^ j * = 1 θ ^ j 1
Note that with this transformation, θ ^ j * 0 and can be interpreted as an “inefficiency” score; i.e., θ ^ j * = 0 indicates that RCC j is 100% efficient.
For each model dimension, a left-censored (Tobit) regression analysis was performed with the vglm function from the VGAM library in R using the transformed efficiencies over all four years of our study. From the results in Table 6, this combination of determinants explains roughly 64% and 38% of variation in the planning and quality efficiency scores, respectively, but does not adequately account for variation in delivery efficiency ( R 2 < 0.1 %).
Due to the transformation applied to efficiency scores, interpretation of coefficients must be handled carefully: a positive coefficient indicates that efficiency worsens as the dependent variable increases, and vice versa. For the planning dimension, higher efficiency scores were associated with smaller centers (lower levels of medical resources) ( p < 0.01 ) and centers with less diverse radiation treatment portfolios ( p < 0.001 ), but not catchment populations ( p = 0.253 ) or teaching designation ( p = 0.218 ). Smaller RCCs were also more efficient in the quality ( p < 0.05 ).

6. From Rankings to Continuous Improvement

Treatment center policies can be informed by relative efficiency scores. Rather than focusing solely on the rankings that a DEA provides, we want to work towards actionable plans that contribute to the continuous improvement of RCCs. We developed several visualizations to identify potential courses of action for an RCC to improve upon its performance in a given dimension. In the following subsections, visualizations for a selection of computed results are presented; similar analysis can be performed for all dimensions, years, and combinations of RCCs.

6.1. Comparing Results by Dimension

For each of the three treatment dimensions, relative efficiency scores can be compared to identify whether any trends in performance are apparent. Figure 3 plots the bias-corrected mean efficiencies for the planning dimension ( θ ^ ), computed according to the method in Appendix A.2 and shown in Table 3 by year and RCC.
The efficiencies of some centers are distinctly lower than others: C8, C11, C13, and C14 consistently perform with θ ^ < 0.30 . Focusing on a specific center or group of centers within this range, say C8 and C14, allows us to understand which inputs and outputs impact their planning efficiency scores (Figure 4).
Though both in the band of low efficiency, C14’s performance in the planning dimension has improved since 2013, whereas C8’s has seen a decline. These trends can be attributed numerous changes in inputs and outputs for C14 and C8. In particular, C14 saw a general decline in clinic visits and improvement in the proportion of patients meeting the RTC wait time target, which both outweigh the increase in simulation visits. For C8, however, while the number of planning visits decreased over this period, it did not offset increases in clinic and simulation visits or the worsening of the RTC target.
We do not recommend exhausting pairwise comparisons of centers; rather, decision-makers should identify “peer centers” (or, RCCs with similar performances in a treatment dimension) and scrutinize why those RCCs perform differently. In what specific measures does one RCC outperform the other, and how can each RCC strive for improvement?

6.2. Relative Comparison of RCCs

To assist in identifying these peer centers, we propose visualizations in Figure 5, Figure 6 and Figure 7 to compare two dimensions in a given year (2016).
Say we compare centers based on both planning and quality dimensions (Figure 5). In keeping with our C8 and C14 pairing, we can see how “far” away C8 is compared to C14, but also how close it is in performance to other centers, such as C11. While all centers should be striving to be in the top-right quadrant (i.e., 100% efficient in both dimensions), in the short-term, it is perhaps better to compare just certain groups of centers, quadrant-by-quadrant. This visualization also quickly identifies clusters of centers: According to this study, non-teaching hospitals (denoted by “×”) performed better in both planning and quality in 2016, in general, compared to their teaching counterparts. This clustering is also apparent when comparing planning and delivery dimensions (Figure 6) and delivery and quality dimensions (Figure 7) in 2016, as most non-teaching hospitals performed better than non-teaching hospitals in all three dimensions.
The dotted red lines in each of these quadrant charts represent arbitrarily chosen thresholds for performance, and should be adjusted appropriately by decision-makers. For example, in Figure 7, if a 60% efficiency threshold in the quality dimension is considered too high to strive for, it can be lowered to some other value, say 30%. By focusing on fewer centers who have not met this 30% target, decision-makers can tailor strategic plans for those centers to get them past this milestone. Alternatively, centers scoring very high on quality (given managerial targets) may reduce efforts in this dimension to focus on under-performing aspects in other dimensions. Through these positive incremental changes, centers maybe be encouraged to adopt more continuous improvement initiatives.

7. Discussion

As part of measuring the performances of RCCs, the provincial health authority measures and reports various indicators, including RTC and RTT. Although this is an important and effective means of assessing improvement opportunities, indicators measured in isolation do not speak to the center’s efficiency. DEA, in contrast, provides a single numeric value that signifies each center’s relative efficiency and can identify system-level strategies towards performance improvement (e.g., increase medical resource levels).
We recognize that computing these scores is not enough; we must also consider how to influence managerial action from those scholarly insights [14]. Typically, inefficient RCCs would seek to reduce their inputs while maintaining or increasing their outputs, with specific targets obtained using slack values from model [ P ] . These RCCs may also consider emulating a “weighted” combination of efficient RCCs based on results from model [ D E A ] . While knowing these target values is useful, how to achieve them operationally is another story. We present a variation of this benchmarking information to assist in identifying potential partnerships between centers. Visualizing DEA inputs, outputs, and resulting efficiency scores allows policy makers to quickly see how centers are performing across several radiation treatment dimensions and against other centers over time. By considering three separate treatment dimensions, we show that an RCC that is inefficient in one dimension can also be a leader in another. The parameters listed in Table 2 were used as inputs and outputs. While there are other measures that could be incorporated, the number of RCCs we studied was a limiting factor: Özcan [36] suggests that the number of inputs and outputs to consider in a DEA model should satisfy n max ( r × s , 3 ( r + s ) ) , where n , r , and s denote the numbers of RCCs, inputs, and outputs, respectively. With n = 15 RCCs, we used r 3 and s 2 for any one DEA model.
Some determinants of relative efficiencies discussed in Section 5 are not within the decision-making jurisdiction of an individual radiation treatment center. For example, center catchment population and teaching hospital designation cannot be controlled or changed by an RCC. Medical resource levels and treatment diversification, however, can be influenced by the specific needs of the population served by the RCC and the policies developed by radiation treatment center administrators.
Furthermore, the inputs and outputs used in our analysis were limited to values that are currently available. Other center-specific information that could have been more informative would include full-time equivalents (FTEs) for radiation treatment professionals, such as oncologists, dosimetrists, technicians, and administrators. Provincial funding also differs by center and is partly based on the center’s case mix, neither of which was provided by the health authority for use in our study. A center’s case mix would influence the complexity of radiation treatments planned and delivered, which could impact RTC and RTT wait times, medical resource utilization, and patient survival rates, but also the specialization of FTEs and types of equipment required to deliver specialized treatments. Without this more granular information, insights from DEA models are limited to high-level interpretations.
Finally, it is important to be mindful of metrics deemed important by provincial authorities to ensure the insights developed by DEA modeling are meaningful. For example, during the analysis period (2013 to 2016), RTC and RTT wait times were considered important indicators for how well RCCs were meeting provincial wait time targets. However, in their most recent plan, the provincial authority modified these indicators to measure instead wait time from diagnosis to first treatment date and the radiation integrated wait time [37]. The DEA model should be revised to reflect these updates in provincial measurements. Regardless, the analytical approach we presented here can be followed with the appropriate value substitutions [10,11,21].

8. Conclusions

Using data envelopment analysis, multiple and varying inputs and outputs were considered together by separating the patient’s radiation treatment journey into three phases: planning, delivery, and quality. With bias-corrected DEA scores computed from the bootstrap model, efficient centers are better distinguished from one another based on their mean efficiencies compared to using VRS DEA results directly. The censored (Tobit) regression analysis identifies external determinants of efficiency, namely, center size (measured by medical resource level), diversification of radiation treatment, center catchment population, and teaching hospital designation. These determinants account for roughly 64% and 38% of variation in efficiencies in the planning and quality dimensions, respectively (but they did not significantly explain variation in the delivery dimension).
We highlight that this analysis is not prescriptive: while it can identify problem areas, it does not actually prescribe the specific actions for centers to take to reach their targets. Rather, with a larger number of RCCs for comparison, it can point decision-makers in directions that could lead to learning opportunities and beneficial collaborations between regional cancer centers for meeting provincial goals.
There are interesting empirical and theoretical future research directions for our work. For example, how can we combine three different efficiency scores into a single measure and evaluate all the centers with this new score? A single score will be more practical and easier to understand and act upon. However, building such a theoretical and empirical framework is an open research question. Another option would be to compare all centers within a province to those in another province while taking into account provincial differences. This would allow policy makers to improve centers even more.

Author Contributions

Conceptualization, M.A.B. and F.F.R.; methodology, T.B., M.A.B. and F.F.R.; software, T.B.; formal analysis, T.B.; investigation, T.B., M.A.B. and F.F.R.; writing—original draft preparation, T.B. and M.A.B.; writing—review and editing, T.B., M.A.B., F.F.R. and D.B.; visualization, T.B.; funding acquisition, M.A.B., F.F.R. and D.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Cancer Care Ontario, Planning and Regional Programs.

Institutional Review Board Statement

This study was approved by the Western University Health Sciences Research Ethics Board. The project ID is 112072 and the approval date was 7 August 2018.

Informed Consent Statement

Not applicable.

Data Availability Statement

Ontario Health is prohibited from making the data used in this research publicly accessible if it includes potentially identifiable personal health information and/or personal information as defined in Ontario law, specifically the Personal Health Information Protection Act (PHIPA) and the Freedom of Information and Protection of Privacy Act (FIPPA). Upon request, data de-identified to a level suitable for public release may be provided by Ontario Health.

Acknowledgments

Parts of this material are based on data and information provided by Ontario Health (Cancer Care Ontario) and include data received by Ontario Health (Cancer Care Ontario) from the Canadian Institute for Health Information (CIHI), and Institute for Clinical Evaluative Services (ICES). The opinions, results, views, and conclusions reported in this publication are those of the authors and do not reflect those of Ontario Health (Cancer Care Ontario), CIHI, and/or ICES. No endorsement by Ontario Health (Cancer Care Ontario), CIHI, and/or ICES is intended or should be inferred. We thank Alexander Smith for his support in this project.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Bootstrap DEA Model Solution Approach

Appendix A.1. Input-Oriented VRS DEA Model

We follow the DEA frameworks established by [22,23], which are presented in [13]. We begin by detailing the primal linear program from which we construct the dual model used in the bootstrap DEA. Let c = 1 , 2 , 3 denote the planning, delivery, and quality dimensions of the radiation treatment program, respectively. Let j = 1 , 2 , , n denote the regional cancer centers (DMUs), i = 1 , 2 , , m c be the inputs, and r = 1 , 2 , , s c represent the outputs for each DMU. We use the following notation for every treatment dimension c and year t:
θ j c t = relative efficiency of DMU j
x i j c t = input i from DMU j
y r j c t = output r from DMU j
v i c t = implicit price of input i
u r c t = implicit price of output r
u 0 is an unrestricted in sign variable (urs)
For every DMU k = 1 , 2 , , n ; treatment dimension c = 1 , 2 , 3 ; and year t = 2013 , , 2016 , we have:
[ P ] k c t max θ k c t = r = 1 s c t u r c t y r k c t + u 0
s . t . r = 1 s c t u r c t y r j c t i = 1 m c t v i c t x i j c t + u 0 0 j = 1 , 2 , , n
i = 1 m c t x i k c t = 1
u r c t 0 r = 1 , 2 , , s c t
v i c 0 i = 1 , 2 , , m c t
u 0 u r s
Constraint (A2) ensures the sum of outputs for a DMU is less than or equal to the sum of its inputs, while (A3) scales the inputs of DMU to equal 1. Constraints (A4)–(A6) are defining domains of the variables. The objective function maximizes the total outputs of DMU k: if the result equals 1, then DMU k is considered efficient; otherwise it is deemed inefficient. (Note that we omit superscripts c and t from here on to streamline our notation, but all models are solved for every treatment dimension c and year t.)
An inefficient DMU need not despair: information regarding how it might reach the efficient frontier can be obtained from solving the dual of [ P ] . After introducing weights, λ j for DMU j and dual efficiency of DMU k, θ k , the dual of [ P ] is formulated as follows:
[ D E A ] k min θ k
s . t . j = 1 n x i j λ j x i k θ k i = 1 , 2 , , m
j = 1 n y r j λ j y r k r = 1 , 2 , , s
λ j 0 j = 1 , 2 , n
j = 1 n λ j = 1
Here, we minimize the dual efficiency of DMU k, where Constraint (A8) ensures the weighted sum of reference inputs i is less than or equal to the input for DMU k and Constraint (A9) ensures the weighted sum of reference outputs r is greater than or equal to the output of DMU k. Inequalities (A10) ensure weights are non-negative.
An efficient DMU t will yield θ k = 1 . For an inefficient DMU k, solving [ D E A ] will give θ k < 1 . The reference set (or benchmarks) to this inefficient center is identified by observing the efficient DMUs k = 1 , 2 , whose λ k > 0 .

Appendix A.2. Bootstrap DEA

We follow the approach outlined in [38], which uses the Shephard input δ ( 1 ) rather than the Farrell efficiency score θ (where θ 1 and δ = θ 1 ). For each year t and treatment dimension c:
(1)
Solve an input-oriented VRS DEA and calculate δ j for j = 1 , 2 , , n .
(2)
Let { δ 1 , δ 2 , , δ n , 2 δ 1 , 2 δ 2 , , 2 δ n } be a list of mirrored efficiencies.
(3)
For B = 1 , , 2000 :
(a)
Generate a standardized realization of efficiencies through sampling with replacement from mirrored efficiencies;
(b)
Using this sample, generate fictional reference inputs, x * ;
(c)
Solve VRS DEA using original outputs and inputs, and use x * as reference inputs for DMUs 1 , , n .
(4)
Calculate the bias-corrected mean efficiency, θ ^ j , and quantiles for each DMU j.
This bootstrap approach repeatedly samples inputs to generate a range of efficiencies, ultimately providing us with an average efficiency score that better distinguishes efficient DMUs centers from one another, even if that difference is small. Table 3, Table 4 and Table 5 show efficiency scores derived from [ D E A ] , along with the bias-corrected mean efficiencies and their 5th and 95th quantiles (denoted by θ , θ ^ , q 0.05 , and q 0.95 , respectively). The ranking of DMUs remains, in general, consistent with results from solving [ D E A ] just once.

References

  1. Jaffray, D.A.; Gospodarowicz, M.K. Radiation Therapy for Cancer. In Cancer: Disease Control Priorities, 3rd ed.; Gelband, H., Jha, P., Sankaranarayanan, R., Horton, S., Eds.; The International Bank for Reconstruction and Development: Washington, DC, USA, 2015; Volume 3, Chapter 14. [Google Scholar] [CrossRef] [Green Version]
  2. Citrin, D.E. Recent developments in radiotherapy. N. Engl. J. Med. 2017, 377, 1065–1075. [Google Scholar] [CrossRef] [PubMed]
  3. Cohen, J.; Harper, A.; Nichols, E.M.; Rao, G.G.; Mohindra, P.; Roque, D.M. Barriers to timely completion of radiation therapy in patients with cervical cancer in an urban tertiary care center. Cureus 2017, 9, e1681. [Google Scholar] [CrossRef] [Green Version]
  4. Lin, S.M.; Ku, H.Y.; Chang, T.C.; Liu, T.W.; Hong, J.H. The prognostic impact of overall treatment time on disease outcome in uterine cervical cancer patients treated primarily with concomitant chemoradiotherapy: A nationwide Taiwanese cohort study. Oncotarget 2017, 8, 85203. [Google Scholar] [CrossRef] [PubMed]
  5. Olivotto, I.A.; Lesperance, M.L.; Truong, P.T.; Nichol, A.; Berrang, T.; Tyldesley, S.; Germain, F.; Speers, C.; Wai, E.; Holloway, C.; et al. Intervals longer than 20 weeks from breast-conserving surgery to radiation therapy are associated with inferior outcome for women with early-stage breast cancer who are not receiving chemotherapy. J. Clin. Oncol. 2009, 27, 16–23. [Google Scholar] [CrossRef] [PubMed]
  6. Harnett, N.; Bak, K.; Lockhart, E.; Ang, M.; Zychla, L.; Gutierrez, E.; Warde, P. The Clinical Specialist Radiation Therapist (CSRT): A case study exploring the effectiveness of a new advanced practice role in Canada. J. Med. Radiat. Sci. 2018, 65, 86–96. [Google Scholar] [CrossRef] [PubMed]
  7. Cancer Care Ontario. Ontario Cancer Plan 4: 2015–2019; Cancer Care Ontario: Toronto, ON, Canada, 2015. [Google Scholar]
  8. Cantor, V.J.M.; Poh, K.L. Integrated Analysis of Healthcare Efficiency: A Systematic Review. J. Med. Syst. 2018, 42, 8. [Google Scholar] [CrossRef]
  9. Dexter, F.; O’Neill, L. Data envelopment analysis to determine by how much hospitals can increase elective inpatient surgical workload for each specialty. Anesth. Analg. 2004, 99, 1492–1500. [Google Scholar] [CrossRef]
  10. Aletras, V.; Kontodimopoulos, N.; Zagouldoudis, A.; Niakas, D. The short-term effect on technical and scale efficiency of establishing regional health systems and general management in Greek NHS hospitals. Health Policy 2007, 83, 236–245. [Google Scholar] [CrossRef]
  11. Allin, S.; Veillard, J.; Wang, L.; Grignon, M. How can health system efficiency be improved in Canada? Healthc. Policy 2015, 11, 33. [Google Scholar] [CrossRef] [Green Version]
  12. Zakowska, I.; Godycki-Cwirko, M. Data envelopment analysis applications in primary health care: A systematic review. Fam. Pract. 2020, 37, 147–153. [Google Scholar] [CrossRef]
  13. Özcan, Y.A. Health Care Benchmarking and Performance Evaluation: An Assessment Using Data Envelopment Analysis, 2nd ed.; Springer Science + Business Media: New York, NY, USA, 2014. [Google Scholar]
  14. Kohl, S.; Schoenfelder, J.; Fügener, A.; Brunner, J.O. The use of Data Envelopment Analysis (DEA) in healthcare with a focus on hospitals. Health Care Manag. Sci. 2019, 22, 245–286. [Google Scholar] [CrossRef] [PubMed]
  15. Araújo, C.; Barros, C.P.; Wanke, P. Efficiency determinants and capacity issues in Brazilian for-profit hospitals. Health Care Manag. Sci. 2014, 17, 126–138. [Google Scholar] [CrossRef] [PubMed]
  16. Büchner, V.A.; Hinz, V.; Schreyögg, J. Health systems: Changes in hospital efficiency and profitability. Health Care Manag. Sci. 2016, 19, 130–143. [Google Scholar] [CrossRef] [PubMed]
  17. Sherlaw-Johnson, C.; Gallivan, S.; Jenkins, D. Evaluating cervical cancer screening programmes for developing countries. Int. J. Cancer 1997, 72, 210–216. [Google Scholar] [CrossRef]
  18. Jensen, A.; Vejborg, I.; Severinsen, N.; Nielsen, S.; Rank, F.; Mikkelsen, G.J.; Hilden, J.; Vistisen, D.; Dyreborg, U.; Lynge, E. Performance of clinical mammography: A nationwide study from Denmark. Int. J. Cancer 2006, 119, 183–191. [Google Scholar] [CrossRef]
  19. Van Luijt, P.; Heijnsdijk, E.; de Koning, H. Cost-effectiveness of the Norwegian breast cancer screening program. Int. J. Cancer 2017, 140, 833–840. [Google Scholar] [CrossRef] [Green Version]
  20. Cheng, C.Y.; Datzmann, T.; Hernandez, D.; Schmitt, J.; Schlander, M. Do certified cancer centers provide more cost-effective care? A health economic analysis of colon cancer care in Germany using administrative data. Int. J. Cancer 2021, 149, 1744–1754. [Google Scholar] [CrossRef]
  21. Langabeer, J.R.; Özcan, Y.A. The economics of cancer care: Longitudinal changes in provider efficiency. Health Care Manag. Sci. 2009, 12, 192–200. [Google Scholar] [CrossRef]
  22. Charnes, A.; Cooper, W.W.; Rhodes, E. Measuring the efficiency of decision making units. Eur. J. Oper. Res. 1978, 2, 429–444. [Google Scholar] [CrossRef]
  23. Banker, R.D.; Charnes, A.; Cooper, W.W. Some models for estimating technical and scale inefficiencies in data envelopment analysis. Manag. Sci. 1984, 30, 1078–1092. [Google Scholar] [CrossRef] [Green Version]
  24. Simar, L.; Wilson, P.W. Sensitivity analysis of efficiency scores: How to bootstrap in nonparametric frontier models. Manag. Sci. 1998, 44, 49–61. [Google Scholar] [CrossRef] [Green Version]
  25. Lothgren, M.; Tambour, M. Testing scale efficiency in DEA models: A bootstrapping approach. Appl. Econ. 1999, 31, 1231–1237. [Google Scholar] [CrossRef]
  26. Périco, A.E.; Santana, N.B.; Rebelatto, D.A.N. Estimating the efficiency from Brazilian banks: A bootstrapped Data Envelopment Analysis (DEA). Production 2016, 26, 551–561. [Google Scholar] [CrossRef]
  27. Lee, B.L.; Worthington, A.; Wilson, C. Learning environment and primary school efficiency: A DEA bootstrap truncated regression analysis. Int. J. Educ. Res. 2019, 33, 678–697. [Google Scholar] [CrossRef] [Green Version]
  28. Kim, C.; Kim, H.J. A study on healthcare supply chain management efficiency: Using bootstrap data envelopment analysis. Health Care Manag. Sci. 2019, 22, 534–548. [Google Scholar] [CrossRef]
  29. Statistics Canada. Population and dwelling counts, for Canada and forward sortation areas as reported by the respondents, 2011 Census (table). Population and Dwelling Count Highlight Tables. 2011 Census. In Statistics Canada Catalogue No. 98-310-XWE2011002; Statistics Canada: Ottawa, ON, Canada, 2012. [Google Scholar]
  30. Statistics Canada. Population and dwelling counts, for Canada and forward sortation areas© as reported by the respondents, 2016 Census (table). Population and Dwelling Count Highlight Tables. 2016 Census. In Statistics Canada Catalogue No. 98-402-X2016001; Statistics Canada: Ottawa, ON, Canada, 2017. [Google Scholar]
  31. World Bank. World Development Indicators; World Bank: Washington, DC, USA, 2021. [Google Scholar]
  32. Ontario Ministry of Health and Long-Term Care. Group A: Classification of Hospitals. Available online: https://www.health.gov.on.ca/en/common/system/services/hosp/group_a.aspx (accessed on 4 May 2021).
  33. Cancer Care Ontario. Radiation Treatment Capital Investment Strategy April 2012; Cancer Care Ontario: Toronto, ON, Canada, 2012. [Google Scholar]
  34. Cancer Care Ontario. Radiation Treatment Capital Investment Strategy 2018; Cancer Care Ontario: Toronto, ON, Canada, 2018. [Google Scholar]
  35. Simar, L.; Wilson, P.W. A general methodology for bootstrapping in non-parametric frontier models. J. Appl. Stat. 2000, 27, 779–802. [Google Scholar] [CrossRef]
  36. Ozcan, Y.A.; Legg, J.S. Performance measurement for radiology providers: A national study. Int. J Healthc. Tech. Manag. 2014, 14, 209–221. [Google Scholar] [CrossRef]
  37. Cancer Care Ontario. Radiation Treatment Program: Implementation Plan 2019–2023; Cancer Care Ontario: Toronto, ON, Canada, 2019. [Google Scholar]
  38. Behr, A. Production and Efficiency Analysis with R; Springer: Berlin/Heidelberg, Germany, 2015. [Google Scholar]
Figure 1. Key dates.
Figure 1. Key dates.
Healthcare 10 01033 g001
Figure 2. Visualization of a VRS model with a single input and single output.
Figure 2. Visualization of a VRS model with a single input and single output.
Healthcare 10 01033 g002
Figure 3. Mean bias-corrected planning efficiency scores.
Figure 3. Mean bias-corrected planning efficiency scores.
Healthcare 10 01033 g003
Figure 4. Comparing planning dimension performance: C8 and C14.
Figure 4. Comparing planning dimension performance: C8 and C14.
Healthcare 10 01033 g004
Figure 5. Benchmarking according to planning and quality dimensions (2016).
Figure 5. Benchmarking according to planning and quality dimensions (2016).
Healthcare 10 01033 g005
Figure 6. Benchmarking according to planning and delivery dimensions (2016).
Figure 6. Benchmarking according to planning and delivery dimensions (2016).
Healthcare 10 01033 g006
Figure 7. Benchmarking according to delivery and quality dimensions (2016).
Figure 7. Benchmarking according to delivery and quality dimensions (2016).
Healthcare 10 01033 g007
Table 1. Characteristics of regional cancer centers in Ontario.
Table 1. Characteristics of regional cancer centers in Ontario.
RCCTeaching Hospital
(Yes = 1 )
Medical Resource Level aDiversification % aCatchment Population (’000s) a
C10[1, 6) b32.62≤500 b
C20[6, 10)21.28>500
C30[6, 10)23.52>500
C40[6, 10)20.06>500
C50[1, 6)23.33≤500
C60[1, 6)16.33>500
C70[6, 10)21.01≤500
C81≥1033.63≤500
C91≥1036.49>500
C101[1, 6)31.50≤500
C111≥1055.69>500
C121[6, 10)31.80≤500
C131≥1040.13≤500
C141≥1039.70≤500
C151[6, 10)30.42≤500
a Yearly average from 2013 to 2016. b Values presented in this table have been categorized to maintain center anonymity; each center’s precise values were used in our quantitative analysis.
Table 2. Input and output parameters for DEA models.
Table 2. Input and output parameters for DEA models.
ParameterDescriptionDim (I or O) 12013201420152016
MeanSDMeanSDMeanSDMeanSD
Clinic VisitsNumber of new radiation and follow-up clinic visits with a physicianP(I) 12,568.877615.6813,256.077458.4313,910.47832.7213,357.077782.20
Planning VisitsNumber of radiation planning visitsP(I) 961.871307.27889.401229.06929.201313.66939.531404.12
Simulation VisitsNumber of visits involving conventional simulation, CT simulation, or emerging imaging methodsP(I) 3950.872450.013989.932730.233999.932505.114103.202479.76
RTC Target% of patients whose time from referral to a radiation oncologist until the consult occurred (referral-to-consult; RTC) was ≤14 daysP(O) 82.818.2986.516.3987.206.5885.676.14
RTT Target% of patients who started treatment ≤14 days from the date the patient was deemed ‘ready-to-treat’ (RTT) by the radiation oncologist responsible for that patient’s careP(O)D(I) 293.584.4294.033.2192.813.4490.747.17
MR CapacityAvailable MR equipment hoursD(I) 21,310.4011,790.9521,310.4011,790.9521,310.4011,790.9519,136.8012,658.32
MR Utilization% time MR equipment is in use, calculated as the number of hours MR equipment was in use divided by MR capacityD(O) 89.6015.0890.6015.3891.6015.6581.5313.38
Treatment VisitsNumber of visits where radiation is given with a LINACD(O)Q(I)48,462.6738,107.0648,491.9339,037.9050,253.3338,813.5749,702.8737,659.65
Patient Support VisitsPatient education and co-ordination/scheduling of radiation-related visitsQ(I) 39,085.7334,866.6141,535.4038,200.2844,525.0039,306.0443,830.2740,144.35
Quality Assurance VisitsSome activities include image acquisition, use of respiratory gating equipment, peer review, and fluence/dosimetry checksQ(I) 77,278.8091,423.4579,555.2787,393.0184,650.6089,687.2187,858.2091,477.60
Survival Rate (%)1 − (Deaths ÷ Incidents)Q(O) 81.9117.8081.0718.1484.9514.9081.3117.74
Inverse Subsequent Diagnosis Rate (%)1 − (Subsequent Cancer Diagnoses ÷ Incidents)Q(O) 85.421.6484.081.6478.352.1778.612.24
1 Dim = dimension, I = input, O = output; 2 inverse of rtt target was used.
Table 3. Planning dimension efficiencies, bias-corrected bootstrap efficiencies, and quantiles.
Table 3. Planning dimension efficiencies, bias-corrected bootstrap efficiencies, and quantiles.
DMU2013201420152016
θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95
C10.7430.6530.5380.7360.4540.3840.2890.4510.7380.6250.5060.7220.8380.6960.6050.808
C21.0000.7680.5570.9901.0000.7400.5440.9831.0000.7210.5350.9761.0000.7210.5680.957
C31.0000.7650.5460.9911.0000.7370.5440.9831.0000.7980.5870.9721.0000.7270.5700.959
C41.0000.8870.7540.9920.6880.6020.5070.6770.3880.3290.2740.3821.0000.7370.5900.952
C51.0000.7710.5670.9911.0000.7520.5800.9851.0000.7620.6290.9761.0000.7440.6160.953
C61.0000.8740.7090.9931.0000.7330.5440.9831.0000.8070.6730.9781.0000.7500.6270.955
C71.0000.8260.6370.9921.0000.8000.6140.9840.9380.7690.6030.9161.0000.7490.6150.953
C80.2280.1980.1520.2260.2850.2460.1970.2810.2500.2110.1680.2450.4450.3630.3100.431
C90.3420.3120.2630.3400.2640.2280.1930.2600.3070.2670.2320.3000.2800.2250.1940.267
C100.5850.5130.3700.5801.0000.7610.5630.9811.0000.7240.5350.9731.0000.7180.5680.948
C110.6020.5480.4530.5970.7920.7060.6190.7790.7310.6290.5450.7131.0000.7540.6320.955
C120.1760.1590.1330.1750.2130.1890.1680.2100.1280.1080.0880.1260.2550.2080.1810.244
C131.0000.7630.5460.9910.3840.3340.2670.3810.4220.3570.2850.4161.0000.7200.5690.952
C140.1680.1520.1210.1670.1580.1400.1140.1560.2130.1940.1710.2110.2300.1850.1550.224
C150.8850.7860.6330.8781.0000.7330.5440.9830.8910.7680.5820.8741.0000.7220.5680.948
Table 4. Delivery dimension efficiencies, bias-corrected bootstrap efficiencies, and quantiles.
Table 4. Delivery dimension efficiencies, bias-corrected bootstrap efficiencies, and quantiles.
DMU2013201420152016
θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95
C11.0000.9680.8670.9991.0000.9700.9130.9981.0000.9700.9030.9981.0000.9760.8651.000
C20.9880.9780.9660.9860.9810.9700.9540.9801.0000.9870.9730.9991.0000.9900.9751.000
C31.0000.9780.9370.9991.0000.9750.9350.9981.0000.9740.9310.9991.0000.9760.8361.000
C40.9070.8990.8890.9060.9240.9190.9120.9230.9420.9360.9290.9410.9450.9420.9380.945
C50.9850.9770.9650.9840.9810.9670.9480.9790.9860.9780.9630.9850.9760.9700.9520.976
C60.9800.9690.9470.9791.0000.9810.9600.9990.8860.8750.8480.8851.0000.9800.9121.000
C70.9630.9550.9460.9620.9750.9660.9550.9740.9740.9650.9550.9730.9960.9920.9790.996
C80.9260.9190.9120.9250.9530.9450.9360.9520.9770.9700.9610.9760.9510.9490.9440.951
C90.9970.9870.9640.9960.9970.9850.9640.9951.0000.9890.9710.9990.9800.9740.9590.980
C101.0000.9870.9600.9991.0000.9850.9470.9991.0000.9860.9530.9991.0000.9740.8361.000
C111.0000.9670.8670.9991.0000.9690.9130.9991.0000.9700.9040.9981.0000.9740.8371.000
C121.0000.9860.9730.9990.9110.9020.8900.9100.9430.9340.9230.9411.0000.9920.9781.000
C130.8400.8350.8290.8390.8980.8910.8830.8970.9040.8970.8900.9030.8040.8020.7990.803
C140.9840.9750.9630.9830.9820.9720.9580.9810.9780.9690.9560.9770.9920.9900.9870.992
C150.9690.9620.9530.9680.9670.9610.9540.9650.9820.9760.9680.9810.9800.9770.9710.980
Table 5. Quality dimension efficiencies, bias-corrected bootstrap efficiencies, and quantiles.
Table 5. Quality dimension efficiencies, bias-corrected bootstrap efficiencies, and quantiles.
DMU2013201420152016
θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95 θ θ ^ q 0 . 05 q 0 . 95
C11.0000.7190.5670.9581.0000.7130.5510.9861.0000.6730.5560.9241.0000.7000.5540.966
C21.0000.6800.5310.9581.0000.7110.5500.9841.0000.6980.5770.9331.0000.6980.5480.965
C30.6560.5020.3690.6331.0000.7110.5510.9811.0000.6670.5540.9211.0000.6960.5460.964
C41.0000.6760.5140.9621.0000.7760.5610.9851.0000.6730.5580.9261.0000.7730.5710.967
C51.0000.6790.5190.9571.0000.7140.5510.9831.0000.6720.5560.9211.0000.6960.5470.960
C60.2490.1930.1420.2430.8170.6930.5160.8050.7450.5720.4820.6940.6410.5000.3990.621
C71.0000.6960.5440.9641.0000.8100.5730.9841.0000.6740.5610.9291.0000.6950.5480.960
C80.7670.5830.4330.7470.1920.1650.1370.1890.2280.1680.1390.2110.1920.1540.1170.188
C90.0550.0430.0330.0540.2240.1970.1690.2200.2140.1650.1420.2000.1650.1500.1320.164
C100.4070.3570.3010.4030.9100.8410.7600.9051.0000.7940.6850.9270.9980.9030.8020.988
C110.2430.1880.1440.2330.2320.1940.1450.2280.2250.1670.1350.2160.2150.1700.1250.211
C120.3060.2290.1750.2940.3720.3410.3050.3660.4110.3230.2770.3940.4480.3900.3390.440
C130.0850.0730.0600.0840.2180.2030.1850.2150.2780.2230.1920.2620.2480.2160.1900.243
C140.1110.0910.0680.1090.1820.1550.1200.1800.2160.1640.1370.2050.2360.2020.1620.233
C151.0000.6720.5140.9550.3910.3380.2750.3851.0000.6690.5560.9230.4940.4390.3840.489
Table 6. Regression variables and results.
Table 6. Regression variables and results.
Dependent VariableDescriptionMeanSDCoefficient by Dimension
PlanningDeliveryQuality
MRLMedical Resource Level: a measurement of medical equipment required for radiation treatment delivery7.684.200.2216 **0.0055 *0.4234 *
DIVRadiation Treatment Diversification: Proportion of radiation treatments delivered to body regions other than pelvis or chest30.5010.129.2068 ***−0.1373−3.789
POPCenter Catchment Population: Population within 50 km radius of center (in ’000s)585.93350.10−0.00070.00000.0000
TEATeaching hospital designationSee Table 1 for binary designation−0.62160.00142.093
R 2 0.63640.00080.3846
*** p < 0.001, ** p < 0.01, * p < 0.05. Note: a positive (negative) coefficient indicates a decrease (increase) in efficiency.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bayley, T.; Begen, M.A.; Rodrigues, F.F.; Barrett, D. Relative Efficiency of Radiation Treatment Centers: An Application of Data Envelopment Analysis. Healthcare 2022, 10, 1033. https://doi.org/10.3390/healthcare10061033

AMA Style

Bayley T, Begen MA, Rodrigues FF, Barrett D. Relative Efficiency of Radiation Treatment Centers: An Application of Data Envelopment Analysis. Healthcare. 2022; 10(6):1033. https://doi.org/10.3390/healthcare10061033

Chicago/Turabian Style

Bayley, Tiffany, Mehmet A. Begen, Felipe F. Rodrigues, and David Barrett. 2022. "Relative Efficiency of Radiation Treatment Centers: An Application of Data Envelopment Analysis" Healthcare 10, no. 6: 1033. https://doi.org/10.3390/healthcare10061033

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop