Next Article in Journal
Groundwater Resources Assessment for Sustainable Development in South Sudan
Next Article in Special Issue
Chemical Destabilization of Fresh and Spent Cutting Oil Emulsions: Differences between an Ecofriendly and Two Commercial Synthetic Lubricants
Previous Article in Journal
Sustainable Curriculum Planning for Artificial Intelligence Education: A Self-Determination Theory Perspective
Previous Article in Special Issue
Disentangling Benefit-Sharing Complexities of Oil Extraction on the North Slope of Alaska
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

To Rank or Not to Rank with Indices? That Is the Question

Centre for Environment and Sustainability (CES), University of Surrey, Guildford, Surrey GU2 7XH, UK
Sustainability 2020, 12(14), 5572; https://doi.org/10.3390/su12145572
Submission received: 18 June 2020 / Revised: 4 July 2020 / Accepted: 6 July 2020 / Published: 10 July 2020

Abstract

:
Ranking countries via index-based league tables is now commonplace and is said by its proponents to provide countries with an ability to compare performance with their peers, spurring them to learn from others and make improvements. The Human Development Index (HDI) is arguably one of the most influential indices of its type in terms of reporting within the media and influence on development policy and funding allocation. It is often used as part of a suite of indices to assess sustainability. The index was first published in the Human Development Report (HDR) of 1990 and has appeared in each of the HDRs published since then. This paper reports the first research of its type designed to explore the impacts of methodological changes over 28 years (1991 to 2018) on the ranks of a sample of 135 countries appearing in the HDRs. Results suggest that methodological changes in the HDI have had a statistically significant impact on the ranking of the majority (82%) of countries in the sample, and the ranks of countries that tend to appear towards the top, middle, or bottom of the HDI league table are just as likely to be influenced by changes in HDI methodology. The paper suggests that after nearly 30 years of the HDI, there is an urgent need for independent and empirical research on the changes that it has helped bring about.

1. Introduction

The ongoing COVID-19 tragedy has, at the time of writing, been at least partially responsible for the deaths of over 400,000 people globally and the infection rate is estimated to be over 700 million. These are sobering and upsetting figures, especially since the virus was only confirmed in countries outside of China in January 2020—barely 6 months ago. Every day since 30 March 2020, the UK government has been giving daily briefings on the state-of-play with COVID-19, although prior to that there were various statements from the prime minister and other senior politicians. The daily briefings have followed a broadly consistent format of a member of the government (e.g., minister for health) making a statement followed by a government expert (e.g., chief medical officer) presenting a series of slides showing the progression of the disease in the UK. In the period between 30 March and 9 May 2020, one of the key slides each day comprised a comparison of the UK (England, Scotland, Wales, and Northern Ireland) with other countries in terms of cumulative deaths that could be ascribed to COVID-19. Figure 1 is a graph based on a compilation of COVID-19 mortality data for the UK and a number of European countries. The data used for Figure 1 are those provided by the UK government in a spreadsheet released for each of the daily briefings and have not been adjusted to account for differences in population size or indeed any other factors. Day zero was taken to be the first day when 50 or more deaths attributed to COVID-19 were reported in each country and the lines are cumulative number of deaths. In the UK, this was initially reported in terms of deaths taking place in hospitals but was later replaced by a more accurate figure based on deaths in all settings: primarily hospitals and care homes for the elderly. The intention was clearly to show a comparison between the UK and other countries in terms of disease progression, even though it was acknowledged that there are differences between them in the ways in which the data were collected. Nonetheless, the graph clearly shows how the UK was heavily affected by the disease—far more so than any of the other European countries. As a result of this stark picture, questions were continually being asked of the politicians and government experts as to why the UK was doing so badly. What was it about the UK that resulted in such a high number of deaths due to COVID-19?
However, this “ranking” of countries via line graphs of mortality attributed to COVID-19 ceased as of 9 May—the last briefing where the graph was presented. The sudden omission of the ranking chart did receive a fair level of criticism in the UK press and indeed from politicians. Interestingly, there was a parallel debate around the same time amongst experts about the value of such ranking. Professor Sir David Spiegelhalter, Chairman of the Winton Centre for Risk and Evidence Communication at the University of Cambridge, wrote an article published in the Guardian newspaper on 30 April 2020 which included the following:
Every country has different ways of recording Covid-19 deaths: the large number of untested deaths in care homes have not featured in Spain’s statistics—which, like the UK’s require a positive test result. The numbers may be useful for looking at trends, but they are not reliable indicators for comparing the absolute levels………But, of course, people are not so interested in the numbers themselves—they want to say why they are so high, and ascribe blame. But if it’s difficult to rank this country, it’s even trickier to give reasons for our position.
(emphasis added by author)
This article was picked up by both the government and its experts as a reason why it is not good to have rankings of the UK with other countries. They all pointed to differences in the way data are collected across countries and indeed the ways in which mortality from COVID-19 was defined—points which Professor Spiegelhalter highlighted in his article. Thus, the argument goes, if data are not strictly comparable, then country rankings should cease. Indeed, Professor Spiegelhalter began to take issue with the use of his article and critique of rankings by politicians and in a later edition of the Guardian there can be found the following rebuff:
A statistician has asked the government to stop using an article he wrote for the Guardian as justification for why Britain’s death toll from coronavirus should not be compared with that of other countries. Prof David Spiegelhalter said in the piece published on 30 April that comparing the number of deaths from Covid-19 between countries was difficult because of the different methodologies used by governments to measure deaths. The day the article was published, England’s chief medical officer, Prof Chris Whitty, praised it during the No 10 daily coronavirus briefing, saying it showed that comparing death rates in different countries was a “fruitless exercise”……… Boris Johnson again referred to Spiegelhalter’s words on Wednesday in a response to the Labour leader, Keir Starmer, during prime minister’s questions, after Britain’s death toll became the highest in Europe and second highest globally………. However, a few hours later Spiegelhalter, tweeted: “Polite request to PM and others: please stop using my Guardian article to claim we cannot make any international comparisons yet. I refer only to detailed league tables—of course we should now use other countries to try and learn why our numbers are high.
(Harry Taylor, Guardian, 6 May 2020; emphasis added by author)
The message now appears to be far more nuanced; league table rankings appear to be unacceptable (“fruitless exercise”) while it seems other forms of “country comparisons” are acceptable if they help to provide “learning”. However, this debate over the validity and presumed usefulness of country rankings in league tables is just a recent—albeit intense, given the health and political ramifications—incarnation of one that is much older within the community of researchers and practitioners working with indicators, including those used to assess sustainability. The contextualized and selective use of indices by users has been well-known and reported in the literature for many years (a discussion for sustainability indicators can be found in [1]) and, ironically, the UK has been something of a leader in this field as league table rankings have been embraced by governments of all political shades and existed for long in many sectors including education and health. It often comes down to a fundamental question: to rank or not to rank performance with what are, by definition, a simplified set of indicators or even a single index?

2. Literature Review

This paper is not about COVID-19 or indeed about the UK or international response to the epidemic. Analyses of the epidemiology of the disease and the effectiveness (or not) of the various responses by agencies are best left to a later date. Instead, the paper will build on the prompt provided by the latest manifestation of the league table ranking debate surrounding the use of indices of country performance. It will seek to re-open that question by focusing on one of the oldest indices still published routinely in country league table format—the Human Development Index (HDI).
The HDI is arguably one of the most influential indices of its type in terms of reporting within the media and influence on development policy and allocation of funding. It was developed during the 1970s and 1980s [2] and it is often said that the motive was largely a desire by the United Nations Development Programme (UNDP) to try and move the development discourse away from what it saw as a strong emphasis at the time from other powerful international agencies such as the World Bank on economic development and towards a more balanced “human” development. Since 1990, the UNDP has published country league table rankings of the HDI as one of the tables within its annual Human Development Reports (HDRs), usually as the first in a suite of indicator tables at the end of the report [3,4]. The HDI encapsulated some of the social indicators readily available at country level at the time and combined them with a proxy measure of income (initially gross domestic product (GDP) or GDP/capita) within a theoretical framework of human development that drew heavily from Nobel Prize winner Amartya Sen’s work on “capabilities” [5]. From the very beginning, the HDI was intended to allow the allocation of a single “headline” number to each country to “capture” its human development rather than rely on a suite of separate social indicators. The use of a single index to capture human development meant that countries could be ranked in a league table format, with best performers at the top and worst at the bottom. Thus, countries could easily compare their performance against those they consider to be their peer group. The logic here is that the government of a poorly performing country in the league table will feel external and internal pressure, an example of the latter being pressure from media reporting [6], to improve its standing. In almost complete contradiction to the point made by Professor Spiegelhalter, with such league table rankings it can be argued that the value of the HDI becomes less important and what matters is where a country is ranked relative to peers [7,8]. Indeed, the kind of in-depth strategic debate that can be had here about improving a country’s HDI rank is illustrated in a paper by Bryane (2018) for Brunei [9].
Since 1990, the underlying conceptualization of the index as combing three components has remained intact:
  • Life expectancy as a proxy measure for health. It is assumed here that people cannot improve their livelihood unless they are healthy.
  • Education. It is assumed here that higher levels of education provide the capability to develop as it provides, amongst other things, opportunities for employment and career development.
  • Income (proxied by GDP/capita). It is assumed here that people need financial capital to help improve their livelihood options through the purchase of goods and services.
Thus, good health, education, and income were seen by the creators of the HDI as key to providing the basis for people to break out of low human development and were considered to be of equal importance [10]. Indeed, the HDI has often been seen, rightly or wrongly, as a measure of quality of life and it has often been included within suites of indicators intended to assess sustainability.
The UNDP has consistently refused to expand the components of the HDI and argued that simplicity was a vital requirement for transparency [11,12,13,14,15]. While the creators of the HDI have not altered its “soul” in any way, they have made changes to the way in which the HDI is calculated as well as the choice of components to best reflect that soul. This is understandable given that the HDI has been around since 1990 and has attracted much attention and critique from researchers and practitioners from the very start [16,17,18], with some suggesting alternative weightings of components [19,20] or even new indices altogether [2]. However, and here there is a strong echo to the COVID-19 ranking debate in the UK, any change to the choice of indicators, quality of the datasets, and decisions over the methodology of calculation would potentially, of course, influence country placement in the league tables irrespective of what a government does (or does not do). As with the COVID-19 indicators, there is a reliance on data collected by the countries themselves and while politicians and others may rightly point to the variation in the way data on disease incidence and mortality are collected across Europe, the same point could equally, if not far more so, apply to the 100+ countries included in the HDI league table spanning the developed and developing worlds. Why should such league tables of the HDI also not be regarded as a fruitless exercise? Indeed, the uncertainties embedded within the HDI can add much fuzziness to the notion of an objective or “true” country ranking just as it can for COVID-19 indicators, and this has been well-reported in the literature since the origin of the HDI. For example, Høyland et al. (2013; pages 11–12), following their analysis of the uncertainty in index rankings, have noted in words that resonate with the concern of Professor Spiegelhalter regarding the use of country ranks for COVID-19 mortality, “Whenever the scores of international index rankings are taken literally, the indexes may be poor guides for policies as each link between indicators and scores is noisy and uncertain, but presented as certain.” [21].
Here is the basic conundrum behind the question, “To rank or not to rank?” While the intention may be well-meaning (i.e., to allow countries to compare their performance with peers as a spur for them to learn and improve) there is a danger that the indicators and the ranking of countries based on them may be taken too literally and presented as some kind of objective truth. This results in a grey area that is open to exploitation by those who wish to praise the indicators and report them when they do well in the league table and denigrate them as “noisy and uncertain” when their ranking is not so good. The results may be exactly, and indeed almost predictably given the intense context provided by COVID-19, the sort of debate Professor Spiegelhalter became embroiled in with the politicians and government officials. Just where can the line be drawn between a need for comparing the performance of countries so lessons can be learned and the use of a tool such as a league table which apparently allows just that?
What is less reported in the literature are the impacts that factors such as different methodologies for collecting and reporting data between countries as well as compiling (aggregating) it all into indices have on country rank. In fairness, the latter point (aggregation methodology) has received some attention. Morse (2013) explored the HDI rankings of 167 countries and how they may have been affected by a change in the methodology for the income component of the HDI that took place in 1999 [22]. Results suggest that for the majority (65%) of countries in the dataset their “adjusted ranks” between two periods (1991–1998 and 1999–2009) were not influenced by the change in methodology for handling GDP/capita, while for 35% the change did influence their rank. However, there have been a number of methodological changes in the HDI, not only with regard to how the income component is handled and there has been no exploration to date of the impact coming from the totality of change (all components together) and how these impacts compare between countries. For example, are countries towards the top, middle, or lower end of the league table more vulnerable to such change or is the vulnerability equally distributed across the spectrum of HDI ranks?
The methodology by which the components of the HDI are aggregated has seen some significant change since 1990 and these have revolved around the following:
  • Education component: A number of components have been used during the life of the HDI, including years of schooling, enrolment in full-time education, and adult literacy rate. The latter was used in the education component until the HDR of 2009, after which it was dropped and only years of schooling was employed.
  • Income component: There have been a number of changes here, some small and some large. Firstly, the UNDP has alternated between the use of logarithmic (base 10) and Atkinson transformations to transform the data and help avoid a dominance of this component in the index (Morse, 2013). In 1990 and between the HDRs of 1999 and 2018, the UNDP used logarithms while between the HDRs of 1991 and 1998, they used the Atkinson formula. Secondly, while most of the years (HDRs 1990 to 2009) used the real GDP/capita (adjusted for purchasing power parity (PPP) and chained to a particular year), in more recent publications of the HDR (HDRs of 2010 onwards), there was a switch to using gross national income (GNI) per capita (also adjusted for PPP). GDP and GNI are similar metrics but not the same. At the same time as changing to GNI, logarithm base e was used rather than logarithm base 10 for transforming GNI/capita.
  • Arithmetic and geometric means. It has always been assumed that the three components of the HDI have the same weight within the index, and until 2009 this was achieved by taking the arithmetic mean of the three HDI components. After that year (HDRs 2010 onwards) the geometric mean was used instead, ostensibly to avoid high values of one component compensating for low values of another [23]. However, this change has been claimed to have a negative impact on the HDI for developing countries [19,24].
As well as these more significant changes, there have been other smaller ones related to data years for the index components and the choice of minimum and maximum values for standardization. Following on from this, it is possible to establish two key years which saw significant changes in the calculation of the HDI:
  • Change 1 (HDR 1999): Transformation of the GDP/capita component changed from the Atkinson formula to the use of logarithm (base 10).
  • Change 2 (HDR 2010): Adult literacy rate was dropped from the education component and changes were made to the way in which the income component was calculated and transformed. In addition, there was a shift from arithmetic to geometric mean for combining the three components of the HDI.
Which of these two changes had the largest impact on country rank? If there are changes then is the impact greater for countries that occupy the top, middle, and lower ends of the table? Finally, why is there a continued fascination with using indices to rank countries so that problems such as that encountered in the UK with COVID-19 still seem to occur? These three questions form the basis for the work reported here, although the emphasis will be placed on the first two. The third will be explored in the discussion.
The paper will first set out the major methodological changes in the HDI that have occurred during its lifetime and based upon this a number of key “change” years will be identified. These change years will then be used to explore the first of the aims above. The paper will then move onto the second of the two aims and explore whether there is any evidence of the methodological changes in HDI having differential impacts across countries occupying different parts of the table.

3. Materials and Methods

3.1. HDI Ranks

The HDI published in the HDR 1990 was not included in the analysis largely because the index was arguably still experimental at that time and relatively few countries were included. Hence the focus here is upon the HDI published in the HDRs between 1991 and 2018. Over that period there have been many geopolitical changes and the reported rank of a country may change between HDRs as the number of countries changes. For example, Niger was ranked at 187 in the HDR of 2014 but 188 in HDR 2015 and 189 in HDR 2018. In all three cases, it was the bottom-ranked country, so the ranks of 187, 188, and 189 are simply a reflection of changes in the number of countries included in the HDI league table. Thus, allowance needs to be made for changes in the number of countries by calculating an adjusted rank for each of them and in the work reported here this has been based on a fixed scale of 1 (top-ranked) and 2 (bottom-ranked). The original rank of a country is that taken from the HDI table in the HDRs and adjusted ranks were calculated as follows:
Adjusted rank = 1 + ((original rank − 1)/(lowest rank − 1))
The result of adjustment in this way is a series of ranks spanning 1 to 2 irrespective of the number of countries in the HDI table. For example, using adjusted ranks for the HDI tables in HDR 2014, HDR 2015, and HDR 2018 means that Niger has a value of 2.0 for each of the years rather than having ranks varying between 187 and 189.
In order to allow for consistency of comparison of adjusted ranks over time, a group of 135 countries was selected that were territorially the same between 1991 and 2018 (Table 1). It should be noted that this sample of 135 countries represents a significant proportion of the total number of countries included in the HDI tables between 1991 (84% of all countries) and 2018 (71% of all countries). Thus, each country had a total of 25 adjusted ranks between 1991 and 2018.

3.2. Analysis of HDI Adjusted Rank

In order to test the impact of methodological changes of the HDI on adjusted rank for countries, the adjusted ranks were analyzed using linear regression (least squares estimation). The model adopted was as follows:
Adjusted rank = intercept + β1 N + β2 Change 1 + β3 Change 2 + error
N = the number of countries included in the HDI table for that year. While the original ranks were adjusted to a fixed scale of between 1 and 2, this could still have an impact as countries may be pushed up or down by the inclusion of new countries below or above them.
Change 1 and Change 2 are dummy variables each having values of 0 or 1. Values of 0 were used to cover the years before the methodological change and values of 1 were used to cover the years after the methodological change. For Change 1, 0 was used for the period from 1991 to 1998 when Atkinson transformation was used for GDP/capita and 1 from 1999 to 2018 when logarithms were used. For Change 2, 0 was used for the period 1991 to 2009 and 1 for the period 2010 to 2018 when changes were made to the education and income components as well as a switch from arithmetic to geometric mean.

4. Results

The relationship between standard deviation and mean of adjusted ranks for the 135 countries is shown in Figure 2. If standard deviation is used as a measure of “volatility” in adjusted rank, then this is clearly greater for countries towards the middle of the league table than at the top (high human development) or the bottom (low human development). Countries at the extreme tend to be relatively stable in terms of their rank, but for some countries having a middle-rank the volatility is large, a conclusion that matches that of Cilingirturk and Kocak (2018) [8]. Given the volatility in adjusted rank shown in Figure 2, the question that needs to be asked: is this due to methodological changes or countries simply doing “better” than their peers in terms of improving human development? The results of the least squares regressions using dummy variables (0 or 1) for the periods of relative stasis either side of the key change years (1999 and 2010 for Change 1 and Change 2 respectively) are shown in Table 2. The 135 countries have been grouped into the following categories:
  • No significance: None of the regression coefficients are statistically significant at P < 0.05
  • Single significance: One coefficient is statistically significant at P < 0.05 (either the number of countries (N), Change 1, or Change 2)
  • Double significance: Two coefficients are statistically significant at P < 0.05 (N with Change 1, N with Change 2, or Change 1 with Change 2)
  • Triple significance: All three coefficients are statistically significant (N, Change 1, and Change 2)
Graphical examples of the influence of the Change 1 and Change 2 factors on HDI rank are shown for four countries (Fiji, Gabon, Botswana, and Malta) in Figure 3. Malta is one of the minority of countries with no influence from all of the factors, and indeed the adjusted HDI rank has remained relatively stable from 1991 to 2018. Fiji had a significant influence from Change 1 while Gabon had a significant influence from Change 2. Botswana had significant influences from both Change 1 and Change 2.
A summary of the number of countries (and percentage) within each of these four groups (none, single, double, and triple) is provided in Table 3. Only 24 (18%) countries had rankings that were unaffected by changes in the number of countries or the Change 1 and Change 2 shifts in HDI methodology. The vast majority (82%) of countries had ranks that were influenced by at least one of the changes. Indeed, most countries (60%) had ranks that were influenced by either Change 1, Change 2, or both (Change 1 with Change 2). The number of countries (N) seemed to have a relatively minor influence. Only 2% of countries had adjusted ranks influenced by N, and only 11% were influenced by the combinations of (N with Change 1) and (N with Change 2). Only 12 (9%) of countries were influenced by all three (N, Change 1 and Change 2). Thus, across the 135 countries, it seems that changes in the number of countries had relatively little influence on adjusted rank relative to Change 1 and Change 2.
Figure 4 shows the distribution of countries with statistically significant influences from combinations of N, Change 1, and Change 2, as well as no statistically significant influence. In each graph, the vertical axis is the mean adjusted rank and the countries have been placed in order from lowest mean rank (best human development) at the left-hand side to the highest mean rank (lowest human development) at the right-hand side. Each point is a country. Figure 4a,b shows the distributions for those countries having a statistically significant influence on adjusted rank coming from Change 1 and Change 2 respectively, while Figure 4c is a plot of those countries having a significant influence from both Change 1 and Change 2. Given that the number of countries influenced by N as well as N in combination with Change 1 and Change 2 is relatively small, these have been combined into a single graph (Figure 4d). Figure 4e shows those countries with a significant influence on adjusted rank coming from N, Change 1, and Change 2, while Figure 4f shows the distribution of countries having no statistical influence from any of the independent variables. There is no obvious “bunching” of points within any of these plots graphs which suggests that influences of the dependent variables on adjusted rank are not especially concentrated in any place along the distribution. In other words, the spread of points suggests that countries with low or high adjusted ranks are just as likely to be influenced by methodological changes as those in the middle of the distribution.
In terms of the impact of Change 1 and Change 2 on adjusted rank, this may be gleaned from the values of the respective regression coefficients shown in Table 2. To allow for an easier comparison, the statistically significant regression coefficients for Change 1 and Change 2 are shown plotted in Figure 5. For some countries, the coefficients are negative (adjusted rank is reduced by the change, signifying an apparent increase in human development) while for others it is positive (adjusted ranked is increased by the change, signifying a decline in human development), and there is large variation between countries in terms of the size of the coefficient. Overall, there is some suggestion here that the impact on adjusted rank is greater for Change 1 than for Change 2, but the difference is not all that marked.

5. Discussion

While it has been well-reported that methodological shifts in the HDI do have an influence on country rankings within the reported league tables [22] and there have been studies which looked at uncertainty surrounding country rankings of the HDI and other indices [21], this is the first study of its type to explore the influences of some of the major methodological changes between 1991 and 2018 taken in their totality on country rank. While the analysis was a simple one, it does point to some intriguing conclusions that would certainly warrant more detailed investigation. For some countries (18% of the panel used in the analysis) the methodological changes had no significant impact on rank while for the vast majority (82%) there is an influence from one of the changes or combinations of them. The Morse (2013) study included 167 countries in an analysis of the impact on rank of a change in the methodology of the income component in 1999, equivalent to Change 1 in this study, and noted that the ranks of 35% of countries were influenced by the change [22]. However, in this study with a smaller sample (135 countries) and a longer period (1991 to 2018), Change 1 was found to significantly influence the ranks of just 19% of countries. Change 2 had a significant influence on the adjusted ranks of 14% of countries, but 27% were influenced by both Change 1 and Change 2. The number of countries included in the published HDI league table was of relatively minor importance in terms of influencing adjusted rank; the major factors were Change 1 and Change 2.
There is no suggestion from the results that “sensitivity” to methodological change is concentrated at any part of the rank spectrum; if there are influences, then these would seem to be just as likely for those countries ranked high, middle, or low in the HDI league table. Once the ranks for the 135 countries have been adjusted to range from 1 to 2, the influence of the number of countries in the published league table becomes relatively minor, also. Indeed, in terms of the increased volatility in adjusted rank seen towards the middle of Figure 2 (and noted by [8]), there is no suggestion from these results that methodological changes per se are the primary cause, although it does make a contribution for the majority (82%) of the countries. It is more likely that shifts up and down the league table for those middle-ranked countries is due to fluctuations in performance, at least in terms of how this is reflected in the data available to the UNDP for constructing the index, across the HDI components.
However, while the effects of methodological changes in the HDI appear to be evenly distributed across the group of 135 countries included in this study, it is important to note that for any one country the changes can be important, even if the result is a change of just a few places up or down the HDI league table. There are many illustrations of this that could be provided and here is just one, referring in this case to India:
The robust economic growth notwithstanding, India has garnered a lowly 119th rank in the United Nation’s Human Development Index due to poor social infrastructure, mainly in areas of education and healthcare. In the ’Human Development Report 2010’ by United Nations Development Programme (UNDP) that covered 169 countries and territories, India’s position is way below China (89th spot) and Sri Lanka (91)………….. India’s position in the index has improved by one position on the basis of a five-year comparison since 2005.
Press Trust of India, New Delhi (4 November 2010)
India’s adjusted rank was significantly influenced by both Change 1 and Change 2. The change of just one position between 2009 and 2010 referred to in the article, even if based on the five previous years, could readily be explained by the change in HDI methodology in 2010.
The HDI has had significant longevity, nearly 30 years at the time of writing, and is widely reported and used by aid agencies amongst others to help with allocation of scarce resources. The HDI often appears within suites of indices designed to assess sustainability and some have even suggested modifications to the HDI that would make it more of an index of sustainability [25,26,27]. It is also laudable to see flexibility in terms of the construction of the index and an openness to address issues and look for ways to make improvements. The creators of the HDI have always been very open and transparent about the changes they have made and why they did them. They have also been aware of the impact arising from methodological changes in the HDI and have published versions of the HDI in the HDRs based on a consistent methodology to better facilitate an exploration of trends over time. Despite all of this, it still needs to be noted that the current HDI league table for each year of publication is the very first table presented in the collection of tables at the end of the HDR and is thus is inevitably a highlight. It is thus reasonable to assume, despite the inclusion of other “adjusted” HDI tables, that the current HDI league table is the one which will “jump out” to the lay reader.
Given the uncertainties inherent in the HDI and the ranking of countries based upon the index [21,28] is it still, for all its faults, a valuable tool to help improve human development? Does it matter if some of the shifts in rank are caused by changes in HDI methodology as long as the index has helped improve people’s lives? Perhaps surprisingly there have been few, if any, empirical and independent (from UNDP) studies that have addressed the impact of the HDI and this is certainly a gap that needs to be addressed with some urgency. After all, the league table ranking style of HDI presentation was chosen by the UNDP from the very beginning and the intention was a clear one: to provide a vehicle by which countries, and indeed other “consumers” of the rankings such as international aid agencies and the press, could compare themselves with others. The emphasis on ranking was no accident; it was a policy choice and deserves to be assessed dispassionately. It is well-known that the HDI rankings, even with their uncertainties, are touted by politicians in those countries that do well [21,28] and are used by international development agencies to make decisions over allocation of aid. League tables do continue to fascinate and the COVID-19 rankings noted at the start of the paper attracted a great deal of press and public attention in the UK. However, trying to introduce nuance by claiming that country rankings are not valid but at the same time claiming that county comparisons are useful, thereby allowing politicians and those who advise them to both applaud and dismiss ranking when it suits, is hardly a ringing endorsement of the league table approach. Nuance may be well-meaning and indeed appropriate given the uncertainty surrounding index rankings and the impacts of methodological change if an index has been around for a long time, but there is always the suspicion that it is the favored retreat of those who do badly in the rankings.
In the case of the UK COVID-19 international data presentation shown as Figure 1 it may no doubt be argued that by the time the government stopped presenting the graph (10 May 2020 onwards) in its daily briefings, the country was so far above any other European country in terms of mortality that it was no longer necessary to keep repeating the point. Hence, some may say that there is a degree of “ranking fatigue”; once a country becomes rooted in one spot at the bottom or top in the league table, it may appear pointless to keep repeating it. If this is so, then surely one could make a similar point regarding the HDI league table. After all, since the first version published in the HDR of 1990, the top place has been dominated by a handful of countries (mostly just two—Canada and Norway) while the bottom of the table has been occupied entirely by African countries (mostly Sierra Leone and Niger).
Nonetheless, for all the faults inherent with ranking countries using what are relatively simple indices open to issues of varying data collection methods and hence quality let alone changes to the index methodology as noted in this study, there may well be significant benefits to be gained. The criticisms of indicator “technocrats” would be insignificant if the indices and the rankings based on them do deliver benefits for communities, and, after all, that is precisely what the HDI and the league table style in which it is presented is meant to achieve. It is always important to remember that behind all the graphs and tables in this paper are people. Figure 1 is an embodiment first and foremost of tragedy not competition but so too are the rankings based on the HDI which embody, for many in the developing world, poor life expectancy and poor access to education. It is sometimes all too easy with indicators to forget the human stories which exist at their heart. Indeed, and looking at it from another angle, there may also be some fundamental questions that need to be asked about how indices were developed and whether (potentially unconscious) cultural biases could favor the rankings of some countries over others. This point has been made before, for example with the Environmental Sustainability Index [29], but to date has not received much attention from researchers.
Nonetheless, while the HDRs are replete with case study stories of success and failure there is little, if any, attempt to link these to the HDI, which is perhaps surprising given the emphasis placed on the index. The HDI was meant to be transformative by refocusing attention of policy makers away from an excessive focus on GDP, but has it achieved that [30]? Assessing the impact of a single index is certainly complex and challenging and could take many inter-related forms with many processes at play, but after nearly 30 years of the HDI and all of the effort that has gone into maintaining and promoting the index, now is the time for a better understanding of the changes, good and bad, that it has helped bring about.

6. Conclusions

Index-based league table rankings have been popular for many years and are said to help generate a sense of comparison with peers which facilitates learning and pressure to improve performance. The HDI-based league tables published by the UNDP since 1990 provide an example. However, methodological changes in the HDI over 28 years (1991 to 2018) appear to have had a significant impact on the ranking on the majority (82%) of countries in the group of 135 countries, and the impacts are not focused on any part of the distribution of country ranks. Countries at the top, middle, or bottom of the HDI league table are just as likely to be influenced by changes in HDI methodology. Does it matter that there is no precise, robust, and consistent measure of human development upon which the ranks are based? Maybe ranking, for all of its faults, does deliver benefits. This is a question that needs to be addressed even if assessing the impact of a single index is challenging.

Funding

This research received no external funding.

Acknowledgments

The author would like to thank the anonymous reviewers for their suggestions to improve the paper.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Herzi, A.A. Sustainability indicators system and policy process in Malaysia: A framework for utilisation and learning. J. Environ. Manag. 2004, 73, 357–371. [Google Scholar]
  2. Hou, J.; Walsh, P.P.; Zhang, J. The dynamics of Human Development Index. Soc. Sci. J. 2015, 52, 331–347. [Google Scholar] [CrossRef]
  3. Böhringer, C.; Jochem, P.E.P. Measuring the immeasurable—A survey of sustainability indices. Ecol. Econ. 2007, 63, 1–8. [Google Scholar] [CrossRef] [Green Version]
  4. Wilson, J.; Tyedmers, P.; Pelot, R. Contrasting and comparing sustainable development indicator metrics. Ecol. Indic. 2007, 7, 299–314. [Google Scholar] [CrossRef]
  5. Sen, A. Development as Freedom; Oxford University Press: Oxford, UK, 1999. [Google Scholar]
  6. Morse, S. Harnessing the power of the press with indices. Ecol. Indic. 2011, 11, 1681–1688. [Google Scholar] [CrossRef] [Green Version]
  7. Ogwang, T. Inter-country inequality in human development indicators. Appl. Econ. Lett. 2000, 7, 443–446. [Google Scholar] [CrossRef]
  8. Cilingirturk, A.M.; Kocak, H. Human Development Index (HDI) Rank-Order Variability. Soc. Indic. Res. 2018, 137, 481–504. [Google Scholar] [CrossRef]
  9. Bryane, M. What does Brunei teach us about using Human Development Index rankings as a policy tool? Dev. Policy Rev. 2018, 36, 414–431. [Google Scholar]
  10. Anand, S.; Sen, A.K. Human Development Index: Methodology and Measurement; Human Development Report Office Occasional Paper; UNDP: New York, NY, USA, 1994. [Google Scholar]
  11. Carlucci, F.; Pisani, S. A multivariate measure of human development. Soc. Indic. Res. 1995, 36, 145–176. [Google Scholar] [CrossRef]
  12. Booysen, F. An overview and evaluation of composite indices of development. Soc. Indic. Res. 2002, 59, 115–151. [Google Scholar] [CrossRef]
  13. Ranis, G.; Stewart, F.; Samman, E. Human Development: Beyond the Human Development Index. J. Hum. Dev. 2006, 7, 323–358. [Google Scholar] [CrossRef]
  14. Stapleton, L.M.; Garrod, G.D. Keeping things simple: Why the Human Development Index should not diverge from its equal weights assumption. Soc. Indic. Res. 2007, 84, 179–188. [Google Scholar] [CrossRef]
  15. Nguefack-Tsague, G.; Klasen, S.; Zucchini, W. On weighting the components of the Human Development Index: A statistical Justification. J. Hum. Dev. Capab. 2011, 12, 183–202. [Google Scholar] [CrossRef]
  16. Kelly, A.C. The Human Development Index: ‘Handle with care’. Popul. Dev. Rev. 1991, 17, 315–324. [Google Scholar] [CrossRef]
  17. Streeten, P. Human development: Means and ends. Am. Econ. Rev. 1994, 84, 232–237. [Google Scholar]
  18. Moldan, B. The Human Development Index. In Sustainability Indicators: A Report on the Project on Indicators of Sustainable Development; Moldan, B., Billharz, S., Matravers, R., Eds.; John Wiley and Sons: Chichester, UK, 1997; pp. 242–252. [Google Scholar]
  19. Ravallion, M. Troubling trade-offs in the Human Development Index. J. Dev. Econ. 2012, 99, 201–209. [Google Scholar] [CrossRef] [Green Version]
  20. Pinar, M.; Stengos, T.; Topaloglou, N. Testing for the implicit weights of the dimensions of the Human Development Index using stochastic dominance. Econ. Lett. 2017, 161, 38–42. [Google Scholar] [CrossRef]
  21. Høyland, B.; Moene, K.; Willumsen, F. The tyranny of international index rankings. J. Dev. Econ. 2012, 97, 1–14. [Google Scholar] [CrossRef]
  22. Morse, S. Bottom rail on top: The shifting sands of sustainable development indicators as tools to assess progress. Sustainability 2013, 5, 2421–2441. [Google Scholar] [CrossRef] [Green Version]
  23. Kawada, Y.; Nakamura, Y.; Otani, S. An axiomatic foundation of the multiplicative Human Development Index. Rev. Income Wealth 2019, 65, 771–784. [Google Scholar] [CrossRef]
  24. Tarabusi, E.C.; Guarini, G. Level Dependence of the Adjustment for Unbalance and Inequality for the Human Development Index. Soc. Indic. Res. 2016, 126, 527–553. [Google Scholar] [CrossRef]
  25. Neumayer, E. The Human Development Index and sustainability—A constructive proposal. Ecol. Econ. 2001, 39, 101–114. [Google Scholar] [CrossRef]
  26. Neumayer, E. Human development and sustainability. J. Hum. Dev. Capab. 2012, 13, 561–579. [Google Scholar] [CrossRef]
  27. Bravo, G. The Human Sustainable Development Index: New calculations and a first critical analysis. Ecol. Indic. 2014, 37, 145–150. [Google Scholar] [CrossRef]
  28. Cherchye, L.; Ooghe, E.; Van Puyenbroeck, T. Robust human development rankings. J. Econ. Inequal. 2008, 6, 287–321. [Google Scholar] [CrossRef] [Green Version]
  29. The Ecologist and Friends of the Earth. Keeping score: Which countries are the most sustainable? Ecologist 2001, 31, 44–47. [Google Scholar]
  30. Malay, O.E. Do Beyond GDP indicators initiated by powerful stakeholders have a transformative potential? Ecol. Econ. 2019, 162, 100–107. [Google Scholar] [CrossRef]
Figure 1. Cumulative number of deaths ascribed to COVID-19 in some European countries. Notes: Day zero is taken to be the first day when the number of deaths reached 50 or more in each of the respective countries. There are two lines for the UK; one is for deaths reported in hospitals and the other is for all settings (primarily hospitals and care homes for the elderly). Sources: The graph has been compiled from UK Government data provided at each of the daily briefings. The data can be accessed at www.gov.uk/government/collections/slides-and-datasets-to-accompany-coronavirus-press-conferences.
Figure 1. Cumulative number of deaths ascribed to COVID-19 in some European countries. Notes: Day zero is taken to be the first day when the number of deaths reached 50 or more in each of the respective countries. There are two lines for the UK; one is for deaths reported in hospitals and the other is for all settings (primarily hospitals and care homes for the elderly). Sources: The graph has been compiled from UK Government data provided at each of the daily briefings. The data can be accessed at www.gov.uk/government/collections/slides-and-datasets-to-accompany-coronavirus-press-conferences.
Sustainability 12 05572 g001
Figure 2. Relationship between the standard deviation and mean of adjusted rank of the Human Development Index (HDI).
Figure 2. Relationship between the standard deviation and mean of adjusted rank of the Human Development Index (HDI).
Sustainability 12 05572 g002
Figure 3. Impacts of the two HDI methodology changes (in 1999 and 2010) on the adjusted rank for four countries (ad).
Figure 3. Impacts of the two HDI methodology changes (in 1999 and 2010) on the adjusted rank for four countries (ad).
Sustainability 12 05572 g003
Figure 4. Distribution of countries with a statistically significant influence on adjusted rank coming from the number of countries in that published HDI league table (N), Change 1, and Change 2.
Figure 4. Distribution of countries with a statistically significant influence on adjusted rank coming from the number of countries in that published HDI league table (N), Change 1, and Change 2.
Sustainability 12 05572 g004
Figure 5. Statistically significant regression coefficients (and standard error of coefficient) for Change 1 and Change 2 dummy variables. Note: The dependent variable in each case was adjusted mean rank. Graphs a and b: Only the Change 1 or Change 2 coefficients were significant at P < 0.05. Graphs c and d: Both the Change 1 and Change 2 coefficients were significant at P < 0.05. Graph c shows the coefficients for the Change 1 independent variable while graph d presents the coefficients for the Change 2 variable.
Figure 5. Statistically significant regression coefficients (and standard error of coefficient) for Change 1 and Change 2 dummy variables. Note: The dependent variable in each case was adjusted mean rank. Graphs a and b: Only the Change 1 or Change 2 coefficients were significant at P < 0.05. Graphs c and d: Both the Change 1 and Change 2 coefficients were significant at P < 0.05. Graph c shows the coefficients for the Change 1 independent variable while graph d presents the coefficients for the Change 2 variable.
Sustainability 12 05572 g005aSustainability 12 05572 g005b
Table 1. List of countries included in the analysis.
Table 1. List of countries included in the analysis.
AlbaniaCyprusJordanPhilippines
AlgeriaDenmarkKenyaPoland
AngolaDjiboutiKorea (Republic of)Portugal
ArgentinaDominican RepublicKuwaitQatar
AustraliaEcuadorLao People’s Democratic RepublicRomania
AustriaEgyptLesothoSaudi Arabia
BahamasEl SalvadorLibyan Arab JamahiriyaSenegal
BahrainEquatorial GuineaLuxembourgSierra Leone
BangladeshEthiopiaMadagascarSingapore
BarbadosFijiMalawiSouth Africa
BelgiumFinlandMalaysiaSpain
BelizeFranceMaldivesSri Lanka
BeninGabonMaliSudan
BoliviaGambiaMaltaSuriname
BotswanaGermanyMauritaniaSwaziland
BrazilGhanaMauritiusSweden
Brunei DarussalamGreeceMexicoSwitzerland
BulgariaGuatemalaMongoliaSyrian Arab Republic
Burkina FasoGuineaMoroccoTanzania
BurundiGuinea-BissauMozambiqueThailand
CambodiaGuyanaMyanmarTogo
CameroonHaitiNamibiaTrinidad and Tobago
CanadaHondurasNepalTunisia
Cape VerdeHong Kong, ChinaNetherlandsTurkey
Central African RepublicHungaryNew ZealandUganda
ChadIcelandNicaraguaUnited Arab Emirates
ChileIndiaNigerUnited Kingdom
ChinaIndonesiaNigeriaUruguay
ColombiaIran (Islamic Republic of)NorwayUnited States
ComorosIrelandPakistanVenezuela
CongoIsraelPanamaViet Nam
Congo (Democratic Republic of the)ItalyPapua New GuineaYemen
Costa RicaJamaicaParaguayZambia
Côte d’IvoireJapanPeru
Note: These countries had reported values for the HDI in each of the years between 1991 and 2018. Hence, each country had a total of 25 HDI values and ranks reported in the Human Development Reports.
Table 2. List of country groupings based on significance of the regression coefficients.
Table 2. List of country groupings based on significance of the regression coefficients.
Regression Coefficients (SE)
GroupsCountryInterceptSE CountriesSE Change 1SE Change 2SE F-Value R2 (%)
All (number of countries, Change 1 and Change 2) are significantAlgeria2.2602(0.3026)***−0.0043(0.0018)*0.0808(0.0258)**−0.0784(0.0310)*9.6***52
Austria1.3500(0.0517)***−0.0016(0.0003)***0.0127(0.0044)**0.0454(0.0053)***31.8***79
Brunei Darussalam1.5763(0.1084)***−0.0020(0.0006)**−0.0510(0.0092)***0.0239(0.0111)*21.8***72
Chile1.5225(0.0974)***−0.0019(0.0006)**0.0314(0.0083)**0.0218(0.0100)*7.5**45
Cote d’Ivoire1.3781(0.1149)***0.0025(0.0007)**0.0891(0.0098)***−0.0281(0.0118)*51.6***86
Iran2.2868(0.3211)***−0.0047(0.0019)*0.0839(0.0273)**−0.1039(0.0329)**12.6***59
Kenya1.3044(0.1465)***0.0025(0.0009)**0.0621(0.0125)***−0.0684(0.0150)***16.2***65
Papua New Guinea1.4951(0.1116)***0.0014(0.0007)*0.0331(0.0095)**0.0377(0.0114)**30.3***79
Peru1.1791(0.1284)***0.0020(0.0008)*−0.0698(0.0109)***−0.0438(0.0131)**22.9***73
Philippines0.8761(0.1689)***0.0039(0.0010)***−0.0847(0.0144)***0.0862(0.0173)***31.9***79
Qatar1.6649(0.1481)***−0.0021(0.0009)*−0.0583(0.0126)***−0.0326(0.0152)*31.3***79
South Africa0.6296(0.2469)*0.0051(0.0014)**0.1258(0.0210)***−0.0612(0.0253)*25.8***76
Sudan2.1544(0.1307)***−0.0016(0.0008)*−0.0646(0.0111)***0.1023(0.0134)***27.0***76
Change 1 only is significantAustralia0.9997(0.0676)***0.0003(0.0004)ns−0.0402(0.0058)***−0.0115(0.0069)ns25.2***75
Bahamas0.9948(0.1802)***0.0010(0.0011)ns0.0772(0.0153)***0.0141(0.0184)ns17.5***67
Bahrain1.5688(0.1641)***−0.0017(0.0010)ns−0.0551(0.0140)***0.0341(0.0168)ns8.8***49
Burundi1.7695(0.1623)***0.0009(0.0010)ns0.0471(0.0138)**−0.0048(0.0166)ns6.9**43
Cambodia1.8029(0.1296)***0.0003(0.0008)ns−0.1116(0.0110)***−0.0054(0.0133)ns45.9***85
Central African Republic1.7037(0.1249)***0.0011(0.0007)ns0.0755(0.0106)***0.0032(0.0128)ns30.9***79
China1.3879(0.2607)***0.0011(0.0015)ns−0.0510(0.0222)*−0.0406(0.0267)ns3.9*27
Congo DR1.4802(0.2334)***0.0019(0.0014)ns0.1058(0.0199)***0.0310(0.0239)ns23.8***74
Ecuador1.8795(0.2908)***−0.0026(0.0017)ns0.0614(0.0248)*0.0113(0.0298)ns2.3ns14
Fiji1.4062(0.4183)**−0.0005(0.0025)ns0.1394(0.0356)***0.0381(0.0428)ns8.5***48
Gambia2.1036(0.1354)***−0.0009(0.0008)ns−0.0485(0.0115)***0.0166(0.0139)ns8.9***50
Guatemala1.6019(0.0820)***0.0002(0.0005)ns0.0326(0.0070)***0.0062(0.0084)ns13.4***61
Indonesia1.8385(0.1349)***−0.0014(0.0008)ns0.0334(0.0115)**0.0141(0.0138)ns3.5*24
Jamaica0.8775(0.2711)**0.0033(0.0016)ns0.0487(0.0231)*−0.0460(0.0277)ns4.4*30
Malawi1.8069(0.1060)***0.0005(0.0006)ns0.0229(0.0090)*−0.0161(0.0108)ns3.2*21
Mauritania2.0582(0.1659)***−0.0010(0.0010)ns−0.0337(0.0141)*−0.0062(0.0170)ns5.4**35
Mauritius1.2646(0.2067)***0.0003(0.0012)ns0.0530(0.0176)**−0.0106(0.0211)ns4.0*27
Mozambique1.8903(0.1030)***0.0003(0.0006)ns0.0302(0.0088)**0.0013(0.0105)ns6.5**41
Nepal2.0682(0.1290)***−0.0011(0.0008)ns−0.0689(0.0110)***0.0083(0.0132)ns22.1***73
Norway1.0345(0.0331)***−0.0001(0.0002)ns−0.0181(0.0028)***−0.0006(0.0034)ns21.0***71
Poland1.1710(0.1711)***0.0006(0.0010)ns−0.0632(0.0146)***−0.0263(0.0175)ns10.8***55
Tunisia1.8730(0.1971)***−0.0023(0.0012)ns0.0589(0.0168)**−0.0151(0.0202)ns4.8*32
United Arab Emirates1.7547(0.2413)***−0.0026(0.0014)ns−0.0510(0.0205)*−0.0209(0.0247)ns10.3***54
Venezuela1.5561(0.1956)***−0.0017(0.0011)ns0.1103(0.0167)***0.0381(0.0200)ns21.5***72
Viet Nam1.4860(0.1476)***0.0011(0.0009)ns−0.0535(0.0126)***0.0099(0.0151)ns6.1**39
Both Change 1 and Change 2 are significantBangladesh2.0051(0.1312)***−0.0010(0.0008)ns−0.0259(0.0112)*−0.0421(0.0134)**17.6***67
Barbados0.9854(0.1493)***0.0008(0.0009)ns0.0406(0.0127)**0.0904(0.0153)***41.4***83
Belgium1.0188(0.1140)***0.0003(0.0007)ns−0.0302(0.0097)**0.0518(0.0117)***11.3***56
Bolivia1.6656(0.1277)***0.0000(0.0007)ns−0.0232(0.0109)*−0.0406(0.0131)**11.0***55
Botswana1.8700(0.2791)***−0.0020(0.0016)ns0.1966(0.0238)***−0.1080(0.0286)**25.0***75
Brazil1.5044(0.1455)***−0.0008(0.0009)ns0.0410(0.0124)**0.0322(0.0149)*8.2***47
Canada0.9810(0.0556)***0.0001(0.0003)ns0.0127(0.0047)*0.0282(0.0057)***26.4***76
Cape Verde1.4264(0.1615)***0.0015(0.0009)ns−0.0862(0.0138)***0.0563(0.0165)**16.1***65
Colombia1.3105(0.1925)***0.0000(0.0011)ns0.0700(0.0164)***0.1019(0.0197)***34.7***81
Comoros1.6354(0.1040)***0.0010(0.0006)ns−0.0342(0.0089)***0.0826(0.0106)***38.4***82
Congo1.7723(0.1228)***−0.0003(0.0007)ns0.0628(0.0105)***−0.0414(0.0126)**14.0***62
Costa Rica1.4177(0.1395)***−0.0012(0.0008)ns0.0570(0.0119)***0.1042(0.0143)***46.9***85
Cyprus1.2818(0.0803)***−0.0008(0.0005)ns0.0158(0.0068)*0.0290(0.0082)**8.2***48
Djibouti1.9699(0.1214)***−0.0002(0.0007)ns−0.0758(0.0103)***0.0373(0.0124)**20.5***71
Equatorial Guinea2.0762(0.2676)***−0.0015(0.0016)ns−0.1478(0.0228)***0.0708(0.0274)*17.5***67
France1.0645(0.1026)***−0.0002(0.0006)ns0.0375(0.0087)***0.0368(0.0105)**21.0***71
Guinea2.0306(0.1345)***−0.0003(0.0008)ns−0.0566(0.0115)***0.0383(0.0138)*9.9***53
India1.6489(0.0833)***0.0007(0.0005)ns−0.0530(0.0071)***−0.0278(0.0085)**34.0***80
Italy1.1665(0.0450)***−0.0003(0.0003)ns−0.0086(0.0038)*0.0305(0.0046)***18.2***68
Japan0.8699(0.0881)***0.0008(0.0005)ns0.0253(0.0075)**0.0244(0.0090)*22.2***73
Korea Rep1.3337(0.0974)***−0.0009(0.0006)ns−0.0249(0.0083)**−0.0634(0.0100)***49.6***86
Kuwait1.4431(0.1901)***−0.0008(0.0011)ns−0.0784(0.0162)***0.0689(0.0194)**10.7***55
Luxembourg1.2040(0.1319)***−0.0005(0.0008)ns−0.0572(0.0112)***0.0544(0.0135)***12.3***59
Maldives1.5875(0.2209)***0.0002(0.0013)ns−0.1125(0.0188)***0.0522(0.0226)*13.1***60
Mongolia1.8105(0.2200)***−0.0013(0.0013)ns0.0802(0.0187)***−0.1078(0.0225)***15.5***64
Morocco1.5724(0.1073)***0.0006(0.0006)ns0.0254(0.0091)*−0.0381(0.0110)**5.8**37
New Zealand1.0629(0.1088)***0.0001(0.0006)ns0.0246(0.0093)*−0.0654(0.0111)***16.6***66
Nigeria1.7658(0.0905)***0.0002(0.0005)ns0.0566(0.0077)***−0.0438(0.0093)***22.6***73
Portugal1.3785(0.1028)***−0.0010(0.0006)ns−0.0504(0.0088)***0.0738(0.0105)***24.1***74
Romania1.1343(0.2224)***0.0017(0.0013)ns−0.0736(0.0189)***−0.1028(0.0228)***20.5***71
Sweden0.8905(0.0737)***0.0009(0.0004)ns−0.0209(0.0063)**0.0238(0.0075)**10.6***55
Syria1.2339(0.2837)***0.0013(0.0017)ns0.1447(0.0242)***0.0743(0.0290)*31.8***79
Tanzania1.8223(0.1845)***0.0000(0.0011)ns0.0608(0.0157)***−0.0635(0.0189)**7.8**46
Thailand1.7462(0.2384)***−0.0023(0.0014)ns0.0744(0.0203)**0.1019(0.0244)***16.3***66
Trinidad and Tobago1.0992(0.1220)***0.0006(0.0007)ns0.0910(0.0104)***0.0346(0.0125)*57.4***88
Uruguay1.1405(0.1044)***0.0003(0.0006)ns0.0513(0.0089)***0.0340(0.0107)**32.3***80
Zambia1.7438(0.2728)***0.0002(0.0016)ns0.1262(0.0232)***−0.1030(0.0279)**12.7***59
Change 2 only is significantAngola2.2368(0.2047)***−0.0019(0.0012)ns0.0000(0.0174)ns−0.0825(0.0209)***16.4***66
Argentina1.4837(0.1719)***−0.0016(0.0010)ns−0.0008(0.0146)ns0.0479(0.0176)*2.5ns16
Burkina Faso1.8893(0.0514)***0.0005(0.0003)ns−0.0014(0.0044)ns−0.0165(0.0053)**3.6*24
Chad1.8901(0.0719)***0.0003(0.0004)ns0.0081(0.0061)ns0.0236(0.0074)**11.9***58
Egypt1.9139(0.1633)***−0.0015(0.0010)ns0.0186(0.0139)ns−0.0569(0.0167)**10.9***55
Ethiopia1.7010(0.1741)***0.0013(0.0010)ns0.0296(0.0148)ns−0.0538(0.0178)**3.9*27
Finland1.1263(0.1120)***−0.0004(0.0007)ns0.0097(0.0095)ns0.0494(0.0115)***11.3***56
Gabon1.3911(0.2393)***0.0015(0.0014)ns0.0135(0.0204)ns−0.1084(0.0245)***7.9***46
Germany1.0528(0.0942)***0.0002(0.0006)ns0.0159(0.0080)ns−0.0711(0.0096)***26.7***76
Ghana1.6170(0.1396)***0.0008(0.0008)ns0.0053(0.0119)ns−0.0321(0.0143)*1.7ns8
Greece1.2249(0.0876)***−0.0006(0.0005)ns0.0037(0.0075)ns0.0220(0.0090)*2.6ns16
Iceland1.2555(0.1150)***−0.0013(0.0007)ns−0.0140(0.0098)ns0.0557(0.0118)***7.8**46
Israel1.0504(0.0551)***0.0004(0.0003)ns0.0066(0.0047)ns−0.0382(0.0056)***19.4***70
Mali1.8479(0.0889)***0.0008(0.0005)ns−0.0154(0.0076)ns−0.0232(0.0091)*4.8*32
Mexico1.1809(0.1428)***0.0006(0.0008)ns0.0092(0.0122)ns0.0556(0.0146)**13.3***61
Senegal1.7027(0.1413)***0.0010(0.0008)ns0.0120(0.0120)ns−0.0385(0.0145)*2.5ns16
Sierra Leone1.9312(0.0537)***0.0004(0.0003)ns0.0015(0.0046)ns−0.0430(0.0055)***29.4***78
Spain1.3843(0.1422)***−0.0017(0.0008)ns0.0233(0.0121)ns0.0396(0.0145)*4.5*30
Sri Lanka1.4945(0.2632)***0.0001(0.0015)ns0.0225(0.0224)ns−0.0869(0.0269)**5.2**35
Only the number of countries is significantHonduras1.3832(0.1179)***0.0016(0.0007)*−0.0122(0.0100)ns0.0027(0.0121)ns3.0ns20
Niger1.8774(0.0479)***0.0006(0.0003)*0.0051(0.0041)ns−0.0037(0.0049)ns4.1*28
Togo1.4464(0.1245)***0.0022(0.0007)**−0.0080(0.0106)ns0.0155(0.0127)ns8.7***49
Both number of countries and Change 1 are significantCameroon1.4206(0.1159)***0.0019(0.0007)*0.0532(0.0099)***−0.0200(0.0119)ns20.2***71
El Salvador1.2022(0.1368)***0.0025(0.0008)**−0.0583(0.0117)***−0.0189(0.0140)ns9.9***53
Guinea-Bissau2.0699(0.0508)***−0.0008(0.0003)*0.0278(0.0043)***−0.0094(0.0052)ns14.7***63
Ireland1.4830(0.1344)***−0.0022(0.0008)*−0.0383(0.0114)**−0.0089(0.0137)ns17.9***68
Lesotho0.9626(0.2054)***0.0044(0.0012)**0.0552(0.0175)**0.0091(0.0210)ns22.0***72
Singapore1.7666(0.1792)***−0.0033(0.0010)**−0.0525(0.0153)**−0.0272(0.0183)ns25.3***75
Swaziland1.0508(0.2538)***0.0036(0.0015)*0.0675(0.0216)**−0.0168(0.0260)ns10.4***54
Turkey1.9671(0.2546)***−0.0032(0.0015)*0.0865(0.0217)***−0.0425(0.0260)ns7.2**44
Both number of countries and Change 2 are significant Guyana1.2014(0.1275)***0.0023(0.0007)**−0.0186(0.0109)ns0.0420(0.0130)**16.6***66
Hong Kong1.4730(0.0600)***−0.0020(0.0004)***0.0049(0.0051)ns−0.0405(0.0061)***73.6***90
Myanmar1.3265(0.1113)***0.0024(0.0007)**−0.0026(0.0095)ns0.0265(0.0114)*18.2***68
Paraguay1.1121(0.1215)***0.0023(0.0007)**−0.0055(0.0103)ns0.0585(0.0124)***30.7***79
Saudi Arabia1.9945(0.2634)***−0.0034(0.0015)*0.0016(0.0224)ns−0.1229(0.0269)***24.0***74
Suriname0.8304(0.2668)**0.0033(0.0016)*0.0283(0.0227)ns0.0697(0.0273)*15.0***64
None of the factors are significantAlbania0.8875(0.5286)ns0.0034(0.0031)ns−0.0169(0.0450)ns−0.0945(0.0541)ns1.2ns2
Belize0.8812(0.5098)ns0.0030(0.0030)ns0.0419(0.0434)ns0.0387(0.0522)ns2.9ns19
Benin1.8899(0.1886)***0.0000(0.0011)ns0.0117(0.0161)ns−0.0359(0.0193)ns1.7ns8
Bulgaria0.9972(0.2883)**0.0018(0.0017)ns0.0181(0.0245)ns−0.0436(0.0295)ns0.9ns0
Denmark1.2593(0.1059)***−0.0010(0.0006)ns−0.0007(0.0090)ns−0.0095(0.0108)ns3.6*25
Dominican Republic1.5007(0.1328)***0.0001(0.0008)ns0.0001(0.0113)ns−0.0010(0.0136)ns0.0ns0
Haiti1.4418(0.1871)***0.0023(0.0011)ns0.0114(0.0159)ns−0.0017(0.0191)ns3.7*25
Hungary0.9181(0.1971)***0.0018(0.0012)ns−0.0108(0.0168)ns−0.0186(0.0202)ns0.8ns0
Jordan1.8305(0.2482)***−0.0019(0.0015)ns0.0298(0.0211)ns−0.0302(0.0254)ns2.5ns16
Lao2.0406(0.1328)***−0.0015(0.0008)ns−0.0096(0.0113)ns−0.0234(0.0136)ns8.7***49
Libya1.4009(0.4489)**0.0001(0.0026)ns−0.0763(0.0382)ns0.0688(0.0459)ns1.8ns9
Madagascar1.3551(0.2360)***0.0026(0.0014)ns0.0244(0.0201)ns−0.0302(0.0241)ns2.5ns16
Malaysia1.2963(0.0801)***0.0002(0.0005)ns0.0124(0.0068)ns−0.0166(0.0082)ns2.2ns13
Malta1.2665(0.1358)***−0.0004(0.0008)ns−0.0133(0.0116)ns0.0066(0.0139)ns0.8ns0
Namibia1.5828(0.2389)***0.0006(0.0014)ns0.0050(0.0203)ns−0.0392(0.0244)ns1.0ns0
Netherlands1.1529(0.0724)***−0.0007(0.0004)ns0.0053(0.0062)ns−0.0051(0.0074)ns2.3ns14
Nicaragua1.1965(0.2240)***0.0026(0.0013)ns0.0127(0.0191)ns−0.0145(0.0229)ns2.5ns16
Pakistan1.5752(0.1065)***0.0011(0.0006)ns0.0143(0.0091)ns−0.0186(0.0109)ns2.8ns18
Panama1.5842(0.2077)***−0.0017(0.0012)ns0.0282(0.0177)ns0.0189(0.0213)ns1.3ns4
Switzerland1.1080(0.1533)***−0.0003(0.0009)ns0.0077(0.0131)ns−0.0236(0.0157)ns1.6ns7
Uganda1.6455(0.1747)***0.0014(0.0010)ns−0.0271(0.0149)ns−0.0081(0.0179)ns1.3ns4
United Kingdom1.1104(0.1604)***−0.0002(0.0009)ns0.0074(0.0137)ns0.0282(0.0164)ns2.0ns11
United States0.9107(0.1114)***0.0007(0.0007)ns0.0119(0.0095)ns−0.0135(0.0114)ns1.3ns3
Yemen1.5585(0.2126)***0.0015(0.0012)ns−0.0022(0.0181)ns0.0102(0.0218)ns1.5ns6
ns = Not significant at 0.05 (P > 0.05); * P < 0.05; ** P < 0.01; *** P < 0.001.
Table 3. Summary of regression results.
Table 3. Summary of regression results.
GroupSub-GroupsNumber of Countries in Sub-GroupPercentage (%)
NoneNo significant coefficients2418
Number of countries (N)32
SingleChange 12519
Change 21914
Sub-total4735
N and Change 197
DoubleN and Change 264
Change 1 and Change 23727
Sub-total5238
TripleN and Change 1 and Change 2129
Overall total135100

Share and Cite

MDPI and ACS Style

Morse, S. To Rank or Not to Rank with Indices? That Is the Question. Sustainability 2020, 12, 5572. https://doi.org/10.3390/su12145572

AMA Style

Morse S. To Rank or Not to Rank with Indices? That Is the Question. Sustainability. 2020; 12(14):5572. https://doi.org/10.3390/su12145572

Chicago/Turabian Style

Morse, Stephen. 2020. "To Rank or Not to Rank with Indices? That Is the Question" Sustainability 12, no. 14: 5572. https://doi.org/10.3390/su12145572

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop