This section seeks to position the evolution of philanthrocapitalism in the context of economic history, evolution, and complexity theory. Humans, like other primates and many other social animals, are a hierarchical species, but the scale of tolerable inequality is not infinite. Excessive inequality, where visible, leads to resentment, and, at times organized revolt [
15]. To reduce this risk, elites both disguise their wealth and, at times, redistribute part of it. In the Roman Empire, elites, sometimes incredibly wealthy, had informal obligations to reduce hardship of the masses, if they were affected by crisis [
16]. Similar obligations also existed for Indian rulers.
2.1. Saving the British Class System
In the early 19th century, there was anxiety among the British ruling class that revolution would cross the English Channel, following the 1789 French Revolution. Unrest and dissent in Britain did occur, yet extreme, state-supported domestic violence to quell protest gradually abated, while the 1832 Repeal Act ceded some power to workers, although the reform of the British Poor Law in 1834 halved national spending on welfare [
17].
The easing of suppression was combined with growing economic opportunities to a rising proportion of the British population, soothing unrest in Britain, apart from in Ireland. Opportunities afforded by the industrial revolution and by the enormous, still expanding empire, itself founded on coercion and violent state power [
18,
19], contributed to wealth that led to the emergence of a large British middle class, widespread literacy, sewage systems in the major towns and even to garden cities [
20]. After an initial dip in the 1830s and 1840s, probably worsened by the temporary cessation of organized smallpox vaccinations in Glasgow and perhaps elsewhere [
17], British life expectancy improved through the 19th century [
20]. This was not only due to economic “take-off” [
21] based on ingenuity as well as distant pillage, but also to organized public health [
22] and reformers such as Charles Dickens, whose work was eagerly consumed by the increasingly literate masses. These social factors interacted and reinforced each other, creating a snowball effect, leading to a social “phase change”.
Physical resources were vital to this progress, but so too were education, idealism, religious fervor, science, protest, redistribution, and reformers wary of the “old corruption”—the ruling class [
23]. The “self-organization” of these and other factors remains relevant to contemporary global and planetary health.
2.2. Saving American Capitalism
In response to the Great Depression, itself a consequence of runaway capitalism, Franklin Roosevelt, the patrician U.S. president, introduced the New Deal, and then the second New Deal. This, with other policies, including incorporation by the government of many other Leftist ideas, weakened support for the American Communist Party and, some say, saved American capitalism [
24,
25]. From about 1933 until about 1970, and especially in the 1940s, wages of workers rose in the U.S., while incomes of the rich fell as taxes were increased [
26]. Greater equality, favored by the policies of prominent economists such as J.M. Keynes and J.K. Galbraith snr, triumphed in the decades immediately after World War II. Reduced domestic inequality, in both Britain and the US, was also important to sustain morale during the war. The period of relative egalitarianism in the U.S. has been called the Great Compression [
26].
Thus, in both Britain and the U.S., in the 19th and 20th centuries, respectively, it can be argued that inequality was lowered, at least temporarily, in order to avoid and to prevent an utter loss of elite privilege, such as occurred in France and Russia. In some cases, aristocrats may also have sincerely helped to support the cause of the common people (“noblesse oblige”—nobility obligates), motivated either by compassion, a sensing of these risks, or perhaps more likely, a combination of the two [
27]. Philanthrocapitalists also emerged in this category; people such as Peter Cooper, Andrew Carnegie, John D. Rockefeller, Henry Wellcome, and, more recently, Bill Gates, George Soros, Warren Buffet and others (see
Table 1) [
28,
29]. The fortunes of these entrepreneurs who became philanthrocapitalists accumulated due to the expansion in overall wealth that resulted from science, discovery and the conversion of nature (such as oil, land, minerals, and forests) to manufactured and agricultural goods. But these fortunes also accumulated, at least in Western countries, because of neoliberalism, the reversal of the Great Compression and the introduction of new norms which spurned taxation, good government, and public goods [
9]. Few if any philanthrocapitalists accumulated their fortune during the Great Compression, a period of comparative self-restraint, as well as comparatively high taxation. (The bequest of Sir Henry Wellcome, who died in 1936, created what, until recently, was the world’s largest health foundation). Note, also, that few if any philanthrocapitalists inherited their fortunes; they are not nobility or “old” money. Furthermore, many millionaires (more recently billionaires) don’t even pretend to be generous to the poor, although some take advantage of tax incentives to support the arts, museums, or the environment, without claiming to promote any aspect of social justice. But some others work to undermine both democracy and environmental protection [
30].
2.3. Saving the World System
A hat-trick of global catastrophes (the Great War, the Depression, and World War II [WWII]) led, in the mid-1940s, to the birth of the United Nations, the successor to the failed League of Nations. Apart from the opinion of cynics and Cold War bulldogs such as George Kennan [
31], the UN, at least in its early decades, was widely regarded as a legitimate attempt to save the world system, or civilization as we know it [
32] (see
Appendix B).
During the Great Compression some forms of global inequality eased. For example, many nations were granted independence, although others, such as the former French colonies of Algeria and IndoChina, were not willingly let go, triggering savage wars with long-lasting consequences. The anticolonial war in Algeria also helped to inform and inspire the psychiatrist Franz Fanon, whose book
The Wretched of the Earth [
33] helped to raise awareness of the plight of people in what was still generally called the Third World [
34].
The vindictiveness of Britain, France, and the U.S. that followed the “Great” War (World War I) saw Germany burdened with enormous debt and helped trigger a short-lasting peace, immediately foreseen as fragile by Keynes [
35]. This mistake was not repeated after the second war. Instead, the Marshall Plan can be seen as a form of self-interested generosity by the U.S. which is widely credited as accelerating economic recovery in Western Europe, and which saw Germany emerge as a strong and durable U.S. ally (until the presidency of Donald Trump), rather than the adversary it became in the 1930s.
In Britain, the National Health Service (NHS) was introduced in 1948. Survivors of WWII voted out the Conservatives, cementing a bipartisan compact protective of the NHS that still exists, despite the sustained shift to the Right that occurred following the election of U.K. Prime Minister Margaret Thatcher in 1979. In many countries, including in the U.S., executives showed voluntary self-restraint with their salaries and bonuses, for about three decades after WWII, extending and consolidating the Great Compression [
36]. However, today, in Britain [
37], France, and the U.S., inequality is again rising, triggering major social reactions, including shifts towards nationalism and authoritarianism.
2.4. The Rise and Decline of WHO as a Promoter of Global Health
Adding to the trio of calamities that incubated the United Nations institutions, including the World Health Organization (WHO), was the “Spanish” influenza pandemic, which killed as many as 50 million people in the final years of World War I and the period immediately following it (when global population was fewer than 2 billion, compared to over 7.5 billion today) [
38]. Some hypothesize that the unusual virulence of the responsible virus evolved in the appalling conditions of undernutrition, crowding, and mustard gas-damaged lungs that characterized the Great War’s closing years, and which may have served inadvertently as a natural laboratory of biological warfare [
39,
40].
One aim of WHO (founded in 1948) was to prevent such pandemics. Another was to improve health in developing countries, places today mostly called the Global South [
34]. In 1955, for example, WHO launched a highly ambitious, ultimately unsuccessful campaign to eradicate malaria [
41]. Twelve years later, WHO launched another eradication campaign, against smallpox [
42]. Within only 11 years, that campaign proved successful. In 1988, in a third major vertical health project, WHO proposed to eradicate polio by the year 2000 [
43]. However, eradication of smallpox was an easier target than either malaria or polio, though the achievement was still phenomenal; an unprecedented achievement that remains unique.
In its early decades, most WHO funding was discretionary (i.e., from the perspective of WHO), provided by member states as assessed dues in proportion to their income. The peak percentage of such discretionary funding was about two thirds in the 1960s [
44]. The drive for WHO to promote health culminated in what Halfdan Mahler, the third WHO director general (1973–1988), called the “warm decade” of the 1970s [
45]. In this decade, WHO launched its campaign of “Health for All by the Year 2000” [
45,
46]. Such a proposal would be unthinkable today, including because only about 20% of WHO’s current spending is discretionary [
44]. The 1978 Alma Ata conference proved to be the height of WHO’s aspiration [
47]. However, a more limited appeal for universality has re-emerged in WHO’s call for universal coverage or access in a wide range of health-related areas including HIV/AIDS, reproductive health, health insurance, and free health services, particularly for women and children [
48].
From the early 1980s, a shift occurred in the funding of WHO, in parallel to the rising influence of neoliberalism, or the doctrine that that market exchange is the supreme guide to economic prosperity and thus to human well-being [
9,
49,
50]. It is probably not coincidental that around this time the number of influential people with vivid memories and deep understanding of the catastrophes that gave birth to the UN was declining, due to the ageing of the generation for which these experiences were formative. British sociologist Simon Szretzer likened the public acceptance of the alleged benefits of “trickle down” economics in western economies in this period, to a naïve prospector tricked by fool’s gold [
17].
Neoliberal policies, in conjunction with tightening Limits to Growth [
4], have, at first gradually but now more rapidly, led to the undermining of many forms of support for many global public goods [
51]. This slow cascade of consequences is still evolving, and has many manifestations. It is revealed, for example, by three recent unilateral U.S. decisions: announcing its withdrawal from the 2015 Paris climate change agreement (effective November 2020), its reneging on the Western alliance to quell Iranian nuclear aspirations, and its exit from the UN Human Rights Council [
52]. The increasingly recognized failure of neoliberalism and associated doctrines to generate a fair distribution of health, income, and opportunity has generated rising nationalism, almost everywhere, and increased intolerance, in almost all high-income settings, for the record number of migrants, refugees, and displaced people.
An early casualty of the rise of post-WWII neoliberalism was the erosion of support for comprehensive primary health care, which was grounded in philosophies such as the importance of social and political determinants of health, knowledge-transfer and partnership [
53,
54]. Compounding this erosion was the gradual abandonment of horizontal health care approaches, and the suppression of knowledge that rapid population growth in poor settings helps to lock in poverty [
55].
The pursuit of comprehensive primary health care was narrowed to “GOBI”—a strategy based on the four pillars of growth charts (G), oral rehydration (O), breast feeding (B), and immunization (I) [
56]. Then, support even for this ghost of comprehensive primary health care faltered. Funds for mass immunization in the global South dwindled [
57] leading to a vaccine-funding shortfall that eventually helped to stimulate the Global Alliance for Vaccines and Immunization (GAVI), initially funded by the BMGF (see
Box 2).
“Bill and Melinda Gates added to the momentum by hosting a dinner at their home for leading scientists to discuss what could be done to overcome the barriers preventing millions of children from receiving basic vaccines. Bill and Melinda challenged their guests to come back with proposals for “breakthrough solutions.”
“In March 1999, a second summit at Bellagio in northern Italy provided the answer to the Gates’ challenge. Rather than setting up a new international organization, the existing major players in global immunization—the key UN agencies, leaders of the vaccine industry, representatives of bilateral aid agencies and major foundations—agreed to work together through a new partnership: the Global Alliance for Vaccines and Immunization (GAVI, the Vaccine Alliance).”
“The Vaccine Alliance’s dream of delivering vaccines to millions of the world’s poorest children moved a step closer to reality in November 1999, when the Gates Foundation pledged US$ 750 million over five years to GAVI. Two months later, in January 2000, GAVI was formally launched at the World Economic Forum in Davos, Switzerland.”
In 2002, as a result of the increasing use of market forces (and before the largely BMGF-driven revival of interest for global health had flowered), the annual budget of WHO was reported to be less than that spent on advertising by Coca Cola and PepsiCo combined [
58]. In response to its widely perceived loss of status and influence [
59] WHO sought and accepted a growing fraction of its support from private sources. Since 2000, funding for WHO has more than doubled, and this increase has come significantly from the private sector, led by the BMGF [
44]. For the period 2015–2017, voluntary contributions to WHO accounted for 80% of its total income. Of the voluntary contributions, those of the BMGF (13.7%) approached that of the U.S. (18%) [
44]. However, the fourth largest contributor to WHO in this period, at
$112 million per annum, was GAVI, which remains partly, perhaps substantially, dependent on the BMGF. Therefore, the contribution of the BMGF, directly and indirectly, to WHO may have exceeded the voluntary contribution of the U.S.
WHO’s direction and policies are thus influenced by philanthrocapitalists, and by neoliberal governments, in ways that are unlikely to be widely understood by the public. Big pharmaceuticals and even beer barons have also been claimed to exert growing power over global health actors and policy [
60]. While direct donations to WHO from drug companies are banned [
61], Richard Horton recently asserted in 2018, that “the pharmaceutical industry is slowly colonizing global health in the same way that it has colonized clinical medicine” [
60]. The loss of independence by WHO has been lamented by both the current WHO director general (Tedros Adhanom) and his predecessor, Margaret Chan, but so far, with little positive result [
44,
49]. In fact, Horton hints that, under Tedros (who prefers to be known by his first name), the problem is deepening [
60].