It has become fashionable to attribute most of the nation’s problems and, especially, those of the university to neoliberalism. On the left, calling someone a neoliberal is an all-purpose political dismissal. Hillary Clinton was constantly described as such despite the fact that her economic proposals were the most progressive of any major presidential candidate at least since George McGovern, and maybe since Franklin Roosevelt’s proposal for a second Bill of Rights. Of course, all manner of misrepresentation is typical of political campaigns, so it is not surprising that Clinton should have been attacked in this way. But the phenomenon is an instance of the widespread misuse and misunderstanding of neoliberalism.
When people say “neoliberalism” they often seem to mean capitalism. By one narrative of economic history, we are currently living under the neoliberal stage of capitalism, entailing a contemporary identity between the two. But it is important to understand that neoliberalism was first an ideology and second a political program. That program has been partially implemented in many places, but nowhere has it entirely displaced other forms of capitalism or other residual and emergent economic forms. My goal in this essay is to try to sort out what impact neoliberalism has had on the American university, and especially, on the humanities. My argument is that, while neoliberalism has had a significant impact, especially on public universities, one needs to cast a wider net to understand the current state of the humanities in American higher education.
1. The American University
In order to correctly gage the impact of neoliberalism, it is important to understand the American university system as it developed prior to the advent of this ideology. It was never the university Bill Readings discovered in German idealist philosophy, and from which he argued, in The University in Ruins
, the modern American university had fallen (Readings 1996
). While the German university was a model for the emergence of the American university in the last quarter of the 19th century, Americans were more interested in the actual practice of German universities than they were in the theories behind them. German idealism historically provided grounds for the notion that the university must be radically autonomous. As Jürgen Habermas explains, Humboldt and Schleiermacher were “concerned with the problem of how modern science and scholarship, released from the tutelage of religion and the church, can be institutionalized without their autonomy being threatened from another quarter,” whether that be the state or bourgeois society (Habermas 1989, p. 108
). They also argue, according to Habermas, that “it is in the interest of the state itself to guarantee the university the external form of an internally unlimited freedom.” (Habermas 1989, p. 109
). Habermas argues that this “idea of the university” does help to explain some characteristics of the German university, but he also observes that this university never existed even in Germany. Not only does he show that the German system was neither so purely devoted to research nor so isolated from the state and bourgeois society, but he also asks “isn’t the very premise that a vast structure like the modern university system should be permeated with and sustained by a way of thinking common to its members unrealistic?” (Habermas 1989, p. 101
It is true that in the late nineteenth century, as Laurence Veysey shows, “German higher education, however incompletely understood, became the focus of extravagant excitement and admiration…An insufficiently differentiated Germany, partly real and partly imaginary, became the symbol for all scientific claims upon American education” (Veysey 1965, p. 128
). But American reformers seldom invoked the names of Humbolt or other idealists, focusing instead on the practice and productivity of German universities. According to Veysey, the American invocation of Germany emphasized the “painstaking investigation of particulars, both in laboratories and in such areas as historical documents” (Veysey 1965, p. 127
). Moreover, the German model influenced only the research vision, one of the three programs of higher educational reform that shaped the American university. The other two, which Veysey labels “utility” and “liberal culture” were derived from other sources. Those who championed utility advocated universities that would serve the public good, whose orientation was toward “real life,” and which were themselves “democratic” (Veysey 1965, pp. 60–62
). Those who advocated for liberal culture as the model for educational reform rejected the scientific model as too narrow and the utilitarian one as debased. Strongly influenced by Matthew Arnold’s ideas of culture, proponents of liberal culture urged a curriculum built around great works and the study of past civilizations. The American system of higher education includes institutions that emerged out of each of these programs. The Morrill Act, which created the land-grant institutions, required instruction in agriculture and mechanical arts, defining these institutions as practical. Liberal culture was the dominant model behind the liberal arts college, which were often what the smaller old colleges became in the twentieth century, while research would seem to be the model that informed the research university. But Veysey’s point is that most American institutions of higher education as they developed in the twentieth century were influenced by all three visions. As a result, American universities have never understood themselves as radically autonomous, but rather as places where research and teaching were understood to have practical benefits beyond the ivory tower.
Veysey’s story the American university’s emergence is best described as intellectual history; it’s about the ideas that helped to reshape higher education. But to this history, we need to add an economic dimension, which Christopher Newfield has done in Ivy and Industry.
There Newfield argues that, “in the late nineteenth century, American capitalism underwent what we could call its first corporate revolution…The research university was a central component of this rising industrial system” (Newfield 2003, p. 16
). Indeed, the rise of the modern research university occurred simultaneously with the shift from the free-market capitalism of individual entrepreneurs to the monopoly capitalism dominated by modern corporations. This new kind of business organization required college educated workers for their technical expertise, for their management and communications skills, and for the disciplined work habits they learned at university. Conversely, Newfield shows how the university has long been dependent on business for “technological development; managerial administration, including financial control; and entrepreneurial visions of the public good” (Newfield 2003, p. 20
). The point is not that the university was a mere extension of business and industry, but rather that it was never more than partially independent of them.
Many people think that the dependence of the university on government and private support for research emerged only in the wake of World War II and the Cold War, but in fact the dependence on external research funding began in earnest during World War I. According to Roger Geiger, the National Research Council (NRC) enabled the emergence of a research program in support of the war effort. “The NRC was first and foremost a network for joining academic, industrial, and government science” (Geiger 1986, p. 97
). Moreover, in 1918 many universities became military training facilities: “Campuses were nationalized almost as completely as the railroads had been” (Geiger 1986, p. 102
). While government involvement in university life and support for research receded after the war, external private research support continued to grow. World War II and the Cold War would make government support for university research a permanent reality. As Newfield reports,
In the early 1960s, President Clark Kerr at California offered a famous warning about the increasingly prosperous university’s decreasing autonomy: “Federal support of scientific research during World War II,” he wrote, has had a greater impact on higher education than any event since the land-grant movement was enshrined in the Morrill Act of 1862. Kerr detailed the ways in which an indirect form of “federal influence” operated through a nearly irresistible structure of financial opportunities to reduce “the authority of the department chairman, the dean, the president, [and]…faculty government.” The research university had become a “federal grant university” in which direct state control was avoided in favor of a much more effective system of financial rewards and penalties. (Newfield 2003, p. 73
Newfield argues that direct corporate support for research began to grow significantly in the wake of the economic difficulties of the 1970s. This connection was abetted by the Bayh–Dole Act of 1980, which, by changing patent law, “explicitly encouraged the commercialization of federally sponsored university research” (Newfield 2003, p. 180
). It is precisely at this moment when neoliberalism begins to have impact on both government and universities.
The subjects taught and studied at American universities confirm their long-standing connection to the practical and the economic. While most universities offered courses in the disciplines of the liberal arts and sciences (e.g., English, sociology, chemistry, etc.), many also allowed students to major in such vocationally oriented subjects as nursing, education, business administration, hotel management, agriculture, and engineering. Moreover, professional schools for law, medicine, and business were regular features of most research universities. While it is true that majors in the liberal arts and sciences have declined over the past the 50 years, even in 1890 they accounted for under 60% of the total. By the 1950s, they had fallen below 50%. And even when students did major in traditional disciplines, they and their parents did not necessarily understand the value of these studies. As Sinclair Lewis’s Babbitt put it, “there’s a whole lot of valuable time lost even at the U., studying poetry and French and subjects that never brought in anybody a cent” (Lewis 1991
It would be a mistake, however, to assume that only those who majored in applied fields were being educated to serve capitalism and the status quo. Richard Ohmann argued as early as 1970 that under the New Criticism, the study of literature had been radically depoliticized. He also observed that the teaching of composition was linked to “administered thought” and the “military–industrial complex” (Ohmann 1976
). His critique of the familiar injunction to “Use Definite, Specific, Concrete Language” observes that a whole series of ideologically invested assumptions are entailed by it: ahistoricism, empiricism, fragmentation, solipsism, and denial of conflict (Ohmann 1987, pp. 249–50
). Since freshman composition was then as now the English course most university students were exposed to, this example illustrates how the humanities could be strongly aligned with the needs of bourgeois society. In this, he argues, English courses are not unique: “the whole university has tacitly and understandably adopted and acted upon an ideology that aligns professor’s interests with those of governing groups outside the university” (Ohmann 1976, p. 170
Neoliberalism is an ideology in both senses of the term. It began life as an articulated economic and political program, but within American political discourse it also became a “motivated cover-up,” a set of assumptions that disguised the interests that lay behind what passed for common sense. While the explicit program was openly forced on foreign governments in need of help from international organizations such as the World Bank and International Monetary Fund, the program was seldom named by those who supported it in the American domestic context. Indeed, neoliberal economics in the U.S. were typically disguised by appeals to racism, sexism, and anti-government individualism, all ideologies that long pre-date neoliberalism.
David Harvey dates the rise of neoliberalism’s influence to the 1970s, but observes that the program was actually born in 1947 in Mount Pèlerin, Switzerland, when a group of academics (which included economist Milton Freedman) “gathered together around the renowned Austrian political philosopher Friedrich von Hayek” (Harvey 2005, pp. 19–20
). In the name of individual freedom, private property, and the invisible hand of the market, the Mount Pèlerin group opposed Keynesian economics as well as all forms of socialism and state planning. While they were supported by some members of the U.S. bourgeoisie, “this movement remained on the margins of both policy and academic influence until the troubled years of the 1970s” (Harvey 2005, p. 22
). According to Harvey, it gained dominance in the U.K. and the U.S. at end of the decade with the election of Margaret Thatcher and Ronald Reagan. As Jamie Peck has shown, neoliberalism’s influence grew as a result of a campaign funded by wealthy supporters, which included that establishment of “free-market” think tanks in every state in addition to national ones such as the Manhattan Institute and the Heritage Foundation (Peck 2010, pp. 123–25
What happened in the 1970s that created a fertile ground for a program that had previously not had much influence on the domestic policies of advanced industrial societies? The answer is that profits of major corporations declined significantly. Some of this had to do with changes in global economic conditions. After World War II, the United States was in the enviable position of being the only industrial power with its productive capacity undiminished by the war. Indeed, the war effort had meant that its productive capacity was even greater than it had been before the conflict. By the 1970s, however, our former adversaries and allies had rebuilt, and they were competing with the U.S. successfully. Moreover, the U.S. experienced two “oil shocks” in the 1970s, when Middle Eastern politics led to oil embargos that drove up the price of energy, causing inflation and cutting into profits for businesses not selling energy. During the 1970s, the economy of the U.S. was widely characterized as suffering from “stagflation,” a combination of low economic growth and high inflation. By the end of the decade, inflation rates running above 10% were particularly troubling, not only to bankers, but also to average citizens. The response of Federal Reserve Chair Paul Volker, whom Harvey characterizes as a neoliberal, was to raise interest rates to levels above 20%, caused more hardship and consternation (Harvey 2005, pp. 1–2
When profits were high and were expected to remain so, corporate leaders were willing to compromise with labor and with the welfare state. Once they declined, however, both labor and social welfare programs were blamed for “economic failure.” This happened both in explicit ideological efforts to discredit unions and welfare programs, but it was also a motive for corporations to actively seek greater political influence. Supposed reforms of campaign finance law in the 1970s actually enabled large scale corporate giving to candidates and parties.
These economic and political conditions help account for the election of Ronald Reagan to the presidency in 1980, and the initiation of the first concerted neoliberal program in the U.S. Neoliberalism had been imposed, with the help of the U.S., by dictatorships in Chile and Argentina earlier in the 1970s. These countries saw waves of privatization and social austerity and there was international recognition of neoliberalism being invoked there as an explicit program. In the U.S., however, Reagan appealed to older, familiar American political rhetoric of the Right couched in a suspicion of government and, especially, of taxation and government borrowing. These traditional right-wing themes were coupled with racism, as social welfare programs were represented as providing benefits to African Americans at the expense of whites. Thus, neoliberal policies were advanced as making the country fairer to white people. The racist rhetoric masked the fact the social welfare programs provided more aid to whites than to blacks, but it also had its limits. Social security and Medicare, which the neoliberal agenda slated for privatization, proved largely to be politically untouchable because they were perceived as benefiting whites.
Reagan was able to get Congress to enact a massive tax cut for the wealthy, but perhaps more important, his administration marked a shift in expectations. Just eight years earlier, both President Nixon and his Democratic challenger George McGovern had proposed schemes for direct government income supplements for all Americans. While Nixon did not follow through on this campaign promise, he did assert, “we are all Keynesians now.” Reagan’s “achievement” was to demonize government so that Americans stopped believing that it would continue to help make their lives better. In addition, his decision to fire the nation’s air-traffic controllers rather than negotiate with them when they went on strike signaled a new hostility between government and labor.
While Reagan did not cut government spending anywhere near enough to cover the cost of his tax cuts—thus running up the federal deficit—the idea that cutting spending was an unambiguous good created the conditions by which various forms of austerity might be imposed. The difficult economic conditions reduced tax revenues for many states and localities, which led to reduced support for higher education. According to Michael Frabricant and Stephen Brier, “The New York City fiscal crisis and the imposition of tuition at CUNY in the 1976–77 academic year signaled the end of the three-decades-long era of sustained growth in public higher education” (Frabricant and Brier 2016, p. 91
). While Newfield dates the defunding of public education to 1980, the process seems to have started in earnest around 1990. “The real dollar value of per capita student funding that states provided to public colleges and universities declined 2.3 percent between 1990 and 2010. The U.S. inflation rate over that twenty-one-year period averaged 2.7 percent per year,
totaling more than 56 percent over the course of the two decades,” resulting in 35.3 percent difference from the total that level funding would have provided (Frabricant and Brier 2016, p. 92
). In the wake of the Great Recession of 2008, more extreme cuts to state support for higher education became the norm, with all but three states having reductions between 2008 and 2015, the majority greater than 20% (Newfield 2016, p. 168
). The loss of public financial support for higher education is the most significant impact neoliberalism has had on the American university.
But cuts in state support are only part of the story of what Newfield calls the “half-way privatization” of public higher education (Newfield 2016, p. 31
). Two other key parts are increasing tuition and the vast expansion of student debt. During the period in which public support was being cut, the tuition at four-year public universities more than doubled. Tuition increases were possible because students were encouraged to borrow to pay for college. Between 2000 and 2010, average per-student debt increased by 45 percent to $
25,150 (Newfield 2016, p. 193
). These changes resulted in the cost of higher education shifting from society to the individual student, which is exactly the outcome that neoliberal theory desires.
Privatization, however, was not just a matter of austerity in government spending. Indeed, tuition increases paid for with student debt to some extent mitigated and to a significant degree masked the effects of declining public support for higher education. One might imagine that such a radical change in government priorities would have produced significant public discussion and been met with vocal opposition, especially from higher educational institutions themselves. But as Newfield shows, university administrators largely accepted the new model of funding. “Cut our funding and we won’t complain if you let us hike tuition: this is the hidden contract between public university executives and their state officials” (Newfield 2016, p. 170
These executives were more willing to accept this contract because of other aspects of privatization that were seen as opportunities. Besides tuition, public universities saw increased private philanthropic support, contract research, and patent income as the means by which they could become more independent from what many had long experienced as the whims of the state legislators.
The problem is that all of these sources together could not be counted on to make up for the losses on public support. Tuition increases were for a period reliable enough, but beginning in the 2000s, the decades-long increases far over the rate of inflation came in for public criticism and made universities a political target. Public universities found that unlimited tuition increases were no longer possible. While some public universities have been quite successful in attracting private philanthropy, as Newfield shows, the scale of private giving is far too small to allow it to replace lost state funding (Newfield 2016, pp. 118–19
). Moreover, regardless of scale, major gifts are seldom unrestricted, disqualifying them from making up shortfalls in institutions’ general funds. Most often, large gifts are directed to business schools or intercollegiate sports, meaning that they do not contribute at all to universities’ core missions. Finally, as Newfield demonstrates, contract research does not pay for itself. While research grants often enable important research to be conducted, this research must be subsidized by general funds, i.e., tuition. The hope that such research will produced significant patent income has been likened to playing the lottery.
It is clear that neoliberal ideology has supported the privatization of public higher education through its insistence that the free market is always the best way to regulate any human activity. Neoliberalism would do away with all public education, returning learning to the private tuition model. If we have not yet reached this extreme, neoliberalism has changed universities’ conceptions of their own missions by demanding that they do only what will bring in income, regardless of the use value of the activities. The idea that education should be “not for profit,” which Martha Nussbaum has eloquently argued, has been increasingly forgotten under the influence of neoliberalism (Nussbaum 2010
). As Wendy Brown puts it, under the influence of neoliberalism, “public goods
of any kind are increasingly difficult to speak of or secure. The market metrics contouring every dimension of human conduct and institutions make it daily more difficult to explain why universities, libraries, parks and natural reserves, city services and elementary schools, even roads and sidewalks, are or should be publicly accessible and publicly provisioned” (Brown 2015, p. 176
Yet the current economic situation of American universities cannot be blamed entirely on neoliberalism. The appeal of many of strategies of privatization depended on aspects of academic culture that predate neoliberalism’s influence even if they are often reinforced by it. For example, American universities have long existed within a status hierarchy. Traditionally, there were in fact two separate informal but widely understood hierarchies. One of these was based mainly on research, and it included both private institutions such as Harvard, Cornell, and Stanford, and major public universities such as California, Michigan, and Wisconsin. But there was also a hierarchy based solely on social status where private universities such as those of the Ivy League and private liberal arts colleges such as Smith, Wesleyan, or Wellesley ranked above any public institutions. Because hierarchies of social status are not easily changed, the main opportunities for institutions to improve their standing were in the research hierarchy. The advent of the U.S. News rankings in 1983 began to make the public more aware of such hierarchies. These rankings began around the same time as increasing numbers of institutions redefined themselves as having research missions. The competition for research contracts and philanthropic support was driven not only by a lack of public funding, but also the desire for higher status. Doubtless neoliberalism and the environment it fostered intensified this competition, but it did not bring it into existence.
Competition among universities has also been fostered by the perception of a scarcity of students, and this despite the fact that in recent years the growth of the percentage of age cohorts attending four-year college has stagnated, leaving a large untapped market. The end of the post-World War II baby boom produced the first panic over declining enrollments, though it was somewhat mitigated by the increasing percentage of high-school graduates who enrolled in universities. More recently, the end of the baby boomlet of 1980s and 1990s has produced a new panic without a similar mitigation. While one might have thought that the desire to attract students would have led to a renewed concern with teaching, in fact it has led to increased spending at private institutions on quality of life amenities, and at all classes of institutions on administrators devoted student life. This is likely because, while it is difficult for prospective students to assess the quality of teaching, it is easy for them to judge the athletic facilities or the comfort of the dormitories. Here, universities’ need to market themselves is consistent with neoliberalism, but is driven by factors that go beyond ideology.
The result of these two kinds of competition has been the declining importance of teaching at many institutions. The most obvious example of this is enormous growth of adjunct faculty and the decline of the number and percentage of tenure-track faculty. Of course, this casualization of labor is entirely consistent with neoliberalism, and it has been to some degree a response to declining public support. While the shift to adjunct labor has not been as consistent among private universities, the phenomenon has occurred there as well, suggesting that lack of public funding can only be a partial explanation. NYU, for example, shifted teaching to adjuncts in order to afford the cost of faculty stars’ salaries (Newfield 2008, pp. 233–34
). And since these stars taught fewer undergraduate courses than other tenure-track faculty, even more teaching had to be done by adjuncts or graduate students. Newfield describes this strategy as “bait and switch,” since undergraduates are attracted to the institution by the presence of the stars but most of their courses are taught by others (Newfield 2008, p. 234
3. The Humanities
In the ideological environment of neoliberalism, the humanities would seem to be particularly vulnerable. While all of the traditional disciplines of the liberal arts and sciences are devoted to pure rather than applied research, historically the public and politicians have been able to understand better how the natural and some social sciences can lead to practical applications and therefore monetary returns. Because neoliberalism rejects the very idea of “not-for-profit” and insists that all values must be measured by the market, the humanities appear valueless. This has been a problem both for humanities enrollments and for the status of humanities disciplines within the university.
It is often thought that neoliberalism has helped to foster a large decline in humanities majors. Humanities enrollments have historically been affected by economic influences. Recessions are bad for the humanities, but enrollments typically recover in good times. Since we have been arguing that the 1970s is the period when low profits and “stagflation” paved the way for neoliberal policies and politics, it is perhaps not a coincidence that, as Michael Bérubé puts it, “there was
a decline in bachelor’s degrees in English, just as there was a drop-off in humanities enrollments more generally. But it happened almost entirely between 1970 and 1980.
” Since then, there has not been a general decline in those majoring in the humanities. Bérubé quotes Nate Silver, “In 2011, 1.1 out of every 100 21-year-olds graduated with a bachelor’s degree in English, down only incrementally from 1.2 in 2001 and 1.3 in 1991. And the percentage of English majors as a share of the population is actually higher than it was in 1981, when only 0.7 out of every 100 21-year-olds received a degree in English” (Bérubé 2013
). In the wake of the 2008 recession, there is some evidence that majors in the humanities at elite liberal arts institutions and private universities declined more than elsewhere, while, paradoxically, humanities majors at community colleges have increased (Flaherty 2017
; Commission on the Humanities and Social Sciences 2013
; American Academy of Arts and Sciences 2017
; Menton 2013
). There is evidence that, even as the number of majors declined at these institutions, courses in the humanities remained popular. As of 2017, history, historically Yale’s most popular major, was once again so for the class of 2019 (Hussari 2017
). What these trends suggest is that the humanities are perceived as riskier majors for finding jobs in tough times, even though statistics show that humanities majors will eventually earn more than those majoring in many more vocationally oriented subjects. Neoliberalism as an ideology has doubtless contributed to this perception, but it is not in the main responsible for it. One could argue that neoliberal policies have made jobs more scarce, thus making college students more insecure about their futures, but that would be difficult given automation and other influences on employment.
Another problem for the humanities since Ohmann wrote English in America
is that they are no longer entirely apolitical. Newfield thinks that the attack on “political correctness” in universities, which the Right initiated in the early 1990s, was undertaken because of the fear of the political impact of the humanities. He believes that this right-wing assault on the humanities was partially responsible for the decline in prestige of these disciplines and for the cuts in public funding to higher education in general (Newfield 2008, pp. 49–122
). The recent poll showing that Republicans regard colleges as having a negative impact on the U.S. suggests that these attacks have been persuasive to many (Pew Research Center 2017
). While it would be a mistake to imagine that most teaching or research in the humanities is in fact genuinely oppositional, the Right’s attacks suggest the difficulty of the university playing an adversarial role.
Neoliberal ideology and the university management policies it has fostered may have played a bigger role in calling into question the prestige and value of the knowledge the humanities produce. The humanities disciplines have the advantage of requiring very little financial support for research when compared with the natural sciences and technological fields. Yet when universities see external research funding as an essential part of their strategy to prosper in the market, they see the arts and humanities as irrelevant. Even though this research is a net cost to the universities that do it, bringing in grants make an institution appear successful. But income aside, neoliberalism calls into question the kind of knowledge that the humanities produce since it cannot be readily turned into practical applications, i.e., commodities. While, as Sinclair Lewis attests, this view of the humanities long antedates neoliberalism, the latter has certainly reinforced it.
Neoliberalism and the accompanying but separate phenomenon of financialization have made it harder to understand the value of nonapplied research in any field. That’s because pure research by definition yields practical benefits only indirectly and more slowly than applied research. Because the investment climate of recent years has demanded immediate and consistent returns to shareholders, major corporations have radically cut back their own research and development operations. While these corporations have sought to use universities as a way to subsidize research they are no longer willing to fully support, this research is almost always narrowly focused on creating new products quickly. Knowledge of any kind that cannot be used to produce quick profits is thereby devalued, and humanistic knowledge almost never leads to quick profits. The assumption that value must be realized immediately also affects the way humanities majors are perceived. Since such majors do not prepare students for specific jobs, and since humanities students typically earn less than those with more vocational major early in their careers, the fact that humanities students have the advantage in the long run is ignored. Despite the fact that it is increasingly likely that contemporary college graduates will change careers more often those of previous generations, the intellectual flexibility that the study of the humanities produces is not recognized as the enormous benefit it will increasingly be.
Neoliberalism has not fundamentally changed the role of the humanities in the American university. Rather it is best understood as having exacerbated tendencies long in existence in American higher education. Moreover, it is at least questionable whether we should speak of the “neoliberal university.” Newfield’s argument in The Great Mistake
suggests that we are on the road to such a university, but that we are not there yet. Public universities have been only partially privatized and private universities remain (mostly) not-for-profit institutions. A hegemonic neoliberalism would end public and private not-for-profit education. In such a world, the humanities would likely be relegated to a few elite institutions where knowledge of them would lend cultural capital not available elsewhere. Humanities research would become a rich person’s avocation. It is because neoliberalism remains a contested ideology that we can still assert the value of education as a public good. The recognition of the humanities as a key component of that good is evidenced by the support from various sectors of society for the American Academy of Arts and Sciences manifesto, The Heart of the Matter
, a project supported by leaders from virtually every sector of American society (Commission on the Humanities and Social Sciences 2013
The political obstacles to creating more support for higher education in general and the humanities in particular have less to do with neoliberalism, than with racism, sexism, and anti-intellectualism. The election of Donald Trump was not an endorsement of neoliberalism—much of his populist rhetoric was anti-neoliberal—but of racism, sexism, and status resentment. While I cannot offer a magic bullet for killing these endemic social diseases, I suggest a strategy to make public education once again a priority. In order to restore public support for higher education, we need to show that an educated population is good for everyone. This means that we need to argue for its value in terms of outcomes other than higher income, since that metric will always be reduced to “for whom?” While we need to claim that education produces positive effects in the real world, the effects we should emphasize are increased creativity, flexibility, personal satisfaction, and civic engagement. In other words, we need to make the case for higher education as a public, not merely a private, good.