Paying Attention: Big Data and Social Advertising as Barriers to Ecological Change

: Big data and online media conglomerates have signiﬁcant power over the behavior of individuals. Online platforms have become the largest canvas for advertising, and the most proﬁtable commodity is users’ attention. Large tech companies, such as Facebook and Alphabet, use historically e ﬀ ective psychological advertisement tactics in tandem with enormous amounts of user data to e ﬀ ectively and e ﬃ ciently meet the needs of their customers, who are not the end-users, but the corporations competing for advertising space on users’ screens. This commodiﬁcation of attention is a serious threat to socio-ecological sustainability. In this paper, I argue that big data and social advertising platforms, such as Facebook, use commodiﬁed attention to take advantage of psycho-social neuroticisms and commodity fetishism in modern individuals to perpetuate conspicuous consumption. They also contribute to highly fragmented information ecologies that intentionally obscure scientiﬁc facts regarding ecological emergencies. The commitment to stakeholders and growth economics makes social advertising conglomerates a signiﬁcant barrier to a socio-ecological future. I provide a series of solutions to this problem at the institutional, research, policy, and individual levels and areas for future sustainability research.


Introduction
Beginning in the 1930s, there was a systemic shift of the relationship between science and technology to put social focus on end products of technology. This view permeates common understandings of technology as ever expanding and as a mechanism for growth economics. Heidegger warned that technology would begin to deeply undermine social relations, rather than act as a co-evolutionary process for human development [1]. Most technology critics during his time were instead, and understandably, focused on how technology seemed to contribute to pollution and heavy warfare [2]. However, others noticed early trends in line with Heidegger's concerns and provided alternative visions in hopes of untangling technology and growth economics. Mumford [3] presented a vision of a polytechnic where technology is life-oriented and a tool for democratic relation of diversity among humans, while Schumacher's [4] visions of appropriate technology demonstrate early concern regarding socio-technical relationships. These visions were not enough to put moral boundaries on technological innovation. Instead, technology became the primary tool for efficient economic growth. Now, unchecked socio-technological relationships are deeply undermining socio-ecological transitions, particularly through social advertising platforms.
In the most recent agenda setting attempt between socio-ecological research and technology, Wallis et al. say that "technology is not neutral" [5]. Today, technological development is so extraordinarily rapid that there is hardly time to consider if innovations are truly improvements, let alone the moral implications of the changes. Social media and cell phones have always been sold as more efficient and effective ways to connect with friends and family through cyberspace, without limiting our ability to be free and mobile individuals. The same line is provided regarding social media platforms. However, this technological superstructure is not built on a base of social goodwill and media corporations' desire to strengthen social relationships and connect loved ones to one another. Social media companies allow free usage of their platforms to facilitate a highly effective connection between advertisers and consumers. It is for the benefit of their shareholders that these companies allow people to use their advertising platforms for the price of attention. While this seems like a small price to pay, commodification of attention is behind some of the most problematic, powerful, and pervasive socio-ecological emergencies ever faced by humanity.
Sustainability scholars have largely engaged with criticisms of physical technologies, rather than social technologies, with attitudes that vary considerably [6]. Strong sustainability research is more associated with pessimistic views of technology [7,8] or arguing technology sold as a saving grace for ecological destruction is unfounded [9,10]. These warnings often focus on biophysical limits to technology [11,12] or on undesirable rebound effects [13]. A lot of the research linking digital technology with environmental issues focuses on either the total energetic cost of operating the systems or how big data can contribute to more efficient energy use patterns [14][15][16]. These approaches to socio-ecological technological research are insufficient, particularly regarding modern advances in advertising for behavior modification.
Mainstream economists used to argue that advertising is "neither necessary nor useful to attribute to advertising the function of changing tastes" [17]. Modern mainstream economists now understand the persuasive power of advertising to alter consumer tastes and to create brand loyalty [18]. Otherwise, the purpose of advertising is to provide consumers with more information to create better decisions as a complimentary piece to physical market products. In ecological economic literature, advertising is pinpointed as problematic, as it immorally persuades people to consume beyond necessity [19,20]. However, subsequent solutions focus on utilizing advertising for environmental change or in having advertisers do more community service work, as if the problem is still Don Draper, an advertising salesman, psychologically manipulating the public to sell a dream [21][22][23]. Socio-ecological researchers are significantly behind the curve on this issue. Today, Don Draper is a highly intelligent supercomputer collecting data on every move of social media users and employing those same psychological tricks at ever increasing efficiency. While advertising used to face inevitable diminishing returns, these new forms of advertising allow manufacturers to set prices well-above costs, earn considerable profits, and are subject to increasing returns to scale.
The pervasiveness of social media is very well known. In 2018, 2.62 billion people used social media at least once a month at an average of 135 min per day [24]. With that much attention on such a concentrated space, it is unsurprising that advertisers took advantage. The idea of an attention economy developed in the 1990s as use of ads within online media took hold [25,26]. User attention was quickly pinpointed as a scarce and highly profitable resource [27]. Alphabet and Facebook capture an ever-increasing share of advertising profits through the monetization of unpaid user "work" and their data [28]. However, an overview of technological attitudes within socio-ecological research does not even mention the terms social media, advertising, or big data, suggesting a major gap in the socio-technical criticism from sustainability scholars [6]. Socio-ecological researchers need to forefront these issues if there is to be any influence regarding behavior change.
Through advertising and a systemic locking of the technological black box, corporations have reoriented the ways society, consumers, and researchers engage with technology. Sold as a great social connector and provider of truth, news, and information, these social-advertising platforms are instead a leading detractor of social well-being and primary platform for the strategic obscuration of truth. In this paper, I demonstrate that the benefits of technology as a tool for social connection and shared curiosity are reduced to the lowest common neoliberal denominator consumption. Social technology has become the largest canvas for advertising, and the most profitable commodity on the market is our attention [25,[29][30][31]. After an introduction to the problem of commodified attention, I explore the destructive relationship between commodified attention and limits to growth, mainly that social-advertising companies use commodified attention to (a) prey on the psycho-social neuroticisms and commodity fetishism of modern individuals to incite an "urge to splurge" [32] that fuels destructive conspicuous consumptions and (b) contribute to a highly fractured information ecology [33] that obscures or rejects scientific facts regarding global overheating and planetary boundaries while simultaneously fueling political radicalization and human-Earth dualisms. The pervasiveness and power of these systems have become so strong that ecological economists must immediately prioritize research into these issues to loosen their grip on our reality.
The goal of this paper is to demonstrate how social advertising platforms create major barrier to ecological change. The pervasiveness of social media use and nature of the outcomes I describe in what follows makes these platforms and advertising tactics one of the most important areas for future research. Given this, I suggest a research agenda for ecological economics that takes a stronger approach to tackling these deeply embedded systems.

Contextualizing Commodified Attention
In 2016, Instagram announced that information in user feeds would be ordered to show moments that their algorithms believed the user would care about the most, and they have kept the details of the algorithm quite quiet [34]. These algorithms are passed off as simply the math behind the screen. While math used to be seen as an unbiased filter for a deeply entangled world, it now fuels complex and important problems, such as the subprime mortgage crisis that caused the 2008 economic recession [35]. Algorithms are not without goals; they are embedded with the goals of the programmers employed by the corporations. Quite clearly, the goals of social advertising conglomerates are to meet the needs of their customers and shareholders. The customer is not the users, but rather the advertisers competing for space on users' screens [36]. The algorithms track behaviors and patterns to make decisions on what users should be shown to maximize profit to customers. "These decisions rest on ontological processes of defining and categorizing the data resulting from" online activity by users [34]. Thus, the purpose of algorithms is not to provide users with the most enjoyable or psychologically beneficial news feed, but rather to sell the most effective use of advertising space. The user is the commodity of multi-billion dollar publicly traded enterprises. These algorithms play a significant role in what content is consumed online [37][38][39], and by "establishing the conditions by which social media users are seen, algorithms serve as disciplinary apparatuses that prescribe participatory norms" [34]. To this end, algorithms are highly effective and complex programs with the primary goal of monetary growth and wealth maximization for the customers [40].
Some characterize these systems of algorithms as artificial intelligence [41,42]. The computers in the metaphorical basements of Facebook and Alphabet are incredibly vast, complex linked networks running various algorithms. To some extent, the algorithms "know" the meaning of specific words enough to associate them with behavioral outcomes, weaknesses, and vulnerabilities. Big data companies such as Alphabet run tests with thousands of variables to test effectiveness of changing a simple font color for improving user engagement, increasing the "knowledge" of the AI [35,43]. In 2018, a confidential report leaked from Facebook indicated that ads would not be targeted based on demographics, but on how users predictively behave, and that they are able to persuade users to make a different decision to alter human choice and behavior [44]. Moreover, these algorithms can change behaviors beyond consumption such as political engagement [45].
An engineer who worked at multiple of these multimedia companies said they do not fully understand the function, decisions, language, and learning of these programs [46]. They provide the program with a goal, and the machines figure out how to achieve it, and with every mistake or missed opportunity, the algorithms improve at getting users' attention and selling it to the highest bidder [47]. These machines go as far as to record how long users look at a picture or a post to create a highly individualized stream. If the user stops to dwell on the ad, but does not click it, the algorithms learn to include more priming before purchase. Bilić demonstrates that, given Facebook's underlying foundations and approach, the company commodifies audience and algorithm, which in tandem solidifies its dominance and control over Internet usage habits [48].
When we are online "we are ranked, categorized, and scored in hundreds of models, on the basis of our revealed preferences and patterns" [35]. After we are pinpointed, the multi-variable algorithms run against one another to see what the most successful action was and then reconfigures future actions for increased effectiveness within milliseconds. Our ignorance to the algorithm is pertinent to its success, as it plays on deeply embedded psychological weaknesses, which I explore in the following section. People who spend time looking at information regarding vaccines or climate change misinformation and other elements of our fractured information ecology are fed more of the same, reinforcing their positions and making them more likely to watch more videos, generating advertising revenue, or make purchases to solidify their identity.
The use of these psychological tactics in advertising is not new. In the 2002 British television documentary series The Century of the Self, Adam Curtis explores the work of Edward Bernays and Sigmund and Anna Freud. The series discusses how those in powerful positions have used Freud's work on psychoanalysis to manipulate the masses. Freud's nephew, Bernays, famously used his uncle's psychological techniques in public relations campaigns. The docu-series argues that while consumer behavior was once slightly more rational, it is now driven by humanity's most primitive impulses that make individuals believe that wants are equally important as needs, particularly regarding identity formation.
Bernays' popularity as a publicist caught the attention of the 1939 New York's World Fair president, Grover Whalen-a former corporate executive. The fair promised to present "The World of Tomorrow", but Whalen envisioned exhibits as primarily "commercial, industrial, oriented to consumer products" and the level was to be "pitched to the mentality of a twelve-year-old" [49].
Prominent scientists such as Harold Urey and Albert Einstein wanted science presented for its own sake and way of thinking, rather than as a route for improved economic growth. The famed science popularizer Carl Sagan lamented that these pleas were unsuccessful and henceforth "corporate and consumer focus remained central . . . essentially nothing appeared about science as a way of thinking, much less as a bulwark of a free society" [49]-a trend that has since only deepened. For Sagan and other scientists of the time, science and technology were a part of a romantic curiosity that inspired innovations and helped humanity understand truths about the world and themselves. Corporations and advertisers co-opted science and technology and turned them into highly efficient routes of wealth accumulation and economic growth. There are now various concerted efforts by invested interests, such as lobby groups or politicians, to keep people engaged with technology for consumption's sake. We can see this further exacerbated in social media, which I explore in section four.
This systemic prioritization of the products of science and technology, rather than the processes or ideas, became a primary tool for selling new products to individuals [50]. The 1950s was a particularly vibrant time for selling technological progress to realize the American dream; luxury became a central defining feature of consumerism, driven by technology [51]. With computers, advertisers had to get smarter. Advertising needs to maneuver around ad-blockers and users' general irritation with online ads. Four primary strategies emerged to deal with this: (a) Native advertising or branded content-when a page on a website appears to be news or an opinion article, but it is really a corporation trying to sell something [52]; (b) Influencers-a type of micro-celebrity with a large number of online followers who uses their social capital to influence behavior to generate income [34]; (c) Controlled cyber realities-this includes ads that appear at the top of search engine results, banners that are targeted directly through advertising schemes, and the order in which posts and ads appear on social media timelines to portray the kind of reality algorithms decide for users [53]; (d) Ad-block circumventing, including websites that will detect ad-blockers and refuse access until they are disabled, or video ads that appear before actual content that viewers are forced to watch.
Except for ad-block circumventing, these new approaches to advertising are often veiled as more legitimate and unbiased content.
Technology is not simply "not neutral". Socio-technical relationships are deeply embedded in a Gramscian common sense that places technology almost entirely inside a growth and progress paradigm. Socio-ecological research on existing and possible relationships between technology and society suffer because of this. Before socio-ecological researchers can discuss ways to govern technology with ethics and accountability, there needs to be hard checks put into place to drastically reduce the reach of big data companies. Their tactics not only degrade the social fabric of society but are speeding environmental degradation by increasing consumption. This is a largely unregulated network of deeply embedded microtargeting advertisement tactics that plays on user's psychology to control the behavior of billions of people [54].
Commodification of attention is in part to blame for increased polarization in politics, the pervasiveness of conspiracy theories, denial of climate change, the undermining of democratic institutions, and patterns of consumer behavior [32,35,47,54]. This is problematic for socio-ecological researchers in two primary ways: (a) as a tool for conspicuous consumption through development of identity and (b) as a means to obscure truth and radicalize political opinions. In the following two sections I use literature from historical sociology, psychology, resilience studies, socio-ecological systems, and ecological economics to provide a narrative review of this problem.

Modernity and the Selfie
Unfortunately, a trifecta of complex systemic processes interacted to solidify technology as an agent for economic growth. While Bernays was developing highly effective methods for selling products, technology was surpassing an ease of access barrier, and people were becoming optimally primed to consume.
Growth economics emerged alongside, and relied on, the process of individualization and movement toward rights of the individual [55,56]. Polanyi famously argued that the central dynamic of moving into modernity was the disembedding of economic activities as a distinct domain, separate from cultural ties [57]. Livelihood became structured around incentives for individual economic gain, rather than well-being of community and relationships. Increasing need and opportunity for distinct social roles shifted the I/We balance toward the "I" [58]. People became increasingly oriented toward the self, rather than groups, amidst rapid progress and discontinuity [59]. Social processes shifted to make sense of the maelstrom characteristic of industrial cities, by making individuals both the subject and object of modernization. This provided individuals with a sense of power to change the world and create themselves within it [60].
The replacement of immediate relationships with secondary institutions resulted in a reorientation by individuals to depend on "fashions, social policy, economic cycles, markets" [55]. Within this process, individuals shift increasingly from prescribed to "elective biography" [56], meaning they are increasingly forced to choose and mold their social identity and bolster that via consumption. In doing so, capitalist modernity came to be engendered by social relations and patterns that furthered individualization. The idealized notion of the sovereign individual with rights of their own epitomized social order [56,61]. This rich and complex historical sociology ultimately led to the replacement of authentic social life with its representation, giving root to the "culture industry".
The idea of the culture industry was coined by Frankfurt theorists Adorno and Horkheimer, who argued that through the factory of popular culture, the masses are manipulated into passivity, particularly in robbing individuals of their imagination and providing reality for them [62]. They argued that: "All are free to dance and enjoy themselves, just as they have been free, since the historical neutralisation of religion, to join any of the innumerable sects. But freedom to choose an ideology since ideology always reflects economic coercion-everywhere proves to be freedom to choose what is always the same." The decisions made by individuals regarding their inner lives were all oriented toward becoming accepted members within the culture industry. Personality was homogenized and organized around acceptability as defined by mass media, which made it easier for corporations and advertisers to sell products through mass production. Advertisers in the culture industry succeeded in convincing individuals to consume to fit in with the culture industry, even if they were aware of the process, contributing significantly to mass consumption.
The internet has only heightened and exacerbated these processes, particularly in accentuating extreme narcissism. It is no longer enough to consume to create an image of one's self; now people want as many others as possible to both see that image and legitimize it. Social media provides an immediate way to measure the success of the representation according to peers; this opened the floodgates for companies to employ the tactics of Bernays and other psychologists alongside the technological capabilities of big data and algorithms. Social media represents a tailored experience in which the individuality of identity is central [63]. These new modes of visibility within Facebook and Instagram, constructed by algorithmic power, constructs what Bucher calls a perceived "threat of invisibility" on the user [37]. He argues that this threat is so strong that it reverses Foucault's notion of surveillance via permanent visibility, to surveillance via the fear of disappearing and becoming obsolete. The psychological impacts of all this are extreme.
In 1938, The Harvard Study of Adult Development began collecting data on what makes men happy and healthy. Their longitudinal study suggested with great certainty that people are happy into their old age if they have strong and healthy relationships and that tending to relationships is a form of long-term self-care that protect people from depression, declines in mental and physical health, and are a strong predictor of happiness [64,65]. While these findings came to public attention many years ago, the importance of relational bonds is still not actively encouraged.
The way many Western individuals live "is not conducive to our emotional, social, and psychological well-being . . . well-being is a collateral casualty of modernity" [66]. A 2014 study found that social media use directly correlates with lower rates of happiness in relationships and higher divorce rates [67], and further evidence suggests it creates artificial relationships rife with jealousy and bullying while decreasing happiness and efficiency at work [68]. In Canada, consumers who buy cigarettes are faced with images of cancerous lungs or children struggling to breath from second-hand smoke. But when Canadians log onto their devices, they are greeted with happy AI assistants and messages on Facebook wishing good morning or reminding them of pleasant memories from that day in years past. Perhaps warnings of depression and anxiety should be prominently displayed on these platforms. Instead, there is continuous cultural persuasion to tweet, follow, and like, despite negative impacts on well-being, because consumption is designed as a central vehicle to communicate self-identity-a central obsession in modern society [69].
It may be argued that the well-being of adults is their own responsibility, but children are a primary demographic for these systems. Continuous technological stimulation and use of social advertising platforms is stunting critical development and leading to increased levels of depression and anxiety in kids as young as three years old [70]. Jean Twenge's research finds that teenagers born in the mid-1990s to mid-2000s are incredibly technology-capable and constantly linked to devices during waking hours [71]. These teens suffer from high rates of depression. In the United States, 65% of girls aged 13-18 have experienced suicide-related events such as thinking about suicide, self-harm, or clinical depression. The percentage of teen girls with suicide-related events increased by 58% in 2012-the same year smartphones became widely popular in Western high schools. A large contributing factor to declines of mental health in Western society is this excessive differentiation of the individual and the self-perception of independence [72]. Twenge and Foster argue that this obsession is not an organic human development, but has been engineered by advertising companies, social media, and consumer society to perpetuate a feedback loop of economic growth [73].
The perception of the self as sovereign and self-determining is sold back to individuals covertly through internet advertisements. The same providers of this message provide a platform for consumers to share their carefully curated self to gain approval from their peers in hopes of garnering self-esteem. Where people get their self-esteem matters a great deal.
In 1973 Ernest Becker published The Denial of Death, which offered a "broad and powerful conceptual analysis of human motivation based on the notion that the awareness of death, and the consequent denial thereof, is a dynamic force that instigates and directs a substantial portion of human activity" [74]. Becker attributed a human's recognition of mortality as a primary motivator for human behavior. In response to such finitude, people develop mechanisms to cope with death anxiety, with one primary method being cultural worldviews, which provide a deep sense of meaning. While cultures vary greatly from person to person, Becker argues they all provide the same psychological insulation from fear of death. The worldview alone is insufficient, and humans take on prescribed social roles to acquire a unique sense of significance and value to the world and to bolster self-esteem [74].
This thinking came to form Terror Management Theory (TMT): Our basic identities and motivations function to assuage our deep anxieties from awareness of our impending death [32]. Because of this, when the public is reminded of environmental issues, they end up resorting to consumerism to form identity and status, as this is the lowest-common-denominator of self-esteem [75]. Consumerism helps to navigate uncertainty and confusion as it bolsters self-esteem, which also overwhelms consumers' moral reservations. Personal style and the feeling of making the right consumer choice are markers of prestige and self-worth [76]. Consumerism serves a double function: first to distract people from their doubts and mortality, while secondly providing a visible façade to the rest of society to proclaim success and prestige. This creates multiple feedback loops (Figure 1). Response from users on primes to spend, successful spending, and user feedback on the spending provides algorithms with more data to increase effectiveness. For the user, self-esteem is either bolstered or diminished, either way restarting the cycle to try and maintain or gain back the esteem. Algorithmic advertising has billions of people in a cycle of conspicuous consumption that degrades both mental well-being and environmental sustainability.
Sustainability 2020, 12, x FOR PEER REVIEW 7 of 17 consequent denial thereof, is a dynamic force that instigates and directs a substantial portion of human activity" [74]. Becker attributed a human's recognition of mortality as a primary motivator for human behavior. In response to such finitude, people develop mechanisms to cope with death anxiety, with one primary method being cultural worldviews, which provide a deep sense of meaning. While cultures vary greatly from person to person, Becker argues they all provide the same psychological insulation from fear of death. The worldview alone is insufficient, and humans take on prescribed social roles to acquire a unique sense of significance and value to the world and to bolster self-esteem [74]. This thinking came to form Terror Management Theory (TMT): Our basic identities and motivations function to assuage our deep anxieties from awareness of our impending death [32]. Because of this, when the public is reminded of environmental issues, they end up resorting to consumerism to form identity and status, as this is the lowest-common-denominator of self-esteem [75]. Consumerism helps to navigate uncertainty and confusion as it bolsters self-esteem, which also overwhelms consumers' moral reservations. Personal style and the feeling of making the right consumer choice are markers of prestige and self-worth [76]. Consumerism serves a double function: first to distract people from their doubts and mortality, while secondly providing a visible façade to the rest of society to proclaim success and prestige. This creates multiple feedback loops ( Figure 1). Response from users on primes to spend, successful spending, and user feedback on the spending provides algorithms with more data to increase effectiveness. For the user, self-esteem is either bolstered or diminished, either way restarting the cycle to try and maintain or gain back the esteem. Algorithmic advertising has billions of people in a cycle of conspicuous consumption that degrades both mental well-being and environmental sustainability.

Disinformation to Divide and Conquer
A great deal of success in advertising and selling online is convincing users of truth. Individuals place online communication through social media as concrete and objective representations of the real world [63]. Social media allows users to come to new (mis)understandings of the world that impact offline exchanges and realities, including relations with replacements for face-to-face communities. In doing so, they work to create a version of themselves that only interacts with selective information, adhering to the "neoliberal performance principles" [77]. While users see these interactions as a community experience, they are highly isolated "behind a curtain of self-presenting 'performance'" [63]. Social media platforms provide users an instant form of community seemingly

Disinformation to Divide and Conquer
A great deal of success in advertising and selling online is convincing users of truth. Individuals place online communication through social media as concrete and objective representations of the real world [63]. Social media allows users to come to new (mis)understandings of the world that impact offline exchanges and realities, including relations with replacements for face-to-face communities. In doing so, they work to create a version of themselves that only interacts with selective information, adhering to the "neoliberal performance principles" [77]. While users see these interactions as a community experience, they are highly isolated "behind a curtain of self-presenting 'performance'" [63]. Social media platforms provide users an instant form of community seemingly for free, but the user is also accepting ritualized entertainment which Fuchs refers to as "play labor"-a new ideology of capitalism in which "objectively alienated labor is presented as creativity, freedom, and autonomy that is fun for the workers" [63]. Within this labor, the user feels free and connected, all while their worldview becomes solidified by algorithmically driven cyber-communities. The more a user associates with a perceived community, the more the algorithm will help solidify that association and in turn, bolster user self-esteem and certainty within an uncertain world.
Never have the implications of this been so obvious than within the social media information ecology of COVID-19. The hyper-connected world has led to grave issues of misinformation on multiple aspects of the pandemic [78]. Scientific interventions uncontroversial within the broader scientific community become extreme points of contention within cyber-communities, which seep into real life. For example, the use of facemasks to slow the spread of the virus is based on firm scientific evidence [79][80][81], and yet, people literally took to the streets at the very thought of perceived constriction of individual liberty and freedom. Within weeks, one's stance on facemasks became highly associated with political orientation and eventually for many, larger conspiracy narratives. The information ecology of the internet is extremely effective at taking advantage of crises. For environmental topics such as climate change, users seek out information that helps to alleviate anxiety intentionally or unintentionally, like the COVID-19 pandemic. The predatory social media algorithms prey upon this behavior.
Facebook intentionally has their algorithm show more "I voted" tags to encourage voting, which they calculated increased turnout by an estimated 340,000, enough to swing an entire state and election [35]. Politicians target specific people through consumer marketing to ascertain beliefs and likelihood to vote, organize, and help fund-raise. O'Neil describes individuals as "stocks", and algorithms decide what stock is worth investing in by deciding what information to provide the stock with and how it will be delivered. Political donors who want to get the most out of their contributions would "give a drip-feed of money based on whether the messages they hear are ones they agree with. For them, managing a politician is like training a dog with treats" [35]. Civic life is impacted by these tactics. When fake information permeates into the public, it is nearly impossible to change the minds of those who found it convincing in the first place. This is reflective of a new era of "information warfare", within which deliberate disinformation strategies are employed by state and non-state actors [82]. Social-advertising platforms are used "to employ time-tested propaganda techniques to yield far-reaching results" [82] such as amplifying existing narratives with networks of bots that force algorithms into thinking a topic is an emerging trend, heightening its exposure. These attacks sometimes undermine the government, public institutions, and researchers, providing users with a more convincing "truth" for their worldviews. Masks do not hamper breathing when worn, even over long periods of time, and yet there are swaths of people claiming they reduce oxygen levels [83]. Even if these kinds of disinformation campaigns do not result in immediate consumption, they build users' trust in their cyber-communities and political organizations associated with them, which help algorithms be more successful in ad targeting. This suggests research based in scientific truth is not a convincing route for change.
Advertisers manipulate users to fit into narrow attention clusters that make it difficult for any person to be a recognized and trustworthy source within multiple clusters [29]. The inability of public health professionals and experts to sufficiently combat COVID-19 disinformation is troubling. It suggests that for the billions of people using social media every day, their reality is no longer shaped by science or expert positions, but rather by the information gathered online with little to no concern regarding accuracy.
The same obstacles exist for convincing the public that global overheating is imminently important. Elsewhere, I have argued that "we need to ensure urgently that the science-policy nexus is stronger and more robust to uncertainty and disinformation at national and international levels, not only to support long-term policymaking, but also to avoid chaotic responses to more immediate crises" [33]. However, there are two problems with this: First, the uncertainty and complexity of the world means we have no idea what chaotic response we need to defend against, and the issues are likely to start happening more frequently and with greater urgency, especially in relation to global overheating and ecological emergencies. This suggests that we need modes of knowledge generation that prioritize cosmo-local resilience. The peer-to-peer and commons knowledge system prioritizes free and shared information and can be a part of the solution. While cyber-communities were at war over the seriousness of the virus and if masks were necessary, peer-to-peer communities were sharing 3D printer designs for personal protective equipment and enabled rapid retooling of manufacturing equipment by providing free information on how to do so. P2P knowledge sharing helps establish resilience in the face of unknown catastrophe by providing as much information as possible and creating a network of prosocial brain power.
Second, regardless of the strength of a science-policy nexus, cyber communities are not established and bolstered by strong science and policy, but rather by corporations seeking to turn a profit or politicians looking to further political idealization. The political polarization and crisis of truth is both a source of anxiety and a prime source of anxiety alleviation, making it a resilient terror-to-terror management system. The only way users feel they can deal with the anxiety of COVID-19 and displeasure of seeing loved ones become "radicalized" is to take control of their world, and social media provides them that (perceived) control. In a matter of minutes, a user may see an article that angers them, a political message that resonates with that anger, and then a cute video of a cat. Not only are the algorithms building users' cyber-worlds, but they are also "flattening" the impact that major events have on individuals by putting enormously important political issues beside "empty digital space" such as cats and celebrities [63]. This degrades the users' ability to process information in meaningful ways and makes large issues seem too chaotic for meaningful engagement, producing "emotional dissonance" [84]. The only perceived way to meaningfully engage with major political issues is to like them, share them, and move on. This gives the user the sense that they have meaningfully participated in the political realm while not actually inciting any change. In this state, the user is more susceptible to curated information ecologies, with their minds less able to critically process ideas and sort important from unimportant. The only way to challenge these problems of disinformation, radicalized news feeds, and pliant users is to limit the power of algorithms, big data, and the psychological power of social media.

Loosening the Grip
There are major barriers to implementation of radical policy and institutional change. But if the story above has convinced you that individuals need to see this as a battle for the well-being of the self and the planet, there are some immediate actions an individual can take: • Clear your cookies regularly; there are applications for your browser that will do this automatically every month, week, or day whatever you set it to. • Stop using social-advertising platforms such as Facebook altogether. At the very least, delete them off your phone and only look at them when you have an express purpose.

•
Install an adblocker on your browser.

•
Fact check absolutely everything you read. Make use of Snopes and other debunking websites.

•
When an ad appears, even if you are interested, do not click it. Clear your cookies, open a new tab-preferably incognito, and go to the website independently. Do not give the AI dog its bone.

•
Give time and attention to the voices of others to reduce polarization. As difficult as it may be to speak to someone on the far left or the far right, understand that these positions have been manufactured by multi-billion dollar corporations-the complex systems at work behind their opinions are not their ethics/brain, but rather a highly efficient and intense AI system.
• Randomly report ads as offensive when they pop up on your feed.

•
Buy from companies that support open source technologies and knowledge.
Beyond this, with the pervasiveness and extreme hold these systems have over individual behavior and the continuation of conspicuous consumption and spreading of misinformation, these issues need to have a more serious consideration in socio-ecological research. Even within the university, "mainstream social science . . . has been unable to provide answers to these challenges" because universities are "thoroughly integrated into the capitalist system and been heavily influenced by the trend of marketization, especially in the USA" [28]. In this section, I suggest some areas for academic research, institutional change, and policy approaches. The lack of recent research from the field of ecological economics regarding social media and advertising signifies the need for a research agenda on these problems.

Academic Areas for Research
Here I provide four areas for future research regarding the complexities between socio-ecological systems and technology and advertising. I have grouped research questions and ideas within four overarching themes: foundational questions, nonrational drivers for change, advertising and policy, and property rights. I have included research questions under each of these headings in Table 1. There are fundamental philosophical and institutional considerations regarding the role of technology within ecological futures. Quite clearly, this is not simply the kinds of technologies we might want in the future; technology and advertising have become inextricably entangled. But there is a lack of understanding regarding both what is happening and how it is happening. Ecological economists need to engage with this relationship in more meaningful ways. There are various philosophical question and approaches to doing this: Another fundamental issue is that technology has vast emancipatory power for creating a space where democratic and revolutionary ideas circulate. The power of social media in organizing social movements is well known. However, as long as this takes place on Facebook and similar platforms, they are not organizing on a communications platform, but on the ground of the largest advertising agency in the world fueled by neoliberal capitalist logics [77].

Nonrational Drivers
Quite clearly, truth is no longer significant or important. Some socio-ecological researchers routinely argue that we have science on our side and the biophysical basis of sustainability science is the most important starting point for change [85]. However, in a world where truth is relative to one's political positioning and internet activity, ecological economists need a different philosophy of behavior change. Behavior is not changed through truth, it is change through emotion, so scientific truth and foregrounding solutions in biophysical realities is not the route for change. However, there is an emotionality of science ecological economists could adopt more deeply to establish a romantic cosmology. In Reinventing the Sacred, Stuart Kauffman makes the above argument that traditional scientific reductionism is inadequate for change. He differs from other thinkers by providing complexity science and philosophy as alternatives. Hoping to breach the wide gap between spirituality and rationalism, Kauffman argues that several emergent phenomena of complex systems (i.e., agency, value, meaning) are irreducible and absent among codified regularities of physics [86]. These complexities "appear to preclude even sensible probability statements" resulting in an unpredictable nature of the future and suggesting a "persistent creativity" among these various scales of complexity [86]. He argues that these complex phenomena are therefore symbolically sacred, and thus positivist frameworks need to reinstate spirituality. A reasoned approach for doing so is to share and emphasize the wonder and awe of the universe. Herein lies the ultimate strength of Kauffman's argument for ecological economists: to "use the God word, for my hope is to honorably steal its aura to authorize the sacredness of the creativity in nature". Ecological economists cannot shy away from heightening and embracing the sacredness of nature.
It is this view of science that Carl Sagan lamented losing to Bernays and the World Fair. Sagan and other scientists were enchanted by science, such as Niels Bohr who said: "When it comes to atoms, language can only be used as in poetry". However, these sorts of non-rational drivers of change undermine systems functioning to fulfill these voids such as capitalism and rationality. Without capitalism and rationality, there could be a loss of social welfare systems and technological/social progress, creating new problems [87]. Thus, fulfilled individuals are potentially faced with a Catch-22. They require both the fundamental systems built on the back of rationality and the non-rational spiritual renewal that undermines it. Brian Swimme, in his book The Universe is a Green Dragon, asks if a cosmological reenchantment can exist within a system that allows us to see an enchanted vision of the cosmos if scientific rationality, the foundation of the problem of disenchantment, is also required for reenchantment and therefore potentially mutually exclusive [88].

Advertising and Policy
Large scale change is unlikely to happen without major policy interventions. Various policies exist within ecological economic literature that would address pieces of this problem such as income capping, better work/life balance, redefinitions of success, and hard income capping. However, policy makers should more specifically and more aggressively seek out ways to ban, tax, and warn against these technological systems.

Property Rights
The political and monetary interests invested in these systems are incredibly powerful, and these powerful systems rely on arguably unethical intellectual property rights. To combat a great deal of these issues, knowledge needs to be free and open. The ability of countries and individuals to respond to COVID-19 was entirely dependent upon information. The production of a vaccine and creating a resilient force against the spread of the virus requires shared data [89]. Decentralized knowledge and network architectures represent opportunities for individuals and groups to re-appropriate the internet [90]. Additionally, the peer-to-peer (P2P) networks that make up decentralized knowledge commons are a socio-technical infrastructure that builds resilience in the face of uncertainty. Bauwens argues that on a small scale, distributed commons empowered by P2P are establishing roots for a new social mode of production and knowledge [91]. Such a model enhances participatory civil society, both locally and globally, that is founded on democratic principles. Institutions looking to combat the disinformation ecology online should publish and present knowledge within an open knowledge common. Those who do not contribute to local knowledge commons are only further perpetuating the problem. The commodification of knowledge is as instrumental in these problems as the commodification of attention.

Institutional Solutions
O'Neil suggests that the tech companies themselves should adopt behaviors of ethical and reasonable technology such as creating a Hippocratic Oath for mathematical modelers and computer scientists [35]. Institutions could also lessen the pressure for workers to advertise on advertising platforms, which has an obvious catch-22 of reduced visibility and legitimation. However, institutions could provide digital literacy to their employees and workers with significant warnings of how these platforms can influence behavior. Greater awareness may not necessarily intervene in the system, but it can provide greater buy-in toward alternatives when alternatives are presented. General literacy regarding electrical engineering and programming are also vastly important.

Policy Approaches
There are various ways that policy makers can help to combat these issues. The problem, of course, is getting comprehensive policies past the invested interests. Policy makers should look for ways to ban, tax, and warn regarding these technological systems. In the 1990s, when children would watch Saturday morning cartoons, they were protected from advertisements through regulation of advertisers' interactions with children [47]. Now, children consume their media through streams such as YouTube, which are unregulated regarding quality and quantity of advertisements on children's channels. Policy makers should work to ban advertisements on websites and media directed at children. In extension to this, policy makers should push to add warnings to social media websites that inform users that interactions with the platform lead to increased rates of depression and anxiety and to be aware of false information. But most importantly, policy makers need to put limits on these corporations such as taxing the extraction of data, regulating the amount and kind of information they are allowed to collect, dumbing down algorithms, and adding warnings to all possible fake news items on the internet. Enhanced privacy laws for individuals are also increasingly important in this fight for social meaning.

Conclusions
It is not technological fixes or technology in general that deserves critique and attention from socio-ecological researchers, it is specific technological processes. Social media and big data should be a central concern for socio-ecological researchers. The reach of social technologies is so vast that it undermines and degrades both social and ecological systems that people depend on for a healthy life. The behavior manipulation caused by employing big data for advertising purposes causes conspicuous consumption, degrades moral fabrics, and strengthens global overheating denial. Ecological economists need to centralize research on these topics and become more up-to-speed regarding modern advertising tactics that are incredibly resilient and strong barriers to ecologically oriented behavior change in citizens.
At the end of his book Attention Merchants¸Wu links back to philosopher William James who suggests that our experience of being alive "would ultimately amount to whatever we had paid attention to" during our temporary stint on the pale blue dot [30]. There is now a multi-billion-dollar industry devoted to altering that experience for the sake of consumption, growth economics, and political gain. Our shared collective culture is an important part of being human and it is being sold. How much are we paying attention?
Funding: This research was funded by Leadership for the Ecozoic and Economics for the Anthropocene at McGill University.