1. Introduction
During the early months of COVID-19, rumors spread that the illness was not caused by the virus. Elaborate conspiracy theories about mobile phone technology in the form of “radiation” from 5G towers being the deep cause quickly developed on the base of older versions of the same theory. This time they found a rapt audience who hungered for effective action, resulting in physical action against both masts and engineers, including more than 100 arson attacks on towers in at least eight countries (
Chee 2020;
Jolley and Paterson 2020). The study of conspiracy beliefs is, therefore, timely and important.
Conspiracy theories vary in their scope and direction. Some undoubtedly “arouse strong feelings and divide communities and society” and thus fit the definition of ‘controversial issues’ (
Council of Europe 2015). Conspiracy theories tend to be and correlate positively with other explanations that conflict with best academic knowledge (
Lobato et al. 2014;
Stone et al. 2018), and belief in them leads to less support for democratic processes and institutions (
Imhoff et al. 2020;
Jolley and Douglas 2014). They typically fill a function as theodicy on the human level (
Barkun 2003), explaining the evils of current situations as originating, at least partially, from other social groups. Among them, we find conspiracy theories relating to religion and/or ethnicity as a topic, with, for example, actors from (other) religions being defined as the villains of the narrative (
Dyrendal et al. 2018). When antisemitic and Islamophobic conspiracy theories are subscribed to, such beliefs can be tied to strong social identities, making them hard to combat. In common with other conspiracy theories, such variations play a role in political polarization (
Kreko et al. 2018), and they are used in identity-protective cognition (
Lewandowsky et al. 2013). There is, however, a silver lining to the very problems noted: it makes conspiracy theories a good case for practicing analytical skills dealing with a host of these problems (cf.
Cook 2019).
2. Conspiracy Theory
“Conspiracy theory” can be a slippery concept. Researchers approach it differently, with different building blocks being more appropriate for some disciplines and certain research interests (see
Dyrendal et al. 2018). For example, some disciplines stress the ‘theory’ part of the construct. This leads historians and cultural theorists to demands for a developed narrative which can then be analyzed for rhetorical styles, the historical development of themes, how they travel, political usage, and a host of related issues (e.g.,
Butter 2014). Philosophers, on the other hand, tend to go for epistemic concerns, the shape of arguments and related interests of philosophers (e.g.,
Dentith 2018). Religious studies scholars likewise try on the term with the tools of their own trade and tend to see myths, theodicies, and communities of “believers” (e.g.,
Robertson and Dyrendal 2018).
Another set of options for addressing conspiracy theories relates to surveys as the main tool of finding them. This approach is typically more prominent in political science and social psychology. Developed narratives, genres, the shape of arguments, and epistemological concerns are all harder to press into short survey questions. The research thus looks at conspiracy “beliefs” as some level of agreement with, typically, one-sentence assertions about conspiracy (e.g.,
Uscinski 2020, p. 31), and it is primarily interested in what predicts such beliefs and what they in turn predict. These findings may not cover
beliefs—especially in the way qualitative studies of religion tend to treat notions of belief—as much as they discover
suspicions about conspiracy (cf.
Smallpage et al. 2020;
Wood 2017). However, the findings of its correlates are consistent enough that the way such scholars tend to understand “conspiracy theory” can be generally useful, not least because they give us the most specific ideas on where the ideas come from and how they can be mitigated when harmful.
In place of a hard definition, the term conspiracy theory may, thus, be understood here along the lines of agreement with one general survey item: “Much of what happens in the world today is decided by a small and secretive group of individuals” (
Oliver and Wood 2014, p. 959). This “secret cabal” item from political scientists Eric Oliver and Thomas Wood stresses suspicion that something secret and sinister is going on, and that it is directed by a small, shadowy group of people, but it is void of details. It is thus an example of a general attitude that can be underwritten from multiple political and religious points of view.
Specific conspiracy beliefs will include specific grievances. Conspiracy theories that are successful enough to arouse concern tend to address ongoing concerns of a political nature (
Butter 2014;
Moore 2018). The issues may be long-standing and relate to established lines of social conflict, or they may relate to more recent developments and fissures in the polity. In either case, conspiracy narratives typically relate suspicion that some outgroup or outgroups are dangerous and harbor plans in secret. In developed and successful narratives, the outgroup is said not merely to plot in secret to harm the ingroup or its sacred values; they are already committing horrific deeds to innocents and subvert the true order of things. The outgroup in question will vary in size depending on the lines of conflict, and the “small and secretive group” may be exchanged for anything up to and including ethnic groups, like for instance “the Jews”, or “the Chinese”.
Conspiracy theories analyzed as part of the category “controversial issues” can, thus, be addressed along some of the dimensions related to such controversy: they can be long-standing or they can be recent and, as a class, they can be treated both as inherently and as superficially controversial (
Council of Europe 2015). Analyzed as inherently controversial, conspiracy beliefs can derive from a fundamental disagreement on strong beliefs and central values and, therefore, be seen as intractable. However, they can also be framed as a topic solvable by looking at evidence and arguments, and, thus, as “merely” superficially controversial. Overall, it may be useful to think of conspiracy theories as a category residing between the superficially and the inherently controversial. They may be seen as
expressing strong beliefs and values to be taken seriously rather than literally; the challenge for educators is to turn what seems inherently controversial relating to strong values and fundamental belief into something capable of solution by appeal to logic and evidence. This may not always be difficult: not all interest in conspiracy narratives is belief, and some beliefs could be tentative and passing—not least among adolescents. The problem is that we do not know how much of it, and with regard to how many.
3. Conspiracy Beliefs, Adolescents, and School
Conspiracy beliefs have typically been studied in adult populations. This goes for both qualitative and quantitative research into the study of conspiracy theories. The closest we come to systematic knowledge about how conspiracy thinking and conspiracy beliefs may look among those of school age is the research into young adults. However, most of this research has been conducted on undergraduates, and, while these are near the same age, they seem likely to differ systematically from even last year high school students along the lines of family background and academic aptitude. Both may reasonably be suspected of affecting propensity to conspiracy beliefs and types of beliefs (
Goertzel 1994;
Smallpage et al. 2020;
Ståhl and van Prooijen 2018). We know that conspiracy belief is related to factors such as the belief in the paranormal and lower analytical thinking skills, ontological confusions, and higher degrees of reliance on intuition (
Lindeman and Aarnio 2007;
Swami et al. 2014;
Wood and Douglas 2018). Since critical and analytical thinking must be learned, while intuitive, intentionalist, and anthropomorphic thinking is more automatic and common in children, one could reasonably expect both to be higher in a broader population of adolescents who also have less education (cf.
Rizeq 2019).
We know, however, little about the social and developmental pathways of conspiracy beliefs in adolescents. It is possible that increased exposure to non- and anti-scientific beliefs and related identity construction could, over time, effectively either compensate for or directly countermand the effects of increased exposure to the effects of age and education. The former is the main finding of the most direct study into the topic to date (
Rizeq 2019, chp. 4). When they compare adolescents (age 12–19,
Mage = 15.9) and young adults (age 19–30,
Mage = 19.4) across beliefs in the paranormal, conspiracy theories, and anti-scientific attitudes, it is only on paranormal beliefs that there is a significant difference. The levels of anti-scientific attitudes and beliefs in conspiracy theories are effectively identical across age groups, and they seem to be driven by the same factors. That the levels are similar may seem to be bad news for the effects of education, but if the factors behind are indeed the same, it is promising for the effects of didactic interventions. Before we go there, however, we should know more about what the situation is deemed to be in school.
Our knowledge here is also sparse, and the research is just beginning. Some early qualitative data from teachers and students in Norway yield some unsurprising data points on which to reflect.
1 First, during school visits, the teachers were often not much better off than students when considering the conceptualization of “conspiracy theory”. The teacher responses mostly considered here are different. Most of these were gathered from open-ended questions in surveys where teachers had taken part of a half- or full-day course about conspiracy theories. These teachers, therefore, showed both higher motivation and richer associations to employ when answering the questionnaire than what we are used to seeing when merely visiting classes. Since the questionnaire was also used as a tool for group discussions, the collective answers were able to exemplify a wide range of conspiracy beliefs and reflections on them in the class context.
Overall, and looking at both students and teachers together, the teachers’ answers mostly concurred with what we found in student sessions with regard to both the prevalence and types of theories circulating. The conversations and Q&A sessions with Norwegian high school students at multiple schools showed that most are unfamiliar with extant conspiracy theories. That was true even of nationally highly salient theories. After the mass murders at Utøya in 2011, many students learned about the murderer’s motivating conspiracy theories. A handful of years later, this knowledge would seem to be in decline. While a minority volunteered that they recognized the “Eurabia” term as associated with him, very few could reproduce even the overarching “great replacement” (e.g.,
Zuquete 2018, pp. 146–55) elements of his beliefs. The kind of conspiracy beliefs students typically showed some familiarity with were the ones often cited in mainstream media or specific types of conspiracy lore directed at their age group. An example of the latter was the “Illuminati controls popular music” theories. These had been included also in a popular, humoristic meme (“Illuminati confirmed”) among teenagers a few years earlier, and they were in decline but still popular during our time frame (2017–2019). From general media coverage, moon-landing as a hoax and flat earth theories were trending as topics and were cited by several students, as well as by groups of teachers. Interested students also mentioned classics of media coverage, such as theories about the deaths of JFK and Princess Diana, but they showed little to no familiarity with the content. The three lines of conspiracy theory that elicited more clear familiarity and belief responses were 9/11 theories, great replacement theories, and “climate change as a hoax” theories. These were also more clearly tied to political identity. Some topics were more likely to be noted by teachers in response to the questionnaire items about which if any conspiracy theories they experienced as problematic: “group polarization” conspiracy theories, ”Eurabia” theories, “Islamization” theories, and “theories where minorities are scapegoats” were some of the answers. “Just asking questions” about the Holocaust was also mentioned. While the latter had also been used as an example in the course, the examples of what were deemed the most problematic cluster around group prejudices.
Students and teachers agreed that only a small minority of students show more than passing interest in conspiracy theories in classroom settings, and many of these responses are jocular or explicitly critical. In conversations with students expressing something that seems more positive, conspiracy theories may often occupy the ambivalent space of identity play rather than be an established belief of consequence. Establishing whether students believe or play at postures of belief is its own problem; either way, teachers saw both handling the ambivalent and the small subgroup of clear believers as difficult. This was not merely because some of these conspiracy theories were otherwise associated with extremism, but because they were sensitive in other ways. Some used conspiracy theories to disrupt class, express outsider status, and seek attention.
2 The problem for teachers was then whether to address it at all, and, if so, how to handle the issue in a manner that maintained classroom flow, but at the same time was sensitive and avoided pushing the student further away.
This was typical for how teachers attempted to master the situations. When conspiracy theories came up in the classroom, it was almost always because a student spontaneously uttered support or fascination for one of them. Although experiences varied broadly,
3 these “panic moments”, as one group of teachers called them, could be difficult to handle. This was especially the case when students who brought up these issues were among the more socially isolated and had taken on an explicit outsider role. These “outsiders” made themselves difficult to reach, the dynamic in the classroom became difficult, and teachers often felt they lacked constructive, didactic tools to deal with the situation as a possible situation for learning. One stated that she, therefore, tried to smooth over and otherwise ignore the topic; others went into open dialogue letting the class explore with whatever tools they felt they had (that were usually statements of opinion). This was especially difficult when students were pushing conspiracy beliefs denigrating fellow students—typically antisemitic and/or Islamophobic versions, but also gender-related ones. This made the universal demand for treating viewpoints and students with respect difficult; furthermore, it was sometimes seen as possibly tied to radicalization into extremist attitudes.
4 One group of teachers analyzed the situation as they saw it: new technological and socio-political conditions make conspiracy theories more available, more relatable, more normalized, and, thus, also potentially more problematic. The multiplicity of platforms with semi-public spheres where new subcultures flourish, create new narratives, and adapt old ones according to sometimes briefly popular special interests, make it impossible for teachers to follow up on “what’s going on”.
The technological development and the multiplicity of spheres and tales, including about lesser known mainstream news, can exacerbate the teachers’ feeling of not being prepared when these half-baked suspicions or full-blown narratives show up in class. The primary remedy they saw was one of a more general preparedness (“media literacy”). Other groups concurred, and that was also what many attempted to practice, by applying general elements of source criticism and whatever elements of analytical thinking they had available. Some were wisely skeptical of amplifying conspiracy theories by giving specific theories special attention, especially as time is limited. One group wrote that “the hard part is when we lack knowledge required to address the circular arguments in the conspiracy theory and do not have time to really analyze the arguments by going deeper to the premises of how to establish reliable knowledge”. In general, again unsurprising as these teachers were mostly attendees of a course on conspiracy theories, they wanted to learn more general tools on how to handle conspiracy theories as a topic. Their own stenographic suggestion was often (to embed this within a framework of learning) “critical thinking”.
Although these interactions have offered many areas of reflection, the interactions need to be interpreted with caution. One problem is the select nature of respondents, another is time and connection. The longest session with students was 90 min; the others ranged from 15 to 45 min after a lecture. One challenge in eliciting responses from students on conspiracy theories in such short sessions was that they usually had only vague notions of what should be considered a conspiracy theory. This is not surprising, as any teacher of first year theory courses on the academic study of religion can attest. Thinking about abstract concepts and applying them to statements and narratives is a difficult exercise at the best of times. But where “religion” comes with established prototypes in organized, social activities and culturally central narratives and symbols, “conspiracy theory” has less of the same cultural baggage. Further, the sample of teachers was drawn from those completing a professional learning course, alongside teachers who had not attended such a course. As such, the experience of the course may have impacted responses which we have not considered as part of this article. Nonetheless, the responses from teachers, thus, echo the general findings related other controversial issues: they addressed “teaching style, teachers’ responsibility to protect student sensitivities, classroom climate and control, teachers’ lack of expert knowledge, and their ability to deal with spontaneous questions and remarks” (
Council of Europe 2015, p. 15). Are there definitive responses to these issues?
4. What to Do?
There may be an upside to facing conspiracy theories head-on, but there is no hard and fast solution. The good news is that what seems to work best is related to goals schools are already pursuing, and that focusing on certain broader skills may help achieve more goals than merely protecting against destructive conspiracy beliefs. However, the message comes with two caveats: as mentioned above, the knowledge base on conspiracy theories is almost exclusively based on older age groups, and studies on how to reduce conspiracy thinking (see
Kreko 2020) are relatively few and typically address specific conspiracy theories.
The most basic conventional technique is fact-checking and correction. Conceptualized as a teacher-driven procedure, the important element is to get facts straight, keep it simple, as non-threatening as possible to overall worldview and identity, and tell a story (
Cook 2017). Conspiracy theories can be “sticky” (
Jolley and Douglas 2017) and the counter may need to be not only based on reliable information, but also to be as sticky as possible, summarized as SUCCES: “simple, unexpected, concrete, credible, invoke an emotional response, and tell a story” (
Cook 2017, p. 14). Using clear, easy-to-understand graphics helps, and it should follow the structure of presenting facts before the erroneous story, and then explain how and why it goes wrong.
This is not an easy task to be prepared for, and it badly suits the “panic moments” when controversial topics crop up spontaneously. While research shows that corrections do work, they do not in and of themselves transfer between situations. Nor is a correction after the misinformation has been spread the most effective way to combat even specific wrong information. Happily, albeit with the caveats above, there is a better tool for both combating misinformation and for training broader sets of skills: from extant evidence, the most promising tactic involves inoculation, or “prebunking” (
Banas and Miller 2013;
Cook et al. 2017;
Jolley and Douglas 2017;
Roozenbeek et al. 2020).
Inoculation is a long-established, effective communication strategy against hostile persuasion. Its history goes back to the cold war when early versions were developed as a protection against “brainwashing”, and it has shown itself effective over a wide range of situations for protecting against misinformation (
van der Linden and Roozenbeek 2020). As the name indicates, it is a “vaccine”. Recipients of attempted persuasion are presented with a “weaker version” of the misinformation to develop resistance to it. The strategy enlists both cognitive and affective dimensions before presenting the misinformation: first, there is a warning about the coming threat, then a preemptive refutation, and only then one is presented for the misinformation itself. In practice, it can mean something like presenting how a conspiracy theory is destructive, i.e., how antisemitic conspiracy theories are used to not only denigrate and promote discrimination and violence against Jews, but also to promote anti-democratic political goals, followed by clear, factual information
before one is introduced to examples of conspiracy theories. The order in which things are presented is important, since, as we mentioned above, misinformation is “sticky” (
Lewandowsky et al. 2012) and conspiracy theories seem to be especially so (
Jolley and Douglas 2017). The pervasive cynicism about and distrust of scientific and other mainstream sources of information is often included as a specific inoculation of its own in conspiracy theories: science and media are bought and paid for by Them and cannot be trusted. The corollary is that only contrary voices can be trusted, partially accounting for the anti-science permeating conspiracy culture (cf.
Barkun 2003).
The inoculation strategy utilizing the strict warning–fact–misinformation structure is effective, but it tends to be fairly specific. For educational purposes, we would like a more general approach that could tie into larger programs of “critical thinking” skills (e.g.,
Sellars et al. 2018). The style of thinking that has been shown to predict lower conspiracy beliefs, is “conscious, deliberate, effortful, slow, affectively-neutral, and rule-based” (
Swami et al. 2014). It is analytic and “actively openminded” (e.g.,
Stanovich and West 2007). To work as a generalized inoculation, it needs both general and specific tools. Recent studies on combating climate change misinformation and “fake news” yield an approximation of what we are looking for (
Cook et al. 2017;
van der Linden and Roozenbeek 2020): learning general strategies for both good thinking and reliable conclusions, and for how one can identify unreliable, mis-, and disinformation. This means the inoculation takes on a more active form. Rather than passively reading a persuasive message, participants (students) actively take part in checking sources, facts, and logic, constructing and deconstructing fake news, mis-, and disinformation.
The teacher changes roles accordingly, from the all-knowing sage of immediate rebuttal to coach instructing in critical investigation skills. Students need instruction, but the point is that
they are the ones doing the exercises and training their skills. Studies consistently show that those who are more consistently employing analytical thinking score lower on conspiracy beliefs (e.g.,
Swami et al. 2014). Training the “mindware” (
Stanovich 2011) required of analytical thinking takes practice, but reduces badly grounded beliefs (
Swami et al. 2014), probably reduces prejudice (
Jolley et al. 2020;
Yilmaz et al. 2016), and should have other beneficial effects.
5. How to Do It (and When)?
Conspiracy theories may be considered a subset of many topics, and as such they give the possibility for training skills across multiple classes and situations. They are relevant to (“English”) literature studies and natural sciences as much as history and religious, media, or social studies. Conspiracy theories can be a subset of rumors and “fake news”; they can be part of deliberate propaganda, denialism, and disinformation; and they can be provocative “trolling” for entertainment and disruption. All of these yield traits and tactics we may use to help students identify problems with conspiracy narratives, and since these traits and tactics are much more common than specific conspiracy beliefs, these are skills that transfer to other situations.
Considered as narrative, they can draw on (or belong to) to a genre (“melodrama”), they are often employed in constructing the narrative arch of thrillers, and their mode of construction borrows heavily from the detective genre (e.g.,
Boltanski 2014;
Borenstein 2019). Conspiracy theories often loan their moral universe from melodrama and related mythology: heroes fight to unmask and stop villains who are doing grotesque misdeeds against innocent victims. All that is good suffers, and heroic, masculinized action is needed to reconstruct the true order of the world. Like detective novels, conspiracy theories present a world filled with hidden clues, clues that the orthodox channels of science, media, and police are unable or unwilling to follow to their (only) logical end. The villains
have to be leaving clues forming hidden patterns to be uncovered, and these are inevitably meaningful once one has learned to see them. As in thrillers and spy novels, the conspiratorial threat of conspiracy theories tends to be larger and more pervasive than the limited ones in detective fiction. The threat is against the good of the nation or more. Few can be trusted, and an important, recurring plot twist is that “good” actors and allies turn out to be working for Them, explaining why those who disagree cannot be trusted.
All these elements may be present in how conspiracy theories are used as political speech (e.g., in a jeremiad). Some versions borrow from apocalyptic speech, in presenting the deeds, goals, and consequences if the conspiracy of political opponents is not stopped: all that is good will be brought to an end, the dastardly deeds of the conspiracy and their pernicious effects on society, morals, and religion are everywhere (cf.
Hofstadter 1964). These narratives often make use of
atrocity tales. A specific version of melodrama presented, sometimes truly, as valid news about recently uncovered crime, atrocity tales give details about horrific crimes committed against one or more innocent victims (e.g.,
Best 1990). They are often followed by “advocacy numbers”, highly inflated and misleading estimates about how many suffer from this kind of crime and thus under their evil perpetrators.
These kinds of conspiracy theories tend to circulate as rumors, and they are used in “fake news”. This means that they appeal to something as being fact and point to something they claim to be evidence. Often this takes the form of denial of what mainstream sources present as true, and systematically as “denialism” of contrary views from sources like science or mainstream news sources. This leads us to the possible role of conspiracy theory in teaching media literacy, source criticism, and fact checking, and learning about certain common errors of reasoning. Knowledge about the literary genres and shape of classical conspiracy narratives may help, but they are not strictly necessary, since the conspiracy theories are subsumed here under general skills for learning how we best investigate the possible truth of statements. Simple guidelines for student activity relate to such things as checking the source (see
European Commission 2020): Is the source known for reliable news? Do they retract news and change their coverage when better knowledge is presented? Do independent fact checkers treat them as reliable, and do they support the specific claim?
“Fake news” regularly uses conspiracy theories, and they are a constant presence in denialism. Psychologist John Cook notes some simple traits to look for (
Cook 2020), one of which goes directly to source criticism: fake experts, logical fallacies, impossible expectations, cherry picking, and conspiracy theories (FLICC).
5 People claiming expertise they do not have, covering themselves in vocabulary that may sound like relevant science to the uninitiated is a mainstay of the larger family of misleading claims. Checking the credentials of specific “experts” cited, as well as the venue of claims is good practice and a fairly simple group exercise to set up. For other training venues looking more at the processes behind spreading conspiracy theories and other fake news online, “the fake news game” is an interactive alternative shown to be highly effective in inoculating against fake news (
van der Linden and Roozenbeek 2020).
Originally tested as a card game in a high school setting (16–18-year-olds), it is one of few interventions, general or specific, tested on adolescents. The online version, called Bad News, places the player in the role of a maker of fake news in a simulated social media setting. The surface goal is to gain “followers”, while getting and keeping “credibility” with the same (ibid.). The player learns sets of different disinformation strategies and how and why they work. Learning to use, and, thus, to recognize the use of, emotional content, polarization, trolling, and discrediting opponents—with or without claiming conspiracy—is specifically relevant to inoculating against conspiracy theories.
Logical thinking and fallacies can be dry material for students to encounter. They are still important. Good analytical thinking is a general goal of education; it trains good citizens for democratic participation, and it helps students understand the world better and thus to feel some measure of control in life. In the “panic moments” when students present controversial theories, it also helps that good practices in analytical thinking are essential to uncovering real conspiracies. After all, the world is full of small- and medium-sized conspiracies. As John Cook and Stephan Lewandowsky write in
The Conspiracy Theory Handbook (
Cook and Lewandowsky 2020), healthy skepticism and coherent thinking that is responsive to new evidence can lead to uncovering such true conspiracies, whereas the unreliable thinking typical of conspiracy thinking leads to false results and false accusations. The FLICC list of logical and argumentative fallacies involved in science denial and other conspiracy thinking (
Cook 2020) may not be comprehensive, but the points are well illustrated, are often easy to identify, and assist as well in media literacy. For instance, “fake experts” involves false balance, with extreme minority views magnified to present a fake debate that primarily serves to mislead and increase polarization. Cook’s logical fallacy list contains good, common pointers that, by extension, show good practice in thinking; it also contains some new items that are good to think with and point at in conspiracy theories. For instance, the distraction technique of the “red herring” is accompanied by a subcategory of “blowfish”—blowing a small detail out of proportion and drawing conclusions from it which would be wildly inappropriate, even if the alleged fact held up to scrutiny.
The Conspiracy Theory Handbook follows up on the FLICC themes with content related to identifying what is specific to conspiracy theories. These items (“CONSPIR”) also make for good student exercises, identifying things such as internal contradictions, a certainty that things are caused by nefarious intent, and the reinterpretation of randomness into a suspicion of conspiracy that makes itself immune to contrary evidence.
Using a modified inoculation strategy across disciplines that focuses on general characteristics of conspiracy theories holds several possibilities. It activates students in learning to recognize rhetorical strategies, storytelling techniques, good and bad practices with regard to sources, thinking, and arguing. By training students in good habits, it can help turn the “panic moments” into a possibility of active learning, making what can appear to be an inherently controversial issue into something more superficial that can be checked according to the rules of good thinking. It makes for an active role for the teacher as an instructor, therefore, sometimes also referee, but it also involves the peer group in the habit of training skills. It will involve active students interacting with both “rules” and examples, training good habits, and exercising critical, actively open-minded thinking that should also fit into larger programs of critical thinking (cf.
Sellars et al. 2018).
When successful, this approach should increase thinking skills that may lead to increased feelings of understanding the world, and thus feeling more in control, and less vulnerability to malicious manipulation and propaganda leading to hatred of outgroups.