Next Article in Journal
Family Business as a Bearer of Social Sustainability in Multinationals-Case of Slovakia
Next Article in Special Issue
Dynamic Norms and Food Choice: Reflections on a Failure of Minority Norm Information to Influence Motivation to Reduce Meat Consumption
Previous Article in Journal
On the Capability of the Epigeous Organs of Phragmites australis to Act as Metal Accumulators in Biomonitoring Studies
Previous Article in Special Issue
Can Reflective Diary-Writing Increase Sufficiency-Oriented Consumption? A Longitudinal Intervention Addressing the Role of Basic Psychological Needs, Subjective Well-Being, and Time Affluence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Changing Pro-Environmental Behavior: Evidence from (Un)Successful Intervention Studies

1
Behavioral Economics and Engineering Group, Faculty of Economics and Business, KU Leuven, Naamsestraat 69, 3000 Leuven, Belgium
2
Department of Psychology, University of Amsterdam, Nieuwe Achtergracht 129 B, 1018 WT Amsterdam, The Netherlands
*
Authors to whom correspondence should be addressed.
Sustainability 2021, 13(14), 7748; https://doi.org/10.3390/su13147748
Submission received: 7 July 2021 / Accepted: 10 July 2021 / Published: 12 July 2021
Human behavior is the main driver of environmental degradation and climate change [1,2]. Preserving our standard of living requires behavior change, and the most successful attempts will be informed by a solid understanding of what causes behaviors that affect the natural environment. Intervention studies can further this understanding by studying the causal determinants of pro-environmental behavior through (quasi-)experimental manipulation. Researchers across the behavioral sciences have followed this approach and tested a variety of behavior change techniques and intervention strategies [3,4,5,6]. With this Special Issue, we aim to contribute to this impressive body of evidence by inviting intervention studies that might otherwise be lost to the file drawer.
For the intervention literature to provide reliable behavior change knowledge, it needs to be largely free of systematic bias. Unfortunately, studies are more likely to enter the published literature if they yield statistically significant results, which leads to inflated effect size estimates [7,8,9]. Recent meta-analyses show that this kind of publication bias also affects intervention research in the environmental domain (e.g., [10,11]). In addition, researchers often tend to selectively report analyses that provide statistically significant results, which further contributes to the inflation of effect size estimates and the prevalence of false-positive findings in the literature [12,13].

1. Valuable Null Results

In this Special Issue, we addressed these issues by promoting the unbiased and transparent reporting of pro-environmental behavior research. We explicitly encouraged the submission of non-significant results and replication studies and ensured that no submission was rejected because of null results or the lack of subjective novelty. Submissions were rejected, however, when they contained serious methodological or reporting weaknesses that rendered those papers less informative for readers. At the editorial review stage, we encouraged authors to report all the studies they conducted on their research question, to report their analyses in an unbiased way (e.g., including supplementary tables displaying correlations between all study variables), and to make raw data openly available.
In this issue, we are publishing eleven reports of empirical studies and one systematic review that are valuable additions to an unbiased research literature (see Figure 1 for a word cloud of published articles). All of the empirical reports included at least one intervention study examining the effect of an experimental manipulation on pro-environmental behavior or closely related outcomes. In total, the authors reported 16 experiments: four of them were conducted in the field, six in the laboratory, and six online. These experiments used a variety of methodological approaches examining different intervention techniques and measuring pro-environmental behavior.
The field experiments examined the effect of real-world interventions on observations of actual behaviors of environmental relevance. For example, Ramli [14] studied the effectiveness of a feedback intervention on water use, and Goodhew and colleagues [15] tested whether inquiries about a wall insulation scheme would increase following the presentation of images visualizing heat loss from uninsulated walls. Similarly, Sloot and colleagues [16] examined which kind of appeal (financial, environmental, communal) was most effective in stimulating students to request information or to take a flyer about an environmental initiative.
Other researchers searched for ways to study pro-environmental behaviors under more controlled conditions in the laboratory. Both Hahn et al. [17] and Buttlar et al. [18] made use of a mouse-tracking paradigm to study if information-based interventions could change (ambivalent) attitudes regarding packaged food and food with expired best before dates, respectively. Moreover, Brick and Sherman [19] tested whether making choices in public vs. private laboratory settings would increase donations to environmental organizations and self-reported preferences for pro-environmental products.
Similar approaches were pursued in some of the online intervention studies. While Hallez and colleagues [20] examined if eco labels could reduce the ecological footprint of hypothetical food choices, AlDoh and colleagues [21] tested the effect of dynamic norms on individuals’ self-reported interest to reduce meat consumption. Next to these cross-sectional studies, two online studies also used a longitudinal approach, measuring pro-environmental behavior at two or more time points and administering the intervention in between. Kesenheimer and Greitemeyer [22] examined whether daily text message appeals would increase pro-environmental donations and pro-environmental behavior on a multi-item self-report scale, and Tröger and colleagues [23] investigated the effect of reflective diary-writing on self-reported sufficiency-oriented behaviors.
All but one of these 16 experiments used a between-group design to compare pro-environmental behavior across (two to five) conditions. In their systematic review on community interventions, Biglan and colleagues [24] present interrupted time-series designs as an alternative experimental approach to the study of pro-environmental behavior. Rather than being varied between groups of individuals, an intervention could be (repeatedly) introduced, removed, or changed over time to study its effect on pro-environmental behavior. Lange and colleagues [25] used a similar approach by linking the refusal of plastic bags in a takeaway restaurant to different prosocial incentives implemented across time points.
Only three of the 16 published experiments presented statistically significant results for the main intervention effect. Even when results were statistically significant, they were subject to a number of critical limitations. In our view, none of the studies provides conclusive evidence for or against the effectiveness of a particular intervention approach. However, we think that it is vitally important for the success of our research field that these results are published after a thorough quality control process (e.g., peer review). This is particularly important for designing future field studies because little may be known about logistic and design challenges that face field work. Inconclusive individual studies can be valuable building blocks for a cumulative (and eventually conclusive) research literature.

2. Accumulating Conclusive Evidence

The most relevant questions of pro-environmental behavior research cannot be conclusively addressed in a single study but rather through the systematic and unbiased accumulation of results. For many interventions, we still lack practically useful estimates of their effectiveness and reliable knowledge about the behavioral, contextual, or individual characteristics that moderate this effectiveness. If we want to know which kind of intervention works best for the promotion of which type of behavior in which population, we need extraordinarily large datasets rarely found in individual studies. However, such knowledge can be obtained by pooling data across studies (e.g., with meta-analyses). For example, by aggregating information across multiple intervention studies examining the effect of incentives on pro-environmental behavior, Maki and colleagues [26] aimed to test whether the effectiveness of incentives differed between behaviors or depended on incentive characteristics. Unfortunately, they were not able to test some of their hypotheses (e.g., regarding the differential effects of positive and negative reinforcement) due to a lack of studies in the published literature. In the context of such meta-analyses, inconclusive intervention studies (such as the ones published in the present Special Issue) can make a valuable contribution to addressing relevant research questions, no matter if they found statistically significant results themselves. Moreover, these studies provide a wealth of exploratory, secondary estimates (e.g., of the relationship between predictors), and these estimates are also valuable for future accumulation.
We suggest that intervention studies must meet several criteria to contribute to a cumulative empirical literature (see also [27]). First, they need to be free of reporting biases. If only significant or hypothesis-compatible findings find their way into the literature, any meta-analysis of that literature will be biased. With this Special Issue, we illustrate one way this criterion can be achieved, similar to the Registered Report format: by publishing studies based on their methods, not their results. Second, methods and results need to be described in a comprehensive and transparent way to allow meta-analysts to identify potentially relevant commonalties and differences between studies. In our role as guest editors, we tried to promote such reporting practices and, together with the authors and reviewers, we tried to make every report as transparent and informative as possible, including open code and data where possible. Third, methods need to be comparable to a degree that allows meaningful aggregation in meta-analyses. When reviewing the submissions to this Special Issue, we observed that different authors gave different names to similar interventions or similar names to possibly very different interventions. Standardizing the operationalization of interventions in line with, for example, established taxonomies of behavior change techniques [28] may help to build a more cumulative research literature.
In a similar vein, we think that more standardization may be helpful on the level of behavioral assessment. Many intervention studies rely on non-validated ad hoc measures for the assessment of pro-environmental behavior or its assumed antecedents [29]. Increased use of psychometrically established measures and procedures may render results more comparable and thus more valuable for a cumulative science of pro-environmental behavior change.
We heartily thank all the authors and reviewers who contributed to this Special Issue. On a personal note, we both experienced the review and revision processes as very constructive and cooperative, guided by the common goal to create the most informative empirical reports rather than chase novelty or low p-values. We hope that this Special Issue illustrates the value of inconclusive results and improves our knowledge about the causal determinants of pro-environmental behavior.

Author Contributions

Conceptualization, F.L. and C.B.; writing—original draft preparation, F.L.; writing—review and editing, C.B.; visualization, C.B. Both authors have read and agreed to the published version of the manuscript.

Funding

F.L. was funded by an FWO postdoctoral fellowship (No 12U1221N).

Acknowledgments

We thank all authors, reviewers, and the MDPI editorial staff for their support of this Special Issue.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. IPCC. Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change. 2014. Available online: https://www.ipcc.ch/report/ar5/syr/ (accessed on 9 April 2021).
  2. Ripple, W.J.; Wolf, C.; Newsome, T.M.; Galetti, M.; Alamgir, M.; Crist, E.; Mahmoud, M.I.; Laurance, W.F.; Coudrain, A.; Catry, T.; et al. World scientists’ warning to humanity: A second notice. BioScience 2017, 67, 1026–1028. [Google Scholar] [CrossRef]
  3. Byerly, H.; Balmford, A.; Ferraro, P.J.; Hammond Wagner, C.; Palchak, E.; Polasky, S.; Ricketts, T.H.; Schwartz, A.J.; Fisher, B. Nudging pro-environmental behavior: Evidence and opportunities. Front. Ecol. Environ. 2018, 16, 159–168. [Google Scholar] [CrossRef] [Green Version]
  4. Gelino, B.W.; Erath, T.G.; Reed, D.D. Going Green: A Systematic Review of Proenvironmental Empirical Research in Behavior Analysis. Behav. Soc. Issues 2021. [Google Scholar] [CrossRef]
  5. Osbaldiston, R.; Schott, J.P. Environmental sustainability and behavioral science: Meta-analysis of proenvironmental behavior experiments. Environ. Behav. 2012, 44, 257–299. [Google Scholar] [CrossRef]
  6. Zelezny, L.C. Educational interventions that improve environmental behaviors: A meta-analysis. J. Environ. Educ. 1999, 31, 5–14. [Google Scholar] [CrossRef]
  7. Franco, A.; Malhotra, N.; Simonovits, G. Publication bias in the social sciences: Unlocking the file drawer. Science 2014, 345, 1502–1505. [Google Scholar] [CrossRef] [PubMed]
  8. Ioannidis, J.P. Why most published research findings are false. PLoS Med. 2005, 2, e124. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Kühberger, A.; Fritz, A.; Scherndl, T. Publication bias in psychology: A diagnosis based on the correlation between effect size and sample size. PLoS ONE 2014, 9, e105825. [Google Scholar] [CrossRef] [PubMed]
  10. Mackay, C.M.; Schmitt, M.T. Do people who feel connected to nature do more to protect it? A meta-analysis. J. Environ. Psychol. 2019, 65, 101323. [Google Scholar] [CrossRef] [Green Version]
  11. Maki, A.; Carrico, A.R.; Raimi, K.T.; Truelove, H.B.; Araujo, B.; Yeung, K.L. Meta-analysis of pro-environmental behaviour spillover. Nat. Sustain. 2019, 2, 307–315. [Google Scholar] [CrossRef]
  12. Gelman, A.; Loken, E. The statistical crisis in science. Am. Sci. 2014, 102, 460–465. [Google Scholar] [CrossRef]
  13. Simmons, J.P.; Nelson, L.D.; Simonsohn, U. False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 2011, 22, 1359–1366. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Ramli, U. Social norms based eco-feedback for household water consumption. Sustainability 2021, 13, 2796. [Google Scholar] [CrossRef]
  15. Goodhew, J.; Pahl, S.; King, K.; Sanders, M.; Elliott, P.; Fox, M.; Boomsma, C.; Goodhew, S. Engaging People with Energy Efficiency: A Randomised Controlled Trial Testing the Effects of Thermal Imaging Visuals in a Letter Communication. Sustainability 2021, 13, 3543. [Google Scholar] [CrossRef]
  16. Sloot, D.; Jans, L.; Steg, L. Is an Appeal Enough? The Limited Impact of Financial, Environmental, and Communal Appeals in Promoting Involvement in Community Environmental Initiatives. Sustainability 2021, 13, 1085. [Google Scholar] [CrossRef]
  17. Hahn, L.; Buttlar, B.; Walther, E. Unpacking Plastic: Investigating Plastic Related Ambivalence. Sustainability 2021, 13, 2186. [Google Scholar] [CrossRef]
  18. Buttlar, B.; Löwenstein, L.; Geske, M.-S.; Ahlmer, H.; Walther, E. Love Food, Hate Waste? Ambivalence towards Food Fosters People’s Willingness to Waste Food. Sustainability 2021, 13, 3971. [Google Scholar] [CrossRef]
  19. Brick, C.; Sherman, D.K. When Does Being Watched Change Pro-Environmental Behaviors in the Laboratory? Sustainability 2021, 13, 2766. [Google Scholar] [CrossRef]
  20. Hallez, L.; Qutteina, Y.; Boen, F.; Smits, T. The ABC’s of Ecological and Nutrition Labels. The Impact of Label Theme and Complexity on the Environmental Footprint of Online Grocery Choices. Sustainability 2021, 13, 2474. [Google Scholar] [CrossRef]
  21. AlDoh, A.; Sparks, P.; Harris, P.R. Dynamic Norms and Food Choice: Reflections on a Failure of Minority Norm Information to Influence Motivation to Reduce Meat Consumption. Sustainability 2021, in press. [Google Scholar]
  22. Kesenheimer, J.S.; Greitemeyer, T. Ego or Eco? Neither Ecological nor Egoistic Appeals of Persuasive Climate Change Messages Impacted Pro-Environmental Behavior. Sustainability 2020, 12, 10064. [Google Scholar] [CrossRef]
  23. Tröger, J.; Wullenkord, M.C.; Barthels, C.; Steller, R. Can Reflective Diary-Writing Increase Sufficiency-Oriented Consumption? A Longitudinal Intervention Addressing the Role of Basic Psychological Needs, Subjective Well-Being, and Time Affluence. Sustainability 2021, 13, 4885. [Google Scholar] [CrossRef]
  24. Biglan, A.; Bonner, A.C.; Johansson, M.; Ghai, J.L.; Van Ryzin, M.J.; Dubuc, T.L.; Seniuk, H.A.; Fiebig, J.H.; Coyne, L.W. The State of Experimental Research on Community Interventions to Reduce Greenhouse Gas Emissions—A Systematic Review. Sustainability 2020, 12, 7593. [Google Scholar] [CrossRef]
  25. Lange, F.; De Weerdt, L.; Verlinden, L. Reducing Plastic Bag Use Through Prosocial Incentives. Sustainability 2021, 13, 2421. [Google Scholar] [CrossRef]
  26. Maki, A.; Burns, R.J.; Ha, L.; Rothman, A.J. Paying people to protect the environment: A meta-analysis of financial incentive interventions to promote proenvironmental behaviors. J. Environ. Psychol. 2016, 47, 242–255. [Google Scholar] [CrossRef]
  27. Lange, F. Are difficult-to-study populations too difficult to study in a reliable way? Lessons learned from meta-analyses in clinical neuropsychology. Eur. Psychol. 2020, 25, 41–50. [Google Scholar] [CrossRef]
  28. Michie, S.; Richardson, M.; Johnston, M.; Abraham, C.; Francis, J.; Hardeman, W.; Eccles, M.P.; Cane, J.; Wood, C.E. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Ann. Behav. Med. 2013, 46, 81–95. [Google Scholar] [CrossRef]
  29. Lange, F.; Dewitte, S. Measuring pro-environmental behavior: Review and recommendations. J. Environ. Psychol. 2019, 63, 92–100. [Google Scholar] [CrossRef]
Figure 1. Word cloud of articles in the Special Issue. The most common words in the published articles are displayed here with more frequent words appearing larger. Most function words were excluded and duplicates were combined, such as behavior/behaviour/behaviors. Image generated at worditout.com.
Figure 1. Word cloud of articles in the Special Issue. The most common words in the published articles are displayed here with more frequent words appearing larger. Most function words were excluded and duplicates were combined, such as behavior/behaviour/behaviors. Image generated at worditout.com.
Sustainability 13 07748 g001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lange, F.; Brick, C. Changing Pro-Environmental Behavior: Evidence from (Un)Successful Intervention Studies. Sustainability 2021, 13, 7748. https://doi.org/10.3390/su13147748

AMA Style

Lange F, Brick C. Changing Pro-Environmental Behavior: Evidence from (Un)Successful Intervention Studies. Sustainability. 2021; 13(14):7748. https://doi.org/10.3390/su13147748

Chicago/Turabian Style

Lange, Florian, and Cameron Brick. 2021. "Changing Pro-Environmental Behavior: Evidence from (Un)Successful Intervention Studies" Sustainability 13, no. 14: 7748. https://doi.org/10.3390/su13147748

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop