Next Article in Journal
Patterns of Tobacco Smoking and Nicotine Vaping among University Students in the United Arab Emirates: A Cross-Sectional Study
Previous Article in Journal
Sedentary Lifestyle Matters as Past Sedentariness, Not Current Sedentariness, Predicts Cognitive Inhibition Performance among College Students: An Exploratory Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Viewing Natural vs. Urban Images and Emotional Facial Expressions: An Exploratory Study

Faculty of Informatics and Management, University of Hradec Králové, Rokitanského 62, 500 03 Hradec Králové, Czech Republic
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2021, 18(14), 7651; https://doi.org/10.3390/ijerph18147651
Submission received: 5 June 2021 / Revised: 14 July 2021 / Accepted: 16 July 2021 / Published: 19 July 2021
(This article belongs to the Section Mental Health)

Abstract

:
There is a large body of evidence that exposure to simulated natural scenes has positive effects on emotions and reduces stress. Some studies have used self-reported assessments, and others have used physiological measures or combined self-reports with physiological measures; however, analysis of facial emotional expression has rarely been assessed. In the present study, participant facial expressions were analyzed while viewing forest trees with foliage, forest trees without foliage, and urban images by iMotions’ AFFDEX software designed for the recognition of facial emotions. It was assumed that natural images would evoke a higher magnitude of positive emotions in facial expressions and a lower magnitude of negative emotions than urban images. However, the results showed only very low magnitudes of facial emotional responses, and differences between natural and urban images were not significant. While the stimuli used in the present study represented an ordinary deciduous forest and urban streets, differences between the effects of mundane and attractive natural scenes and urban images are discussed. It is suggested that more attractive images could result in more pronounced emotional facial expressions. The findings of the present study have methodological relevance for future research. Moreover, not all urban dwellers have the possibility to spend time in nature; therefore, knowing more about the effects of some forms of simulated natural scenes surrogate nature also has some practical relevance.

1. Introduction

People react to the natural environment mostly with positive emotions, and viewing the natural environment also has a positive function in mental restoration [1,2,3]. Moreover, there is a large body of evidence that exposure to simulated natural scenes also has similar positive effects [4]. Viewing simulated natural scenes may help people to improve their actual emotional state and their mental restoration in situations in which they have no opportunity to visit a real natural environment. Thus, it is useful to deeply analyze immediate emotional reactions to various types of natural scenes with diverse research methods. In the present study, we explored emotional facial expressions in viewing natural and urban images and employed automated facial expression analysis by machine vision software. These techniques have developed and improved considerably in the last three decades and may overcome the drawbacks and time consumption associated with the facial action coding system and the technical difficulties of facial electromyography. Recently, automated computer-based technologies have demonstrated sufficient reliability (e.g., [5,6]), and their accuracy may surpass that of human raters in many cases [7].

1.1. Positive Effects of Viewing Surrogate Nature

A growing body of research has documented the therapeutic and health-improving effects of contact with the natural environment. Many studies have shown that being in a natural environment benefits health, increases positive emotion, reduces stress, and has direct and positive impacts on well-being and mental health (for review, [1,2,3]).
However, not all urban dwellers have the possibility to spend time in nature; moreover, the natural environment may be difficult to access from large metropolitan areas. Therefore, environmental psychologists have explored whether some forms of simulated nature can have at least partially similar positive effects. The research findings have suggested that exposure to simulated natural scenes (e.g., viewing photographs, slides, videos, and virtual computer-generated nature scenes (for review, see [4])) may also have similar positive effects. For instance, it was documented that viewing natural images can improve mood and perceived restoration [8,9,10,11,12]. Viewing natural videos can also improve mood and perceived restoration and reduce stress (e.g., [13,14,15,16,17,18,19,20]). A similar effect was revealed with the exploration of nature scenes in virtual reality (e.g., [21,22,23,24,25]). Some of these studies used self-reported assessments (e.g., [8,11,14,16,20,26]), while others used physiological measures or combined self-reports with physiological measures (e.g., [13,15,17,18,22,27,28]). A detailed overview of methods surveying environmental perception was provided by Browning et al. [21]. An analysis of facial emotional expression has only rarely been used (e.g., [29,30,31,32,33]).

1.2. Ekman’s Six Basic Emotions

The link between emotions and facial expressions is based on theoretical grounds and has empirical support. Several decades ago, Ekman [34] defined the six most basic emotions that should be common in all cultures. They are anger, disgust, fear, happiness, sadness, and surprise. They can be easily recognized in facial expressions. Although they are also other theoretical frameworks that describe emotions in a dimensional space (e.g., [35]), Ekman’s concept of six emotions has preferentially been used in analyses of facial expressions of emotions.

1.3. Measurement of Emotional Facial Expressions

Currently, three methods are used in research studies to measure facial expressions of emotion: the facial action coding system, facial electromyography, and automatic computer facial expression analysis. The first method, the facial action coding system, is based on a subjective identification of six basic emotions in video-recorded faces [36]. Specially trained human coders evaluate specific emotional expressions called “action units” that account for the expression of six basic emotions. The action unit is the smallest visible functional facial movement that human observers can recognize. Although this is a method that provides sufficient validity in facial emotion description, its disadvantage is the considerable time required for data processing. Another research technique, facial electromyography, is based on monitoring activations of facial muscles during changes in emotional responses. It requires applying electrodes on the skin surface. It enables the identification of the specific facial muscle patterns used to display, for instance, joy, appetite, and disgust (e.g., [37]). This technique enables the detection of subtle facial muscle activities, but its disadvantage is technical complexity. Moreover, having electrodes attached to the face is far from a natural condition.

1.4. Validation of Software for Automated Facial Expression Analysis

Recently, there have been three commercial major software tools for automated facial expression analysis: Noldus’ FaceReader (Noldus Information Technology, Wageningen, The Netherlands) [38], iMotions’ FACET module (iMotions, Copenhagen, Denmark) [39], and iMotions’ AFFDEX module (iMotions, Copenhagen, Denmark) [39]. There is a debate regarding the reliability of these software programs in emotion recognition compared to facial electromyography or the facial action coding system. In a comparison with EMG results, Beringer et al. [40] validated the software FACET for happy and angry expressions. Recently, Kulke et al. [41] compared AFFDEX emotion recognition software with facial electromyography measurements for the ability to identify happy, angry, and neutral faces. However, there might be specific situations where human observers are better than automated face analysis. For instance, Del Líbano et al. [42] investigated how prototypical happy faces can be discriminated from blended expressions with a smile but nonhappy eyes and found that human observers using facial action units were better than those using FACET software for automated analysis. They concluded that the software FACET can be a valid tool for categorizing prototypical expressions, but it is not reliable enough for the discrimination of blended expressions.

1.5. Facial Expressions While Viewing Natural Environment

There are only a few studies that analyzed facial expressions when viewing the natural environment. These studies mostly employed facial electromyography. Electromyographic responses were mostly measured using the facial muscles of the forehead because these muscles can reflect mental and emotional stress better than other muscles. An increase in facial electromyography amplitude is a reflection of an increased level of muscle tension and, conversely, a decrease in amplitude reflects decreased tension.
Cacioppo et al. [29] presented slides that were mildly to moderately evocative of a positive and negative effect (mountain cliff, bruised torso, ocean beach, and polluted roadway) for 5 s to participants. They found that facial electromyographic activity over the brow, eye, and cheek muscle regions differentiated the pleasantness and intensity of affective reactions to the visual stimuli. In the study by Chang et al. [30], participants viewed images of an office with window views of nature or the urban environment for 15 s. The electrodes were placed above the eyebrows. The amplitude of electromyography, whose growth indicated an increasing degree of muscle tension, was inspected. In addition, changes in EEG, blood volume pulse, and state anxiety were recorded. The results indicated that the participants were less anxious, while watching a view of nature or indoor plants in contrast to offices without window views or offices without plants. However, the electromyographic results were inconsistent with the other measures. While there were lower amplitudes with the city-window office, the highest amplitude, curiously, was with the nature-window office. In a subsequent study, Chang et al. [31] presented natural images to the participants with various levels of restorativeness, each for 10 s. Electromyographic responses were measured using the facial muscles of the forehead, and EEG and blood volume pulse were assessed. The results revealed a large degree of congruency between the psychological measures of restorativeness and the three physiological responses. In summary, these few studies suggested that viewing natural images may elicit changes in facial expressions.
To date, automated facial expression recognition has not been used for the analysis of facial movements in viewing urban images. However, Wei et al. [32] explored facial emotional expressions in a real outdoor environment during a walk. Participants were asked to repeatedly take selfies while walking on urban streets or in a forest park that reflected their natural facial expressions and real-time emotions. The photographs were analyzed using FireFACE software. It was shown that the forest experience evoked higher happy scores but lower neutral scores than the urban environment.

1.6. The Goals

To date, our knowledge about emotional reactions after viewing images with natural environments registered via facial expressions is rather limited. To our knowledge, this technique has not been used in the context of environmental psychology and environmental preference research. Our goal was to explore these direct facial expressions while viewing a diverse range of images by using automated facial expression analysis, specifically iMotions’ AFFDEX software. In the present study, facial expressions while viewing natural images, namely, forest trees with foliage, forest trees without foliage, and urban images were investigated. Based on previous findings, it was hypothesized that natural images would evoke a higher magnitude of positive emotions in facial expressions and a lower magnitude of negative emotions than urban images. Furthermore, we explored whether people react in a different way to forest trees with foliage and forest trees without foliage

2. Materials and Methods

2.1. Participants

Sixty-six undergraduates participated in the experiment. The sample comprised young adults between the ages of 18 and 25 (mean = 20.97, SD = 1.11; 42 females). The participants were enrolled in the first, second, or third year of various psychology courses. They were students in informatics, financial management, and tourism at the University of Hradec Králové. The University of Hradec Králové is a small regional university, and the students come mostly from nearby, the northeastern regions of the Czech Republic—Hradec Králové and Pardubice. In this area, there are mostly lowlands or temperate highlands, mostly with deciduous forests. The participants lived in towns and villages, where the natural environment is easily accessible. Thus, the stimuli presented in the experiment (see below) included the type of landscape known to the participants. Similarly, the types of city buildings were known to the participants.

2.2. Ethical Approval

Ethical approval for the present study was obtained from the Committee for Research Ethics at the University of Hradec Králové (No. 4/2018). Participants signed an informed consent form in which they declared that they voluntarily participated in the experiment and that they were informed about the experimental procedure. They agreed that recordings of their facial behavior would be registered and used for scientific purposes only. They were allowed to withdraw from the experiment at any time.

2.3. Stimulus Material

Images used in the experiment were taken by one of the authors (Figure 1). They included images of forests and urban scenes. The images were transformed into a 1920 × 1080 pixel resolution using Adobe Photoshop CS 6 software. All images had their brightness levels and contrast balanced using the “Auto Levels”, “Auto Contrast”, and “Auto Colors” options in Adobe Photoshop. The photographs were not further digitally modified. Twenty-four images were presented in one experimental session. Eight natural images of deciduous forests with foliage were taken mainly in forests along the city of Prague. An additional eight natural images of deciduous forests without foliage were taken in the same areas as the previous set of photographs. Eight images were photographs of urban streets in Prague in the Czech Republic.

2.4. Apparatus

The experiment was controlled by a PC computer with a 1920 × 1200 pixel resolution screen and a diagonal of 61 cm with a Logitech Webcam C920 camera (Logitech, Newark, CA, USA) that was situated on the top of the screen. The camera and presentation of stimuli, as well as the data processing, were controlled by the software iMotion 8.0. The facial expression analysis was conducted by iMotions Facial Expression Analysis Module AFFDEX (iMotions, Copenhagen, Denmark). The web camera recorded facial videos while the participants viewed the stimuli, and then, videos were imported into the iMotions software for facial expression analysis postprocessing. AFFDEX enables the measurement of seven emotional categories: joy, anger, surprise, fear, contempt, sadness, and disgust. All emotional indicators were scored by the software on a scale from 0 to 100, indicating the probability of having detected the emotion. A magnitude of 0 indicated that the emotion was absent; in turn, a magnitude of 100 indicated a 100% probability of having detected the emotion.

2.5. Procedure

The participants were tested individually in a laboratory. The research was conducted in December 2019 within working days from December 10 to December 18 from 9:00 to 16:00. The participants selected the date and time of the experimental session according to their free time. After arrival to the laboratory, the participant signed the informed consent form. Then, he/she was informed about the experiment and read the instructions. The instructions were as follows: “You will take part in a study, in which you will successively examine a series of images presented on the computer screen. View an image with composure. Do not try to remember its content or its details. Your face will be recorded. Each image will be displayed for 15 s”. The participants sat approximately 70 cm from the display monitor. The images were presented in a random order. Every trial started with a fixation cross situated in the center of the screen on a gray background. The participants had to fixate on the fixation cross for 2 s before the image appeared. Each image was displayed for 15 s. There was a comfortable temperature in the laboratory, about 23 degrees Celsius.

3. Results

First, the raw data were exported from AFFDEX. Approximately, 240 measurements were obtained for one image, and approximately, 1900 measurements were obtained for one participant within one image category (urban images, forest images with vegetation, forest images without vegetation, see Supplementary Materials, Table S1). Next, the mean scores were calculated for each participant and the images in each category (Table 1). The results showed that the level of identified facial emotions was very low, under 1%, and differences between the scores for specific emotions under these conditions were also small. One-way repeated measures analyses of variance (ANOVA) were conducted to test the effect of the experimental condition (urban images, forest image with vegetation, forest images without vegetation) on the level of facial expression of specific emotions (Table 2). The analyses showed that the effect of the experimental conditions was nonsignificant for facial expressions of all emotions. It was only for facial expressions of the emotion fear, compared to facial expressions of other emotions, where more pronounced differences were found between urban images and both sets of forest images in the expected direction; however, the p-value was only 0.121.

4. Discussion

By using automated facial expression analysis, the present study explored whether a short viewing of urban or natural environments would elicit changes in facial expressions of emotions that might reflect changes in actual emotional state. Although we predicted differences between facial expressions while viewing urban and natural images, we did not find any significant differences in our study, which is in contrast with a large body of previous research (for review, see [21]), which documented diverse reactions to virtual urban and natural scenes by using introspection or different physiological methods.
In the study by Wei et al. [32], participants walked along a forest or an urban street for five hours and were asked to take selfies every 30 min by posing with their natural facial expressions and real-time emotion. Photographs of their faces were analyzed and processed by facial expression analysis software to obtain scores for happy, sad, and neutral expressions. It was found that the forest walk evoked higher happy and lower neutral expressions than the walk in an urban environment. Clearly, people who spend a long time in a pleasant natural environment might express positive emotion on their faces. Thus, the first explanation of our failure may be that the 15-second viewing of an image was too short to elicit observable facial expressions of an emotional reaction. However, Cacioppo et al. [29] observed that even five seconds of presentation of slides with outdoor environments resulted in changes in facial expressions; however, they used a different measure, namely, facial electromyographic activity.
The second possible explanation may be that the visual stimuli used in the present study were not sufficiently distinct to elicit intense emotional reactions accompanied by visible emotional facial expressions. As examples of urban images, we used photographs of ordinary urban apartment houses from the first half of the 20th century. Similarly, natural images represented photographs of ordinary deciduous forests located around the capital city taken under “normal” atmospheric conditions. Moreover, they were not further digitally modified to make them more attractive. Thus, the visual stimuli used in this experiment represented common environments where participants were living and, thus, may not have had the capacity to elicit a feeling of pronounced emotional responses. In the present study, we did not employ attractive natural images that have mostly been used in environmental psychology research [20], such as high mountains, rocks, lakes, sea, etc. For instance, Cacioppo et al. [29], who reported changes in facial electromyographic activity after 5-second slide presentations of natural stimuli, used stimuli that were rated as mildly to moderately pleasant (e.g., mountain cliff) or mildly relaxing (e.g., ocean beach). Clearly, a mountain cliff or an ocean beach are more distinct environments than central European lowland forests. This explanation is consistent with the findings of the Joye and Bolderdijk study [43], where participants watched pictures of awesome and mundane nature. They found that watching awesome natural scenes compared to mundane nature scenes and a neutral condition had pronounced emotional effects. Clearly, future research should compare the effects of mundane vs. more attractive natural scenes on emotional facial expressions. It is worth commenting on possible effects of participants’ experiences and cultural background on perception and estimation of aesthetical values of natural environments (e.g., see scenic beauty estimation method [44]). These individual variables may even result in a different estimation of the scenic beauty of an identical natural environment. Moreover, although verbal evaluations of environments also have their cognitive component that may be influenced by common beliefs (e.g., nature is beautiful, an urban street is ugly), facial emotional expressions are more spontaneous and reflect actual emotions. Thus, these individual variables may play a more substantial role in our research than in investigations based only on the verbal estimation of the environment. A specific environment may elicit positive emotion because people may have positive experiences and memories with that environment, or the environment is surprising because it is in strong contrast with their everyday environment, and this may elicit a desire to visit such an attractive environment, etc. Thus, the further limitation of the present study is that these individual experiences and backgrounds were not explored.
Furthermore, we may also speculate that the experimental situation, when participants know that they are part of the experiment and quickly observe diverse visual stimuli, may also differ from real-life situations, when they are using some form of virtual nature for relaxation. On the other hand, a large body of research has observed various emotional reactions to the natural environment in the laboratory (for review, see [4]). Perhaps an appropriate instruction that stresses the necessity to concentrate on visual stimuli and to imagine that they are inside the environment for a relaxed walk may strengthen emotional reactions.
As mentioned above, to date, there is a lack of data from studies that used the same research methodology and computer software. Our results can only be compared with the data obtained in the most recent study that was conducted in a different field. Otamendi and Sutil Martín [45] explored facial expressions in perceiving video advertisements processed by the same AFFDEX software that was used in our study. In their study, the participants viewed advertisement spots lasting 91 s that consisted of 31 scenes. The spots showed the accompanying role that a mother plays throughout the life of a child, from birth to adulthood. Similarly, they also reported small values for specific emotions, the highest for joy with a mean = 4.82, and smaller for the other emotions with means between 0.42 and 1.12 (AFFDEX scores emotions on a scale from 0 to 100). Only in the target group for the advertisement (mature aged women) did they find higher emotional reactions (mean for joy = 14.17). Thus, their investigation obtained similar small average values for emotional facial expressions in nontarget groups, as we found in our study.
To conclude, although our findings did not confirm differences in emotional reactions to natural and urban scenes, we suppose that there are other variables that may influence these findings. A low emotional salience of the pictures was already mentioned. Moreover, it seems that a random and short presentation of different visual stimuli, which is typical for experiments in the area of visual perception, is not ideal for the investigation of emotional reactions to visual stimuli, even where they consist of “mundane” environments and are not emotionally salient enough. It seems that by using more attractive visual environments, it could be possible to find significant differences in facial emotional expressions. Moreover, instructions in the experiment to be more immersed and engaged in the presented visual environment may affect the results. It is also worth commenting on possible individual variables. Thus, the present study has methodological relevance for future research. Moreover, knowing more about the effects of viewing simulated natural scenes on emotional reactions also has practical relevance.

5. Conclusions

The present study represents one of the first attempts to use automated facial expression analysis by machine vision software within the context of environmental psychology and research on preferred environments. The results showed that a mundane environment with low emotional salience did not elicit significant facial emotional expressions. This finding may help future research that could provide deeper insights into the positive Sileffect of viewing certain forms of simulated natural scenes. Not all urban dwellers have the possibility to spend time in nature, or the natural environment may be difficult to access. Therefore, it is useful to explore whether some forms of simulated nature can have at least partially similar positive effects.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/ijerph18147651/s1, Table S1: Dataset.

Author Contributions

Conceptualization, M.F.; methodology, M.F. and J.P.; software, J.P.; validation, J.P. and M.F.; formal analysis, M.F. and J.P.; data curation, J.P.; writing—original draft preparation, M.F.; writing—review and editing, M.F. and J.P.; project administration, J.P.; funding acquisition, M.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Student Specific Research Grants 1/2021 from the Faculty of Informatics and Management at the University of Hradec Králové.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Committee for Research Ethics at the University of Hradec Králové, No. 4/2018, 8 January 2019.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets supporting this article have been uploaded as part of the Supplementary Materials.

Acknowledgments

We thank Daniel Kučera, Jan Nálevka, Jan Nosek, Filip Roškot, Filip Rousek, and Michal Voda for their help in conducting the experiment.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bowler, D.E.; Buyung-Ali, L.M.; Knight, T.M.; Pullin, A.S. A Systematic Review of Evidence for the Added Benefits to Health of Exposure to Natural Environments. BMC Public Health 2010, 10, 456. [Google Scholar] [CrossRef] [Green Version]
  2. Bratman, G.N.; Hamilton, J.P.; Daily, G.C. The Impacts of Nature Experience on Human Cognitive Function and Mental Health: Nature Experience, Cognitive Function, and Mental Health. Ann. N. Y. Acad. Sci. 2012, 1249, 118–136. [Google Scholar] [CrossRef]
  3. McMahan, E.A.; Estes, D. The Effect of Contact with Natural Environments on Positive and Negative Affect: A Meta-Analysis. J. Posit. Psychol. 2015, 10, 507–519. [Google Scholar] [CrossRef]
  4. Browning, M.H.E.M.; Mimnaugh, K.J.; van Riper, C.J.; Laurent, H.K.; LaValle, S.M. Can Simulated Nature Support Mental Health? Comparing Short, Single-Doses of 360-Degree Nature Videos in Virtual Reality with the Outdoors. Front. Psychol. 2020, 10, 2667. [Google Scholar] [CrossRef] [Green Version]
  5. Lewinski, P.; den Uyl, T.M.; Butler, C. Automated Facial Coding: Validation of Basic Emotions and FACS AUs in FaceReader. J. Neurosci. Psychol. Econ. 2014, 7, 227–236. [Google Scholar] [CrossRef] [Green Version]
  6. Calvo, M.G.; Nummenmaa, L. Perceptual and Affective Mechanisms in Facial Expression Recognition: An Integrative Review. Cogn. Emot. 2016, 30, 1081–1106. [Google Scholar] [CrossRef] [PubMed]
  7. Geiger, M.; Wilhelm, O. Computerized Facial Emotion Expression Recognition. In Digital Phenotyping and Mobile Sensing; Baumeister, H., Montag, C., Eds.; Studies in Neuroscience, Psychology and Behavioral Economics; Springer International Publishing: Cham, Switzerland, 2019; pp. 31–44. ISBN 978-3-030-31619-8. [Google Scholar]
  8. Berman, M.G.; Jonides, J.; Kaplan, S. The Cognitive Benefits of Interacting with Nature. Psychol. Sci. 2008, 19, 1207–1212. [Google Scholar] [CrossRef] [PubMed]
  9. Johnsen, S.Å.K.; Rydstedt, L.W. Active Use of the Natural Environment for Emotion Regulation. Eur. J. Psychol. 2013, 9, 798–819. [Google Scholar] [CrossRef]
  10. Lee, K.E.; Williams, K.J.H.; Sargent, L.D.; Williams, N.S.G.; Johnson, K.A. 40-Second Green Roof Views Sustain Attention: The Role of Micro-Breaks in Attention Restoration. J. Environ. Psychol. 2015, 42, 182–189. [Google Scholar] [CrossRef]
  11. Martínez-Soto, J.; Gonzales-Santos, L.; Barrios, F.A.; Lena, M.E.M.-L. Affective and Restorative Valences for Three Environmental Categories. Percept. Mot. Skills 2014, 119, 901–923. [Google Scholar] [CrossRef]
  12. Staats, H.; Kieviet, A.; Hartig, T. Where to Recover from Attentional Fatigue: An Expectancy-Value Analysis of Environmental Preference. J. Environ. Psychol. 2003, 23, 147–157. [Google Scholar] [CrossRef]
  13. Akers, A.; Barton, J.; Cossey, R.; Gainsford, P.; Griffin, M.; Micklewright, D. Visual Color Perception in Green Exercise: Positive Effects on Mood and Perceived Exertion. Environ. Sci. Technol. 2012, 46, 8661–8666. [Google Scholar] [CrossRef] [PubMed]
  14. Bornioli, A.; Parkhurst, G.; Morgan, P.L. Psychological Wellbeing Benefits of Simulated Exposure to Five Urban Settings: An Experimental Study from the Pedestrian’s Perspective. J. Transp. Health 2018, 9, 105–116. [Google Scholar] [CrossRef]
  15. de Kort, Y.A.W.; Meijnders, A.L.; Sponselee, A.A.G.; IJsselsteijn, W.A. What’s Wrong with Virtual Trees? Restoring from Stress in a Mediated Environment. J. Environ. Psychol. 2006, 26, 309–320. [Google Scholar] [CrossRef]
  16. Mayer, F.S.; Frantz, C.M.; Bruehlman-Senecal, E.; Dolliver, K. Why Is Nature Beneficial?: The Role of Connectedness to Nature. Environ. Behav. 2009, 41, 607–643. [Google Scholar] [CrossRef]
  17. Pilotti, M.; Klein, E.; Golem, D.; Piepenbrink, E.; Kaplan, K. Is Viewing a Nature Video After Work Restorative? Effects on Blood Pressure, Task Performance, and Long-Term Memory. Environ. Behav. 2015, 47, 947–969. [Google Scholar] [CrossRef]
  18. Snell, T.L.; McLean, L.A.; McAsey, F.; Zhang, M.; Maggs, D. Nature Streaming: Contrasting the Effectiveness of Perceived Live and Recorded Videos of Nature for Restoration. Environ. Behav. 2019, 51, 1082–1105. [Google Scholar] [CrossRef]
  19. Tabrizian, P.; Baran, P.K.; Smith, W.R.; Meentemeyer, R.K. Exploring Perceived Restoration Potential of Urban Green Enclosure through Immersive Virtual Environments. J. Environ. Psychol. 2018, 55, 99–109. [Google Scholar] [CrossRef]
  20. van den Berg, A.E.; Koole, S.L.; van der Wulp, N.Y. Environmental Preference and Restoration: (How) Are They Related? J. Environ. Psychol. 2003, 23, 135–146. [Google Scholar] [CrossRef]
  21. Browning, M.H.E.M.; Saeidi-Rizi, F.; McAnirlin, O.; Yoon, H.; Pei, Y. The Role of Methodological Choices in the Effects of Experimental Exposure to Simulated Natural Landscapes on Human Health and Cognitive Performance: A Systematic Review. Environ. Behav. 2020, 53, 001391652090648. [Google Scholar] [CrossRef]
  22. Chirico, A.; Ferrise, F.; Cordella, L.; Gaggioli, A. Designing Awe in Virtual Reality: An Experimental Study. Front. Psychol. 2018, 8, 2351. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Chirico, A.; Gaggioli, A. When Virtual Feels Real: Comparing Emotional Responses and Presence in Virtual and Natural Environments. Cyberpsychol. Behav. Soc. Netw. 2019, 22, 220–226. [Google Scholar] [CrossRef] [PubMed]
  24. Felnhofer, A.; Kothgassner, O.D.; Schmidt, M.; Heinzle, A.-K.; Beutl, L.; Hlavacs, H.; Kryspin-Exner, I. Is Virtual Reality Emotionally Arousing? Investigating Five Emotion Inducing Virtual Park Scenarios. Int. J. Hum. Comput. Stud. 2015, 82, 48–56. [Google Scholar] [CrossRef]
  25. Higuera-Trujillo, J.L.; López-Tarruella Maldonado, J.; Llinares Millán, C. Psychological and Physiological Human Responses to Simulated and Real Environments: A Comparison between Photographs, 360° Panoramas, and Virtual Reality. Appl. Ergon. 2017, 65, 398–409. [Google Scholar] [CrossRef]
  26. Hartig, T.; Korpela, K.; Evans, G.W.; Gärling, T. A Measure of Restorative Quality in Environments. Scand. Hous. Plan. Res. 1997, 14, 175–194. [Google Scholar] [CrossRef]
  27. Pretty, J.; Peacock, J.; Sellens, M.; Griffin, M. The Mental and Physical Health Outcomes of Green Exercise. Int. J. Environ. Health Res. 2005, 15, 319–337. [Google Scholar] [CrossRef]
  28. Ulrich, R.S. Natural Versus Urban Scenes: Some Psychophysiological Effects. Environ. Behav. 1981, 13, 523–556. [Google Scholar] [CrossRef]
  29. Cacioppo, J.T.; Petty, R.E.; Losch, M.E.; Kim, H.S. Electromyographic Activity over Facial Muscle Regions Can Differentiate the Valence and Intensity of Affective Reactions. J. Pers. Soc. Psychol. 1986, 50, 260–268. [Google Scholar] [CrossRef]
  30. Chang, C.-Y.; Chen, P.-K. Human Response to Window Views and Indoor Plants in the Workplace. Hort. Sci. 2005, 40, 1354–1359. [Google Scholar] [CrossRef] [Green Version]
  31. Chang, C.-Y.; Hammitt, W.E.; Chen, P.-K.; Machnik, L.; Su, W.-C. Psychophysiological Responses and Restorative Values of Natural Environments in Taiwan. Landsc. Urban. Plan. 2008, 85, 79–84. [Google Scholar] [CrossRef]
  32. Wei, H.; Hauer, R.J.; He, X. A Forest Experience Does Not Always Evoke Positive Emotion: A Pilot Study on Unconscious Facial Expressions Using the Face Reading Technology. For. Policy Econ. 2021, 123, 102365. [Google Scholar] [CrossRef]
  33. Wei, H.; Ma, B.; Hauer, R.J.; Liu, C.; Chen, X.; He, X. Relationship between Environmental Factors and Facial Expressions of Visitors during the Urban Forest Experience. Urban. For. Urban. Green. 2020, 53, 126699. [Google Scholar] [CrossRef]
  34. Ekman, P.; Friesen, W.V. Measuring Facial Movement. J. Nonverbal. Behav. 1976, 1, 56–75. [Google Scholar] [CrossRef]
  35. Russell, J.A.; Mehrabian, A. Evidence for a Three-Factor Theory of Emotions. J. Res. Pers. 1977, 11, 273–294. [Google Scholar] [CrossRef]
  36. Ekman, P. Universal facial expressions of emotion. Calif. Ment. Health Res. Dig. 1970, 8, 151–158. [Google Scholar]
  37. Fridlund, A.J.; Cacioppo, J.T. Guidelines for Human Electromyographic Research. Psychophysiology 1986, 23, 567–589. [Google Scholar] [CrossRef]
  38. N. I. Technology. Face Reader. 2007. Available online: http://noldus.com/facereader (accessed on 5 June 2021).
  39. Research/Products/Facereader iMotions. Facial Expression Analysis: The Definitive Guide. 2016. Available online: https://imotions.com/facialexpression-guide-ebook/ (accessed on 5 June 2021).
  40. Beringer, M.; Spohn, F.; Hildebrandt, A.; Wacker, J.; Recio, G. Reliability and Validity of Machine Vision for the Assessment of Facial Expressions. Cogn. Syst. Res. 2019, 56, 119–132. [Google Scholar] [CrossRef]
  41. Kulke, L.; Feyerabend, D.; Schacht, A. A Comparison of the Affectiva iMotions Facial Expression Analysis Software with EMG for Identifying Facial Expressions of Emotion. Front. Psychol. 2020, 11, 329. [Google Scholar] [CrossRef]
  42. Del Líbano, M.; Calvo, M.G.; Fernández-Martín, A.; Recio, G. Discrimination between Smiling Faces: Human Observers vs. Automated Face Analysis. Acta Psychol. 2018, 187, 19–29. [Google Scholar] [CrossRef]
  43. Joye, Y.; Bolderdijk, J.W. An Exploratory Study into the Effects of Extraordinary Nature on Emotions, Mood, and Prosociality. Front. Psychol. 2015, 5, 1577. [Google Scholar] [CrossRef]
  44. Daniel, T.C.; Boster, R.S. Measuring Landscape Esthetics: The Scenic Beauty Estimation Method; Res. Pap. RM-RP-167; U.S. Department of Agriculture, Forest Service, Rocky Mountain Range and Experiment Station: Fort Collins, CO, USA, 1976.
  45. Otamendi, F.J.; Sutil Martín, D.L. The Emotional Effectiveness of Advertisement. Front. Psychol. 2020, 11, 2088. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Examples of stimuli used in the experiment: (a) urban image, (b) forests with foliage, and (c) forests without foliage.
Figure 1. Examples of stimuli used in the experiment: (a) urban image, (b) forests with foliage, and (c) forests without foliage.
Ijerph 18 07651 g001
Table 1. Mean scores for individual emotional categories with exposure to urban images, forest images with foliage, and forest images without foliage (the scale ranged from 0 to 100).
Table 1. Mean scores for individual emotional categories with exposure to urban images, forest images with foliage, and forest images without foliage (the scale ranged from 0 to 100).
EmotionUrban ScenesForest with FoliageForest without Foliage
MeanSDMeanSDMeanSD
Anger0.2100.8190.1060.3460.1600.571
Contempt0.2490.2040.2570.3240.3260.444
Disgust0.4830.1520.5220.4130.4740.160
Fear0.2190.7430.1260.4420.1260.525
Joy0.1430.7890.0980.6750.1430.768
Sadness0.2881.0760.2640.9050.2650.781
Surprise0.4681.7600.3701.1160.3300.914
Table 2. Results from one-way repeated measures ANOVAs for individual emotional categories.
Table 2. Results from one-way repeated measures ANOVAs for individual emotional categories.
EmotiondfFp
Anger2, 1281.1120.332
Contempt2, 1281.5580.214
Disgust2, 1280.7280.485
Fear2, 1282.1480.121
Joy2, 1280.0500.608
Sadness2, 1280.1230.884
Surprise2, 1281.6400.200
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Franěk, M.; Petružálek, J. Viewing Natural vs. Urban Images and Emotional Facial Expressions: An Exploratory Study. Int. J. Environ. Res. Public Health 2021, 18, 7651. https://doi.org/10.3390/ijerph18147651

AMA Style

Franěk M, Petružálek J. Viewing Natural vs. Urban Images and Emotional Facial Expressions: An Exploratory Study. International Journal of Environmental Research and Public Health. 2021; 18(14):7651. https://doi.org/10.3390/ijerph18147651

Chicago/Turabian Style

Franěk, Marek, and Jan Petružálek. 2021. "Viewing Natural vs. Urban Images and Emotional Facial Expressions: An Exploratory Study" International Journal of Environmental Research and Public Health 18, no. 14: 7651. https://doi.org/10.3390/ijerph18147651

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop