Next Article in Journal
State-of-the-Art Power Factor Correction: An Industry Perspective
Previous Article in Journal
Contemporary Branding Strategies for Higher Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Entry

Eye-Tracking Applications in Architecture and Design

by
Alexandros A. Lavdas
1,2,3
1
Eurac Research, Institute for Biomedicine, Affiliated Institute of the University of Lübeck, Via Galvani 31, 39100 Bolzano, Italy
2
The Human Architecture & Planning Institute, Inc., 43 Bradford St., Concord, MA 01742, USA
3
Department of Psychology, Webster University, Athens Campus, 9 Ipitou Street, 10557 Athens, Greece
Encyclopedia 2024, 4(3), 1312-1323; https://doi.org/10.3390/encyclopedia4030086
Submission received: 14 June 2024 / Revised: 2 September 2024 / Accepted: 11 September 2024 / Published: 13 September 2024
(This article belongs to the Section Behavioral Sciences)

Definition

:
Eye-tracking is a biometrics technique that has started to find applications in research related to our interaction with the built environment. Depending on the focus of a given study, the collection of valence and arousal measurements can also be conducted to acquire emotional, cognitive, and behavioral insights and correlate them with eye-tracking data. These measurements can give architects and designers a basis for data-driven decision-making throughout the design process. In instances involving existing structures, biometric data can also be utilized for post-occupancy analysis. This entry will discuss eye-tracking and eye-tracking simulation in the context of our current understanding of the importance of our interaction with the built environment for both physical and mental well-being.

1. History of Eye Tracking

The first documented systematic interest in eye movements can be found in Aristotle [1], with the first experimental setup attributed to Prolemy, who had devised a board for examining the range of binocular single and double vision [2]. This topic was later examined in detail by Alhazen [3] and approached systematically in the modern era first by Wells [4]. During the 19th century, interest in the systematic study of eye movements increased [5,6], and in 1879 the French ophthalmologist Louis Émile Javal noticed that readers’ eyes do not scan a text with a constant speed while reading but make quick movements interrupted by short pauses instead—making the first description of the motion known as saccades [7]. In 1901, Dodge and Cline introduced a non-invasive eye tracking technique using light reflected from the cornea, recording eye position onto a photographic plate [8]. However, it only recorded eye positions on the horizontal plane and required the participant’s head to be motionless.
A few years later, motion picture photography was first used to record eye movements in two dimensions [9], utilizing a small speck of white material that had been placed on the participants’ eyes. A more intrusive approach was that of Edmund Huey in 1908, utilizing an apparatus that could be used to track eye movement during reading. Subjects had to wear a type of contact lens that had a small opening in front of the pupil and was connected to a pointer that changed its position in response to the movements of the eye [10].
The first dedicated eye-tracking laboratory was founded in 1929 by Edmund T. Rolls at the University of Cambridge, marking the recognition of eye tracking as a field of study. A number of additional advances in eye tracking systems were made during the first half of the twentieth century by combining corneal reflection and motion picture techniques (see [11] for a review). In one of them, in the late 1940s, motion picture cameras were used to study the movements of pilots’ eyes as they used cockpit controls and instruments to land an airplane, in what was the earliest application of eye tracking to usability engineering, i.e., the systematic study of users interacting with products, aimed at improving product design [12]. In the early 1950s, the development of the electrooculogram (EOG) marked a new chapter in the field of eye tracking, providing a more accurate method for tracking eye movements than earlier techniques. The EOG measures the cornea-positive standing potential relative to the back of the eye. It uses skin electrodes attached outside the eye near the lateral and medial canthus, allowing the potential to be measured when participants move their eyes a set distance in the horizontal plane [13]. In the 1950s and 1960s, Alfred Lukyanovich Yarbus pioneered the study of saccades when viewing complex images by recording the eye movements performed by observers presented with natural objects and scenes [14]. An important finding was that gaze fixations can be influenced by the instructions given to an observer, demonstrating that low-level, stimulus-driven guidance of attention can be overridden by high-level factors [15].
The pupil center corneal reflection technique (PCCR) was developed in the 1970s and became a standard method for tracking eye movements because of its accuracy and ease of application [6]. It involves using specialized glasses that shine near-infrared light directed at the pupil, creating reflections that can be detected in both the pupil and the cornea. An infrared camera incorporated in the glasses tracks the vector between the cornea and pupil, determining gaze direction. This technology allows researchers to observe the participants’ natural gaze and attention in various environments. Other, less frequently used methods have also been developed [5].
The first eye-tracking study of images, including images of architecture, was performed as early as 1935 [16], with the first study dedicated to architecture appearing in the 1970s, followed by a handful of studies in the following years (see [17]). The use of eye tracking as a tool for investigating our interactions with the built environment has grown very rapidly in recent years, and the following paragraphs discuss some of the relevant studies, especially in the context of evidence-based design for promoting psychosomatic health.

2. Eye Tracking and AI-Simulated Eye-Tracking: Applications and Findings in Architecture and Design

2.1. What Determines First Fixations

Initial gaze fixations are guided by pre-attentive processes in the brain. Our visual system has developed mechanisms to prioritize pertinent/salient information for determining suitable actions and regulating their execution. This selection process initiates as soon as an image reaches the retina, with low-level visual feature computation commencing at this stage and persisting in the lateral geniculate nucleus of the thalamus and early visual cortical areas. There are neurons in the early stages of perception that are specialized to detect specific fundamental visual attributes, such as differences in brightness and contrasting colors. As processing continues downstream through the thalamus and into the visual cortex, characteristics like orientation, direction, and speed of movement are detected (as reviewed by [18]). These features of the visual scene are computed in parallel, creating an early “saliency map” [19]. This first, pre-attentive processing, lasting about 200–250 milliseconds, creates information that guides the early deployment of selective attention and, potentially, action. Approximately 10% of the retinal output follows an alternative pathway directed towards the phylogenetically older system involving the superior colliculus as well as the pulvinar nucleus of the thalamus. This alternative route is responsible for the early response to motion detected in the peripheral visual field [20], as well as fear responses triggered directly from the pulvinar to the amygdala, such as when we perceive threats from potential predators [21]. It is interesting to note that primates seem to have a dedicated circuit for snake recognition in the pulvinar [22], a heritage of the early struggle of humans and their ancestors with serpents.

2.2. Importance

As our survival depends on our ability to quickly and accurately interpret environmental cues, any difficulties encountered in this early processing can trigger feelings of danger and stress. In other words, environments that pose challenges in their visual processing can induce anxiety, as they lack a feeling of place or “anchoring”.
So, the cohesion of visual elements plays a pivotal role in effective information processing, but whether a stimulus elicits a response of fear or attraction depends on the nature of that stimulus. For example, a building with a coherent design captures visual attention, but so does a predator, whose form also possesses geometric coherence. The form has to be quickly processed and assessed, and delays in this initial information processing can also delay the second evaluation stage, potentially causing stress and a sense of unease. In other words, coherence in the environment may prompt anxiety because it complicates the assessment of whether the surroundings are benign or threatening.
Through his pioneering work, Christopher Alexander [23] identified several parameters that create a connection between viewers and their surroundings. A number of studies have since demonstrated that exposure to certain fractal visual patterns, both in nature and in architecture, can have measurable physiological effects [24,25]. Of particular interest is the finding of the activation of the default mode network of the brain during the perception of fractals [26,27], which, together with findings on the role of fractals in stress reduction [24,25], data related to specific electroencephalogram responses to fractal patterns [28], and our own data from eye-tracking and eye-tracking simulation studies [29,30,31], supports the notion of their privileged status in terms of perceptual fluency, a key determinant of initial information processing speed.
While the benefits of exposure to natural scenes have been documented extensively, similar effects can result from artificial environments that mimic the geometrical qualities of nature [32,33]. Importantly, these qualities are not limited to fractal properties, as the pioneering work of Nikos Salingaros has shown [34,35]. Our processing system is finely tuned to the geometry of the natural environment, responding positively to specific types of organized complexity. The presence of fractals is an important aspect of this phenomenon, but it does not describe the entire range of “connecting” qualities [36]. The experience of beauty, often overlooked in modern architecture, results from a multiparametric evaluation of a visual scene, and our current understanding of these findings has started creating a new appreciation of the affective qualities of our built environment [37]. Studies have associated stress with environments that lack a specific level of organized complexity [38], an effect that has parallels with the results of sensory input deprivation [39]. There are also visual patterns that can cause discomfort directly, through mechanistic processes [40].
Eye tracking alone does not provide direct information about anything beyond visual attraction, but measurements of valence and arousal can also be collected to assess emotional, cognitive, and behavioral information [41], as discussed below.

2.3. Relevance for Architecture and Design

In recent years, numerous studies and experiments within the architecture, engineering, and construction sectors have used eye tracking for design evaluation [31,42,43,44,45]. Various companies (for example [46,47,48,49]) offer research-grade glasses for collecting real-world eye-tracking data, including gaze, fixation, and saccade information. To comprehend participant behavior in authentic/natural settings, it is essential to move beyond the lab and bring research tools directly to those settings, and portable solutions are now offered. A good compromise that is also offered is eye-tracking at home, where a webcam can be utilized, recording eye movements in front of a calibrated monitor.
In addition to actual eye-tracking using volunteers, there are now commercial artificial intelligence (AI) applications that can predict initial viewer responses to images (for example, Visual Attention Software (VAS) from 3M [50], Eyequant [51], Attention Insight [52], Neurons [53], and Expoze (version 1.01) [54]). These companies leverage artificial intelligence applications developed from extensive eye-tracking experimental data. Such applications generate maps displaying the likelihood of fixation points and estimating the temporal sequence of these fixation occurrences. This approach effectively unveils the subconscious processing of visual stimuli with remarkable accuracy. Initially designed for applications such as product design, advertising, and signage, this software has now started being used also in assessing architectural and environmental design [29,30,55]. This technology is highly suitable for performing direct comparative evaluations of different structures, facilitating the assessment of both quantitative and semi-quantitative parameters.
Various researchers, including the author of this article, have carried out eye-tracking investigations using images of architecture and constructed environments. These studies have utilized both volunteers to participate in actual eye-tracking experiments [31,42,56,57,58] as well as 3M’s VAS [29,30,50,59,60,61]. The analysis of gaze sequence and gaze can be performed in either whole images or in pre-defined areas of interest. The capacity to predict a user’s engagement with a building’s design, particularly as experienced through its facade, is important both as a study subject and because of potential practical applications.
One notable insight coming from such studies is that initial fixation points, guided by pre-attentive processing, tend to center around the presence of humans, particularly their facial features, even within depictions of architectural or urban settings. Moreover, the gaze naturally shifts towards specific elements such as contrasts, details, and structural components that play a role in establishing a comprehensive sense of geometrical coherence [29,30,31,59,60]. Certain structures instantly engage the viewers’ interest because of their cohesive design, while others actively discourage attention, diverting viewers’ gazes away from the building’s exterior. These observations indicate a strong connection between the visual processes at play and the mathematical coherence or structured complexity inherent in the design. The differences between designs that result in a fragmented gaze heatmap that seems to disintegrate upon zooming in to the image and designs that possess a coherence that is detectable in the gaze heatmap, both initially and at subsequent zooming-in levels, demonstrate heatmap coverage that can be scaled through iterative zoom levels as more detail is being “discovered” on each of them [29,30,59] (Figure 1).
Similar findings have come from a recent eye-tracking study with volunteers [31]. In one of the categories examined in this study, modern vs. traditional, a clear bias for traditional architecture over modern buildings was found. The geometric organization of traditional facades commanded the first fixation in most image pairs. The data also suggested that complexity informs the brain of stimuli worth examining more closely, hence producing significantly longer dwell times. As images in this category came from the Harris Poll of 2020 [62], where the conscious preference of Americans for traditional buildings was recorded, these results demonstrate the convergence of questionnaire and biometric data, acquired independently by different researchers in different population samples (Figure 2, more discussion on this in the next section). The same study also examined other morphological features that are thought to be inherently attractive to the gaze, such as face-like geometry. We know that infants are immediately attracted to faces and that, as early as a week after birth, they tend to look longer at faces that adults consider attractive [63], a phenomenon that has been shown to generalize across race, sex, and age by the 6th month [64]: eye-tracking results from this study showed an attraction to even the most rudimentary face-like pattern (Figure 3).
These findings carry significant implications for our perception of architecture. Contemporary buildings, particularly those featuring plain glass exteriors, often receive fleeting glances without distinct fixation points on the building itself. This stands in stark contrast to traditional structures, where the presence of organized complexity incorporating nested hierarchies appears to direct attention to the entirety of the construction. The role of pre-attentive processing becomes evident in attracting individuals to certain structures while overlooking others. For example, individuals facing old buildings can swiftly identify an entrance. To foster inviting spaces, buildings should integrate elements such as fractal scaling, organized complexity, and repeating symmetries. These features guide viewers in the right direction, fostering a sense of safety, especially in outdoor environments. Eye-tracking can be a practical guide in future building design, identifying all these issues at an early design stage.

2.4. Eye-Tracking and Stimulus Valence

The intensity and, even more so, the valence of the emotions that will follow when the unconscious gaze attraction is followed by conscious perception of a scene cannot be assessed by eye-tracking alone. Combining eye-tracking with data from questionnaires, as in [31], is a way to fill this gap. Another way is the combination of eye-tracking with other biometric techniques. One such approach is the use of electrodermal activity, also called galvanic skin response (GSR) [65]. GSR measures the skin’s conductivity and, through it, can detect changes in sympathetic nervous system activity that reflect emotional states [66]. Stimuli that evoke a sense of danger, fear, happiness, or other emotional states can result in elevation in sweat gland activity, which can be measured by GSR. Hence, GSR can provide a measure of emotional arousal, although obviously it cannot identify the valence of the emotion being experienced but only its impact on the periphery [67]. For a more complete understanding of the individual’s reaction to the environment, combinations with other techniques can be employed. For example, in a study where eye tracking and facial electromyography (fEMG) were used to collect data on the participants’ visual attention and facial expressions in response to virtual reality environments (VE) [68].
FEMG is a psychophysiological method used to detect electrical potentials generated by facial muscles [69]. This potential is in the range of microvolts and is related to the amount of muscle contraction as well as to the number of contracted muscles in a linear fashion. This method allows for a better understanding not only of muscle movements and activity themselves but, importantly, in the present context, also of their association with specific emotions and behaviors with which facial expressions correlate [70,71]. For example, parameters like the level of luminance in rooms, the presence or absence of natural lighting, wall color, and openness of spaces, as well as the presence or absence of outside landmarks and of a visible entrance, were shown to change the way people perceive a space, as assessed by the reflection of this perception on the parameters studied [68]. GSR sensor data showed that skin conductance levels were higher in a negative environment, reflecting increased stress compared to the positive environment. Moreover, heart rate variability indicated a greater emotional response in the negative space compared to baseline values and those obtained in the positive space. FEMG software can often be integrated with other data analysis tools and software platforms, including electroencephalography and eye tracking. FEMG devices can be wearable, compact, and lightweight, so they can be used outside a laboratory setting, and have already been implemented, in combination with other techniques, in neuroarchitectural investigations [72,73].
In the 1970s, Ekman and Friesen [74,75] developed a technique for quantifying facial expressions, the facial action coding system (FACS). This is a comprehensive system to distinguish all visible, anatomically-based facial movements, forming a basis for correlating muscle actions during the expression of basic emotions with those emotions. FACS dissects all observable facial movements into 44 distinct muscle action units [76] and can be used both for real-time observations and for video-recorded interactions conducted in laboratory conditions, in conjunction with automated facial coding (AFC) [77] software, which can categorize facial movements into emotional or cognitive states [78]. AFC enables the analysis of facial expressions in various contexts, where individuals are observed in their natural environment without the constraints of technical equipment, even using the participants’ computers at home, the same way as eye-tracking at home is performed through calibrated webcams. Data for both eye-tracking and facial analysis can be collected at the same time and analyzed either separately or together, with cross-correlation. The company iMotions offers such a solution, with its facial expression analysis software module integrating the facial coding engines AFFDEX by Affectiva and Realeyes. Using a webcam, one can synchronize expressed facial emotions with stimuli directly in the iMotions software, simply import videos, and carry out the relevant analysis [48].
Tracking facial expressions can be a powerful indicator of emotional experiences and complement eye-tracking, providing insights into valence and, as part of a synthesis of multiple data streams, contributing to a better understanding of our interactions with the environment.

2.5. Using Virtual Reality Environments

VEs are useful for neuroscientists and psychologists as research tools, as well as for architects, designers, and stakeholders, in applications ranging from the early design phases to the real estate sector. VEs offer the advantage of facilitating experimental research in a laboratory under controlled conditions without the compromises involved in using 2-dimensional images. VE solutions have already been used in many studies in architecture and interior and urban design, for example in manipulating ceiling height, colors, wall curvature, and surface textures [79,80,81,82,83,84] and adding virtual plants [85]. VE studies can also be combined with sensors monitoring correlates of emotional responses, such as heart rate and skin conductance [68,86,87,88]. However, the issue of the incompleteness inherent in virtual experiences presents a potential drawback [89,90]. For example, the lack of gravitational and accelerational sensations when navigating virtual spaces hinders multisensory integration. These factors make the VE experience palpably different from reality and need to be taken into account when designing experiments.

3. Conclusions

Eye-tracking, either on its own or in combination with other biometric measurements and questionnaire responses, can provide architects and designers with a basis for data-driven decision-making throughout the design process. In cases involving existing structures, biometric data can also be utilized for post-occupancy analysis.
The information obtained from these studies places the concept of biometrics in architecture and design in a more practical and also human-centered perspective, surprisingly, perhaps, to the naïve observer. By systematically exploring the workings of our subjective experiences, we can establish more objective assessments of architectural forms. This stands in contrast to the ad hoc concepts that architects and designers have commonly employed over the past century.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

Although the author works with the Human Architecture & Planning. Institute, Inc., the company is non-profit, and this article has no conflict of interest with the company.

References

  1. Ross, W.D. The Works of Aristotle; Clarendon: Oxford, UK, 1927; Volume 7. [Google Scholar]
  2. Lejeune, A. L’Optique de Claude Ptolémée dans la Version Latine d’après l’Arabe de l´Émir Eugène de Sicile; Université de Louvain: Louvain, Belgium, 1956. [Google Scholar]
  3. Sabra, A.I. The Optics of Ibn Al-Haytham; TheWarburg Institute: London, UK, 1989. [Google Scholar]
  4. Wells, W.C. An Essay upon Single Vision with Two Eyes: Together with Experiments and Observations on Several Other Subjects in Optics; Cadell: London, UK, 1792. [Google Scholar]
  5. Wade, N.J. Pioneers of eye movement research. Iperception 2010, 1, 33–68. [Google Scholar] [CrossRef] [PubMed]
  6. Jacob, R.J.K.; Karn, K.S. Eye tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research; Hyönä, J., Radach, R., Deubel, H., Eds.; Elsevier Science: Amsterdam, The Netherlands, 2003; Volume 2, pp. 573–605. [Google Scholar]
  7. Javal, E. Essai sur la physiologie de la lecture. Ann. D’oculi-Stique 1878, 80, 97–117. [Google Scholar]
  8. Dodge, R.; Cline, T.S. The angle velocity of eye movements. Psychol. Rev. 1901, 8, 145–157. [Google Scholar] [CrossRef]
  9. Judd, C.H.; McAllister, C.N.; Steel, W.M. General introduction to a series of studies of eye movements by means of kinetoscopic photographs. In Psychological Review, Monograph Supplements; Baldwin, J.M., Warren, H.C., Judd, C.H., Eds.; The Review Publishing Company: Baltimore, MD, USA, 1905; Volume 7, pp. 1–16. [Google Scholar]
  10. Huey, E.B. The Psychology and Pedagogy of Reading, with a Review of the History of Reading and Writing and of Methods, Texts, and Hygiene in Reading; Macmillan: New York, NY, USA, 1908; p. xvi+469. [Google Scholar]
  11. Mackworth, J.F.; Mackworth, N.H. Eye fixations recorded on changing visual scenes by the television eye-marker. J. Opt. Soc. Am. 1958, 48, 439–445. [Google Scholar] [CrossRef]
  12. Fitts, P.M.; Jones, R.E.; Milton, J.L. Eye movements of aircraft pilots during instrument-landing approaches. Aeronaut. Eng. Rev. 1950, 9, 56. [Google Scholar]
  13. Creel, D.J. The electrooculogram. In Handbook of Clinical Neurology; Levin, K.H., Chauvel, P., Eds.; Elsevier: Amsterdam, The Netherlands, 2019; Volume 160, pp. 495–499. [Google Scholar]
  14. Yarbus, A.L. Eye Movements and Vision; Haigh, B., Translator; Plenum Press: New York, NY, USA, 1967. [Google Scholar]
  15. Tatler, B.W.; Wade, N.J.; Kwan, H.; Findlay, J.M.; Velichkovsky, B.M. Yarbus, eye movements, and vision. Iperception 2010, 1, 7–27. [Google Scholar] [CrossRef]
  16. Buswell, G.T. How People Look at Pictures: A Study of the Psychology and Perception in Art; University Chicago Press: Chicago, IL, USA, 1935. [Google Scholar]
  17. AAlto, P.; Steinert, M. Emergence of eye-tracking in architectural research: A review of studies 1976–2021. Archit. Sci. Rev. 2024, 1, 1–11. [Google Scholar]
  18. Itti, L.; Koch, C. Computational modelling of visual attention. Nat. Rev. Neurosci. 2001, 2, 194–203. [Google Scholar] [CrossRef]
  19. Tollner, T.; Zehetleitner, M.; Gramann, K.; Muller, H.J. Stimulus saliency modulates pre-attentive processing speed in human visual cortex. PLoS ONE 2011, 6, e16276. [Google Scholar] [CrossRef]
  20. Banich, M.T.; Compton, R.J. Cognitive Neuroscience; Cambridge University Press: Cambridge, UK, 2018. [Google Scholar]
  21. McFadyen, J. Investigating the Subcortical Route to the Amygdala Across Species and in Disordered Fear Responses. J. Exp. Neurosci. 2019, 13, 1179069519846445. [Google Scholar] [CrossRef]
  22. Van Le, Q.; Isbell, L.A.; Matsumoto, J.; Nguyen, M.; Hori, E.; Maior, R.S.; Tomaz, C.; Tran, A.H.; Ono, T.; Nishijo, H. Pulvinar neurons reveal neurobiological evidence of past selection for rapid detection of snakes. Proc. Natl. Acad. Sci. USA 2013, 110, 19000–19005. [Google Scholar] [CrossRef] [PubMed]
  23. Alexander, C.; Ishikawa, S.; Silverstein, M.; Jacobson, M.; Fiksdahl King, I.; Angel, S. A Pattern Language; Oxford University Press: New York, NY, USA, 1977. [Google Scholar]
  24. Taylor, R.P. Reduction of Physiological Stress Using Fractal Art and Architecture. Leonardo 2006, 39, 245–251. [Google Scholar] [CrossRef]
  25. Taylor, R.P.; Spehar, B.; Van Donkelaar, P.; Hagerhall, C.M. Perceptual and Physiological Responses to Jackson Pollock’s Fractals. Front. Hum. Neurosci. 2011, 5, 60. [Google Scholar] [CrossRef]
  26. Fischmeister, F.P.; Martins, M.J.D.; Beisteiner, R.; Fitch, W.T. Self-similarity and recursion as default modes in human cognition. Cortex 2017, 97, 183–201. [Google Scholar] [CrossRef] [PubMed]
  27. Martins, M.J.; Fischmeister, F.P.; Puig-Waldmuller, E.; Oh, J.; Geissler, A.; Robinson, S.; Fitch, W.T.; Beisteiner, R. Fractal image perception provides novel insights into hierarchical cognition. Neuroimage 2014, 96, 300–308. [Google Scholar] [CrossRef]
  28. Hägerhäll, C.; Laike, T.; Taylor, R.; Küller, M.; Küller, R.; Martin, T. Investigations of Human EEG Response to Viewing Fractal Patterns. Perception 2008, 37, 1488–1494. [Google Scholar] [CrossRef]
  29. Lavdas, A.A.; Salingaros, N.A. Architectural Beauty: Developing a Measurable and Objective Scale. Challenges 2022, 13, 56. [Google Scholar] [CrossRef]
  30. Lavdas, A.A.; Salingaros, N.A.; Sussman, A. Visual Attention Software: A New Tool for Understanding the “Subliminal” Experience of the Built Environment. Appl. Sci. 2021, 11, 6197. [Google Scholar] [CrossRef]
  31. Rosas, H.J.; Sussman, A.; Sekely, A.C.; Lavdas, A.A. Using Eye Tracking to Reveal Responses to the Built Environment and Its Constituents. Appl. Sci. 2023, 13, 12071. [Google Scholar] [CrossRef]
  32. Frumkin, H. Beyond toxicity: Human health and the natural environment. Am. J. Prev. Med. 2001, 20, 234–240. [Google Scholar] [CrossRef]
  33. Joye, Y. Fractal Architecture Could Be Good for You. Nexus Netw. J. 2007, 9, 311–320. [Google Scholar] [CrossRef]
  34. Salingaros, N.A. The laws of architecture from a physicist’s perspective. Phys. Essays 1995, 8, 638–643. [Google Scholar] [CrossRef]
  35. Salingaros, N.A.; Mehaffy, M.W. A Theory of Architecture; Umbau-Verlag: Solingen, Germany, 2006. [Google Scholar]
  36. Salingaros, N.A. Unified Architectural Theory: Form, Language, Complexity: A Companion to Christopher Alexander’s “The phenomenon of Life: The nature of Order, Book 1”; Sustasis Foundation: Portland, OR, USA, 2013. [Google Scholar]
  37. Zeki, S. Beauty in Architecture: Not a Luxury-Only a Necessity. Archit. Des. 2019, 89, 14–19. [Google Scholar] [CrossRef]
  38. Ellard, C. Places of the Heart: The Psychogeography of Everyday Life; Perseus Books; LLC Ed.: New York City, NY, USA, 2015. [Google Scholar]
  39. Merrifield, C.; Danckert, J. Characterizing the psychophysiological signature of boredom. Exp. Brain. Res. 2014, 232, 481–491. [Google Scholar] [CrossRef]
  40. Penacchio, O.; Otazu, X.; Wilkins, A.J.; Haigh, S.M. A mechanistic account of visual discomfort. Front. Neurosci. 2023, 17, 1200661. [Google Scholar] [CrossRef]
  41. Mauss, I.B.; Robinson, M.D. Measures of emotion: A review. Cogn. Emot. 2009, 23, 209–237. [Google Scholar] [CrossRef]
  42. Zou, Z.; Ergan, S. Where do we look? An eye-tracking study of architectural features in building design. In Advances in Informatics and Computing in Civil and Construction Engineering; Mutis, I., Hartmann, T., Eds.; Springer: Cham, Switzerland, 2018. [Google Scholar]
  43. Afrooz, A.; White, D.; Neuman, M. Which visual cues are important in way-finding? Measuring the influence of travel mode on visual memory for built environments. Assist. Technol. Res. Ser. 2014, 35, 394–403. [Google Scholar] [CrossRef]
  44. Hollander, J.; Purdy, A.; Wiley, A.; Foster, V.; Jacob, R.; Taylor, H.; Brunyé, T. Seeing the city: Using eye-tracking technology to explore cognitive responses to the built environment. J. Urban. Int. Res. Placemaking Urban Sustain. 2018, 12, 156–171. [Google Scholar] [CrossRef]
  45. Lu, Z.; Pesarakli, H. Seeing Is Believing: Using Eye-Tracking Devices in Environmental Research. HERD Health Environ. Res. Des. J. 2023, 16, 15–52. [Google Scholar] [CrossRef]
  46. Tobii. Available online: https://www.tobii.com/products/eye-trackers (accessed on 14 June 2024).
  47. PupilLabs. Available online: https://pupil-labs.com/products/neon (accessed on 14 June 2024).
  48. iMotions. Available online: https://imotions.com/ (accessed on 21 October 2023).
  49. Biopac. Available online: https://www.biopac.com/product/eye-tracking-etv (accessed on 14 June 2024).
  50. 3M. Visual Attention Software. Available online: https://www.3m.com/3M/en_US/visual-attention-software-us/ (accessed on 16 May 2021).
  51. Eyequant. Available online: https://www.eyequant.com (accessed on 1 December 2023).
  52. Insight, A. Available online: https://attentioninsight.com/ (accessed on 1 December 2023).
  53. Neurons. Available online: www.neuronsinc.com (accessed on 1 December 2023).
  54. Expoze. Available online: https://www.expoze.io (accessed on 1 December 2023).
  55. Sussman, A.; Ward, J. Game-Changing Eye-Tracking Studies Reveal How We Actually See Architecture. Available online: https://commonedge.org/game-changing-eye-tracking-studies-reveal-how-we-actually-see-architecture/ (accessed on 8 November 2023).
  56. Sussman, A.; Ward, J. Eye-tracking Boston City Hall to better understand human perception and the architectural experience. New Des. Ideas 2019, 3, 53–59. [Google Scholar]
  57. Lisińska-Kuśnierz, M.; Krupa, M. Suitability of Eye Tracking in Assessing the Visual Perception of Architecture-A Case Study Concerning Selected Projects Located in Cologne. Buildings 2020, 10, 20. [Google Scholar] [CrossRef]
  58. Suárez, L. Subjective Experience and Visual Attention to a Historic Building. Front. Archit. Res. 2020, 9, 774–804. [Google Scholar] [CrossRef]
  59. Salingaros, N.A.; Sussman, A. Biometric Pilot-Studies Reveal the Arrangement and Shape of Windows on a Traditional Façade to be Implicitly “Engaging”, Whereas Contemporary Façades Are Not. Urban Sci. 2020, 4, 26. [Google Scholar] [CrossRef]
  60. Hollander, J.; Sussman, A.; Lowitt, P.; Angus, N.; Situ, M. Eye-tracking emulation software: A promising urban design tool. Archit. Sci. Rev. 2021, 64, 383–393. [Google Scholar] [CrossRef]
  61. Brielmann, A.A.; Buras, N.H.; Salingaros, N.A.; Taylor, R.P. What Happens in Your Brain When You Walk Down the Street? Implications of Architectural Proportions, Biophilia, and Fractal Geometry for Urban Science. Urban Sci. 2021, 6, 3. [Google Scholar] [CrossRef]
  62. NCAS. Americans’ Preferred Architecture for Federal Buildings. Available online: https://www.civicart.org/americans-preferred-architecture-for-federal-buildings (accessed on 1 December 2023).
  63. Slater, A.; Schulenburg, C.V.D.; Brown, E.; Badenoch, M. Newborn infants prefer attractive faces. Infant Behav. Dev. 1998, 21, 345–354. [Google Scholar] [CrossRef]
  64. Langlois, J.H.; Ritter, J.M.; Roggman, L.A.; Vaughn, L.S. Facial diversity and infant preferences for attractive faces. Dev. Psychol. 1991, 27, 79–84. [Google Scholar]
  65. Benedek, M.; Kaernbach, C. A continuous measure of phasic electrodermal activity. J. Neurosci. Methods 2010, 190, 80–91. [Google Scholar] [CrossRef]
  66. Shi, Y.; Ruiz, N.; Taib, R.; Choi, E.; Chen, F. Galvanic Skin Response (GSR) as an Index of Cognitive Load. In Proceedings of the Conference on Human Factors in Computing Systems, CHI 2007, San Jose, CA, USA, April 28–May 3 2007; pp. 2651–2656. [Google Scholar]
  67. Bakker, J.; Pechenizkiy, M.; Sidorova, N. What’s Your Current Stress Level? Detection of Stress Patterns from GSR Sensor Data. In Proceedings of the 2011 IEEE 11th International Conference on Data Mining Workshops, Vancouver, BC, Canada, 11 December 2011; pp. 573–580. [Google Scholar]
  68. Ergan, S.; Radwan, A.; Zou, Z.; Tseng, H.-A.; Han, X. Quantifying Human Experience in Architectural Spaces with Integrated Virtual Reality and Body Sensor Networks. J. Comput. Civ. Eng. 2019, 33, 04018062. [Google Scholar] [CrossRef]
  69. Read, G.L. Facial Electromyography (EMG). In The International Encyclopedia of Communication Research Methods; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2015; pp. 1–10. [Google Scholar]
  70. Dimberg, U. Facial electromyography and emotional reactions. Psychophysiology 1990, 27, 481–494. [Google Scholar] [CrossRef]
  71. Boxtel, A. Facial EMG as a Tool for Inferring Affective States. In Proceedings of the Measuring Behavior 2010, Eindhoven, The Netherlands, 24–27 August 2010. [Google Scholar]
  72. Chang, C.-Y.; Chen, P.-K. Human Response to Window Views and Indoor Plants in the Workplace. HortScience Publ. Am. Soc. Hortic. Sci. 2005, 40, 1354–1359. [Google Scholar] [CrossRef]
  73. Balakrishnan, B.; Sundar, S.S. Capturing Affect in Architectural Visualization—A Case for integrating 3-dimensional visualization and psychophysiology. In Proceedings of the Communicating Space(s) 24th eCAADe Conference Proceedings, Volos, Greece, 6–9 September 2006; pp. 664–669, ISBN 0-9541183-5-9. [Google Scholar]
  74. Ekman, P.; Friesen, W.V. Measuring facial movement. Environ. Psychol. Nonverbal Behav. 1976, 1, 56–75. [Google Scholar] [CrossRef]
  75. Ekman, P.; Friesen, W.V. Facial Action Coding System: A Technique for the Measurement of Facial Movement; Consulting Psychologists Press: Palo Alto, CA, USA, 1978. [Google Scholar]
  76. Ekman, P.; Rosenberg, E. (Eds.) What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS); Oxford University Press: New York, NY, USA, 2005. [Google Scholar] [CrossRef]
  77. Valstar, M.; Mehu, M.; Jiang, B.; Pantic, M.; Scherer, K. Meta-Analysis of the First Facial Expression Recognition Challenge. In IEEE Transactions on Systems, Man, and Cybernetics. Part B, Cybernetics: A Publication of the IEEE Systems, Man, and Cybernetics Society; IEEE: Piscataway, NJ, USA, 2012; Volume 42. [Google Scholar] [CrossRef]
  78. Lewinski, P.; Uyl, T.; Butler, C. Automated Facial Coding: Validation of Basic Emotions and FACS AUs in FaceReader. J. Neurosci. Psychol. Econ. 2014, 7, 227–236. [Google Scholar] [CrossRef]
  79. Elver Boz, T.; Demirkan, H.; Urgen, B. Visual perception of the built environment in virtual reality: A systematic characterization of human aesthetic experience in spaces with curved boundaries. Psychol. Aesthet. Creat. Arts 2022. Advance online publication. [Google Scholar] [CrossRef]
  80. Coburn, A.; Vartanian, O.; Kenett, Y.; Nadal, M.; Hartung, F.; Hayn-Leichsenring, G.; Navarrete, G.; González-Mora, J.; Chatterjee, A. Psychological and neural responses to architectural interiors. Cortex 2020, 126, 217–241. [Google Scholar]
  81. Lee, K.; Park, C.-H.; Kim, J.H. Examination of User Emotions and Task Performance in Indoor Space Design Using Mixed-Reality. Buildings 2023, 13, 1483. [Google Scholar] [CrossRef]
  82. Tawil, N.; Ascone, L.; Kühn, S. The contour effect: Differences in the aesthetic preference and stress response to photo-realistic living environments. Front. Psychol. 2022, 13, 933344. [Google Scholar] [CrossRef]
  83. Zou, Z.; Yu, X.; Ergan, S. Integrating Biometric Sensors, VR, and Machine Learning to Classify EEG Signals in Alternative Architecture Designs. In Computing in Civil Engineering 2019; ASCE: Reston, VA, USA, 2019. [Google Scholar]
  84. Banaei, M.; Ahmadi, A.; Gramann, K.; Hatami, J. Emotional evaluation of architectural interior forms based on personality differences using virtual reality. Front. Archit. Res. 2019, 9, 138–147. [Google Scholar] [CrossRef]
  85. Mostajeran, F.; Steinicke, F.; Reinhart, S.; Stuerzlinger, W.; Riecke, B.; Kühn, S. Adding virtual plants leads to higher cognitive performance and psychological well-being in virtual reality. Sci. Rep. 2023, 13, 8053. [Google Scholar] [CrossRef]
  86. Fich, L.B.J.; Jönsson, P.; Kirkegaard, P.H.; Wallergård, M.; Garde, A.H.; Hansen, Å. Can architectural design alter the physiological reaction to psychosocial stress? A virtual TSST experiment. Physiol. Behav. 2014, 135, 91–97. [Google Scholar] [CrossRef]
  87. Vecchiato, G.; Tieri, G.; Jelic, A.; De Matteis, F.; Maglione, A.; Babiloni, F. Electroencephalographic Correlates of Sensorimotor Integration and Embodiment during the Appreciation of Virtual Architectural Environments. Front. Psychol. 2015, 6, 1944. [Google Scholar] [CrossRef] [PubMed]
  88. Higuera Trujillo, J.L.; Llinares, C.; Montañana, A.; Rojas, J.-C. Multisensory stress reduction: A neuro-architecture study of paediatric waiting rooms. Build. Res. Inf. 2019, 48, 269–285. [Google Scholar] [CrossRef]
  89. Valentine, C. Health Implications of Virtual Architecture: An Interdisciplinary Exploration of the Transferability of Findings from Neuroarchitecture. Int. J. Environ. Res. Public Health 2023, 20, 2735. [Google Scholar] [CrossRef] [PubMed]
  90. Kalantari, S.; Neo, J.R.J. Virtual Environments for Design Research: Lessons Learned From Use of Fully Immersive Virtual Reality in Interior Design Research. J. Inter. Des. 2020, 45, 27–42. [Google Scholar] [CrossRef]
Figure 1. (A) Panel showing the royal balcony with the marble throne, an 18th-century building belonging to the Golestan Palace complex in Tehran, Iran. This is situated near the border of the Palace complex, and a tall modernist-style office building can be seen from across the street behind it. (A) Original wide-angled photo. (B) Eye-tracking simulation heatmap of (A). (C,E) Zoomed-in views (D,F): Heatmaps for (C,E). Warmer colors represent higher gaze probability, with red being the highest. Heatmap coverage can be scaled through iterative zoom levels for the traditional façade, while it seems to fall apart for the modern one as we move closer in (from [30]).
Figure 1. (A) Panel showing the royal balcony with the marble throne, an 18th-century building belonging to the Golestan Palace complex in Tehran, Iran. This is situated near the border of the Palace complex, and a tall modernist-style office building can be seen from across the street behind it. (A) Original wide-angled photo. (B) Eye-tracking simulation heatmap of (A). (C,E) Zoomed-in views (D,F): Heatmaps for (C,E). Warmer colors represent higher gaze probability, with red being the highest. Heatmap coverage can be scaled through iterative zoom levels for the traditional façade, while it seems to fall apart for the modern one as we move closer in (from [30]).
Encyclopedia 04 00086 g001
Figure 2. (A) The U.S. courthouse in Toledo, Ohio (left) and a modern counterpart, the Hansen Federal Building in Ogden, Utah (right). (B) Heatmaps of both buildings. Warmer colors represent higher gaze probability, with red being the highest. (C) Outlined areas of interest (AOI). The time to first fixation (TTFF) was shorter for the traditional building, and the dwell time was longer. The eye-tracking data in this and other pairs are in agreement with the preference data from the Harris Poll of 2020 using the same images (from [31]).
Figure 2. (A) The U.S. courthouse in Toledo, Ohio (left) and a modern counterpart, the Hansen Federal Building in Ogden, Utah (right). (B) Heatmaps of both buildings. Warmer colors represent higher gaze probability, with red being the highest. (C) Outlined areas of interest (AOI). The time to first fixation (TTFF) was shorter for the traditional building, and the dwell time was longer. The eye-tracking data in this and other pairs are in agreement with the preference data from the Harris Poll of 2020 using the same images (from [31]).
Encyclopedia 04 00086 g002
Figure 3. (A) A spoon with a rudimentary face-like cut-out (left) and a processed version without it (right). (B) Heatmap of both versions, as above. (C) AOIs. The TTFF was shorter for the “face” spoon, and the dwell time was longer. These data demonstrate gaze attraction for face-like features even in their most basic form (from [31]).
Figure 3. (A) A spoon with a rudimentary face-like cut-out (left) and a processed version without it (right). (B) Heatmap of both versions, as above. (C) AOIs. The TTFF was shorter for the “face” spoon, and the dwell time was longer. These data demonstrate gaze attraction for face-like features even in their most basic form (from [31]).
Encyclopedia 04 00086 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lavdas, A.A. Eye-Tracking Applications in Architecture and Design. Encyclopedia 2024, 4, 1312-1323. https://doi.org/10.3390/encyclopedia4030086

AMA Style

Lavdas AA. Eye-Tracking Applications in Architecture and Design. Encyclopedia. 2024; 4(3):1312-1323. https://doi.org/10.3390/encyclopedia4030086

Chicago/Turabian Style

Lavdas, Alexandros A. 2024. "Eye-Tracking Applications in Architecture and Design" Encyclopedia 4, no. 3: 1312-1323. https://doi.org/10.3390/encyclopedia4030086

APA Style

Lavdas, A. A. (2024). Eye-Tracking Applications in Architecture and Design. Encyclopedia, 4(3), 1312-1323. https://doi.org/10.3390/encyclopedia4030086

Article Metrics

Back to TopTop