Next Article in Journal
A Framework for Holistic Assessment of Professional Competencies in Environmental Health WIL at a University of Technology
Previous Article in Journal
The Role of Automated Diagnostics in the Identification of Learning Disabilities: Bayesian Probability Models in the Diagnostic Assessment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Beyond Visuals and Audio: What Is the Effect of Olfactory Stimulus in Immersive Virtual Reality Fire Safety Training?

Faculty of Artificial Intelligence in Education, Central China Normal University, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(10), 1386; https://doi.org/10.3390/educsci15101386
Submission received: 22 August 2025 / Revised: 2 October 2025 / Accepted: 3 October 2025 / Published: 17 October 2025

Abstract

Immersive virtual reality (IVR) has demonstrated significant potential in educational contexts. Nonetheless, prior IVR implementations have primarily focused on visual and auditory simulations, neglecting olfaction, which has limited immersive learning. To address this gap, we conducted an experimental study involving 64 students to examine the impact of integrating olfactory stimulus into IVR systems for fire safety training. Participants were randomly assigned to the control group (without olfactory stimulus, n = 32) or the experimental group (with olfactory stimulus, n = 32). The results indicated that the integration of olfactory stimulus significantly promoted high-arousal positive emotions, increased sense of presence, and reduced cognitive load—although it did not significantly improve learning performance. Thematic analysis further revealed that the incorporation of olfactory stimulus provided learners with an immersive learning experience. Moreover, this IVR system with olfactory stimulus had a high quality of experience. These findings have significant implications for the practice of learning in IVR and multisensory learning theory.

1. Introduction

With the rapid proliferation of head-mounted displays and a decrease in usage costs, immersive virtual reality (IVR) is finding increasingly widespread applications in education. Particularly in fire safety training, IVR can create a safe and controlled learning environment, support interaction between learners and the learning environment, and allow for repeated learning, thereby effectively enhancing the effectiveness of fire safety training (Leder et al., 2019). IVR has already achieved significant development in the field of visual and auditory simulation. Several scholars argued that integrating other senses beyond audiovisual elements is the future direction for virtual reality (VR) in education (Andonova et al., 2023; Edwards et al., 2019). Such integration may further bridge the gap between controlled educational environments and real-world scenarios, making IVR-based education more relevant and effective across various contexts. For example, in fire safety training, olfactory information may play a significant role in assessing fire scenes, responding to stress, and forming memories within virtual environments. However, previous research primarily focuses on the use of audiovisual elements in IVR and has not fully utilized other sensory channels (Guttentag, 2010; Lopes & Falk, 2024; Melo et al., 2020). Therefore, it is necessary to explore the incorporation of other senses to enhance IVR learning.
Olfaction, as a sense closely linked to the brain regions responsible for memory and emotional processing, may influence learning performance and emotions in IVR environments (Li et al., 2024). It can also simulate highly realistic environments to enhance sense of presence, which is defined as “the psychological sense of being in the virtual environment” (Slater & Wilbur, 1995). While some studies have explored the impact of olfaction on learning, only a few have focused on IVR learning environments. Regarding learning performance, olfactory stimulus can serve as an additional sensory cue to aid learners in memory and recall, but empirical research conclusions on this topic remain inconsistent. For example, Garcia-Ruiz et al. (2008) found that adding a mint scent during web-based learning significantly improved learners’ performance in information recall tasks. Conversely, some studies suggest that olfactory stimulus has no significant impact on learning performance (Covaci et al., 2018) or even produces negative effects (Ghinea & Ademoye, 2009). Regarding emotion and sense of presence, existing empirical research has also yielded mixed conclusions. Some studies have shown that olfactory cues can enhance sense of presence (Archer et al., 2022) and provoke stronger emotions (Ranasinghe et al., 2019). However, Serrano et al. (2016) found no significant improvement in emotion induction and sense of presence with olfactory stimuli. Furthermore, Quality of Experience (QoE) is a critical determinant of success for interactive learning systems. The current research mainly focuses on the following dimensions of QoE: relevance, distraction, consistency, annoyance, realism, experience, and liking (Kani-Zabihi et al., 2021; Mesfin et al., 2020).
In particular, the impact of olfactory stimuli on cognitive load remains controversial and lacks empirical evidence. According to cognitive load theory, learners have limited cognitive resources during the learning process. When learners need to process information from multiple sensory channels simultaneously, cognitive load increases (Magana et al., 2019). IVR systems deliver rich scene information, and incorporating olfactory stimuli could lead to cognitive overload. However, some researchers presented different viewpoints that olfactory stimuli can assist learners in integrating their cognitive processes into the virtual environment and reducing cognitive load (Araiza-Alba et al., 2021; Pouw et al., 2014). Therefore, empirical research is urgently needed to explore the actual impact of olfactory stimuli on cognitive load.
To address the research gaps, this study employed a randomized experiment to investigate the effects of olfactory stimulus on learners’ learning in IVR-based fire safety training. We posed the following research questions:
  • Does the IVR-based fire safety training system developed in this study enhance learners’ learning performance?
  • What impact does the olfactory stimulus in IVR-based learning have on learners’ learning performance, emotion, sense of presence, cognitive load, and learning experience?
  • What is the QoE for IVR systems incorporating olfactory stimulus?

2. Literature Review

2.1. From Audiovisual IVR to Multisensory IVR

Existing research mainly focuses on the use and design of audiovisual elements in IVR (Guttentag, 2010; Lopes & Falk, 2024; Melo et al., 2020), lacking richer multisensory experiences and failing to fully exploit its educational potential. IVR is a media technology that simulates the real world by providing multiple sensory stimuli and can be accessed through devices such as head-mounted displays. It has been widely and effectively applied in learning and training, such as fire safety training (Çakiroğlu & Gökoğlu, 2019; Lovreglio et al., 2021). With improvements in display system resolution and rendering technology, the visual quality of virtual environments has been significantly enhanced. Audio implementation has evolved from traditional monophonic systems to stereophonic formats, while spatial audio technology further enhances auditory realism through authentic simulation of three-dimensional soundscapes (Russell et al., 2022; Warp et al., 2022). Existing research suggests that enhancing audiovisual fidelity and designing appropriate audiovisual cues can facilitate learning in IVR environments (Jiang et al., 2024; Han et al., 2023; Yu et al., 2023). For instance, Jiang et al. (2024) explored the impact of visual and auditory fidelity on hazard recognition training in virtual environments, revealing that high-fidelity audiovisual conditions better promote learning outcomes. Yu et al. (2023) demonstrated that visual cues on strings and musical scores in IVR-based musical instrument learning reduced learners’ cognitive load and improved their playing skills. Nevertheless, current IVR applications do not consistently yield satisfactory results. Several studies have identified negative or non-significant impacts (Buttussi & Chittaro, 2018; Makransky et al., 2019), which may be potentially attributable to IVR’s inability to fully replicate real-world learning contexts. Other studies indicate that rich but irrelevant information in IVR environments may increase extraneous cognitive load, thereby impeding learning (Frederiksen et al., 2020; Makransky et al., 2019). Moreover, meta-analysis reports relatively small effect sizes for IVR-based learning (Coban et al., 2022), underscoring the necessity of developing novel approaches to optimize it.
Some researchers believe that the future trend of IVR will be a shift from audiovisual to multi-sensory experiences (Andonova et al., 2023; Flavián et al., 2021). Embodied cognition theory supports this perspective. It holds that the human body explores the environment and forms cognition through motor behaviors and sensory input (Gibson, 1966). The synergistic operation of multisensory modalities facilitates the establishment of rich and easily accessible mental representations, thereby promoting knowledge comprehension, retention, and transfer to real-world contexts (Hutmacher & Kuhbandner, 2018; Novak & Schwan, 2021). Constructivism learning theory further emphasizes that abundant sensory information provides essential materials and contextual cues for knowledge construction, enabling learners to build an understanding of new knowledge by deriving meaning from these sensory experiences (Piaget, 1971). The cognitive affective model of immersive learning (CAMIL) proposed by Makransky and Petersen (2021), grounded in empirical studies, highlights that enriched sensory information in IVR environments enhances the learning process by strengthening presence, ultimately facilitating knowledge acquisition (Krassmann et al., 2019). Notably, haptic and olfactory channels, as primary knowledge acquisition channels in human cognition, may circumvent the limitations of finite working memory capacity through their unconscious and effortless cognitive processing characteristics (Paas & Sweller, 2012). Furthermore, the authentic and interactive experience afforded by additional sensory information could potentially release cognitive resources by embedding the learner’s cognitive activities within virtual environments, thereby reducing cognitive load (Araiza-Alba et al., 2021). Therefore, our study investigates the effects of integrating olfactory stimulus in IVR environments, aiming to optimize learning outcomes in immersive educational settings.

2.2. Effect of Olfactory Stimuli on Learning

In the evolution of human sensory systems, olfaction has developed a unique feedback regulation mechanism. It not only helps individuals perceive their surrounding environment and avoid potential dangers through odor information, but also influences behaviors and decisions in daily life (Stevenson, 2010). Unlike the transmission mode of sensory signals such as vision and hearing, which require relay through the thalamus, olfactory signals originate from the olfactory bulb and can be directly projected to the limbic system (e.g., amygdala and hippocampus) without the involvement of the thalamus as a relay (Herz & Engen, 1996). This distinctive neural pathway provides a special physiological basis for olfactory stimuli, enabling them to act directly on brain regions related to emotions and memories (Rolls, 2004; Royet et al., 2003). Therefore, they play a unique regulatory role in regulating emotions and participating in memory construction (Herz, 2016; Toffolo et al., 2012).
However, existing research has not reached a consensus on whether olfactory stimulus can enhance learning performance, and the results are mixed (Covaci et al., 2018; Degel & Köster, 1999; Garcia-Ruiz et al., 2008; Ghinea & Ademoye, 2009; Knötzele et al., 2023; Sorokowska et al., 2022). For instance, Knötzele et al. (2023) found that exposure to rose aroma during learning, sleep, and recall phases enhanced memory effects for Japanese–German word pairs (unrelated to roses) among participants. Sorokowska et al. (2022) further demonstrated that in a task requiring free verbal recall of a story, the presence of contextually congruent odors during the encoding and retrieval phases enhanced performance on delayed memory tests. Conversely, Ghinea and Ademoye (2009) presented learners with six odors that matched the content of six video segments (e.g., a burnt smell for a bush fire video) and found that these olfactory stimuli had an adverse effect on students’ recall of the video information. Degel and Köster (1999) demonstrated that even when participants were unaware of the odor’s presence, a faint jasmine scent in the testing environment impaired performance on letter counting, mathematical, and creativity tests. Regarding emotion, sense of presence, and QoE, relevant conclusions are also inconsistent (Archer et al., 2022; Baus & Bouchard, 2017; Ranasinghe et al., 2019; Sabiniewicz et al., 2021; Serrano et al., 2016; Shaw et al., 2019). For instance, Sabiniewicz et al. (2021) found that when static virtual scenes (such as a rose garden or an orange basket) were paired with matching olfactory stimuli, this visual–olfactory congruence enhanced the pleasantness of the scenes and the richness of descriptions, but did not influence the memorization of the visual virtual environment. On the other hand, Shaw et al. (2019) found that in the virtual fire evacuation scenario, the scene-matching burning wood scent as olfactory stimulus not only enhanced the sense of reality of the scenario by strengthening participants’ recognition of the “building actually being on fire”, but there was no significant change in the subjective scores of the subjects’ emotional responses such as “stress” and “anxiety”.
As one of the important sensory channels for enhancing IVR experiences, olfaction has attracted extensive attention due to its potential applications. Nevertheless, empirical studies that integrate olfactory stimuli into IVR to improve learning outcomes remain relatively scarce, and the conclusions of relevant studies are inconsistent. Some studies have demonstrated that olfactory stimuli can significantly enhance learning effectiveness. For instance, incorporating olfactory stimuli consistent with learning content in IVR environments has been shown to increase learners’ knowledge retention (Andonova et al., 2023; de Bruijn & Bender, 2018), reduce negative emotions, and boost self-efficacy (Kaimal et al., 2020). However, other studies have failed to identify positive effects of olfactory stimuli on learning, and in some cases, even observed negative impacts. Baus et al. (2022) discovered that introducing irrelevant odors in IVR environments diminished users’ memory accuracy in spatial tasks. Narciso et al. (2019) reported that adding olfactory stimuli (such as the smell of burning wood) during VR training exerted no significant influence on learners’ sense of presence, motion sickness, fatigue, stress, or knowledge transfer. These conflicting findings indicate that the impact of olfactory stimulus on cognitive processing and emotional experiences within IVR learning environments remains unclear, necessitating further empirical investigation.

3. Materials and Methods

3.1. Participants

This study randomly recruited 64 college learners from a university in central China, aged between 18 and 26 years, with an average age of 22.64 years. Among them, there were 9 males and 55 females. Before the experiment began, they were randomly assigned to either the experimental group (with olfactory stimulus, n = 32) or the control group (without olfactory stimulus, n = 32). Both groups wore VR headsets (Oculus Rift) and used controllers for interaction. During the interaction with the VR scene, the experimental group received olfactory stimulus, while the control group did not. All learners had normal or corrected-to-normal hearing and vision, and passed an olfactory disorder test, indicating no health issues related to smell, such as a cold or anosmia (physiological inability to perceive any olfactory stimulus). Additionally, they had not previously participated in any olfactory-related experiments, voluntarily participated in this experiment, and signed written informed consent forms. They were informed that they could withdraw from the experiment at any time. After the experiment, they also received a certain amount of compensation. The research protocol was approved by the Institutional Review Board (IRB-202211018, approved on 8 December 2022).

3.2. VR Material and Olfactory Stimulus Design

The learning material used in the experiment is a VR fire safety training system developed based on Unity 3D. As shown in Figure 1, this system consists of knowledge explanation (common types of fires and classification of fire extinguishers), fire extinguisher usage simulation, and escape experience (such as walking through safe escape routes, dialing emergency numbers, covering the mouth and nose with a wet towel, using a fire extinguisher, testing door temperature, sounding the alarm, and avoiding the use of elevators). Learners entered the VR scene in the first-person perspective using the Oculus Rift device. Learners can move and interact within the scene through small body movements. For long-distance movement, they can use the controller to click on arrows within the VR scene, which allows them to teleport instantly to the arrow’s location. The entire VR experience lasts 15–20 min to avoid discomfort such as dizziness, nausea, and vomiting that may result from prolonged VR exposure.
The olfactory stimulus is designed to correspond with the VR fire escape training module, specifically the smell of smoke. This smoke scent is provided by a lit mosquito coil, which features a mild, fragrance-free formula. Short-term exposure from a distance poses no health risks. The smoke scent activates from the moment smoke appears in the scenario until participants cover their nose and mouth with a wet towel.

3.3. Experimental Process

The whole experiment includes preliminary experiment and formal experiment. Before the formal experiment, eight participants were recruited for a preliminary experiment (four for each experimental condition) to ensure that the olfactory stimulus could be felt by participants. Based on the results of the initial experiments, we determined the time required for learners to complete the scene learning, tests, and questionnaires.
The formal experiment was conducted in a well-ventilated laboratory and the experimental environment is shown in Figure 2. Before the experiment, each participant completed the basic information survey and prior knowledge test (5 min). Then, each participant was assigned a unique user ID to allow for anonymous data collection. The experiment assistant introduced the entire experimental procedure and precautions to the participants and provided training on operating the VR equipment (5 min). Participants in the olfactory stimulus group also underwent an olfactory impairment test (to determine if they could smell hand cream) (3 min). Next, the participants were divided into groups and wore the Oculus Rift headset display to experience the VR scene (15–20 min). The olfactory stimulus group received smoke scent stimulus while experiencing the VR scene, while the non-olfactory stimulus group only experienced the VR scene. After completing the fire escape experience, participants completed an experience questionnaire and a post-test on fire safety knowledge on a computer provided by the researchers (30 min). After the experiment, fans were turned on and windows were opened for ventilation to ensure that the olfactory stimulus had dissipated before the next participant arrived.

3.4. Measuring Tools

The collected data included learning performance, emotions, sense of presence, cognitive load, learning experience, and QoE. Learning performance was measured through knowledge tests. The knowledge test was designed based on fire safety test questions from existing research combined with the knowledge points of this study, including a prior knowledge test and an immediate post-test after the experiment. The test was reviewed by five professors specializing in educational technology. The test consisted of seven single-choice questions (one point each), five multiple-choice questions (two points each), nine true/false questions (one point each), and one short-answer question (two points each), totaling twenty-eight points. The prior knowledge test and the immediate post-test had the same question types and similar difficulty levels, with the presentation order and format of similar questions randomly adjusted. The scoring of the short-answer question was independently completed by two researchers after receiving training. The Spearman correlation analysis indicated a high level of consistency between the two researchers’ scores (r = 0.930). Therefore, the final score for the short-answer question was the average of the two researchers’ scores.
Emotions were measured using the Self-Assessment Manikin scale developed by Bradley and Lang (1994). It encompasses two dimensions: valence and arousal, which are commonly measured in research. Participants indicated their emotional valence by choosing from images representing happiness or unhappiness, and their arousal state by choosing from images representing excitement or calmness. A 5-point pictorial scale was used for measurement. Since the emotion measurement method was directly adopted from similar studies and each dimension had only one item, reliability and validity tests were not conducted.
Presence is “a state of consciousness, the psychological sense of being in the virtual environment” (Slater & Wilbur, 1995). It is determined by the display technology, the sorts of sensory information required to perform the task at hand, and the individual differences in preferences for information displayed in various modalities (Bystrom et al., 1999). In this study, presence was measured using the scale developed by Serrano et al. (2016). Participants were asked to choose the image that best represented their level of presence in the learning environment from five options. Since the method for measuring presence was directly adopted from similar studies and consisted of only one item, reliability and validity tests were not conducted.
Cognitive load can be categorized into intrinsic cognitive load (arising from the inherent complexity of the task itself), extraneous cognitive load (caused by poor instructional design or the manner in which information is presented), and germane cognitive load (the effective cognitive load utilized for learning) (Paas et al., 2003; Sweller et al., 1998). In this study, cognitive load was measured using a questionnaire based on Klepsch et al. (2017). It consists of eight items, with two items measuring intrinsic cognitive load, three items measuring extraneous cognitive load, and three items measuring germane cognitive load. For example, “This task is very complex for me,” “The design of this task is very inconvenient for my learning,” and “The learning task includes elements that support my understanding of the task.” A 5-point Likert scale was used, ranging from 1 to 5, with higher numbers indicating greater agreement with the described situation, where 1 means “strongly disagree” and 5 means “strongly agree.” The Cronbach’s alpha for the three dimensions was 0.81, 0.86, and 0.85, which indicates high reliability.
In the post-intervention survey, the following open-ended item collected learners’ perceptions of the IVR system and its impact on their learning experience: “What are your thoughts on the learning experience during the ‘Virtual Reality Fire Safety Skills Training’? Please share any relevant comments.” We intentionally omitted mention of olfactory stimulus to avoid influencing responses, allowing learners to report their experiences without guidance or interference.
QoE refers to the user’s overall satisfaction and perception. In this study, we focused specifically on participants’ perception of the IVR environment with olfactory stimulus. Therefore, the questionnaire was administered only to the experimental group and not to the control group. The questionnaire of QoE is shown in Table 1. This questionnaire measures seven dimensions of QoE: relevance, distraction, consistency, annoyance, realism, experience, and liking. A 5-point Likert scale was used, with higher numbers indicating greater agreement with the described situation, where 1 means “strongly disagree” and 5 means “strongly agree.” The Cronbach’s alpha was 0.87, indicating high reliability of the scale.

3.5. Data Analysis

For learning performance, emotion, sense of presence, and cognitive load, we first conducted normality and homogeneity of variance tests, finding that the data did not follow a normal distribution and lacked homogeneity of variance. Therefore, we used the Wilcoxon signed-rank test to analyze the differences in learning performance before and after IVR intervention in the same group. Mann–Whitney U test was used to analyze the differences between the experimental group and the control group in learning performance, emotion, sense of presence, and cognitive load. A p-value of less than 0.05 was considered statistically significant. Additionally, r value (point-biserial correlation coefficient) was reported as effect size, with r = 0.10 indicating a small effect, r = 0.24 indicating a medium effect, and r = 0.37 indicating a large effect (Cohen, 1988; Fritz et al., 2012). Data analysis was performed using IBM SPSS 26.0.
Regarding learning experience, thematic analysis was employed to process the data. Following anonymization and assignment of new participant identifiers, two coders (the second and third authors) independently identified potential themes in the interview transcripts. During the coding process, the coders were kept unaware of the participants’ group assignments. After initial independent coding, discrepancies were reconciled through consensus discussions to establish final coding. The representative themes were integrated to compare the differences in learning experiences between the experimental group and the control group. In terms of QoE, we directly reported the scores for each dimension of the questionnaire.

4. Results

4.1. Preliminary Analyses

The Mann–Whitney U test revealed no significant differences between the two groups of learners in terms of age (p = 0.726), experience in playing games (p = 0.229), experience in using VR (p = 0.963), experience in evacuation drills (p = 0.384), experience in fire safety training (p = 0.132), and prior knowledge (p = 0.165). In addition, the chi-square test revealed that there was also no significant difference in the gender of the learners in the two groups (p = 0.281), with 29 females and 3 males in the experimental group and 26 females and 6 males in the control group. This indicates that the individual differences between the two groups of learners were unlikely the source of differences in the key measures observed between groups.

4.2. Learning Performance

First, pre-post comparisons via Wilcoxon signed-rank tests revealed significant knowledge gains in both experimental (Z = −4.945, p < 0.001) and control groups (Z = −4.947, p < 0.001) following immersive virtual fire safety training.
Subsequently, intergroup analysis using the Mann–Whitney U test indicated that while the experimental group demonstrated marginally higher mean post-test scores, no statistically significant difference emerged (U = 424, p = 0.234), as shown in Table 2.

4.3. Emotion

The experimental group demonstrated significantly enhanced emotional responses relative to controls, as shown in Table 2. Specifically, they exhibited higher emotional valence (p = 0.004, r = 0.356) and greater emotional arousal (p = 0.020, r = 0.291), both reaching a medium effect. This indicates more pronounced positive affect and emotional engagement during the intervention.

4.4. Sense of Presence

A robust between-group difference emerged in sense of presence (p = 0.002, r = 0.383), as shown in Table 2. Participants in the experimental condition reported substantially heightened presence levels compared to the control group, with the difference reaching a large effect size.

4.5. Cognitive Load

As shown in Table 2, the Mann–Whitney U test demonstrated that the cognitive load of the experimental group was significantly lower than that of the control group (p = 0.038, r = 0.259), reaching a medium effect size. This suggests that olfactory stimulus reduced cognitive load.

4.6. Learning Experience

The results of thematic analysis yielded three prominent themes. The frequency with which these themes were mentioned by the experimental and control groups is presented in Table 3. Below, we elaborate on each theme, providing representative supporting comments. Each comment is accompanied by the participants’ group assignment and their ID.
  • Theme 1: IVR delivers authentic immersive learning experience
Multiple learners reported that the IVR system effectively simulated real fire scenarios, providing an immersive learning experience. This perspective was shared by 26 learners in the experimental group and 11 in the control group. The number of learners in the experimental group who made such comments was greater than that of the control group. Following comments illustrate this theme:
“The fire escape experience was very realistic, making me feel as if I were actually there.”
(ID: CG_13)
“The VR training felt totally immersive, especially when the burnt smell hit you. That instantly took the realism to another level!”
(ID: EG_38)
“The odor simulation was exceptionally authentic and perfectly aligned with the atmosphere of a fire emergency scenario.”
(ID: EG_47)
  • Theme 2: IVR facilitates knowledge comprehension and retention
Learners across both groups indicated that IVR enhanced their understanding and retention of fire safety evacuation knowledge. These observations came from 11 learners in the experimental group and 10 in the control group. The number of comments between the two groups is similar. Following comments substantiate this theme:
“This hands-on learning experience was more useful than mere theoretical study. My grasp of the relevant knowledge has also become much stronger.”
(ID: CG_10)
“The whole learning process really made me feel the atmosphere of a fire. It helped me learn fire escape skills better and remember potential fire hazards.”
(ID: EG_50)
  • Theme 3: IVR poses challenges for knowledge comprehension and retention
A minority of learners noted that IVR learning hindered knowledge retention, citing extensive content volume. This view was expressed by 3 learners in the experimental group versus 8 in the control group. The number of learners in the experimental group who made such comments was less than that in the control group. Following comments demonstrate this theme:
“There is so much to learn about fire classification and how to use fire extinguishers. It is difficult to remember all of this.”
(ID: CG_9)
“The written information in the knowledge learning section does not highlight the key points, so it is difficult to understand.”
(ID: EG_63)

4.7. QoE

QoE for IVR learning system with the olfactory stimulus is shown in Figure 3. The mean values of all dimensions are above 4.0, except for the “distraction” dimension. The scores of relevance, consistency, realism, and experience exceeded 4.5.

5. Discussion

5.1. Does the IVR-Based Fire Safety Training System Developed in This Study Enhance Learners’ Learning Performance?

In terms of the effect of IVR in fire safety training, the research results indicated that IVR significantly improves learning performance, regardless of the presence of olfactory stimulus. This finding is consistent with previous research (Çakiroğlu & Gökoğlu, 2019; Lovreglio et al., 2021). For example, Lovreglio et al. (2021) found that using VR significantly improved participants’ knowledge acquisition in fire extinguisher operation training. Çakiroğlu and Gökoğlu (2019) developed an IVR-based fire safety skills training system and found that the fire safety behavioral skills of 10 primary school learners improved significantly, and most of the learners could transfer the behavioral skills to real environments. The reason may be that IVR creates a controllable virtual environment that allows learners to interact with it and understand the potential consequences of their actions during a fire. This helps learners better appreciate the importance and urgency of learning fire safety knowledge, enhances information processing, and increases learning engagement, thereby promoting knowledge acquisition (Hutmacher & Kuhbandner, 2018).

5.2. What Impact Does the Olfactory Stimulus in IVR-Based Learning Have on Learners’ Learning Performance, Emotion, Sense of Presence, Cognitive Load, and Learning Experience?

For olfactory stimulus in IVR, the research found that it did not have a significant impact on learning performance. Previous studies have also suggested that olfactory stimulus did not significantly affect learning performance (Ademoye & Ghinea, 2013; Covaci et al., 2018). The reasons may be the provided smoke smell was mainly used to create a realistic scenario to enhance the learning experience, but it did not serve as a cue, thus having little effect on knowledge acquisition. Additionally, the test was not directly related to the olfactory stimulus, so even if the olfactory stimulus did influence immediate knowledge acquisition, it was not reflected in the test scores. Future studies could include scenario simulations or practical exercises to further test the impact of olfactory stimulus.
Regarding emotion and sense of presence, the findings demonstrate that olfactory stimulus evoked positive and high-arousal emotions and strengthened participants’ sense of presence. These results align robustly with extant literature across diverse application domains. Regarding affective impact, Lu et al. (2023) observed similar emotional benefits in online education contexts, where scent cues elevated teacher–learner interactivity and fostered positive affective states. Similarly, our results on sense of presence converged with a study on VR game environments (Archer et al., 2022), which confirmed that olfactory integration significantly enhanced spatial presence. Crucially, the consistency of these effects across distinct educational settings (training, online learning, and gaming) underscores the broad validity of olfactory stimulus as a strategy for enhancing users’ sense of presence and emotional engagement.
In regard to cognitive load, we found that the inclusion of olfactory stimulus in IVR reduced cognitive load. This is contrary to previous research findings (Garcia-Ruiz et al., 2021). The possible reason is that adding olfactory stimulus in IVR enhances learners’ perception and understanding of events and interactions within the virtual environment. This evokes emotional resonance among learners, effectively embedding cognitive activities into the environment and reducing the cognitive load (Novak & Schwan, 2021). Moreover, the olfactory channel may serve as an additional sensory pathway, providing learners with extra encoding methods. This allows information to be processed in parallel through multiple channels, such as visual, auditory, and olfactory, thereby reducing the processing demand on any single channel and consequently reducing cognitive load (Webb et al., 2022).
Regarding learning experience, the thematic analysis indicates that olfactory-enhanced IVR primarily influences learning through its effects on immersion and knowledge processing. First, enhanced immersion was reported more frequently by the experimental group (Theme 1: EG = 26 vs. CG = 11), indicating that the olfactory stimulus significantly enhanced the immersion of the IVR system. This is consistent with the existing research results (Baus & Bouchard, 2017). Second, the findings revealed contradictory patterns in knowledge processing: Theme 2 captured constructive perspectives on olfactory stimulus’s enhancement of knowledge retention and comprehension, whereas Theme 3 reflected the detrimental effects. The feedback belonging to Theme 2 was similar across both groups (Theme 2: EG = 11 vs. CG = 10), which aligns with the quantitative results of learning performance showing no significant difference between the groups. These criticisms belonging to Theme 3 were more prevalent among control group participants, but their overall frequency remained limited (Theme 3: EG = 3 vs. CG = 8). It is worth noting that these negative assessments predominantly targeted the knowledge explanation part, where text-dense presentation of declarative knowledge likely led to cognitive overload (Magana et al., 2019). In the future, we should design multimodal representations of knowledge in VR learning environments.

5.3. What Is the QoE for IVR Systems Incorporating Olfactory Stimulus?

The questionnaire showed that the overall QoE was at a high level, indicating that the IVR system in our study had a high overall satisfaction. This is consistent with the previous research results, which stated that olfactory integration can enhance QoE in immersive learning (Murray et al., 2016; Murray et al., 2017). Furthermore, variations in some dimensions of QoE, such as odor pleasantness (Baus & Bouchard, 2017), can lead to divergent research outcomes. It is therefore essential to clarify various dimensions of QoE for our system to delineate the boundaries of research conclusions.
The scores of relevance, consistency, and realism indicated that the olfactory stimulus in this study was highly compatible with the virtual scene, and the timing of providing it was also appropriate. This is because the type of olfactory stimulus in this study was similar to the smoke smell in the fire scene, which is the smell of burning objects. We strictly controlled the olfactory stimulus to be provided after the fire began. The scores of experiences, annoyance, and liking were also high, which might be because the mosquito coil smell used was not irritating. Therefore, this smell triggered the threat simulation, and did not cause physiological aversion. Distraction was the only dimension that did not exceed 4.0. This indicated that although most participants stated that the olfactory stimulus did not hinder learning, some participants were distracted by the olfactory stimulus. This is consistent with the results of the thematic analysis of the learning experience in this study, suggesting the potential risks of olfactory stimulus in hindering learning.

6. Conclusions

This study investigated the effects of incorporating olfactory stimuli in an IVR-based fire safety training program on learning performance, emotions, sense of presence, cognitive load, and learning experience. Prior research on IVR-based learning has primarily focused on audiovisual elements, overlooking the potential of olfaction. Our findings demonstrate that the IVR-based fire safety training effectively enhanced learning performance. Although adding olfactory stimuli to IVR environment did not yield a further significant improvement in learning performance, it significantly elevated high-arousal positive emotions, enhanced the sense of presence, and reduced cognitive load. These effects collectively fostered a more engaging and less mentally demanding learning process. The high QoE reported by learners further affirms the practical viability of this multisensory approach.
These findings offer significant insights for educational practice and theory, highlighting the importance of olfactory applications in IVR-based learning. For practical implications, we recommend that educators selectively incorporate olfactory stimulus into instruction only when aligned with clear learning objectives. This is because although it can offer a more enriching experience, our data indicated that it did not improve learning outcomes substantially. Educators should leverage the IVR–olfactory synergy to create diverse, immersive learning experiences through multifaceted scenarios and activities. They can also influence emotional responses and cognitive processes by utilizing olfactory stimulus, thereby stimulating learners’ positive emotions and reducing cognitive load. In terms of theoretical implications, our study enriches the multimedia learning framework by incorporating the olfactory modality. This work extends Mayer’s dual-channel theory by broadening its scope, thereby propelling the development of multimedia learning theory and offering empirical evidence for multisensory learning theory.

Limitations and Future Research

This study has several limitations: First, due to sampling constraints, the number of male participants recruited was significantly lower than that of female participants. Kani-Zabihi et al. (2021) held that gender differences may influence olfactory sensitivity, odor preference, and emotional associations with smells. This gender imbalance in the sample composition limits the comprehensiveness of the findings. Second, the use of burning mosquito coils makes it difficult to control stimuli intensity throughout the experiment, as the concentration of released odor cannot be maintained at a completely stable level. This variability may introduce variations in exposure levels among participants within the experimental group. Third, the generalizability of the findings to other learning themes and scenarios may be limited. The olfactory stimulus used in this study—the smell of smoke—is an inherent and highly relevant component of fire safety training. In learning contexts where olfactory stimuli are not naturally embedded or do not carry meaningful informational value, the effects observed here may not directly apply.
Future research should address the issue of demographic representativeness by recruiting a gender-balanced sample of participants and employing a precisely controlled odor delivery system to ensure the consistency of stimulus intensity. Furthermore, to enhance the generalizability of the findings, future studies should investigate the effectiveness of olfactory stimulus in other learning contexts, particularly those where scents are not an inherent part of the environment or do not convey critical information.

Author Contributions

Conceptualization, W.L. and L.Q.; Data curation, L.Q.; Formal analysis, T.G., L.Q.; Funding acquisition, W.L.; Investigation, L.Q., T.G., and R.L.; Methodology, T.G.; Project administration, L.Q.; Resources, W.L.; Software, T.G.; Supervision, W.L.; Validation, L.Q., T.G.; Visualization, T.G.; Writing—original draft, T.G. and R.L.; Writing—review & editing, T.G. and L.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the General Office for Education Sciences Planning of Hubei Province, grant number 2024GA072 and the Fundamental Research Funds for the Central Universities, grant number 30106250035.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board of Central China Normal University (protocol code IRB-202211018 approved on 8 December 2022).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the participants to publish this paper.

Data Availability Statement

Dataset available on request from the authors.

Acknowledgments

We are grateful to the members of the dissertation committee and the editors of this special issue for their invaluable feedback, which greatly benefited this work. Deepseek-V3.1 was used for language polishing to improve the clarity and readability of the introduction, literature review, and discussion sections of this manuscript. After using this tool, all authors reviewed the content to ensure accuracy and took full responsibility for the content of the publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
IVRImmersive virtual reality
VRvirtual reality
QoEQuality of Experience

References

  1. Ademoye, O. A., & Ghinea, G. (2013). Information recall task impact in olfaction-enhanced multimedia. ACM Transactions on Multimedia Computing, Communications, and Applications, 9(3), 1–16. [Google Scholar] [CrossRef]
  2. Andonova, V., Reinoso-Carvalho, F., Jimenez Ramirez, M. A., & Carrasquilla, D. (2023). Does multisensory stimulation with virtual reality (VR) and smell improve learning? An educational experience in recall and creativity. Frontiers in Psychology, 14, 1176697. [Google Scholar] [CrossRef]
  3. Araiza-Alba, P., Keane, T., Chen, W. S., & Kaufman, J. (2021). Immersive virtual reality as a tool to learn problem-solving skills. Computers & Education, 164, 104121. [Google Scholar] [CrossRef]
  4. Archer, N. S., Bluff, A., Eddy, A., Nikhil, C. K., Hazell, N., Frank, D., & Johnston, A. (2022). Odour enhances the sense of presence in a virtual reality environment. PLoS ONE, 17(3), e0265039. [Google Scholar] [CrossRef] [PubMed]
  5. Baus, O., & Bouchard, S. (2017). Exposure to an unpleasant odour increases the sense of presence in virtual reality. Virtual Reality, 21(2), 59–74. [Google Scholar] [CrossRef]
  6. Baus, O., Bouchard, S., Nolet, K., & Berthiaume, M. (2022). In a dirty virtual room: Exposure to an unpleasant odor increases the senses of presence, reality, and realism. Cogent Psychology, 9(1), 2115690. [Google Scholar] [CrossRef]
  7. Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49–59. [Google Scholar] [CrossRef]
  8. Buttussi, F., & Chittaro, L. (2018). Effects of different types of virtual reality display on presence and learning in a safety training scenario. IEEE Transactions on Visualization and Computer Graphics, 24(2), 1063–1076. [Google Scholar] [CrossRef]
  9. Bystrom, K. E., Barfield, W., & Hendrix, C. (1999). A conceptual model of the sense of presence in virtual environments. Presence: Teleoperators & Virtual Environments, 8(2), 241–244. [Google Scholar]
  10. Coban, M., Bolat, Y. I., & Goksu, I. (2022). The potential of immersive virtual reality to enhance learning: A meta-analysis. Educational Research Review, 36, 100452. [Google Scholar] [CrossRef]
  11. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Erlbaum. [Google Scholar]
  12. Covaci, A., Ghinea, G., Lin, C. H., Huang, S. H., & Shih, J. L. (2018). Multisensory games-based learning-lessons learnt from olfactory enhancement of a digital board game. Multimedia Tools and Applications, 77(16), 21245–21263. [Google Scholar] [CrossRef]
  13. Çakiroğlu, Ü., & Gökoğlu, S. (2019). Development of fire safety behavioral skills via virtual reality. Computers & Education, 133, 56–68. [Google Scholar] [CrossRef]
  14. de Bruijn, M. J., & Bender, M. (2018). Olfactory cues are more effective than visual cues in experimentally triggering autobiographical memories. Memory, 26(4), 547–558. [Google Scholar] [CrossRef] [PubMed]
  15. Degel, J., & Köster, E. P. (1999). Odors: Implicit memory and performance effects. Chemical Senses, 24(3), 317–325. [Google Scholar] [CrossRef] [PubMed]
  16. Edwards, B. I., Bielawski, K. S., Prada, R., & Cheok, A. D. (2019). Haptic virtual reality and immersive learning for enhanced organic chemistry instruction. Virtual Reality, 23(4), 363–373. [Google Scholar] [CrossRef]
  17. Flavián, C., Ibáñez-Sánchez, S., & Orús, C. (2021). The influence of scent on virtual reality experiences: The role of aroma-content congruence. Journal of Business Research, 123, 289–301. [Google Scholar] [CrossRef]
  18. Frederiksen, J. G., Sørensen, S. M. D., Konge, L., Svendsen, M. B. S., Nobel-Jørgensen, M., Bjerrum, F., & Andersen, S. A. W. (2020). Cognitive load and performance in immersive virtual reality versus conventional virtual reality simulation training of laparoscopic surgery: A randomized trial. Surgical Endoscopy, 34(3), 1244–1252. [Google Scholar] [CrossRef]
  19. Fritz, C. O., Morris, P. E., & Richler, J. J. (2012). Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2–18. [Google Scholar] [CrossRef]
  20. Garcia-Ruiz, M. A., El-Seoud, S. A., Edwards, A., Aljaam, J. M., & Aquino-Santos, R. (2008). Integrating the sense of smell in an educational human-computer interface. Interactive Computer Aided Learning. [Google Scholar]
  21. Garcia-Ruiz, M. A., Kapralos, B., & Rebolledo-Mendez, G. (2021). An overview of olfactory displays in education and training. Multimodal Technologies and Interaction, 5(10), 64. [Google Scholar] [CrossRef]
  22. Ghinea, G., & Ademoye, O. A. (2009, June 28–July 3). Olfaction-enhanced multimedia: Bad for information recall? 2009 IEEE International Conference on Multimedia and Expo (pp. 970–973), New York, NY, USA. [Google Scholar]
  23. Gibson, J. J. (1966). The senses considered as perceptual systems. Houghton Mifflin. [Google Scholar]
  24. Guttentag, D. A. (2010). Virtual reality: Applications and implications for tourism. Tourism Management, 31(5), 637–651. [Google Scholar] [CrossRef]
  25. Han, J., Liu, G., & Zheng, Q. (2023). Prior knowledge as a moderator between signaling and learning performance in immersive virtual reality laboratories. Frontiers in Psychology, 14, 1118174. [Google Scholar] [CrossRef]
  26. Herz, R. S. (2016). The role of odor-evoked memory in psychological and physiological health. Brain Sciences, 6(3), 22. [Google Scholar] [CrossRef]
  27. Herz, R. S., & Engen, T. (1996). Odor memory: Review and analysis. Psychonomic Bulletin & Review, 3(3), 300–313. [Google Scholar] [CrossRef]
  28. Hutmacher, F., & Kuhbandner, C. (2018). Long-term memory for haptically explored objects: Fidelity, durability, incidental encoding, and cross-modal transfer. Psychological Science, 29(12), 2031–2038. [Google Scholar] [CrossRef]
  29. Jiang, T., Fang, Y., Goh, J., & Hu, S. (2024). Impact of simulation fidelity on identifying swing-over hazards in virtual environments for novice crane operators. Automation in Construction, 165, 105580. [Google Scholar] [CrossRef]
  30. Kaimal, G., Carroll-Haskins, K., Ramakrishnan, A., Magsamen, S., Arslanbek, A., & Herres, J. (2020). Outcomes of visual self-expression in virtual reality on psychosocial well-being with the inclusion of a fragrance stimulus: A pilot mixed-methods study. Frontiers in Psychology, 11, 589461. [Google Scholar] [CrossRef]
  31. Kani-Zabihi, E., Hussain, N., Mesfin, G., Covaci, A., & Ghinea, G. (2021). On the influence of individual differences in cross-modal Mulsemedia QoE. Multimedia Tools and Applications, 80(2), 2377–2394. [Google Scholar] [CrossRef]
  32. Klepsch, M., Schmitz, F., & Seufert, T. (2017). Development and validation of two instruments measuring intrinsic, extraneous, and germane cognitive load. Frontiers in Psychology, 8, 1997. [Google Scholar] [CrossRef]
  33. Knötzele, J., Riemann, D., Frase, L., Feige, B., van Elst, L. T., & Kornmeier, J. (2023). Presenting rose odor during learning, sleep and retrieval helps to improve memory consolidation: A real-life study. Scientific Reports, 13(1), 2371. [Google Scholar] [CrossRef]
  34. Krassmann, A. L., Melo, M., Pinto, D., Peixoto, B., Bessa, M., & Bercht, M. (2019). What is the relationship between the sense of presence and learning in virtual reality? A 24-year systematic literature review. PRESENCE: Virtual and Augmented Reality, 28, 247–265. [Google Scholar]
  35. Leder, J., Horlitz, T., Puschmann, P., Wittstock, V., & Schütz, A. (2019). Comparing immersive virtual reality and powerpoint as methods for delivering safety training: Impacts on risk perception, learning, and decision making. Safety Science, 111, 271–286. [Google Scholar] [CrossRef]
  36. Li, W., Qian, L., Feng, Q., & Luo, H. (2024). Using olfactory cues in text materials benefits delayed retention and schemata construction. Scientific Reports, 14, 17819. [Google Scholar] [CrossRef] [PubMed]
  37. Lopes, M. K., & Falk, T. H. (2024). Audio-visual-olfactory immersive digital nature exposure for stress and anxiety reduction: A systematic review on systems, outcomes, and challenges. Frontiers in Virtual Reality, 5, 1252539. [Google Scholar] [CrossRef]
  38. Lovreglio, R., Duan, X., Rahouti, A., Phipps, R., & Nilsson, D. (2021). Comparing the effectiveness of fire extinguisher virtual reality and video training. Virtual Reality, 25(1), 133–145. [Google Scholar] [CrossRef]
  39. Lu, Q., Zhang, Y., Zhang, Y., Ma, S. E., Zhang, Y., Qin, Y., Gao, P., Zhang, Q., & Xu, Y. (2023, April 23–28). Atmospheror: Towards an olfactory interactive system for enhancing social presence and interaction in synchronous online classes. Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (pp. 1–8), Hamburg, Germany. [Google Scholar]
  40. Magana, A. J., Serrano, M. I., & Rebello, N. S. (2019). A sequenced multimodal learning approach to support students’ development of conceptual learning. Journal of Computer Assisted Learning, 35(4), 516–528. [Google Scholar] [CrossRef]
  41. Makransky, G., & Petersen, G. B. (2021). The cognitive affective model of immersive learning (CAMIL): A theoretical research-based model of learning in immersive virtual reality. Educational Psychology Review, 33, 937–958. [Google Scholar] [CrossRef]
  42. Makransky, G., Terkildsen, T. S., & Mayer, R. E. (2019). Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learning and Instruction, 60, 225–236. [Google Scholar] [CrossRef]
  43. Melo, M., Gonçalves, G., Monteiro, P., Coelho, H., Vasconcelos-Raposo, J., & Bessa, M. (2020). Do multisensory stimuli benefit the virtual reality experience? A systematic review. IEEE Transactions on Visualization and Computer Graphics, 28(2), 1428–1442. [Google Scholar] [CrossRef]
  44. Mesfin, G., Hussain, N., Kani-Zabihi, E., Covaci, A., Saleme, E. B., & Ghinea, G. (2020). QoE of cross-modally mapped Mulsemedia: An assessment using eye gaze and heart rate. Multimedia Tools and Applications, 79(11), 7987–8009. [Google Scholar] [CrossRef]
  45. Murray, N., Lee, B., Qiao, Y., & Miro-Muntean, G. (2016, June 6–8). The influence of human factors on olfaction based mulsemedia quality of experience. 2016 Eighth International Conference on Quality of Multimedia Experience (pp. 1–6), Lisbon, Portugal. [Google Scholar]
  46. Murray, N., Lee, B., Qiao, Y., & Miro-Muntean, G. (2017). The impact of scent type on olfaction-enhanced multimedia quality of experience. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 47(9), 2503–2515. [Google Scholar] [CrossRef]
  47. Narciso, D., Bessa, M., Melo, M., & Vasconcelos-Raposo, J. (2019, November 21–22). Virtual reality for training-the impact of smell on presence, cybersickness, fatigue, stress and knowledge transfer. 2019 International Conference on Graphics and Interaction (pp. 115–121), Faro, Portugal. [Google Scholar]
  48. Novak, M., & Schwan, S. (2021). Does touching real objects affect learning? Educational Psychology Review, 33(2), 637–665. [Google Scholar] [CrossRef]
  49. Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational Psychologist, 38(1), 1–4. [Google Scholar] [CrossRef]
  50. Paas, F., & Sweller, J. (2012). An evolutionary upgrade of cognitive load theory: Using the human motor system and collaboration to support the learning of complex cognitive tasks. Educational Psychology Review, 24, 27–45. [Google Scholar] [CrossRef]
  51. Piaget, J. (1971). Science of education and the psychology of the child. Penguin Books. [Google Scholar]
  52. Pouw, W. T. J. L., Van Gog, T., & Paas, F. (2014). An embedded and embodied cognition review of instructional manipulatives. Educational Psychology Review, 26(1), 51–72. [Google Scholar] [CrossRef]
  53. Ranasinghe, N., Koh, K. C. R., Tram, N. T. N., Liangkun, Y., Shamaiah, K., Choo, S. G., Tolley, D., Karwita, S., Chew, B., Chua, D., & Do, E. Y. L. (2019). Tainted: An olfaction-enhanced game narrative for smelling virtual ghosts. International Journal of Human-Computer Studies, 125, 7–18. [Google Scholar] [CrossRef]
  54. Rolls, E. T. (2004). The functions of the orbitofrontal cortex. Brain and Cognition, 55(1), 11–29. [Google Scholar] [CrossRef] [PubMed]
  55. Royet, J. P., Plailly, J., Delon-Martin, C., Kareken, D. A., & Segebarth, C. (2003). fMRI of emotional responses to odors: Influence of hedonic valence and judgment, handedness, and gender. Neuroimage, 20(2), 713–728. [Google Scholar] [CrossRef] [PubMed]
  56. Russell, V., Murphy, D., & Neff, F. (2022, October 4–7). The design of an experiment to evaluate the effect of spatial sound on memory recall in a virtual reality learning environment. 33rd European Conference on Cognitive Ergonomics (pp. 1–5), Kaiserslautern, Germany. [Google Scholar]
  57. Sabiniewicz, A., Schaefer, E., Guducu, C., Manesse, C., Bensafi, M., Krasteva, N., Nelles, G., & Hummel, T. (2021). Smells influence perceived pleasantness but not memorization of a visual virtual environment. i-Perception, 12(2), 2041669521989731. [Google Scholar] [CrossRef] [PubMed]
  58. Serrano, B., Baños, R. M., & Botella, C. (2016). Virtual reality and stimulation of touch and smell for inducing relaxation: A randomized controlled trial. Computers in Human Behavior, 55, 1–8. [Google Scholar] [CrossRef]
  59. Shaw, E., Roper, T., Nilsson, T., Lawson, G., Cobb, S. V., & Miller, D. (2019, May 4–9). The heat is on: Exploring user behaviour in a multisensory virtual environment for fire evacuation. 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–13), Scotland, UK. [Google Scholar]
  60. Slater, M., & Wilbur, S. (1995). Through the looking glass world of presence: A framework for immersive virtual environments. In M. Slater (Ed.), FIVE ’95 framework for immersive virtual environments. QMW University of London. [Google Scholar]
  61. Sorokowska, A., Nord, M., Stefańczyk, M. M., & Larsson, M. (2022). Odor-based context-dependent memory: Influence of olfactory cues on declarative and nondeclarative memory indices. Learning & Memory, 29(5), 136–141. [Google Scholar]
  62. Stevenson, R. J. (2010). An initial evaluation of the functions of human olfaction. Chemical Senses, 35(1), 3–20. [Google Scholar] [CrossRef]
  63. Sweller, J., Van Merrienboer, J. J. G., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296. [Google Scholar] [CrossRef]
  64. Toffolo, M. B. J., Smeets, M. A. M., & Van Den Hout, M. A. (2012). Proust revisited: Odours as triggers of aversive memories. Cognition & Emotion, 26(1), 83–92. [Google Scholar]
  65. Warp, R., Zhu, M., Kiprijanovska, I., Wiesler, J., Stafford, S., & Mavridou, I. (2022, October 17–21). Validating the effects of immersion and spatial audio using novel continuous biometric sensor measures for virtual reality. 2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct (pp. 262–265), Singapore. [Google Scholar]
  66. Webb, M., Tracey, M., Harwin, W., Tokatli, O., Hwang, F., Johnson, R., Barrett, N., & Jones, C. (2022). Haptic-enabled collaborative learning in virtual reality for schools. Education and Information Technologies, 27(1), 937–960. [Google Scholar] [CrossRef]
  67. Yu, S., Liu, Q., Johnson-Glenberg, M. C., Han, M., Ma, J., Ba, S., & Wu, L. (2023). Promoting musical instrument learning in virtual reality environment: Effects of embodiment and visual cues. Computers & Education, 198, 104764. [Google Scholar] [CrossRef]
Figure 1. Scenes of VR learning material.
Figure 1. Scenes of VR learning material.
Education 15 01386 g001
Figure 2. Experimental environment.
Figure 2. Experimental environment.
Education 15 01386 g002
Figure 3. Results of QoE questionnaire.
Figure 3. Results of QoE questionnaire.
Education 15 01386 g003
Table 1. Questionnaire of QoE.
Table 1. Questionnaire of QoE.
ItemDimensionsDescription
Q1RelevanceThe smell was relevant to the VR scene I was experiencing.
Q2DistractionThe smell was not distracting.
Q3ConsistencyThe smell was consistent with the VR scene when released.
Q4AnnoyanceThe smell was not annoying.
Q5RealismThe smell enhanced the realism of my learning in VR.
Q6ExperienceThe smell enhanced my VR learning experience.
Q7LikingI enjoyed the smell that was added to VR safety learning.
Table 2. Results of Descriptive Statistics and Mann–Whitney U Test.
Table 2. Results of Descriptive Statistics and Mann–Whitney U Test.
VariableGroupDescriptive StatisticsMann–Whitney U Test
MeanSDUZp
Knowledge post-testCG19.862.25424−1.1890.234
EG20.343.11
Emotional valenceCG4.060.76318−2.8470.004
EG4.530.84
Emotional arousalCG3.090.96347−2.3250.020
EG3.631.01
Sense of presenceCG3.561.08297−3.0600.002
EG4.281.02
Cognitive loadCG2.970.86361−2.0700.038
EG2.580.60
Table 3. Result of thematic analysis on learning experience.
Table 3. Result of thematic analysis on learning experience.
ThemeEGCG
N%N%
IVR delivers authentic immersive learning experiences.2681.251134.38
IVR facilitates knowledge comprehension and retention.1134.381031.25
IVR poses challenges for knowledge comprehension and retention.39.38825
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, W.; Gu, T.; Qian, L.; Leng, R. Beyond Visuals and Audio: What Is the Effect of Olfactory Stimulus in Immersive Virtual Reality Fire Safety Training? Educ. Sci. 2025, 15, 1386. https://doi.org/10.3390/educsci15101386

AMA Style

Li W, Gu T, Qian L, Leng R. Beyond Visuals and Audio: What Is the Effect of Olfactory Stimulus in Immersive Virtual Reality Fire Safety Training? Education Sciences. 2025; 15(10):1386. https://doi.org/10.3390/educsci15101386

Chicago/Turabian Style

Li, Wenhao, Tingxuan Gu, Li Qian, and Ruoqi Leng. 2025. "Beyond Visuals and Audio: What Is the Effect of Olfactory Stimulus in Immersive Virtual Reality Fire Safety Training?" Education Sciences 15, no. 10: 1386. https://doi.org/10.3390/educsci15101386

APA Style

Li, W., Gu, T., Qian, L., & Leng, R. (2025). Beyond Visuals and Audio: What Is the Effect of Olfactory Stimulus in Immersive Virtual Reality Fire Safety Training? Education Sciences, 15(10), 1386. https://doi.org/10.3390/educsci15101386

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop