Next Article in Journal
Systematic Review: Malware Detection and Classification in Cybersecurity
Previous Article in Journal
XPS Monitoring of Calcarenite Building Walls Long Exposed Outdoors: Estimation of Deterioration Trend from the Time Sequence of Curve-Fitted Spectra and PCA Exploration of the Large Dataset
Previous Article in Special Issue
Using Generative AI to Support UX Design Students in Web Development Courses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality for Learning Algorithms: Evaluation of Its Impact on Students’ Emotions Using Artificial Intelligence

by
Mónica Gómez-Ríos
1,2,*,
Maximiliano Paredes-Velasco
2,
J. Ángel Velázquez-Iturbide
2 and
Miguel Ángel Quiroz Martínez
1
1
Computer Science Department, Universidad Politécnica Salesiana, Guayaquil 090101, Ecuador
2
Department of Computer Science and Statistics, Universidad Rey Juan Carlos, 28933 Móstoles, Madrid, Spain
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(14), 7745; https://doi.org/10.3390/app15147745
Submission received: 9 June 2025 / Revised: 30 June 2025 / Accepted: 1 July 2025 / Published: 10 July 2025

Abstract

Abstract

Augmented reality is an educational technology mainly used in disciplines with a strong physical component, such as architecture or engineering. However, its application is much less common in more abstract fields, such as programming and algorithms. Some augmented reality apps for algorithm education exist, but their effect remains insufficiently assessed. In particular, emotions are an important factor for learning, and the emotional impact of augmented reality should be determined. This article inquires about the impact on students’ emotions of an augmented reality tool for learning Dijkstra’s algorithm. This investigation uses an artificial intelligence tool that detects emotions in real time through facial recognition. The data captured with this tool show that students’ positive emotions increased significantly, statistically surpassing negative emotions, and that some negative emotions, such as fear, were considerably reduced. The results show the same trend as those obtained with psychometric questionnaires, but both positive and negative emotions registered with questionnaires were significantly greater than those registered with the artificial intelligence tool. The contribution of this article is twofold. Firstly, it reinforces previous findings on the positive emotional impact of augmented reality on students. Secondly, it shows an alignment of different instruments to measure emotions, but to varying degrees.

1. Introduction

Augmented reality (AR) has shown great potential for learning by combining virtual objects with the real environment [1]. AR is an innovative tool that fosters creativity [2], facilitates the development of key digital skills [3], and increases student motivation and engagement by allowing them to explore educational content interactively, maintaining their long-term interest [2]. This impact is visible at all educational levels [4]: from early childhood education, where concepts are often introduced playfully, to higher education, where it provides practical experiences in different disciplines [5].
Likewise, emotions play an important role in students’ academic performance, as educational experiences are deeply tied to students’ emotional states. In addition to improving the understanding of abstract concepts, AR can influence students’ emotional states during the learning process. Evaluating this impact is key to optimizing the educational experience [6] and thus contributing to a more appropriate learning experience design.
AR has been less widely used in virtual disciplines such as programming or algorithms than in other disciplines, such as history or engineering. However, AR also has educational potential for these fields. Thus, it can facilitate the understanding of algorithms by allowing the visualization of data structures and computational processes. Experiments with university students have shown that AR-based learning systems make it possible to reduce the cognitive load and improve the efficiency of algorithm learning, allowing for better visualization and understanding of structures such as trees, graphs, and optimization strategies [2]. However, challenges have been identified in the accessibility of the technology, the availability of suitable devices, and the need for teacher training for its effective implementation. As AR continues to evolve, it is crucial to further investigate its impact at different educational levels and to design pedagogical strategies that maximize its benefits in teaching algorithms [2]. Although AR has proven useful in teaching abstract virtual concepts, its impact on students has been insufficiently explored. This article presents a study of students’ emotions while using the AR tool DARA, aimed at learning Dijkstra’s algorithm.
This study addresses two main objectives. The first is to analyze the impact of using an augmented-reality-based tool (DARA) on students’ emotional responses during the learning of Dijkstra’s algorithm. The second objective is to compare the effectiveness of two emotional assessment techniques—facial recognition based on artificial intelligence and the AEQ (achievement emotions questionnaire) psychometric questionnaire—for measuring students’ emotions before, during, and after the learning experience.
Based on these objectives, the following are the main contributions of this study. Firstly, the positive impact of an AR app on students’ emotions in algorithm education was replicated. The replication of results is relevant, as a different measurement instrument was used. Secondly, the degree of increase or decrease in emotions was statistically different between the two instruments. This divergence was not expected and can be relevant for researchers focusing on the emotional impact of technology in education; thus, it deserves further research.
This article is structured as follows: Section 2 presents related work, Section 3 and Section 4 describe the technological resources and the methodology used, respectively, Section 5 analyzes the results obtained, Section 6 presents the discussion, and, finally, Section 7 reports the conclusions and future lines of research.

2. Related Work

This section presents the role of emotions in learning and reviews the instruments for their assessment.

2.1. Emotions in Learning

Learning is influenced by multiple factors, including context, teaching methodology, and students’ emotional state [7]. Various studies have shown that emotions play a crucial role in motivation and attention, directly impacting students’ ability to process and retain information [7,8,9]. In this sense, the relationship between emotions and learning has been studied, where positive emotions have been shown to enhance comprehension and academic performance, while negative emotions can interfere with concentration and generate barriers to learning [10]. An emotionally favorable environment can improve students’ commitment and willingness to study [11]. Therefore, the use of technologies, such as AR, has been explored as a way to make learning more motivating and emotionally enriching [10].
If we distinguish between types of emotions, positive emotions can mediate the relationship between teacher support and student engagement, while negative emotions can interfere with commitment and concentration [10]. Likewise, intense emotional experiences, whether positive or negative, facilitate long-term information retention [7].
Emotions such as curiosity, interest, and confidence enhance academic performance, while frustration and anxiety can motivate or hinder performance, depending on the context [12]. Given their direct impact on learning, measuring student emotions has become a key objective for educators and researchers.

2.2. Assessment of Emotions

The current literature includes several methods and instruments for assessing the impact of technology on emotions in the educational setting. Psychometric questionnaires are probably the most common tool, where students typically rate different questions or statements on how they feel about the learning experience. For example, the AEQ [12] and PANAS (Positive and Negative Affect Schedule) [13] questionnaires assess the positive and negative emotions experienced during learning in this way. Other questionnaires used are DASS (Depression Anxiety Stress Scale) and POMS (Profile of Mood States) [14], which measure student tension, fatigue, anxiety, and stress. Complementary instruments include the CD-RISC (Connor–Davidson Resilience Scale) [15], which evaluates emotional resilience, analyzing the student’s ability to face challenging situations in the educational environment.
Psychometric questionnaires have limitations due to participant subjectivity and the difficulty in capturing emotions in real time. For example, students may not fully understand the questions posed and, therefore, may answer inaccurately. Or, if questionnaires are completed after the learning experience, students respond based on what they remember, which can lead to a discrepancy between the feelings they experienced and what they expressed [16,17]. These limitations motivate the need to explore complementary assessment methods that offer a more complete and accurate view of students’ emotions. Computer technology makes it possible to measure the emotions experienced while learning more objectively and continuously [18]. Biometric methods, such as physiological sensors and electroencephalography (EEG) devices, make it possible to capture emotions in real time in an objective manner, providing more detailed and accurate data than that obtained through traditional questionnaires [19]. Physiological sensors make it possible to monitor emotions in real time to analyze emotional responses in educational settings [17]. Electroencephalography has been used to study brain activity linked to emotions. It has proven effective in research that analyzes how different technological resources influence the emotional state of students [20]. It is also applied in studies that relate physical activity with emotions through Frontal Alpha Asymmetry (FAA) [21]. Recent advances have used neural networks to improve the accuracy of emotion recognition through EEG signals [22]. However, biometric instruments have the problem of their invasive nature, as they force the student to wear sensors and devices with limited usability that prevent free interaction [19]. AI combined with non-invasive devices is a recent line of research that attempts to solve this problem, for example, through facial emotion recognition [18]. Examples of these systems are found in work environments and e-learning environments [23]. Some systems use traditional algorithms for facial recognition. Algorithms such as Eigenfaces [24] and Viola–Jones [25] have been successfully applied in emotion detection, achieving high accuracy in educational environments. Their use has shown that their combination with deep learning models allows for accurate facial recognition in real time [8]. These technologies make it possible to generate detailed reports on students’ emotional behavior, enabling adaptive education [26]. Emotions have also been identified in online learning using models based on convolutional neural networks (CNNs), providing valuable feedback regarding the emotional impact on academic performance [27]. However, these systems have limitations in terms of scalability, accuracy, or applicability in face-to-face educational environments. There are some other techniques, although they are less commonly used. Techniques such as text analysis using natural language processing (NLP) have made it possible to infer emotions from student writing, such as essays or forum discussions [27]. Eye tracking has made it possible to monitor students’ attention through visual patterns, correlating this data with emotions such as curiosity and engagement [28]. AI-based voice analysis techniques have also proven effective in identifying emotions such as anxiety and excitement in verbal interactions, providing a deeper interpretation of the emotions felt during educational activities. Finally, hybrid approaches that combine several of these techniques are gaining popularity, allowing for comprehensive emotion monitoring [29].

3. Technological Resources: DARA and Emolive

This section describes the two technological resources used in this research: the AR educational tool used to explain Dijkstra’s algorithm and the facial recognition tool used to assess students’ emotions.

3.1. DARA in Learning Greedy Algorithms with Augmented Reality

The greedy design technique solves optimization problems in stages. At each stage, a candidate (an element available or calculable from the problem’s input data) is chosen and incorporated into the solution, with no subsequent reversal of the choice. This strategy can be expressed in a high-level schema with several functions: selection of the next candidate, solution feasibility, objective function, etc. Although the greedy schema is simple, its application to specific problems is not obvious [30]. In some cases, the functions that define the greedy schema may be implemented differently, may not be necessary, or may even be merged. This high variability usually is not explained [31] and makes it difficult for students to understand and apply the greedy technique.
DARA is an AR educational application designed to facilitate learning the well-known Dijkstra’s greedy algorithm [32]. In addition to its emotional impact, the effectiveness of DARA in supporting knowledge acquisition has been evaluated in previous studies, showing improvements in student understanding and learning outcomes [33], particularly in relation to Dijkstra’s algorithm, which solves the problem of calculating the shortest paths from one node to the other nodes in a weighted graph. It uses a smartphone camera to capture code fragments, onto which it overlays visualizations and textual explanations. Screenshots of these augmented visualizations and DARA’s interface are provided in Supplementary Material S2. Some interface elements are in Spanish, as the app was originally developed in that language.
A student uses DARA as follows. Initially, they read the problem statement and an explanation of the greedy schema. In the second phase, the smartphone camera is activated, and the student focuses on the Java source code for Dijkstra’s algorithm from the book [30] (either on paper or digitally captured). This phase of the application uses AR. The contextual explanations appear over the detected code (Figure 1b). On the other hand, if the camera captures the code fragment corresponding to the declaration of the graph as an adjacency matrix, a 3D visualization of the graph is generated (Figure 1a). These explanations help the student to interpret the different parts of the source code and relate them to the elements that make up the greedy schema. The student can move through the code and receive augmented feedback in real time.
Finally, the student can activate the third phase of the application, which displays a 2D interactive animation of the algorithm. This animation shows a table with the main variables of the algorithm implemented in Java, where the edges and nodes selected by the algorithm can be consulted. It is possible to move forward or backward through the execution to observe the evolution of the variables.
In summary, augmented reality is implemented in DARA through virtual overlays that display contextual explanations and 3D visualizations of the algorithm via a smartphone camera. Its non-immersive nature and compatibility with mobile devices facilitate its integration into educational environments without requiring additional equipment.

3.2. Emotion Detection with Emolive

Emolive is a tool that allows users to track their emotions in real time using AI facial recognition. Emolive can detect six emotions: anger, disgust, fear, happiness, surprise, and sadness. If the detected facial expression is not recognized as any of these emotions, Emolive categorizes it as “neutral”, just like other facial recognition systems [34].
Emolive uses deep learning techniques to accurately detect emotions. In particular, it uses a convolutional neural network trained with the FER2013 dataset, which is widely used in emotion recognition tasks.
The tool can detect the emotions of up to 12 students simultaneously with a single camera. It also allows teachers to create student groups and generate follow-up reports.
Using Emolive is straightforward. The teacher registers the group of students and can either start monitoring immediately or schedule specific activation periods (start time and duration). During monitoring, the teacher can pause and resume the system as needed. The dashboard displays each student’s emotions through average bar charts and time series plots, which are automatically updated every two minutes.
This tool has been used in classroom settings and has demonstrated an average accuracy of 85% in detecting students’ emotions (ranging from 81% to 93%, depending on the emotion) [33]. As it is open-source and compatible with commonly available devices, it is suitable for use in resource-constrained environments.
A representative screenshot of the Emolive interface is shown in Figure 2, and additional examples are provided in Supplementary Material S3.

4. Methodology

This section describes the methodology for classroom intervention with DARA.

4.1. Hypotheses

The two objectives of the study were outlined in the introduction. We coherently propose the following two hypotheses:
H1. 
Students who use AR for learning greedy algorithms experience significantly more positive emotions than negative ones.
H2. 
The assessment of positive and negative emotions using AR does not vary when using psychometric questionnaire techniques or facial recognition techniques.

4.2. Participants

The total sample size for this study was 48 students, divided into a group of 30 students and a group of 18. All of them consented to participating in this study.
Let us call the first group GEmolive, since their emotions were identified using the Emolive application. This group is made up of 30 s-year students of the Computer Science Engineering degree program Universidad Politécnica Salesiana, University, in Guayaquil, Ecuador, during the 2023–2024 academic year. The students were taking the Data Structures course, which studies data structures (both linear and nonlinear) and the design and analysis of basic manipulation algorithms of these data structures.
We call the second group GEAQ since their emotions were identified using the validated AEQ, a psychometric instrument designed to measure a wide range of academic emotions (see Supplementary Material S1 for the full list of AEQ items used). This group consists of 18 students from a previous study conducted in the 2019–2020 academic year [35], which serves as a control group to validate hypothesis H2. The students in this second group had the same profile (subject, course, degree, and university) as the members of the GEmolive group, thus eliminating possible cultural or educational biases.
Although these data were collected in a previous academic year, they were ethically obtained, anonymized, and stored for academic purposes. Their use in the present research was limited to statistical comparison, with no additional contact or intervention involving the original participants. Furthermore, the current study was reviewed and authorized by the Research Vice-Rectorate of the Universidad Politécnica Salesiana in 2024, confirming compliance with ethical standards.

4.3. Procedure

Participants in both groups worked on the same learning task (learning Dijkstra’s algorithm), used the same version of DARA, and followed a very similar working methodology. The only difference was the instrument each group used to measure emotions.
The classroom intervention with the students of the GEmolive group was organized in three phases:
  • Getting Started. Fifteen minutes were spent providing students with instructions on how to use the DARA app and installing it on their mobile devices.
  • Development. Thirty minutes were allocated to complete the work assignment. Students had the Java source code for Dijkstra’s algorithm on paper and used DARA to read the problem statement and an explanation of the greedy scheme. DARA then activated the smartphone camera, and students focused on the parts of the source code they found the most complex or did not understand, to obtain additional explanations on the smartphone screen. The students then manipulated an animation of the algorithm for a given graph.
  • Finalization. This final phase, which lasted 30 min, was used to address students’ questions and concerns.
In Phases 1 and 3, students were organized into 3 shifts of 10 students each, ensuring that each shift did not exceed the maximum number of 12 users supported by Emolive. All three shifts received the same treatment and were taught by the same teacher. Each shift convened at a different time. While one shift of students was in Phases 2 and 3, the remaining students were given free time outside the classroom.
The students in the GAEQ group followed the same methodology, with only two variations. Firstly, students completed the task of Phase 2 in a single turn (there was no need to limit the number to 12 students because Emolive was not used). Second, an additional 20 min was allocated between Phases 1 and 3 to allow the students to complete the questionnaires (in Phase 2, questionnaires were not provided so as not to interrupt the task). A visual summary of the classroom intervention methodology, including group assignments, intervention phases, and the emotion assessment timeline, is shown in Figure 3.
GEmolive group participants were assessed using the Emolive tool. Emolive allows for continuous, real-time measurement of emotions, but it was decided to record emotions at only three moments: at the beginning, during, and at the end of the learning task. This way, the analysis is focused on some of the key moments of the session and avoids an overload of irrelevant data. Therefore, the six non-neutral emotions detected by Emolive (anger, disgust, fear, happiness, surprise, and sadness) were measured at all three moments. For each of the three measurement moments, two average variables were computed, one for positive emotions (happy and surprised) and another for negative emotions (anger, fear, disgust, and sadness). An anonymized example of the Emolive emotion detection log is available in Supplementary Material S4.
The emotions of the GEAQ student group were measured with the psychometric AEQ questionnaire. The questionnaire presents statements about different emotional states, and students must indicate the degree to which they identify with those emotions using a 5-point Likert scale, from “very little” (scored with 1) to “very much” (scored 5). This questionnaire measures eight possible emotions: enjoyment, hope, pride, anger, anxiety, shame, hopelessness, and boredom. It consists of 75 items that are organized into three sections that allow for the emotional state to be assessed at three different times: before, during, and after completing the learning task.
Note that some specific emotions are not captured by both instruments; for example, one measures happiness while the other one measures enjoyment. To compare the measures from both instruments, emotions were grouped into positive and negative, as has been done in other studies [36].

5. Results

Statistical analyses, whether descriptive or inferential, were performed with SPSS Statistics (version 29.0.2.0), with a 95% confidence level for inferential procedures.

5.1. Analysis of H1: Different Between Positive and Negative Emotions

Table 1 shows the descriptive statistics for the emotions measured in the GEmolive group. It can be observed that the emotion “disgusted” was not experienced at any point during the experience (Mean = 0.00), while the emotion “happy” increased continuously from the beginning of the session to the end. Likewise, the emotions “anger” and “fear” decreased throughout the experience. Finally, the emotions “sad” and “surprised” fluctuated throughout the experience.
Figure 4 shows the means of these variables together with the neutral emotion, where it can be seen that the positive and negative emotions started with a similar mean value, but throughout the experience, the former increased, while the latter decreased.
To determine whether the difference between the means of positive and negative emotions at the beginning and end was significant, an inferential statistical analysis was performed. First, due to the sample size being less than 50, the Shapiro–Wilk test with Lilliefors significance correction was applied to determine whether the sample followed a normal distribution. The result was that none of the variables followed a normal distribution. Consequently, a parametric test for comparing means, the Wilcoxon signed-rank test for related samples, was applied. The results are shown in Table 2. Significant differences are bolded across the three instants (before, during, and after).

5.2. Analysis of H2: Emotional Variation Through Psycometric Questioinnaire and Facial Recognition

The emotion scores measured with Emolive and those measured with AEQ were compared. Table 3 shows the descriptive statistics for the positive and negative emotions measured at the three time points for both groups.
The sample distribution was checked. It was observed that all variables in the GAEQ group followed a normal distribution. However, this was not the case for the GEmolive group (Shapiro–Wilk with Lilliefors significance correction, p < 0.05). Consequently, the nonparametric Mann–Whitney U test for unrelated samples was applied. Table 4 shows the hypotheses tested by this test and its results (significant differences found are marked in bold). Significant differences were found in the positive emotions of both groups at the beginning and end of the experience, as well as in the negative emotions of both groups at the end.

6. Discussion

6.1. Hypothesis 1

The first hypothesis was stated as follows:
H1. 
Students who use AR for learning greedy algorithms experience significantly more positive emotions than negative ones.
The results from Emolive indicate that, at the beginning of the experience, the positive and negative emotions experienced by the students were at a similar level (varying within 0.4 points on a scale of 1 to 5). However, as they participated in the experience, positive emotions increased while negative emotions decreased, with this difference being statistically significant at the end of the experience. Therefore, hypothesis H1 of the intervention can be affirmed, confirming that students who used AR to learn greedy algorithms experienced significantly more positive emotions than negative ones.
Previous research has found evidence along these lines. For example, in a study on the relationship between activity and emotions in algorithm learning with AR [35], it was found that participants reported higher levels of positive emotions than negative emotions during the AR experience. The decrease in negative emotions (anger, disgust, fear, and sadness) found in this study is in line with the findings of other studies analyzing emotions using facial recognition. Specifically, it had been found [35] that 86% and 87% of students felt that using the AR tool was neither unpleasant nor depressing, respectively.
Among negative emotions, the following finding is striking. One of the negative emotions that commonly manifests with the use of AR is anxiety [37]. Emolive does not allow for anxiety to be measured directly, but it does measure fear, which is strongly related to anxiety [10,35]. Therefore, one would expect this emotion to evolve similarly to anxiety due to the use of AR. However, the results found in the present study indicate otherwise. Initially, the fear experienced by the students was high (4.5 points out of 5), and by the end, it had decreased by 38% (practically reaching 2.7 out of 5). It is worth noting that the previous study used as a control group in the present study [35] found an increase in anxiety, using the same DARA tool and the same task as those used in the present study. Therefore, it would be advisable to conduct additional studies to shed light on the divergence between the increase in anxiety and the reduction in fear.

6.2. Hypothesis 2

The second hypothesis to be validated was stated as follows:
H2. 
The assessment of positive and negative emotions using AR does not vary when using psychometric questionnaire techniques and facial recognition techniques.
The results showed significant differences between the measurements collected using the EAQ questionnaire and the Emolive system. Thus, at the end of the experience, the positive and negative emotions detected by the psychometric questionnaire were significantly greater than those automatically detected by facial recognition with Emolive. The same difference was found in the positive emotions at the beginning of the experience. Consequently, hypothesis H2 is rejected, and it cannot be stated that the assessment of positive and negative emotions using AR does not vary with different measurement techniques. It is unclear what may explain this divergence between the two types of emotion recognition. Some studies have identified limitations in the use of psychometric questionnaire techniques. For example, a student may express a false memory of an emotion, and emotional distortion may occur when responding based on memories of past emotions [12,38]. These phenomena could explain the divergence found in this study between the two emotion assessment techniques. However, the evidence found in this study is not conclusive enough to make this conclusion, and a more in-depth analysis of the specific emotions assessed by each technique is necessary.
The differences observed between these techniques may be attributed to both psychological and technical factors. From the psychological perspective, questionnaires such as the AEQ can be subject to memory or social desirability biases [39]. In contrast, AI-based facial recognition tools like Emolive assess emotions in real time, but may be limited by technical conditions such as lighting or facial visibility [40]. Therefore, some researchers argue that both methods can be complementary and provide a more holistic understanding of learners’ emotional states [41].

6.3. Limitations

Studying has several limitations for the results to be generalizable. The first limitation stems from the simultaneous use of the two software applications. Sometimes, difficulties were encountered in recognizing emotions due to the students’ body posture during their interaction with DARA. In these cases, Emolive did not correctly capture emotions due to the students’ forward tilt of the face when interacting with the mobile AR device, which affected the quality of the facial expression analysis. In these cases, Emolive informed the teacher of this situation, and the teacher reminded the students to adopt a more appropriate posture.
A second limitation relates to the constraints of both emotional measurement instruments. Although the AEQ is a validated psychometric tool, and the Emolive system has demonstrated an accuracy of 85%, both approaches present inherent limitations that may affect the results. Additionally, the assessment was conducted with a limited number of students (only 48 participants) and within a specific academic context—second-year undergraduate students in a computer science program—which limits the generalizability of the findings to other populations or educational settings.
A third limitation lies in the use of historical data for the GEAQ group, which corresponds to a previous academic year. This introduces potential confounding variables, such as differences in group characteristics or educational conditions. Furthermore, the absence of a control group within the same academic cycle reduces the level of experimental control.

7. Conclusions

This article explores the emotional impact of using an augmented reality tool for learning Dijkstra’s algorithm, called DARA. The evolution of the emotional state of 48 s-year students enrolled in the Computer Science degree during the use of DARA was analyzed. The emotional state was monitored using an emotion detection tool through facial recognition with AI, called Emolive. The main result is that students experienced significantly more positive emotions than negative ones after using augmented reality. On the other hand, it was found that some unpleasant emotions such as fear were reduced by 38% in intensity at the end of the experience. The results of this study were compared with those of a previous study based on the AEQ psychometric questionnaire. The trend recorded with both approaches is the same. However, statistically significant differences were found in the measures of both methods, both for positive and negative emotions.
For the future, we propose two directions for further research. First, to analyze the causes of the divergence found between the two techniques in emotion detection, which may be due to psychological biases in self-report tools and technical limitations in AI-based recognition. Second, to explore the integration of both techniques into a joint emotion assessment model. The differences identified in this study, though only briefly addressed, suggest the potential value of combining both approaches for a more comprehensive emotional assessment. A more detailed analysis of these psychological and technical factors is suggested as a future research direction.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/app15147745/s1, Material S1: Supplementary_AEQ_items.pdf; Material S2: Supplementary_DARA_screenshots.pdf; Material S3: Supplementary_Emolive_interface.pdf; Material S4: Supplementary_Session_Log_Example_Anonymized.xlsx.

Author Contributions

Conceptualization, M.G.-R.; Methodology, M.G.-R.; Software, M.G.-R., M.Á.Q.M., and M.P.-V.; Formal analysis, M.P.-V.; Investigation, M.G.-R.; Resources, M.G.-R.; Data curation, M.G.-R.; Writing—original draft preparation, M.G.-R.; Writing—review and editing, M.G.-R., M.P.-V., and J.Á.V.-I.; Visualization, M.G.-R.; Supervision, M.P.-V. and J.Á.V.-I.; Project administration, M.G.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Universidad Politécnica Salesiana, Ecuador, which covered the article processing charges (APCs). No external funding was received.

Institutional Review Board Statement

This study was reviewed and approved by the Vicerrectorate for Research of the Universidad Politécnica Salesiana (UPS), and a signed statement was issued by the Vicerrector for Research (Ref. UPS/VINV/2024/05/10), confirming that the study complies with the applicable ethical principles for academic research activities.

Informed Consent Statement

Informed consent was obtained from all participants involved in this study. Participation was voluntary, and students were informed of the study’s purpose, procedures, and their right to withdraw at any time without consequences.

Data Availability Statement

The data presented in this study are openly available at https://doi.org/10.21950/FZF6YA.

Acknowledgments

The authors would like to thank the Universidad Politécnica Salesiana of Ecuador for supporting the publication of this article. No external funding was received.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Altinpulluk, H. Determining the Trends of Using Augmented Reality in Education between 2006–2016. Educ. Inf. Technol. 2019, 24, 1089–1114. [Google Scholar] [CrossRef]
  2. Avila-Garzon, C.; Bacca-Acosta, J.; Duarte, J.; Betancourt, J. Augmented Reality in Education: An Overview of Twenty-Five Years of Research. Contemp. Educ. Technol. 2021, 13, ep312. [Google Scholar] [CrossRef]
  3. Moreno-Fernández, O.; Solís-Espallargas, C.; Moreno-Crespo, P.; Ferreras-Listán, M. Augmented Reality and Primary Education: Linkage, Potentiality and Applicability from the Perspective of Teachers in Initial Training. Hacet. Univ. J. Educ. 2023, 38, 388–389. [Google Scholar] [CrossRef]
  4. Huang, T.-C.; Tseng, H.-P. Extended Reality in Applied Sciences Education: A Systematic Review. Appl. Sci. 2025, 15, 4038. [Google Scholar] [CrossRef]
  5. Cabero-Almenara, J.; Llorente-Cejudo, C.; Martinez-Roig, R. The Use of Mixed, Augmented and Virtual Reality in History of Art Teaching: A Case Study. Appl. Syst. Innov. 2022, 5, 44. [Google Scholar] [CrossRef]
  6. Ibáñez, M.B.; Uriarte Portillo, A.; Zatarain Cabada, R.; Barrón, M.L. Impact of Augmented Reality Technology on Academic Achievement and Motivation of Students from Public and Private Mexican Schools. A Case Study in a Middle-School Geometry Course. Comput. Educ. 2020, 145, 103734. [Google Scholar] [CrossRef]
  7. Vistorte, A.O.R.; Deroncele-Acosta, A.; Ayala, J.L.M.; Barrasa, A.; López-Granero, C.; Martí-González, M. Integrating Artificial Intelligence to Assess Emotions in Learning Environments: A Systematic Literature Review. Front. Psychol. 2024, 15, 1387089. [Google Scholar] [CrossRef]
  8. Kaur, M.; Kumar, M. Facial Emotion Recognition: A Comprehensive Review. Expert Syst. 2024, 41, e13670. [Google Scholar] [CrossRef]
  9. Kalateh, S.; Estrada-Jimenez, L.A.; Nikghadam-Hojjati, S.; Barata, J. A Systematic Review on Multimodal Emotion Recognition: Building Blocks, Current State, Applications, and Challenges. IEEE Access 2024, 12, 103976–104019. [Google Scholar] [CrossRef]
  10. Gómez-Rios, M.D.; Paredes-Velasco, M.; Hernández-Beleño, R.D.; Fuentes-Pinargote, J.A. Analysis of Emotions in the Use of Augmented Reality Technologies in Education: A Systematic Review. Comput. Appl. Eng. Educ. 2023, 31, 216–234. [Google Scholar] [CrossRef]
  11. Ibáñez, M.-B.; Delgado-Kloos, C. Augmented Reality for STEM Learning: A Systematic Review. Comput. Educ. 2018, 123, 109–123. [Google Scholar] [CrossRef]
  12. Pekrun, R.; Goetz, T.; Frenzel, A.C.; Barchfeld, P.; Perry, R.P. Measuring Emotions in Students’ Learning and Performance: The Achievement Emotions Questionnaire (AEQ). Contemp. Educ. Psychol. 2011, 36, 36–48. [Google Scholar] [CrossRef]
  13. Watson, D.; Clark, L.A.; Tellegen, A. Development and Validation of Brief Measures of Positive and Negative Affect: The PANAS Scales. J. Pers. Soc. Psychol. 1988, 54, 1063. [Google Scholar] [CrossRef]
  14. Petrowski, K.; Albani, C.; Zenger, M.; Brähler, E.; Schmalbach, B. Revised Short Screening Version of the Profile of Mood States (POMS) From the German General Population. Front. Psychol. 2021, 12, 631668. [Google Scholar] [CrossRef] [PubMed]
  15. Connor, K.M.; Davidson, J.R.T. Development of a New Resilience Scale: The Connor-Davidson Resilience Scale (CD-RISC). Depress. Anxiety 2003, 18, 76–82. [Google Scholar] [CrossRef]
  16. Bosch, N.; D’Mello, S.; Baker, R.; Ocumpaugh, J.; Shute, V.; Ventura, M.; Wang, L.; Zhao, W. Automatic Detection of Learning-Centered Affective States in the Wild. In Proceedings of the 20th International Conference on Intelligent User Interfaces, Atlanta, GA, USA, 29 March–1 April 2015; ACM: New York, NY, USA, 2015; pp. 379–388. [Google Scholar]
  17. Xie, J.; Luo, Y.; Wang, S.; Liu, G. Electroencephalography-Based Recognition of Six Basic Emotions in Virtual Reality Environments. Biomed. Signal Process. Control 2024, 93, 106189. [Google Scholar] [CrossRef]
  18. Flores, E.; Solis-Fonseca, J.-P.; Cuba-Aguilar, C.-R.; Rosales-Fernandez, J.-H.; Barahona-Altao, Y.-G. Emotional State of the Classroom through Artificial Intelligence to Improve the Teaching-Learning Process in a University. In Proceedings of the 21th LACCEI International Multi-Conference for Engineering, Education and Technology, Buenos Aires, Argentina, 17–21 July 2023; Latin American and Caribbean Consortium of Engineering Institutions: Boca Raton, FL, USA, 2023. [Google Scholar] [CrossRef]
  19. Bogicevic Sretenovic, M.; Milenkovic, I.; Jovanovic, B.; Simic, D.; Minovic, M.; Milovanovic, M. Bringing Biometric Sensors to the Classroom: A Fingerprint Acquisition Laboratory for Improving Student Motivation and Commitment. Appl. Sci. 2020, 10, 880. [Google Scholar] [CrossRef]
  20. Wang, X.; Ren, Y.; Luo, Z.; He, W.; Hong, J.; Huang, Y. Deep Learning-Based EEG Emotion Recognition: Current Trends and Future Perspectives. Front. Psychol. 2023, 14, 1126994. [Google Scholar] [CrossRef]
  21. Wilhelm, R.A.; Lacey, M.F.; Masters, S.L.; Breeden, C.J.; Mann, E.; MacDonald, H.V.; Gable, P.A.; White, E.J.; Stewart, J.L. Greater Weekly Physical Activity Linked to Left Resting Frontal Alpha Asymmetry in Women: A Study on Gender Differences in Highly Active Young Adults. Psychol. Sport. Exerc. 2024, 74, 102679. [Google Scholar] [CrossRef]
  22. Wang, Z.; Wang, Y.; Tang, Y.; Pan, Z.; Zhang, J. Knowledge Distillation Based Lightweight Domain Adversarial Neural Network for Electroencephalogram-Based Emotion Recognition. Biomed. Signal Process. Control 2024, 95, 106465. [Google Scholar] [CrossRef]
  23. Canazas, A.P.; Blaz, J.J.R.; Martínez, P.D.T.; Mamani, X.J. Sistema de Identificación de Emociones a Través de Reconocimiento Facial Utilizando Inteligencia Artificial. Innovación Softw. 2022, 3, 140–150. [Google Scholar] [CrossRef]
  24. Mukhopadhyay, S.; Sharma, S. Real Time Facial Expression and Emotion Recognition Using Eigen Faces, LBPH and Fisher Algorithms. In Proceedings of the 2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence), Noida, India, 29–31 January 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 212–220. [Google Scholar] [CrossRef]
  25. Aouani, H.; Ben Ayed, Y. Deep Facial Expression Detection Using Viola-Jones Algorithm, CNN-MLP and CNN-SVM. Soc. Netw. Anal. Min. 2024, 14, 65. [Google Scholar] [CrossRef]
  26. Grassi, D. Supporting Developers’ Emotional Awareness: From Self-Reported Emotions to Biometrics. In Proceedings of the 28th International Conference on Evaluation and Assessment in Software Engineering, Salerno, Italy, 18–21 June 2024; ACM: New York, NY, USA, 2024; pp. 500–504. [Google Scholar]
  27. Liu, J.; Yan, C.; Liu, W.; Liu, X.; Ding, Y.; Zhou, Y. Research on Learner “Emotion-Behavior-Ability” Characteristics Based on MOOC Online Education User Profiles. Inf. Process Manag. 2025, 62, 104026. [Google Scholar] [CrossRef]
  28. Huangfu, Q.; Li, H.; Ban, Y.; He, J. An Eye-Tracking Study on the Effects of Displayed Teacher Enthusiasm on Students’ Learning Procedural Knowledge of Chemistry in Video Lectures. J. Chem. Educ. 2024, 101, 259–269. [Google Scholar] [CrossRef]
  29. Liu, T.; Wang, M.; Yang, B.; Liu, H.; Yi, S. ESERNet: Learning Spectrogram Structure Relationship for Effective Speech Emotion Recognition with Swin Transformer in Classroom Discourse Analysis. Neurocomputing 2025, 612, 128711. [Google Scholar] [CrossRef]
  30. Velázquez-Iturbide, J.Á. The Design and Coding of Greedy Algorithms Revisited. In Proceedings of the 16th Annual Joint Conference on Innovation and Technology in Computer Science Education, Darmstadt, Germany, 27–29 June 2011; ACM: New York, NY, USA, 2011; pp. 8–12. [Google Scholar]
  31. Velázquez-Iturbide, J. ángel Reflections on Teaching Algorithm Courses. In Proceedings of the 56th ACM Technical Symposium on Computer Science Education V. 1, Pittsburgh, PA, USA, 26 February–1 March 2025; pp. 1148–1154. [Google Scholar]
  32. Sahni, S. Data Structures, Algorithms, and Applications in Java; Universities Press: Ibadan, Nigeria, 2000; ISBN 8173715238. [Google Scholar]
  33. Quiroz-Martinez, M.A.; Díaz-Fernández, S.; Aguirre-Sánchez, K.; Gomez-Rios, M.D. Analysis of Students’ Emotions in Real-Time during Class Sessions through an Emotion Recognition System. In X Congreso Internacional de Ciencia, Tecnología e Innovación para la Sociedad (CITIS 2024); Springer: Cham, Switzerland, 2024. [Google Scholar] [CrossRef]
  34. Salloum, S.A.; Alomari, K.M.; Alfaisal, A.M.; Aljanada, R.A.; Basiouni, A. Emotion Recognition for Enhanced Learning: Using AI to Detect Students’ Emotions and Adjust Teaching Methods. Smart Learn. Environ. 2025, 12, 21. [Google Scholar] [CrossRef]
  35. Paredes-Velasco, M.; Velázquez-Iturbide, J.Á.; Gómez-Ríos, M. Augmented Reality with Algorithm Animation and Their Effect on Students’ Emotions. Multimed. Tools Appl. 2023, 82, 11819–11845. [Google Scholar] [CrossRef]
  36. Fernández Herrero, J.; Gómez Donoso, F.; Roig Vila, R. The First Steps for Adapting an Artificial Intelligence Emotion Expression Recognition Software for Emotional Management in the Educational Context. Br. J. Educ. Technol. 2023, 54, 1939–1963. [Google Scholar] [CrossRef]
  37. Poitras, E.G.; Harley, J.M.; Liu, Y.S. Achievement Emotions with Location-based Mobile Augmented Reality: An Examination of Discourse Processes in Simulated Guided Walking Tours. Br. J. Educ. Technol. 2019, 50, 3345–3360. [Google Scholar] [CrossRef]
  38. D’Mello, S.; Graesser, A. Dynamics of Affective States during Complex Learning. Learn. Instr. 2012, 22, 145–157. [Google Scholar] [CrossRef]
  39. Robinson, M.D.; Clore, G.L. Episodic and Semantic Knowledge in Emotional Self-Report: Evidence for Two Judgment Processes. J. Pers. Soc. Psychol. 2002, 83, 198–215. [Google Scholar] [CrossRef] [PubMed]
  40. Barrett, L.F.; Adolphs, R.; Marsella, S.; Martinez, A.M.; Pollak, S.D. Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements. Psychol. Sci. Public Interest 2019, 20, 1–68. [Google Scholar] [CrossRef] [PubMed]
  41. Calvo, R.A.; D’Mello, S. Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications. IEEE Trans. Affect. Comput. 2010, 1, 18–37. [Google Scholar] [CrossRef]
Figure 1. When the smartphone camera detects the Java code corresponding to the adjacency matrix declaration, DARA displays a 3D visualization of the graph to help students understand its structure, as shown in (a). When detecting other code segments, DARA overlays textual explanations directly onto the source code, allowing the student to relate programming instructions to the components of the greedy algorithm, as illustrated in (b). The student moves the camera across the code to explore different fragments and their contextual annotations.
Figure 1. When the smartphone camera detects the Java code corresponding to the adjacency matrix declaration, DARA displays a 3D visualization of the graph to help students understand its structure, as shown in (a). When detecting other code segments, DARA overlays textual explanations directly onto the source code, allowing the student to relate programming instructions to the components of the greedy algorithm, as illustrated in (b). The student moves the camera across the code to explore different fragments and their contextual annotations.
Applsci 15 07745 g001
Figure 2. Real-time emotion detection in the classroom using the Emolive system. The application identifies the emotions of multiple students simultaneously through webcam input. In this screenshot, different emotional states are detected and labeled above the corresponding students.
Figure 2. Real-time emotion detection in the classroom using the Emolive system. The application identifies the emotions of multiple students simultaneously through webcam input. In this screenshot, different emotional states are detected and labeled above the corresponding students.
Applsci 15 07745 g002
Figure 3. Methodological design of the classroom intervention comparing the GEmolive and GEAQ groups. The process is organized into three main phases: Preparation, Development, and Finalization. The duration of each stage is indicated, along with the specific instruments and procedures applied in each group.
Figure 3. Methodological design of the classroom intervention comparing the GEmolive and GEAQ groups. The process is organized into three main phases: Preparation, Development, and Finalization. The duration of each stage is indicated, along with the specific instruments and procedures applied in each group.
Applsci 15 07745 g003
Figure 4. Evolution of emotions during the experience detected with Emolive.
Figure 4. Evolution of emotions during the experience detected with Emolive.
Applsci 15 07745 g004
Table 1. Descriptive statistics of emotions at three moments of the experience.
Table 1. Descriptive statistics of emotions at three moments of the experience.
Emotion (Type)Before (N = 30)During (N = 30)After (N = 30)
Min.Max.MeanSDMin.Max.MeanSDMin.Max.MeanSD
Anger (negative)0.005.002.032.010.004.001.201.690.005.001.101.67
Upset (negative)0.000.000.000.000.000.000.000.000.000.000.000.00
Fearful (negative)3.005.004.500.682.005.004.031.030.005.002.771.63
Happy (positive)0.005.002.401.830.005.003.001.981.005.003.571.10
Neutral (neutral)0.005.004.001.510.005.002.771.702.005.003.831.15
Sad (negative)1.005.003.831.290.004.001.531.200.003.002.201.10
Surprised (positive)0.004.002.701.640.005.002.971.590.005.002.901.79
Table 2. Wilcoxon Signed-Rank Test.
Table 2. Wilcoxon Signed-Rank Test.
VariableSig.
Positive-negative emotions (before)0.925
Positive-negative emotions (during)<0.001
Positive-negative emotions (after)<0.001
Table 3. Descriptive statistics of the emotions detected by AEQ and Emolive.
Table 3. Descriptive statistics of the emotions detected by AEQ and Emolive.
Variable (Emotions)Group (N)Min.Max.MeanSD
Positive beforeGAEQ (18)2.505.003.930.88
GEmolive (30)0.003.502.550.98
Negative beforeGAEQ (18)1.00 4.13 2.28 0.78
GEmolive (30)1.253.752.590.78
Positive beforeGAEQ (18)2.594.953.850.65
GEmolive(30) 0.004.002.981.63
Negative duringGAEQ (18)1.004.232.030.79
GEmolive (30)0.502.751.690.69
Positive afterGAEQ (18)2.255.003.690.85
GEmolive (30)2.504.503.230.91
Negative afterGAEQ (18)1.444.192.440.85
GEmolive (30)0.003.001.520.94
Table 4. Hypothesis contrast of equality of means between groups.
Table 4. Hypothesis contrast of equality of means between groups.
H.Null HypothesisSig. (U of Mann–Whitney)Decision
1The distribution of positive emotions before is the same across group categories (AEQ/Emolive)<0.001Reject the null hypothesis
2The distribution of negative emotions before is the same across group categories (AEQ/Emolive)0.140Fail to reject the null hypothesis
3The distribution of positive emotions is the same across group categories (AEQ/Emolive)0.344Fail to reject the null hypothesis
4The distribution of negative emotions is the same across group categories (AEQ/Emolive)0.151Fail to reject the null hypothesis
5The distribution of positive emotions after is the same across group categories (AEQ/Emolive)0.042Reject the null hypothesis
6The distribution of negative emotions after is the same across group categories (AEQ/Emolive)0.006Reject the null hypothesis
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gómez-Ríos, M.; Paredes-Velasco, M.; Velázquez-Iturbide, J.Á.; Martínez, M.Á.Q. Augmented Reality for Learning Algorithms: Evaluation of Its Impact on Students’ Emotions Using Artificial Intelligence. Appl. Sci. 2025, 15, 7745. https://doi.org/10.3390/app15147745

AMA Style

Gómez-Ríos M, Paredes-Velasco M, Velázquez-Iturbide JÁ, Martínez MÁQ. Augmented Reality for Learning Algorithms: Evaluation of Its Impact on Students’ Emotions Using Artificial Intelligence. Applied Sciences. 2025; 15(14):7745. https://doi.org/10.3390/app15147745

Chicago/Turabian Style

Gómez-Ríos, Mónica, Maximiliano Paredes-Velasco, J. Ángel Velázquez-Iturbide, and Miguel Ángel Quiroz Martínez. 2025. "Augmented Reality for Learning Algorithms: Evaluation of Its Impact on Students’ Emotions Using Artificial Intelligence" Applied Sciences 15, no. 14: 7745. https://doi.org/10.3390/app15147745

APA Style

Gómez-Ríos, M., Paredes-Velasco, M., Velázquez-Iturbide, J. Á., & Martínez, M. Á. Q. (2025). Augmented Reality for Learning Algorithms: Evaluation of Its Impact on Students’ Emotions Using Artificial Intelligence. Applied Sciences, 15(14), 7745. https://doi.org/10.3390/app15147745

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop