Next Article in Journal
Entrepreneurial Competencies in the Era of Digital Transformation: A Systematic Literature Review
Previous Article in Journal
Integrating Generative Artificial Intelligence in Clinical Dentistry: Enhancing Diagnosis, Treatment Planning, and Procedural Precision Through Advanced Knowledge Representation and Reasoning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Emotion-Aware Education Through Affective Computing and Learning Analytics: Insights from a Moroccan University Case Study

by
Nisserine El Bahri
1,*,
Zakaria Itahriouan
2 and
Mohammed Ouazzani Jamil
1
1
Systems and Sustainable Environment Laboratory, Private University of Fez, Fez 30050, Morocco
2
Intelligent Processing, Data Analysis, and Cybersecurity Team, Moulay Ismail University, Meknes 50050, Morocco
*
Author to whom correspondence should be addressed.
Digital 2025, 5(3), 45; https://doi.org/10.3390/digital5030045
Submission received: 29 July 2025 / Revised: 11 September 2025 / Accepted: 15 September 2025 / Published: 22 September 2025
(This article belongs to the Topic Artificial Intelligence Models, Tools and Applications)

Abstract

In a world where artificial intelligence is constantly changing education, taking students’ feelings into account is a crucial framework for enhancing their engagement and academic performance. This article presents LearnerEmotions, an online application that employs machine vision technology to determine how learners are feeling in real time through their facial expressions. Teachers and institutions can access analytical dashboards and monitor students’ emotions with this tool, which is designed for use in both in-person and remote classes. The facial expression recognition model used in this application achieved an average accuracy of 0.91 and a loss of 0.3 in the real environment. More than 9 million emotional data points were gathered from an experiment involving 65 computer engineering students, and these insights were correlated with attendance and academic performance. While negative emotions like anger, sadness, and fear are associated with decreased performance and lower attendance, the statistical study shows a strong correlation between positive feelings like surprise and joy and successful academic performance. These results underline the necessity of technological tools that offer immediate pedagogical regulation and support the notion that emotions play an important role in the learning process. Thus, LearnerEmotions, which considers students’ emotional states, is a potential first step toward more adaptive learning.

1. Introduction

Whether in distance or face-to-face teaching and learning, the use of computer-based tools has become essential. These tools, although they help a lot in teaching and learning usual tasks, can be used to further improve this domain in several ways. Moreover, this increasing use of technology implies that the student is always present in front of their computer while learning. Therefore, developing solutions that can analyze student behavior becomes increasingly possible, which can lead to understanding students even more and consequently help to teach them in a better way.
Gathering and exploiting emotions data may provide valuable insights for the creation of instructional materials and strategies that are more effective [1]. Analyzing students’ emotional reactions to different teaching approaches may help identify the most engaging and successful teaching strategies. In order to better satisfy the academic and emotional requirements of their students, teachers can then improve their teaching strategies, implement practical ideas, and adapt the curriculum.
This study focuses on how students’ emotional data can be used to enhance learning through technology. Initially, we created an application that enables us to capture and display emotions depending on the learner’s facial expressions. Then, in the second stage, we recorded the students’ emotions while teaching a module using this software. Finally, in order to explore the relations between academic achievement and emotions using a data-driven approach, we gathered and analyzed this data.
The main objective of the software we created for this study is to offer a tool for evaluating the emotions of students during in-person or remote learning by utilizing advances in computer vision. It aims to provide teachers with a tool that facilitates real-time emotional monitoring of their students. The software uses the frontal camera to capture students’ facial expressions during teaching and learning sessions in order to identify their emotions. Additionally, the software can assist in creating a massive data source for several kinds of data science studies that employ the understanding of students’ emotions as a factor of enhancing instruction.
In this paper, we first present the emotion recognition application that we developed by focusing on its main features, architecture and technical specifications. Then, we present the details of the exploitation of this application in the experience of a real course at the private university of Fez to record and analyze the emotions of the students and their correlation with the academic performance of the students.

2. Technology and Emotions

Affective computing, or emotion recognition, is a rapidly growing field that uses artificial intelligence and embedded technologies to perceive, assess, and decipher human emotional states. The goal is to make computer systems more responsive and user-friendly, particularly in the field of education. Various technological methodologies have been suggested and confirmed in scientific publications.

2.1. Facial Expression

One of the most widespread techniques for identifying emotions is the examination of facial expressions [2,3]. It is based on the analysis of facial muscle gestures, standardized in the Facial Action Coding System (FACS) [4,5] developed by Ekman & Friesen. Nowadays, computer vision models, generally based on convolutional neural networks (CNNs), allow for high-accuracy real-time detection. We mainly use this approach in our application used in this study.
Among the most used datasets to train these models are FER2013, AffectNet and CK+ [6]. Several recent studies [7,8] showed that CNN models trained on these datasets achieve an accuracy greater than 90% on the seven basic emotions of Ekman.

2.2. Voice

The voice is a rich source of emotional information. Voice analysis can extract characteristics such as pitch, intensity, prosody, and sound spectrum to classify expressed emotions [9,10]. Models such as OpenSMILE are frequently used to extract these acoustic parameters [11,12]. Many recent research studies using deep neural networks have achieved excellent results in terms of emotion recognition accuracy [13,14].

2.3. Body Language

Posture, gestures, proxemics, and micromovements are all aspects of body language [15,16]. Despite its complexity, it is now possible to leverage depth sensors (such as the Kinect) or RGB cameras combined with skeleton detection algorithms such as OpenPose to infer emotions through movement [17,18]. According to many studies [19], integrating postural data with facial expression improves significantly the accuracy of emotion recognition.

2.4. Physiological Signals

Identifying emotions through physiological signals relies on bodily indicators [20,21] such as heart rate, electrodermal conductance, or skin temperature, captured by portable devices such as the Empatica E4 [22] or Shimmer3 GSR+ [23]. These sensors offer the ability to detect, live and without intrusion, unconscious reactions of the autonomic nervous system associated with emotions such as stress, involvement, or frustration. Several research works [24,25] has shown that examining these signals together can classify emotions with a very high accuracy.

2.5. Neural Signals (EEG)

Emotion identification through neural signal analysis, especially through electroencephalography (EEG), represents a non-invasive technique that allows the detection of emotional fluctuations in real time thanks to high temporal resolution [26,27]. Several recent studies have demonstrated the effectiveness of EEG in emotion classification based on valence and arousal dimensions, particularly using deep learning models such as CNN, RNN or hybrid networks [28,29].

3. Emotions in Teaching and Learning

Emotion recognition and its impact in teaching and learning is a major area of interest and research that attracts the attention of researchers more and more in educational technology domain [30,31,32]. By integrating emotion recognition techniques into the classroom, teachers may obtain important insights into their students’ emotional states and modify their teaching style.
Engaging students increasingly is one of the main objectives of processing students’ emotions in the context of teaching and learning [33,34]. Strategies applied by teachers can be adjusted to grab and maintain students’ attention by closely evaluating the emotional states of their students. If a student is displaying signs of boredom or disinterest, the teacher can adjust the class to include interactive components, hands-on exercises, or multimedia resources.
There is also significant opportunities from the personalized learning enabled by emotion recognition technology [35,36]. Since every student is different, their emotional reactions are crucial to their educational process. Using emotion recognition, teachers can modify the learning process to meet the needs of each individual student. For example, the teacher might provide extra explanations, one-on-one assistance, or other learning resources if a student is struggling to understand or progress.
The capacity to identify emotions facilitates early intervention and provides struggling students with help. By identifying these feelings and providing appropriate solutions [37,38], Teachers are more adept at handling negative emotional states like concern or disinterest. By being proactive, teachers may help students overcome obstacles more quickly and avoid potential obstacles to learning.
Additionally, emotion recognition promotes students’ self-awareness and self-regulation [39]. Teachers can assist students in becoming more conscious of their own learning preferences, abilities, and obstacles by giving them information on the emotional reactions they had during the learning process. Students can improve their general well-being and academic performance by learning how to manage stress, regulate their emotions, and adopt a good learning attitude by becoming more self-aware.

4. Software Description

Analyzing students’ emotions without the help of technology would be impractical, given the importance of considering emotions in teaching as described in the previous section. Facial expression identification can now be performed with outstanding accuracy thanks to advancements in artificial intelligence, particularly computer vision.
Whether studying in person or virtually, students spend most of their study time in front of a computer. This suggests that we can leverage their computers as data collection tools to examine their behavior. To accomplish this, we developed an application that runs on their devices. LearnerEmotions is a web application that can run on students’ computers and access their front-facing cameras in order to analyze facial expressions and detect emotions. Teachers or institutions can use this application to better understand their students by analyzing their emotions and using emotional data to optimize teaching techniques.
The current software is primarily based on an algorithm that uses a machine learning pretrained model to identify and store the emotions of students. The facial expression recognition model achieved an average accuracy of 0.91 and a loss of 0.3 in the experimental environment. Using JavaScript technology, the above process operates on the front end (student machine). Below we outline the key phases of this algorithm in general:
  • Access the student’s camera.
  • For each frame image, the algorithm crops the face first.
  • For each cropped face, the model detects the corresponding emotion.
  • For all frames in one second, the algorithm calculates the most dominant emotion.
  • The algorithm sends the most dominant emotions and the timestamp to the backend.
  • The backend algorithm receives the data and stores it in an object-oriented database linked to the specific student.
To record or analyze students’ emotions, the teacher must first create a group and then add the students involved in this experience to this group. Students subsequently receive an invitation email to join the group, which they must accept in order to participate. After that, the teacher has to create a session in the application which should normally correspond to a real course session that will take place either in the classroom or remotely. Subsequently, they must specify the group related to this session. At the scheduled time, students open the application and enter the session, granting access to their front-facing cameras. Once access is granted, the application begins detecting their emotions from facial expressions. During the session, students can view their detected emotions in real time, and teachers can visualize students’ emotions through dashboards provided by the application.
The emotions detected by the application are not just displayed, but they are also saved in a database. This means that the teacher and even the student can permanently view this data after the end of the session. The application offers dashboards summarizing the session’s emotions. Additionally, teachers can download the raw data for analysis in other tools or applications.
Based on user accounts, the program secures access to data and interfaces. All users, whether instructors or students, must create an account and define a password. Before accessing their data and interfaces, users must complete an authentication process. Every protected feature begins with online identity verification. Additionally, the application handles updating user profiles and changing passwords.

4.1. Software Architecture

LearnerEmotions uses the MERN stack technology. MERN stands for MongoDB, Express, React, and Node, the four main building blocks of the stack. The application was developed in the form of two nodes that cooperate. The first node executes on the client-side machine (front end) and the other executes on the server side (back end). Therefore, the client-side application is designed to:
  • Display the web Interfaces.
  • Access the client’s front-facing camera.
  • Process the video frames captured by the camera.
  • Send asynchronous HTTP requests to the back end.
The server side generally receives and processes the http requests from the clients in order to:
  • Send the public web content (HTML, CSS, and JavaScript files) to the front end.
  • Save or update data (users, groups, sessions, emotions, etc.) to the object database.
  • Request the data from the object database.
  • Return the data requested by the front end in JSON format.
Figure 1 illustrates the process of detecting and storing students’ emotions based on the architecture of the application.

4.2. Software Functionalities

LearnerEmotions was developed to provide teachers with the tools they need to manage teaching sessions while monitoring their students’ emotional states. At the same time, the software allows students to engage with the platform and participate in recording their feelings. In this section, we present the features available for both teachers and students.

4.2.1. Teacher Features

As teachers are primarily responsible for managing teaching sessions, whether in the classroom or in distance learning, we decided to assign them the task of managing emotion recordings. Therefore, the instructor role in the application is responsible for managing student groups, sessions, and overseeing the recording of emotions. It should be noted that instructors can only access their own groups and cannot access those of other instructors. The features offered by this software to teachers are listed below:
  • Create Groups of students: All teaching activities, whether in-person or through distance learning, take place within a framework that organizes students into groups. This feature allows the instructor to create and manage student groups according to their objectives or the arrangement used at their institution.
  • Add students to a group individually or from a csv file: The teacher must first send invitations via email to control which students join the groups. Next, the teacher enters the student data either by filling out a form for each student individually or by uploading an Excel file containing the necessary information (for cases where multiple students need to be added at once). This operation should trigger the sending of emails to the students, inviting them to join the group.
  • Create Sessions and Define Their Nature: The teacher designs specific sessions during which students’ emotions are recorded. For example, a teacher may create a session for the UML module, another for the Java module, and so on. Each session must be created for a particular group, and the instructor needs to specify the learning activities, such as lectures, practical work, guided projects, etc.
  • Start and stop emotions recording: The teacher can control when students’ emotions are recorded for each session. This feature should be used while students are attending the session. The teacher can also pause and resume recording at any time during the session.
  • Supervise Students’ Emotions: During the session, the teacher can visualize in real time the emotional changes in each student in the session. This is made possible through the dashboards provided by the application.
  • Display Students’ Emotions After the Session Ends: The recognized emotions can be viewed not only during the session but also after its completion, as the emotion data is stored in a database. This allows the teacher to review student-specific dashboards for an in-depth analysis of emotions even after the session has concluded.
  • Download emotions data as a csv file: This feature allows teachers to analyze student emotion data more freely, without being limited to the application’s dashboard features. Instructors can download the raw data for use in other data analysis tools to gain more personalized insights.

4.2.2. Student Features

Because the emotions have to be recorded related to the course session, the application allows students to access emotion registration sessions specific to each course session. Notably, a student cannot view the emotions of other students but can see their own recorded emotions. The following is a list of features available to students:
  • Join a Student Group: This feature allows the student to become part of a group and participate in emotion recording sessions. The student can join a group after receiving an invitation from the teacher and has the option to accept or decline the invitation.
  • Join Session: This feature allows the student to enter a session when an emotion recording is starting, enabling the system to begin recognizing their emotions.
  • Start Emotion Registration: This feature allows the student to begin recording their emotions during a specific session. This operation requires the student to grant the application access to the front-facing camera.
  • Display Emotions in Real Time: The student has access to a dashboard that shows their identified emotions and how they vary in real time.
  • Display Personal Emotion Data After the Session Ends: The student can review a visual representation of their emotions from any previously recorded session at any time.

4.2.3. Software Interface Examples

Figure 2 shows the student interface, which displays the student’s face in real time alongside a pie chart showing the percentages of their emotions during the session. These detected emotions are processed on the server side and stored in a database along with other parameters, such as the student, course title, course type, and more. This application is primarily designed for teachers who want to analyze their students’ emotions during course sessions, practical work, or any type of teaching activity. It allows teachers to view students’ emotions in real time, either in a simple mode or through detailed dashboards.
The software’s primary goal is to provide educators and educational institutions with tools to assess and analyze students’ emotions. By offering data extraction features, it allows for multiple use cases, enabling teachers to gain insights that meet their specific needs. Additionally, the application provides dashboards that display graphical charts, either in real time during sessions or after their completion. Figure 3 shows an example of a pie chart that a teacher can view for a student enrolled in a specific session, while Figure 4 illustrates how a student’s emotions change throughout a course session.
For each of the previous charts, two separate algorithms were developed to implement these features according to the architecture in use: one on the back end to retrieve data from the database, and one on the front end to request data from the back end and display it in the user interface. Below, we provide an overview of the back-end algorithm’s main steps for the pie chart feature, which was developed using the Express framework:
  • Send a request to the database to retrieve all rows specific to the current student and session.
  • Count the number of occurrences of each emotion.
  • Calculate the percentage of each emotion.
  • Return the percentage of each emotion.
The previous algorithm is invoked on the front end by the user to display the chart in a specific area of the browser, depending on the view where it is called. The general front-end algorithm for displaying a pie chart is as follows:
  • Asynchronously call the URL that provides emotion percentages, sending the student ID and session ID as parameters.
  • Display the received emotion percentages as a pie chart.
  • Repeat the process every second to update the chart in real time.

5. Experiment

5.1. Method

The objective of this experiment is to use our application to analyze students’ emotions to look for the relationship between learner emotions and their academic performance. For this study, we hypothesized that positive emotions (joy, surprise) would correlate positively with academic performance and attendance, while negative emotions (anger, sadness, fear, disgust) would show negative correlations. We then used the application to record learners’ emotions in real time during all sessions of a machine learning course that has an executive time volume of 42 h of study (24 h of lectures and 18 h of practical work). All sessions were conducted in person. Both groups were taught by the same professor. The experiment was conducted in a single course module. While this provided a controlled environment, we acknowledge that the scope of the findings may be limited and may not fully reflect variations across different subjects.
In the application used, two groups specific to the modules taught were created since the students study this module in two groups. The students were added to their appropriate groups. They received an email inviting them to create an account and join the experience of recording their emotions. All students agreed to join their group and were therefore able to participate in this experience. During each teaching session, students and teachers accessed their spaces in the platform specific to this experience. The teacher then started the recording session to begin capturing the students’ emotions during the current session. As a result, the emotions expressed by the students during all the sessions were captured in real time and then saved in the application’s database.
All the emotions recorded in relation to this experience were retrieved from the database in a raw format. These data were pre-processed and then integrated with the students’ grade and absence in order to look for the relationships between these emotional data on the one hand and performance and attendance on the other.

5.2. Sample Students

The students who participated in this experiment are second-year computer engineering students at the Faculty of Engineering Sciences of the Private University of Fez. While their total number is 65, they were divided into two groups where the first group contains 31 students and the second 34 students. The average age of all students is 21 years old and they consist of 44 boys and 21 girls. We acknowledge that this relatively small and context-specific sample limits the generalizability of the findings.

5.3. Data Collection and Preprocessing

The data as mentioned above is stored in an object database. An Emotion class was defined to carry attributes specific to the recognized emotion in addition to the relationships with the student and the course. The data related to this class was retrieved and preprocessed to prepare it for analysis. Table 1 shows a sample of the preprocessed data.
The data in the first phase was saved in a csv file, but we had a lot of difficulty manipulating this file given the enormous number of records (more than 9 million lines). Therefore, we saved pre-processed data in a table in the database. This greatly facilitated their exploitation by setting up an algorithm to calculate percentages of emotions by students. We denote that missing values were excluded during preprocessing, and outliers (e.g., extreme values due to detection errors) were filtered using statistical thresholds before correlation analysis.

6. Results

Learning is a complex process influenced by many factors, of which emotions play a central role. These emotional states can have a direct impact on students’ concentration, motivation, and, consequently, academic performance. With our application, artificial intelligence now makes it possible to analyze emotions in real time using facial recognition. In this context, we propose to exploit the data collected in this study to study the relationship between emotional states and academic performance.
Therefore, we integrated data of the percentages of the students’ emotions with two other parameters which are the module grade and attendance. Table 2 presents a sample of the dataset in relation to 5 students. The grade is out of 20 and the attendance is out of 28 which is the total of the sessions carried out (1 h 30 min per session). The values presented for the emotions represent the percentage of the emotion recorded for the student during all the sessions. For example, student 1 showed the emotion “angry” for 11.38% of the time they spent in this module.
To determine the relationship between performance and all emotions, we calculated the correlation coefficient between emotions and grade. We also calculated the same metric between emotions and presence. Table 3 represents the correlation values of emotions with either grade or presence.
The results indicate that certain emotions have a significant correlation with the academic performances of the students who participated in this experience. In particular, joy and surprise, with correlations of 0.592 and 0.430, respectively, are linked to students who achieved good grades. This means that students experiencing positive emotions during the sessions of this experience, such as joy of learning or curiosity about new content (expressed by surprise), have more ability to achieve good grades. On the other hand, negative emotions such as anger (−0.410), disgust (−0.439), sadness (−0.273), and fear (−0.282) are negatively correlated with grades, although the magnitude of this correlation can fluctuate. These observations indicate that negative emotions affect concentration and motivation, leading to decreased academic performance. According to these findings, emotions like anger and disgust influence negatively on academic performance.
Regarding class attendance, the correlations are significantly weaker, but they follow the same trend. This means that students who experience positive emotions or remain neutral are slightly more likely to attend classes regularly. However, negative emotions such as anger (−0.260), disgust (−0.176), and fear (−0.114) are weakly correlated with reduced attendance, meaning that these emotions can be considered as factors of gradual disengagement.
Despite a correlation that fluctuates from one emotion to another, we observed a resemblance between emotions deemed positive (joy, surprise) and those perceived as negative (anger, sadness, fear, disgust). Thus, in order to obtain a more accurate interpretation of the results and a more robust analysis, it is preferable to classify emotions into two large groups: positive emotions and negative emotions. This method offers the possibility of appreciating the overall impact of positive and negative emotions on performance and attendance in class, while minimizing the statistical fluctuation caused by the distinct nuances of each emotion. Table 4 shows the transformation made to the dataset for the first 5 students as an example. The value of the positive emotion represents the sum of the values of the emotions considered positive (joy, surprise).
In order to analyze the results to establish the link between positive and negative emotions on the one hand, and academic success and class attendance on the other, we determined the correlation of these elements with the new values of emotions. Table 5 below presents the result of the calculation of the correlation coefficients.
By analyzing the results indicated in Table 5, a significant link between students’ emotions, their academic performance, and their attendance has been revealed. Specifically, in the correlation between positive emotions and academic performance, a strong positive correlation (0.699) was observed. This can be interpreted as an argument that confirms that students who express more joy and surprise tend to perform better. On the other hand, a strong negative correlation (−0.682) was established between negative emotions and scores. This means that emotions such as sadness, anger, or fear are linked to lower performance.
Concerning attendance as a factor, the relationship with positive emotions remains positive, but its value is very weak (0.179). This implies that students who experience positive feelings have a slight tendency to attend class than students with negative emotions, although this impact is less significant than that on academic performance. On the other hand, a moderate negative correlation exists between negative emotions and class attendance (−0.312), indicating that students experiencing stress, worry, or other unpleasant feelings tend to be less diligent in class. This outcome may be associated with a gradual disaffection or discomfort with the educational setting.
To graphically visualize the relationship between positive emotions and students’ academic performance, we plotted the data points representing the students in a plane that has the students’ performance as the first axis and their grades as the second axis. Figure 5 shows the result obtained. We notice in this figure that almost all students with more positive emotions have grades between good and excellent (Zone A). On the other hand, students with less positive emotions have varied grades and a significant number have grades below average (Zone B).
To further investigate the link between emotions and academic performance, we categorized students into four distinct categories based on their achievements: high-achieving students (<5), vulnerable students (5 ≤ grades < 10), satisfactory-performing students (10 ≤ grades < 15), and brilliant students (≥15). This categorization facilitates a more precise observation of the progression of emotions according to different degrees of success and the identification of the most fragile groups. By identifying these categories, we are able to more precisely understand how emotions impact the transition from failure to excellence, and to develop tailor-made pedagogical approaches for each type of profile.
We determined the difference between positive and negative emotions for each student in order to examine each group separately. Each time this disparity was greater, that student demonstrated more positive emotions. Each time it was devalued and low, that student demonstrated more negative emotions. An example of this operation for the first five students is illustrated in the table below (Table 6).
We examined the distribution of students based on their positive or negative difference, classifying them into various groups according to their academic performance. In addition, we determined the mean and standard deviation of these differences for each group. The results obtained are illustrated in the following table (Table 7).
Analysis of the gap between positive and negative emotions reveals a distinct progression across grade groups, ranging from students with significant academic difficulties to those who excel.
A large majority (81.82%) of students with a grade of 5 or lower expressed primarily negative emotions, indicating that they experienced more negative than positive emotions. The mean deviation of this difference is strongly negative (−18.13), indicating a clear preponderance of these unfavorable feelings. This category also shows a mean deviation of 10.29, signaling a moderate dispersion of the emotions experienced. These findings indicate that students with major academic problems primarily experience negative emotions, which are likely to impair their motivation and engagement in the learning process.
Among the group of students who obtained a score of 10 or lower, there is still a tendency toward negative emotions, but with a slight improvement. Indeed, 70% of them present a negative gap, although this average gap has narrowed (−6.21). The average gap is higher (11.81), which suggests greater variability in emotional perceptions. This indicates that some students are beginning to control their emotions, while most still have problems experiencing positive feelings in relation to their learning.
The group of students with a score of 15 or lower represents a tipping point where positive emotions gradually begin to dominate. Indeed, 60% of students in this group show a positive difference, signifying an improvement in their emotional well-being. The mean difference is slightly above zero (1.96), despite a considerable mean difference (14.01), reflecting a large disparity within this group. This transition indicates that students with average to good performance are cultivating a more balanced mindset, which could strengthen their engagement and motivation.
Finally, students who scored above 15 clearly stand out for their preponderance of positive emotions. Indeed, 85.71% of these students show a positive difference, suggesting that they generally experience more positive emotions than negative ones. The average deviation of this difference is 10.81, indicating significant emotional well-being, while the average variance is the lowest (9.54), suggesting greater uniformity within this group. These observations suggest that students who excel demonstrate a more confident mindset, superior stress management, and increased motivation to succeed.

7. Discussion

7.1. Academic Performance

In this study, we have been investigating the influence of emotions on students’ academic achievement through the use of our application. As a result, it generally appears that struggling students are more sensitive to negative emotions, while those who perform better express mainly more positive emotions. Therefore, the results confirm that the emotional state of the learner plays a key role in the learning process. This means that it affects not only motivation, but also attention and engagement in class.
Additionally, the difficulties encountered by students in their studies lead to more frustration, anxiety, or discouragement. Due to these emotions, there may likely be a gradual disengagement from students over time, such that their attention may decrease during class sessions, or they may have an unfavorable perception of their own success. This phenomenon can create a harmful cycle where diminished self-esteem accentuates negative emotions, which hinders the learning process.
On the other hand, Students who demonstrate higher performance during their studies tend to display more positive emotions, such as joy or curiosity. As a result, this fosters in students a mindset conducive to learning by stimulating the desire to discover, understand, and persevere in the face of obstacles based on these feelings experienced and expressed.
It is clear from all this that an emotionally positive environment is essential for learning success, as it allows learners to effectively manage their stress and be positive in the face of learning challenges. Moreover, emotion management also appears to have an impact on class attendance. Students who experience primarily positive emotions tend to attend class regularly, while those dominated by negative emotions generally have lower attendance. This may be because an unfavorable emotional state can make learning more difficult, prompting some students to avoid circumstances that cause them stress or discomfort.
Our findings resonate with Pekrun’s Control-Value Theory of Achievement Emotions [40], which posits that positive activating emotions (e.g., joy, curiosity) enhance motivation and performance, whereas negative emotions undermine engagement. This theoretical lens supports the interpretation of our correlations and provides a psychological basis for understanding the role of emotions in learning.
These observations highlight the need for an educational environment that considers students’ emotional state. Implementing tactics to create a more pleasant classroom atmosphere, such as interactive teaching approaches, available psychological support, or stress management methods, could not only enhance students’ well-being but also optimize their academic performance.
Finally, we acknowledge that the present study employed correlation analysis as an exploratory step. While this provides initial insights, we acknowledge that more sophisticated approaches, such as regression models, classification algorithms, or machine learning validation, could yield deeper predictive insights. We intend to explore these methods in future work.

7.2. Impact of the Software

The solution proposed in this project opens new perspectives for a better understanding of students. By using a set of criteria, the application evaluates students’ emotions, enabling transversal research based on the collected data. This approach helps answer a variety of questions regarding the assessment of teaching and learning processes—an evaluation that was not previously feasible in traditional classroom settings. Universities and educational institutions can gain deeper insights into their students by using the application for emotional analysis. As explained in the first section, research has already demonstrated that understanding learner emotions can help in increasing learner engagement [33,34], applying personalized learning [35,36] providing struggling students with help [37,38] promoting students’ self-awareness and self-regulation [39], and providing valuable insights for the creation of instructional materials and strategies that are more effective [1]. We subsequently cite cases where the application can be very useful:
  • For teachers, who are typically interested in their students’ emotional states during face-to-face classroom instruction, this application allows them to adjust their teaching methods accordingly. However, teachers cannot always adopt this approach, particularly in large classes. Moreover, as machines increasingly take over instructional tasks, this application may enable them to better understand students’ emotions.
  • Classifying Students According to Their Emotional State and Correlating with Other Parameters: The purpose of this software is to store all emotional information in an object database. By allowing open data utilization, it enables further studies. The recorded data can be imported and analyzed in various contexts. For example, unsupervised learning can be used to identify groups of students with similar emotional patterns, which can then be correlated with other variables, such as grades in hard or soft skill modules.
  • Analyzing Changes in Students’ Emotions According to Content: For the same student, it is possible to analyze whether their emotions differ across various modules. This can provide an emotional diagnosis by module, helping to address issues with the student. For example, if a student consistently feels sad or angry in certain modules but happy in others, educators can discuss the underlying reasons with the student to promote greater engagement in the less positive modules.
  • Analyzing the Variation in Students’ Emotions by Educational Content (Course, Practical Work, Guided Project, etc.): Pedagogical techniques are fundamental to effective teaching and learning. Analyzing students’ emotions can help determine the effectiveness of these strategies. This application serves as a valuable tool for teachers to evaluate their teaching approaches by collecting emotional feedback from their students.
  • Detecting Extreme Cases: The algorithm can automatically identify students who exhibit emotions that differ significantly from those of their peers or who consistently display negative emotions across all modules. This feature helps pinpoint such individuals so that the underlying causes of their low emotional engagement can be addressed.
  • Reporting Proactive Alerts: While emotion data is recorded in its raw form, the application provides flexibility for users to apply various techniques to analyze emotions.
  • Bringing Distance Teaching Closer to Classroom Teaching: Many limitations were observed in distance learning by both educators and students during the COVID-19 period. This software can help address some of these gaps, allowing teachers and students to feel more comfortable and engaged in distance learning sessions.
Emotion recognition is a topic of significant interest in the field of computer vision. Considerable research has been conducted on this subject, particularly in developing effective machine learning models and applying these findings to practical use cases, such as the application described in this paper. To illustrate how our software differs from other published works, we compare existing emotion recognition applications across several features in Table 8. These features are as follows:
  • Language: The technologies and programming languages used to develop the software.
  • Display on Video: Indicates whether the application shows the identified emotions in real time on the video.
  • Teaching and Learning Workflow: Indicates whether the application supports aspects of the teaching and learning workflow, such as organizing classes and student groups, and recording emotions according to the type of session (lecture, lab, guided project, etc.).
  • Users Involved in Teaching and Learning: Indicates whether the application provides user accounts specific to educators, learners, and other participants in the teaching and learning process.
  • Dashboards: Indicates whether the software provides features that allow educators or other stakeholders to view charts visually depicting emotions in dashboards.
  • Data storage: Indicates whether the application provides features to store emotional data in files, databases, or other types of storage.
  • Use Outside the Teaching and Learning Context: Indicates whether the application can be used to detect emotions in other contexts beyond teaching and learning.

7.3. Limitations

Although the proposed approach shows promising results, several limitations remain and should be addressed in future work:
  • One limitation of this study lies in the restricted dataset, as the experiment was conducted with 65 computer engineering students from a single Moroccan university. Consequently, the generalizability of the findings is limited, and further studies involving larger and more diverse populations across multiple universities and disciplines are necessary.
  • Another limitation concerns the fact that our experiment was carried out on a single course module. Emotional reactions and their influence on performance may vary across disciplines and types of learning activities. Future studies should therefore extend the experimentation to different subjects and pedagogical contexts.
  • This study did not stratify results by demographic variables such as gender or age, nor by prior academic performance. These factors may influence the relationship between emotions and grades, and they will be considered in future studies to refine our understanding.
  • Multimodal information (e.g., voice, body language, physiological signals) is important for robust emotion detection, this study focused exclusively on facial expressions. Future work will extend LearnerEmotions to integrate multimodal signals.
  • Regarding software, a limitation is the lack of analysis regarding scalability, cost, and hardware dependency. LearnerEmotions is lightweight and web-based, requiring only a webcam-enabled computer, which facilitates deployment. However, a formal analysis of scalability and costs will be included in future work.
  • Although the model used in LearnerEmotions demonstrated an accuracy of 0.91 in real deployment tests, this version of the software did not include a comprehensive evaluation of performance metrics such as latency and error rates. Future work will incorporate a real-time monitoring system to assess and ensure the reliability and efficiency of emotion detection during live sessions.
  • It is important to note that cultural differences in interpreting facial expressions may influence emotion recognition. To extend the use of our application to students from other cultures, cross-cultural validation is needed to ensure robustness and general applicability of the system.

8. Conclusions

This research highlights the importance of considering learner emotions as a factor that directly impacts academic success and student engagement in their training. The results found in this study indicate with an experimental approach that struggling students experience more negative emotions, which can harm their motivation and concentration, while those who are more successful show a higher prevalence of positive emotions, contributing to more effective learning. The effect of emotions is also manifested in attendance and engagement where students who have developed positive emotions tend to attend scheduled classes more regularly.
Artificial intelligence technology applied to emotion analysis in the field of education promises several perspectives of improvement on several levels, notably personalized learning and the improvement in pedagogy. The application we have developed helps in the real-time recognition of students’ emotional states. As a result, teachers and institutions can have a valuable tool on which they can rely to adapt their teaching approaches to the emotional demands of learners.
These results highlight the indispensability of considering learners’ emotional well-being within educational systems, whether in face-to-face or distance learning courses. Therefore, to complete the work, we must seek to develop pedagogical approaches and support mechanisms that promote a positive learning environment, where students feel both stimulated and supported. In future work, we will examine the most effective strategies to help learners better regulate their emotions, always based on our application. This will thus contribute to their academic success and personal development.
Future work will focus on extending this research in several directions. First, we aim to increase the sample size and include students from different universities and disciplines to enhance generalizability. Second, we plan to integrate multimodal data sources (e.g., voice, body language, physiological signals) into the system for more robust emotion recognition. Additionally, technical improvements will include real-time monitoring of performance metrics such as accuracy and latency, as well as a detailed analysis of scalability and deployment costs. Finally, cross-cultural validation will be conducted to ensure applicability in diverse educational contexts.

Author Contributions

Conceptualization, Z.I. and N.E.B.; methodology, Z.I. and N.E.B.; software, N.E.B.; validation, Z.I. and M.O.J.; formal analysis, M.O.J.; investigation, Z.I.; resources, M.O.J.; data curation, N.E.B.; writing—original draft preparation, N.E.B.; writing—review and editing, Z.I. and M.O.J.; visualization, M.O.J.; supervision, Z.I.; project administration, Z.I.; funding acquisition, Z.I. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by “Centre National pour la Recherche Scientifique et Technique”, under grant number SHSE-2021/49.

Institutional Review Board Statement

This study was conducted in compliance with the principles of the Declaration of Helsinki (1975, revised in 2013). According to Moroccan national legislation, specifically Law No. 28-13 (2015), ethics committee approval is required only for interventional studies involving clinical procedures, drugs, or biological sampling. Since the present study is non-interventional, observational, and educational in nature, and involved only anonymized behavioral data from adult student participants, it does not fall under the scope of Law No. 28-13 and therefore did not require formal Institutional Review Board approval. All participants provided informed consent prior to participation. All data were anonymized at collection by replacing student identifiers with pseudonymous codes. Data were stored securely in encrypted databases accessible only to the research team, ensuring participant privacy and confidentiality.

Informed Consent Statement

All individuals provided their informed consent. Participants received complete disclosures about the nature, goals, and confidentiality of their data, as well as information about their right to discontinue participation at any time without repercussions.

Data Availability Statement

The datasets presented in this article are not readily available due to privacy restrictions. Requests to access the datasets should be directed to corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Linnenbrink-Garcia, L.; Patall, E.A.; Pekrun, R. Adaptive Motivation and Emotion in Education: Research and Principles for Instructional Design. Policy Insights Behav. Brain Sci. 2016, 3, 228–236. [Google Scholar] [CrossRef]
  2. Khare, S.K.; Blanes-Vidal, V.; Nadimi, E.S.; Acharya, U.R. Emotion Recognition and Artificial Intelligence: A Systematic Review (2014–2023) and Research Recommendations. Inf. Fusion 2024, 102, 102019. [Google Scholar] [CrossRef]
  3. Mohana, M.; Subashini, P. Facial Expression Recognition Using Machine Learning and Deep Learning Techniques: A Systematic Review. SN Comput. Sci. 2024, 5, 432. [Google Scholar] [CrossRef]
  4. Canal, F.Z.; Müller, T.R.; Matias, J.C.; Scotton, G.G.; de Sa Junior, A.R.; Pozzebon, E.; Sobieranski, A.C. A Survey on Facial Emotion Recognition Techniques: A State-of-the-Art Literature Review. Inf. Sci. 2022, 582, 593–617. [Google Scholar] [CrossRef]
  5. El Bahri, N.; Itahriouan, Z.; Abtoy, A.; Belhaouari, S.B. Using Convolutional Neural Networks to Detect Learner’s Personality Based on the Five Factor Model. Comput. Educ. Artif. Intell. 2023, 5, 100163. [Google Scholar] [CrossRef]
  6. Gursesli, M.C.; Lombardi, S.; Duradoni, M.; Bocchi, L.; Guazzini, A.; Lanata, A. Facial Emotion Recognition (FER) Through Custom Lightweight CNN Model: Performance Evaluation in Public Datasets. IEEE Access 2024, 12, 45543–45559. [Google Scholar] [CrossRef]
  7. Elsheikh, R.A.; Mohamed, M.A.; Abou-Taleb, A.M.; Ata, M.M. Improved Facial Emotion Recognition Model Based on a Novel Deep Convolutional Structure. Sci. Rep. 2024, 14, 29050. [Google Scholar] [CrossRef] [PubMed]
  8. Agung, E.S.; Rifai, A.P.; Wijayanto, T. Image-Based Facial Emotion Recognition Using Convolutional Neural Network on Emognition Dataset. Sci. Rep. 2024, 14, 14429. [Google Scholar] [CrossRef]
  9. Singh, Y.B.; Goel, S. A Systematic Literature Review of Speech Emotion Recognition Approaches. Neurocomputing 2022, 492, 245–263. [Google Scholar] [CrossRef]
  10. Schuller, D.M.; Schuller, B.W. A Review on Five Recent and Near-Future Developments in Computational Processing of Emotion in the Human Voice. Emot. Rev. 2021, 13, 44–50. [Google Scholar] [CrossRef]
  11. Turcian, D.; Stoicu-Tivadar, V. Speech Emotion Recognition Using openSMILE and GPT 3.5 Transformer. In Studies in Health Technology and Informatics; Mantas, J., Hasman, A., Demiris, G., Saranto, K., Marschollek, M., Arvanitis, T.N., Ognjanović, I., Benis, A., Gallos, P., Zoulias, E., et al., Eds.; IOS Press: Amsterdam, The Netherlands, 2024; ISBN 978-1-64368-533-5. [Google Scholar]
  12. Yuan, X.; Wong, W.P.; Lam, C.T. Speech Emotion Recognition Using Multi-Layer Perceptron Classifier. In Proceedings of the 2022 IEEE 10th International Conference On Information, Communication And Networks (ICICN), Zhangye, China, 19–22 August 2022; pp. 644–648. [Google Scholar]
  13. Zhao, Y.; Shu, X. Speech Emotion Analysis Using Convolutional Neural Network (CNN) and Gamma Classifier-Based Error Correcting Output Codes (ECOC). Sci. Rep. 2023, 13, 20398. [Google Scholar] [CrossRef]
  14. Trinh Van, L.; Dao Thi Le, T.; Le Xuan, T.; Castelli, E. Emotional Speech Recognition Using Deep Neural Networks. Sensors 2022, 22, 1414. [Google Scholar] [CrossRef]
  15. Ahmed, F.; Bari, A.H.; Gavrilova, M.L. Emotion Recognition from Body Movement. IEEE Access 2019, 8, 11761–11781. [Google Scholar] [CrossRef]
  16. Wu, J.; Zhang, Y.; Sun, S.; Li, Q.; Zhao, X. Generalized Zero-Shot Emotion Recognition from Body Gestures. Appl. Intell. 2022, 52, 8616–8634. [Google Scholar] [CrossRef]
  17. Elansary, L.; Taha, Z.; Gad, W. Survey on Emotion Recognition through Posture Detection and the Possibility of Its Application in Virtual Reality. arXiv 2024, arXiv:2408.01728. [Google Scholar] [CrossRef]
  18. Försterling, M.; Gerdemann, S.; Parkinson, B.; Hepach, R. Exploring the Expression of Emotions in Children’s Body Posture Using OpenPose. In Proceedings of the Annual Meeting of the Cognitive Science Society, Rotterdam, The Netherlands, 26–29 July 2024; Volume 46. [Google Scholar]
  19. Liu, Y.; Li, X.; Wang, M.; Bi, J.; Lin, S.; Wang, Q.; Yu, Y.; Ye, J.; Zheng, Y. Multimodal Depression Recognition and Analysis: Facial Expression and Body Posture Changes via Emotional Stimuli. J. Affect. Disord. 2025, 381, 44–54. [Google Scholar] [CrossRef]
  20. Saganowski, S.; Perz, B.; Polak, A.G.; Kazienko, P. Emotion Recognition for Everyday Life Using Physiological Signals from Wearables: A Systematic Literature Review. IEEE Trans. Affect. Comput. 2022, 14, 1876–1897. [Google Scholar] [CrossRef]
  21. Lin, W.; Li, C. Review of Studies on Emotion Recognition and Judgment Based on Physiological Signals. Appl. Sci. 2023, 13, 2573. [Google Scholar] [CrossRef]
  22. Chandra, V.; Priyarup, A.; Sethia, D. Comparative Study of Physiological Signals from Empatica E4 Wristband for Stress Classification. In Proceedings of the Advances in Computing and Data Sciences, Nashik, India, 23–24 April 2021; Singh, M., Tyagi, V., Gupta, P.K., Flusser, J., Ören, T., Sonawane, V.R., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 218–229. [Google Scholar]
  23. Ronca, V.; Martinez-Levy, A.C.; Vozzi, A.; Giorgi, A.; Aricò, P.; Capotorto, R.; Borghini, G.; Babiloni, F.; Di Flumeri, G. Wearable Technologies for Electrodermal and Cardiac Activity Measurements: A Comparison between Fitbit Sense, Empatica E4 and Shimmer GSR3+. Sensors 2023, 23, 5847. [Google Scholar] [CrossRef] [PubMed]
  24. Nandini, D.; Yadav, J.; Singh, V.; Mohan, V.; Agarwal, S. An Ensemble Deep Learning Framework for Emotion Recognition through Wearable Devices Multi-Modal Physiological Signals. Sci. Rep. 2025, 15, 17263. [Google Scholar] [CrossRef]
  25. Wang, Z.; Wang, Y. Emotion Recognition Based on Multimodal Physiological Electrical Signals. Front. Neurosci. 2025, 19, 1512799. [Google Scholar] [CrossRef]
  26. Wang, J.; Wang, M. Review of the Emotional Feature Extraction and Classification Using EEG Signals. Cogn. Robot. 2021, 1, 29–40. [Google Scholar] [CrossRef]
  27. Jafari, M.; Shoeibi, A.; Khodatars, M.; Bagherzadeh, S.; Shalbaf, A.; García, D.L.; Gorriz, J.M.; Acharya, U.R. Emotion Recognition in EEG Signals Using Deep Learning Methods: A Review. Comput. Biol. Med. 2023, 165, 107450. [Google Scholar] [CrossRef] [PubMed]
  28. Ahmadzadeh Nobari Azar, N.; Cavus, N.; Esmaili, P.; Sekeroglu, B.; Aşır, S. Detecting Emotions through EEG Signals Based on Modified Convolutional Fuzzy Neural Network. Sci. Rep. 2024, 14, 10371. [Google Scholar] [CrossRef]
  29. Wang, Q.; Wang, M.; Yang, Y.; Zhang, X. Multi-Modal Emotion Recognition Using EEG and Speech Signals. Comput. Biol. Med. 2022, 149, 105907. [Google Scholar] [CrossRef] [PubMed]
  30. Hökkä, P.; Vähäsantanen, K.; Paloniemi, S. Emotions in Learning at Work: A Literature Review. Vocat. Learn. 2020, 13, 1–25. [Google Scholar] [CrossRef]
  31. Salazar, C.; Aguilar, J.; Monsalve-Pulido, J.; Montoya, E. Affective Recommender Systems in the Educational Field. A Systematic Literature Review. Comput. Sci. Rev. 2021, 40, 100377. [Google Scholar] [CrossRef]
  32. Yadegaridehkordi, E.; Noor, N.F.B.M.; Ayub, M.N.B.; Affal, H.B.; Hussin, N.B. Affective Computing in Education: A Systematic Review and Future Research. Comput. Educ. 2019, 142, 103649. [Google Scholar] [CrossRef]
  33. Thomas, C.L.; Allen, K. Driving Engagement: Investigating the Influence of Emotional Intelligence and Academic Buoyancy on Student Engagement. J. Furth. High. Educ. 2021, 45, 107–119. [Google Scholar] [CrossRef]
  34. Zhang, K.; Wu, S.; Xu, Y.; Cao, W.; Goetz, T.; Parks-Stamm, E.J. Adaptability Promotes Student Engagement under COVID-19: The Multiple Mediating Effects of Academic Emotion. Front. Psychol. 2021, 11, 633265. [Google Scholar] [CrossRef]
  35. Dipace, A.; Loperfido, F.F.; Scarinci, A. Analysing Emotions to Personalise Learning for EduOpen Moocs’ Platform. In Proceedings of the Conference Proceedings The Online, Open and Flexible Higher Education Conference, Begin, China, 10–12 October 2018; pp. 9–17. [Google Scholar]
  36. Lim, L.-A.; Dawson, S.; Gašević, D.; Joksimović, S.; Pardo, A.; Fudge, A.; Gentili, S. Students’ Perceptions of, and Emotional Responses to, Personalised Learning Analytics-Based Feedback: An Exploratory Study of Four Courses. Assess. Eval. High. Educ. 2021, 46, 339–359. [Google Scholar] [CrossRef]
  37. Beltman, S.; Poulton, E. “Take a Step Back”: Teacher Strategies for Managing Heightened Emotions. Aust. Educ. Res. 2019, 46, 661–679. [Google Scholar] [CrossRef]
  38. Klopfer, K.M.; Scott, K.; Jenkins, J.; Ducharme, J. Effect of Preservice Classroom Management Training on Attitudes and Skills for Teaching Children with Emotional and Behavioral Problems: A Randomized Control Trial. Teach. Educ. Spec. Educ. 2019, 42, 49–66. [Google Scholar] [CrossRef]
  39. Mertens, E.C.; Deković, M.; Van Londen, M.; Reitz, E. Parallel Changes in Positive Youth Development and Self-Awareness: The Role of Emotional Self-Regulation, Self-Esteem, and Self-Reflection. Prev. Sci. 2022, 23, 502–512. [Google Scholar] [CrossRef]
  40. Pekrun, R. The Control-Value Theory of Achievement Emotions: Assumptions, Corollaries, and Implications for Educational Research and Practice. Educ. Psychol. Rev. 2006, 18, 315–341. [Google Scholar] [CrossRef]
  41. McGrath, J.; Nnamoko, N. TrackEd: An Emotion Tracking Tool for e-Meeting Platforms. Softw. Impacts 2023, 17, 100560. [Google Scholar] [CrossRef]
  42. Kulas, D.; Wrobel, M.R. AffecTube—Chrome Extension for YouTube Video Affective Annotations. SoftwareX 2023, 23, 101504. [Google Scholar] [CrossRef]
  43. Churaev, E.; Savchenko, A.V. A Standalone Software for Real-Time Facial Analysis in Online Conferences and e-Lessons. Softw. Impacts 2023, 16, 100507. [Google Scholar] [CrossRef]
  44. Savchenko, A.V. HSEmotion: High-Speed Emotion Recognition Library. Softw. Impacts 2022, 14, 100433. [Google Scholar] [CrossRef]
Figure 1. Illustration of emotion recognition based on the architecture of LearnerEmotions.
Figure 1. Illustration of emotion recognition based on the architecture of LearnerEmotions.
Digital 05 00045 g001
Figure 2. Student Emotion Recording and Real-Time Display Interface.
Figure 2. Student Emotion Recording and Real-Time Display Interface.
Digital 05 00045 g002
Figure 3. Pie chart of a student’s emotions during a session.
Figure 3. Pie chart of a student’s emotions during a session.
Digital 05 00045 g003
Figure 4. Emotion Variation Curve of a Student During a Session.
Figure 4. Emotion Variation Curve of a Student During a Session.
Digital 05 00045 g004
Figure 5. Relationship between positive emotion and Academic performance.
Figure 5. Relationship between positive emotion and Academic performance.
Digital 05 00045 g005
Table 1. Sample of raw data.
Table 1. Sample of raw data.
Student IdCourse IdTimestampDominant Emotion
1121225 March 2024 14:32:13Neutral
1121225 March 2024 14:32:14Neutral
1091225 March 2024 14:32:13Fear
1091225 March 2024 14:32:14Neutral
1091225 March 2024 14:32:15Surprised
Table 2. Sample of the dataset related to 5 students.
Table 2. Sample of the dataset related to 5 students.
AngryHappySadSurprisedAfraidDisgustNeutralGrade/20Attendance/28
Student 111.3802.130.596.382.2577.27526
Student 21.70.0414.330.10.440.7482.651228
Student 324.412.733.130.961.641.9165.22524
Student 40.5914.230.636.100.1878.271626
Student 50.5914.870.11.130.030.4282.861425
Table 3. Correlation coefficient between emotions and grade.
Table 3. Correlation coefficient between emotions and grade.
AngryHappySadSurprisedAfraidDisgustNeutral
Grades−0.4100.592−0.2730.430−0.282−0.439−0.044
Attendance−0.2600.131−0.0860.136−0.114−0.1760.213
Table 4. Sample of Students’ emotions according to new classes.
Table 4. Sample of Students’ emotions according to new classes.
PositiveNegative
Student 10.5922.14
Student 20.1417.21
Student 33.6931.09
Student 420.331.4
Student 5161.14
Table 5. Correlation between academic performance and binary emotions classes.
Table 5. Correlation between academic performance and binary emotions classes.
PositiveNegative
Grade0.699−0.682
Attendance0.179−0.312
Table 6. Difference between positive and negative emotions (Sample of 5 students).
Table 6. Difference between positive and negative emotions (Sample of 5 students).
PositiveNegativeDifference
Student 10.5922.14−21.55
Student 20.1417.21−17.07
Student 33.6931.09−27.4
Student 420.331.418.93
Student 5161.1414.86
Table 7. Distribution of students based on their positive or negative difference.
Table 7. Distribution of students based on their positive or negative difference.
Students’ SamplePositive DifferenceNegative DifferenceMeanAverage Deviation
≤518.1881.82−18.1310.29
>5 to ≤1030.0070.00−6.2111.81
>10 to ≤1560.0040.001.9614.01
>1585.7114.2910.819.54
Table 8. Comparison to other Applications.
Table 8. Comparison to other Applications.
NatureLanguageDisplay on the VideoTeaching and Learning WorkflowTeaching and Learning UsersDashboardsData StorageUse Outside Teaching and Learning Context
TrackEd [41]DesktopPython 3.8××××
AffecTube [42]Chrome ExtensionPython, JavaScript××××
Churaev & Savchenko [43]DesktopC++, Qt, Apache TVM, OpenCV××××
HSEmotion [44]MobilePython××××
Learner EmotionsWeb App.JavaScript×
√: The feature exists. ×: the feature does not exist.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

El Bahri, N.; Itahriouan, Z.; Ouazzani Jamil, M. Emotion-Aware Education Through Affective Computing and Learning Analytics: Insights from a Moroccan University Case Study. Digital 2025, 5, 45. https://doi.org/10.3390/digital5030045

AMA Style

El Bahri N, Itahriouan Z, Ouazzani Jamil M. Emotion-Aware Education Through Affective Computing and Learning Analytics: Insights from a Moroccan University Case Study. Digital. 2025; 5(3):45. https://doi.org/10.3390/digital5030045

Chicago/Turabian Style

El Bahri, Nisserine, Zakaria Itahriouan, and Mohammed Ouazzani Jamil. 2025. "Emotion-Aware Education Through Affective Computing and Learning Analytics: Insights from a Moroccan University Case Study" Digital 5, no. 3: 45. https://doi.org/10.3390/digital5030045

APA Style

El Bahri, N., Itahriouan, Z., & Ouazzani Jamil, M. (2025). Emotion-Aware Education Through Affective Computing and Learning Analytics: Insights from a Moroccan University Case Study. Digital, 5(3), 45. https://doi.org/10.3390/digital5030045

Article Metrics

Back to TopTop