An Emotion-Aware Learning Analytics System Based on Semantic Task Automation

: E-learning has become a critical factor in the academic environment due to the endless number of possibilities that it opens for the learning context. However, these platforms often suppose to increase the difﬁculties for the communication between teachers and students. Without having real contact between teachers and students, the former ﬁnds it harder to adapt their methods and content to their students, while the students also ﬁnd complications for maintaining their focus. This paper aims to address this challenge with the use of emotion and engagement recognition techniques. We propose an emotion-aware e-learning platform architecture that recognizes students’ emotions and attention in order to improve their academic performance. The system integrates a semantic task automation system that allows users to easily create and conﬁgure their own automation rules to adapt the study environment. The main contributions of this paper are: (1) the design of an emotion-aware learning analytics architecture; (2) the integration of this architecture in a semantic task automation platform; and (3) the validation of the use of emotion recognition in the e-learning platform using partial least squares structural equation modeling (PLS-SEM) methodology.


Introduction
Nowadays, e-learning [1] is a key factor in the learning context, due to the great number of advantages and possibilities that it entails in different academic environments (i.e., distance learning, self-directed learning, or face-to-face learning) [2].For this reason, it has inspired the research and development efforts of numerous companies and universities that seek to improve different learning methods to obtain a better academic performance [3].However, these platforms also entail specific challenges [4].One of these challenges is that maintaining the focus and motivation becomes difficult for certain students [5].In addition, these platforms also introduce a handicap in the communication between teacher and students [6].In this kind of learning, teachers find it hard to know the students and to check whether the methods and contents are appropriate for them.This is particularly important in recent years, where the analysis of the role that emotions play in academic environments is gaining momentum [7].In this context, two technologies play a key role: emotion-aware systems [8] and learning analytics [9] techniques.Emotion-aware systems, based on cognitive computing [10], are capable of detecting human emotions through techniques such as image recognition or speech analysis.By using these systems, human emotions can be detected on a larger scale, with better results, and reducing human-machine interactions.Learning analytics systems are fed with data from the learning environment to conduct a comprehensive analysis using machine learning [11] techniques and algorithms.Moreover, these systems can be improved thanks to a better understanding of the relevance and relationship between the different factors that appear in the learning process.However, interoperability between different platforms and components has become a key issue in these systems [12].The vast amount of generated data requires highly agile data models to make better decisions or any other actionable results.These models enable monitoring, processing, optimizing, and analyzing data to gain insights [13].Semantic technologies propose a suitable approach for interoperability by sharing common vocabularies, enabling the interoperable representation of inferred data.
In order to address the above challenges and advance in the state of the art e-learning platforms, we propose an emotion-aware system with learning analytics capabilities integrated into a semantic task automation platform.The main goal is to improve the learning process by including emotion and attention recognition techniques [14].The proposed system recognizes students' emotions and engagement, allowing educators to take into account this information for adapting their content and methodologies.Furthermore, taking advance of the growing emergence of intelligent devices, the platform also integrates ambient intelligence technologies for adapting the environment to the students' mood [15].This is accomplished by means of the integration of an emotion-aware task automation platform called EWETasker, developed in a previous work [16].This platform offers an easy way to integrate smart devices and services in an environment and use them for automating daily tasks.In this way, the design and development of an emotion aware architecture for e-learning platforms is the main contribution of this paper.In addition, a semantic task automation platform has been integrated into our system in order to provide it with mood and engagement regulation by the use of ambient intelligence techniques.Finally, the usefulness of an e-learning platform that takes into account students' emotions and attention has been validated using partial least squares structural equation modeling (PLS-SEM) methodology [17].
The rest of this paper is organized as follows.In the first place, an overview of the related work in emotion aware e-learning systems and learning analytics is given in Section 2.Then, Section 3 describes the reference architecture of the proposed emotional aware e-learning system, describing the main components and modules as well as their implementation.Section 4 describes the case studies defined for the proposed architecture.Validation of the use of an emotion-aware e-learning system for improving academic performance is described in Section 5. Finally, the conclusions drawn from this work, along with possible future work lines, are described in Section 6.

Background
This section describes the background and related work for the system proposed in this paper.First, Section 2.1 describes the motivation and objectives in the study of emotions related to the learning process.Then, an overview of the most used emotion recognition tools is given in Section 2.2.Section 2.3 presents the related works in emotion aware e-learning platforms.Finally, Section 2.4 gives an overview of the state-of-the-art regarding learning analytics.

Emotion Theories Applied to Education
Over the last years, emotions in the learning process have become an important field of study by psychologists.The influence that emotions have on the way people act [18] is known.In the academic context, the connection between emotions and encouragement has been proved [19].Understanding and managing emotions result in a key factor in improving academic results due to the close relationship between attention and performance [20].The individual characteristics of each student produce differences in cognition, motivation, and learning styles [21], and consequently on their learning outcomes [22].
Pekrun et al. [23] proposed a model on how emotions influence students' learning and achievement.This model studies the effect of mood on performance and how cognitive and motivational mechanisms may mediate it.They state that the influence of emotions can be mediated by several mechanisms with cumulative or contradictory effects, in order to predict overall effects on performance supposes a challenge.The study of Ranellucci et al. [24] provided empirical support for this model, evaluating the relations between achievement goals, emotions, learning strategies, and performance; and showing both direct and indirect effects of students' goals on academic achievement through their emotions and learning strategies.This study presented the benefits of mastery-approach goals for students' emotions and demonstrated enjoyment was beneficial for most learning strategies, while boredom predicted poorer time management, and anxiety predicted lower self-monitoring.In addition, performance-approach goals predicted less critical thinking, and performance-avoidance goals predicted more anxiety, boredom, critical thinking, and lower achievement gains.
In addition, an affective model for education was proposed by Kort et al. [14].This model argues that a typical learning experience involves a range of positive and negative emotions.According to the authors, this range of emotions occurs naturally and is an inevitable part of the learning cycle, so it is important to help students understand and recognize this process and to propel themselves.Social emotional intelligence and competence development literature are linked in the Social Emotional Development (SED) model, proposed by Seal et al. [25].This model includes self awareness, consideration of others, connection to others, and impacting change.Furthermore, the work aims to provide a framework to understand and facilitate increasing student social and emotional capacity to recognize emotional cues, process emotional information, and utilize emotional knowledge to adapt to social challenges in higher education.In addition, the work by Frenzel et al. [26] argue the existence of positive reciprocal links between teachers' and students' enjoyment, and that these links are mediated by the observations of each other's classroom behaviors.
Following these theories, recognizing students' emotions seems to be essential to stimulate them in a learning process.However, despite the importance of recognizing students' emotions in e-learning platforms, there are no many real implementations of these platforms that include emotion recognition.The inclusion of this feature can multiply the benefits of distance learning and add value to the platforms in charge of providing this service.With this purpose, it is essential to know how emotion awareness can be carried out.

Emotion Recognition Tools
Feidakis [8] defines emotion awareness as the implicit or explicit collection of emotion data and the recognition of emotional patterns.This definition is carefully explained by the author distinguishing three types of emotion captures: explicit (subjective report of feelings), implicit (voice, face, and physiological signals), and with emotion patterns recognition (identification of the people emotions based on observation).
The first steps were carried out by expert psychologists who identified the emotions of the patient based on their body posture, tone of voice, or facial expression [27].Then, self-reporting tools became widely used because of their simplicity and low cost, apart from being a non-intrusive tool [28].However, there are also some disadvantages to be taken into account, such as lack of honesty among participants, misinterpretation of the questions, or the use of unadjusted rating scales.The traditional approach of detecting emotions through questionnaires answered by the participants does not yield very efficient methods [29].That is the reason for focusing on automatic emotion detection using multimodal approaches (i.e., facial recognition, speech analysis, and biometric data) as the ensemble of different information sources from the same mode [30].
In recent years, improvements in the field of Artificial Intelligence [10] have enabled the same analysis of emotions using emotion recognition devices and without human intervention.They take advantage of being non-invasive and not requiring specialized equipment since most of them use common devices such as webcams or microphones.Algorithms to predict emotions based on facial expressions are mature and considered accurate.Currently, there are two main techniques to realize facial expression recognition depending on its way of extracting feature data: appearance-based features, or geometry-based features [31].Both techniques have in common the extraction of some features from the images which are fed into a classification system.They differ mainly in the features extracted from the video images and the classification algorithm used [32].Geometric based techniques find specific features such as the corners of the mouth or eyebrows and extract emotional data from them.Otherwise, appearance-based extraction techniques describe the texture of the face caused by expressions and extract emotional data from skin changes [33].Emotion recognition from speech analysis is an area that is gaining momentum in recent years [34].Speech features are divided into four main categories: continuous features (pitch, energy, formants), qualitative features (voice quality, harsh, breathy), spectral features (linear predictive codes, Mel frequency cepstral coefficients), and Teager energy operator-based features such as TEO-decomposed FM variation (TEO-FM-Var) and TEO Autocorrelation Envelope (TEO-Auto-Env) [35].
Physiological signals are another data source for recognizing people's emotions [36].The idea of wearables that detect the wearer's affective state dates back to the early days of affective computing [37].For example, skin conductance changes if the skin is sweaty, which is related to stress situations and other effects.Skin conductance is used as an indicator of arousal, to which it is correlated [38].A low level of skin conductivity suggests a low arousal level.Heart rate is also a physiological signal connected with emotions, as its variability increases with arousal.Generally, heart rate is higher for pleasant and low arousal stimuli compared to unpleasant and high arousal stimuli [38].

Emotion Recognition in E-Learning
The tools presented above are the basis of the emotion recognition research and have been applied in the last years to create emotion-aware learning systems.Reviewing the most relevant works, the proposal of Derick et al. [39] stands out.They implemented a set of visualizations intending to allow learners to reflect their affective states and their connection with specific learning activities through self-reporting.The results they obtained were that the simplest visualizations helped the students in a better way since most of them did not know the interpretation of complex ones.Also making use of the self-report, and complementing it with data about interactions between students and teachers, Ruiz et al. [40,41] proposed a method to measure students mood based on a model of twelve emotions (six positives and six negatives).With the data obtained, a dashboard was implemented, resulting in the improvement of the academic results of almost 75% of the students.
The work of [42] opted for using a library known as Clmtrackr through a web-cam to detect emotions based on facial expression.In this way, it was possible to relate the students' mood with the learning activity they were doing and overcoming some of the difficulties of affective communication in e-learning environments.Ez-Zaoui et al. [43] propose a multimodal and contextual approach to recognize emotions in e-learning platforms.With this purpose, four sources of information were captured and analyzed: audio, video, self-report, and students' interactions.This analysis resulted in a dashboard designed for teachers with the correlation between activities and emotions.
Finally, the work carried out by Robal et al. [44] tried to evaluate the performance of gaze tracking tools to detect attention losses of students in Massive Open Online Courses (MOOC).The conclusion reached was that hardware-based tools perform better than software-based ones.Despite this, the platform proposed in this work implements a software solution, known as Webgazer [45] because of the impossibility of using dedicated hardware outside the lab and its non-negligible results (an error of 175 pixels and an average visual angle of 4.17 • ).The significant majority of these works have in common the use of a dashboard to show the emotions captured in the learning process [42].These works do not only look for collecting as much as possible data about students' emotions, but they also pretend to show the data in an effective way to be used by teachers and students to improve their results.

Learning Analytics
The growing interest in applying machine learning techniques in different fields, coupled with the emergence of numerous data capture techniques, has resulted in a growing trend to use both to improve virtually any area of society.Within this trend appears the concept of learning analytics, which can be defined as "the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning" [9].This definition takes on an even higher dimension if the environment of an e-learning platform is considered, in which students' interactions are captured simply and on a large scale.
In the approach proposed by [46], the learning analytics process is defined in three steps.The first one is to collect the most relevant data from the educational environment and pre-process it to clean, anonymize, and transform data into a format suitable for later use.The second step is based on applying different data analysis techniques must be applied in order to find a solution to improve the learning process.Once a solution appears, it is needed to take action to achieve the goals proposed.Finally, the third step, named post-processing, consists of evaluating the entire process.This evaluation can result in obtaining new data that had not been taken into account, pre-processing better oriented to the final objective, defining new attributes and metrics, modifying analysis variables, or choosing different prediction and analysis methods.As mentioned, learning analytics is a research field with growing interest today, which is reflected in the amount of research that seeks to improve the academic outcomes of students participating in experiments.Beginning with the work of [47], this research aimed to combat dropout in the first year of college.With this purpose, a system was developed with the capabilities of making predictions about students' academic future based on academic and demographic information.The results showed that students that had been advised about their trends obtained better grades and a lower dropout rate than their peers in the same semester.
Pardos et al. [48] analyzed the relationship between affective state and student behavior.With this aim, measurements about student's work context, actions, utterances, facial expressions, body language, and interactions with teachers or fellow students were taken.Once it was made, a machine learning model was implemented, whose results demonstrated that it is possible to predict students' final grades based on their behavior and course performance.Lastly, the work of [49] tried to apply learning analytics in order to improve the learning process when students work in teams.They proposed a system capable of monitoring and evaluating teamwork competencies through data about e-learning platform forums, cloud storage platform interactions and files, and wiki tools.The results showed a strong relationship between active interactions, such as writing a message in the course forum, and academic performance.According to this, teachers thought that, using this system, they were in a better position to evaluate each student's teamwork.

Semantic Task Automation
A recent shift in ambient intelligence is to move from the model of full and transparent automation to smart collaboration since autonomous often leave users feeling out of control [50].A popular approach to interconnect and personalize both IoT and Internet services is the use of Event-Condition-Action (ECA) rules, also known as trigger-action rules.A number of now prominent web sites, mobile, and desktop applications feature this rule-based task automation model, such as If-this-then-that (IFTTT) [51].These systems, called Task Automation Service (TAS) [52], are typically web platforms or smartphone applications that provide an intuitive visual programming environment where inexperienced users seamlessly create and manage their automations.
In this context, we developed EWE Tasker [53], a semantic task automation platform which aims to board rule capabilities and address two of the major drawbacks of these systems: (1) the only incoming data streams available are those the platform is prepared for; and, (2) the lack of a mechanism to use and reason over large scale data outside their platform such as the Linked Open Data (LOD) cloud or context data [54].
EWE Tasker is based on the Evented WEb (EWE) ontology [52], a standardized data schema designed to model the most significant aspects of Task Automation Services.EWE provides a common model to define and describe task automation services, representing its rules and enabling rule interoperability.In addition, it provides a base vocabulary in order to build domain-specific vocabularies.Four major classes make up the core of EWE: Channel, Event, Action, and Rule.The class Channel defines individuals that either generate Events, provide Actions, or both (such as an emotion recognition sensor or a smart light).The class Event defines a particular occurrence of a process and allows users to describe under which conditions should rules be triggered.The recognition of sadness generated by the emotion detector sensor is an example of an entity that belongs to this class.The class Action defines an operation provided by a Channel that is triggered under some conditions (i.e., to change the light color).Finally, the class Rule defines an ECA rule, triggered by an Event that produces the execution of an Action.An example of a rule is: "If sadness is detected, then change the light color.".
Users in the EWE Tasker platform are able to create and configure their own automation rules in an easy way by means of a mobile phone application or a web interface.The platform integrates several external devices and services (such as Gmail or Spotify) and can be adapted to different smart environments in an easy way benefiting from the advantages of semantic technologies.

Emotion-Aware E-Learning Platform Architecture
Once the state of the art is analyzed, we present the proposed architecture along with the implementation details.We aim to build a system that enables the personalization of the learning experience based on students' emotions.With this purpose, and taking into account some e-learning literature [55][56][57], we have identified the following requirements to fulfill by the architecture: (1) to be integrable into the chosen e-learning platform; (2) to recognize and store the students' emotions; (3) to be able to convert data into valuable information for users; and (4) to enable automatic adaptation of the student environment.The proposed architecture matching these requirements is shown in Figure 1.The platform architecture is composed of two main components: emotion-aware e-learning module and emotion recognition module.The former is the core of the platform, and it provides the system with the capability of collect, store, process, analyze, and visualize all the data regarded to the system.In addition, it integrates a semantic task automation platform.On the other hand, the emotion recognition module dotes the system with emotion and engagement detection.This module is integrated into an e-learning platform.

Emotion Recognition Component
The Emotion Recognition component is responsible for collecting data from the e-learning platform.It consists of different views developed as web components that must be integrated into the original e-learning platform.For the proposed architecture, we have used Moodle as the e-learning platform.Moodle allows us to create online courses quickly and easily, thanks to its friendly web interface.In addition, its use is widespread in the academic world, so it is not a barrier for teachers or students, giving relevance to our case study.Moodle also allows us to integrate the emotion recognition component easily.
This component is composed of two modules: emotion widget and emotion recognition tools.The former is responsible for sending all the captured data in the platform to the emotion-aware e-learning component, while the latter is responsible for collecting data regarding students' emotion and engagement.Three emotion recognition tools have been integrated into this component: self-reporter, videogazer, and emotestcapturer.
The self-report tool implements a form in which students can record the emotions they are experiencing each day of the course, as well as the level of intensity of these emotions.Students can fill the report everyday indicating how they feel by means of selecting an emotion and its intensity.These data allow us to perform constant monitoring of students' moods throughout the course using real metrics that have their origin in the subject itself.In addition, these data allow us to get a correlation with measurements obtained from other sub-modules, and with the academic results achieved in each phase of the course.The videogazer tool, based on WebGazer library [45], allows us to detect students' attention in video lessons.Thanks to this tool, it is possible to determine whether the user is paying attention to the lesson or not based on his/her looking position.In this way, it is possible to relate the different phases of the video lesson with the attention and interest produced by each of them.Finally, the emotestcapturer implements a tool that allows us to capture students' emotions when they are solving a test.This tool is empowered by the Clmtracker library [58], which is able to detect the emotion of users through the position of the coordinates that draw their faces.All the data collected by these tools are sent to the emotion-aware e-learning component, where it will be processed.

Emotion-Aware E-Learning Component
The Emotion Aware E-learning component's main functions are collecting, storing, processing, analyzing, and visualizing the data obtained from the emotion recognition component.In addition, the module is able to adapt the smart environment according to automation rules, thanks to the integration of a task automation platform.An overview of the data interchange carried out by this module is seen in Figure 2. The data are received by the emotion server submodule.

MongoDB Emotion Aware E-learning System Emotion Recognition Module Emotion Server Storage Module
Self-Reporter This submodule consists of a web service that receives and handles requests from the emotion recognition module.It receives information regarding management or students, and pass this information to the corresponding submodule of the system depending on its nature.This information is stored by the storage submodule and sent to the learning analytics submodule and the emotion event trigger.The storage submodule consists of two repositories that store the system logic data and the users' related data.One repository, consisting of a MongoDB database [59], is used for storing personal user information (age, sex, etc.) and academic information (courses, lessons, etc.).The other repository, based on the distributed analytics and search engine Elasticsearch [60], is used to store the emotion and attention-related information.This data contains all the information about the students' mood, personal information, and grades.

Emotion Recognition Tools
In order to visualize these data, the Elasticsearch based repository connects with the visualization module, based on the dashboard-based visualization tool Kibana [61].The data must be shown in an accessible and easy to understand way, without the need for previous knowledge by the users.To make this possible, Kibana has been selected as an implementation tool for these visualizations.Kibana provides a web-based environment, on which we can implement our visualizations.We have used different chart types for defining several visualizations.These visualizations, which can be seen in Section 4, are grouped into three different dashboards: two for teachers and one for students.The visualizations designed for these dashboards are average self-report emotions per time (area); average grades, maximum and minimum grades per topic (table); the number of students enrolled (gauge); success rate (goal); grade levels (heatmap); average attention per video, grade average per topic, average emotions per test (bar); emotion evolution per minute in a test, average attention per second, average self-report emotions per time (line); the number of students per gender, and the number of students per grade (pie) and student country (region).
Other than analyzing data by means of the visualization, the system also provides a way to perform data analysis using machine learning techniques.The learning analytics submodule has this goal and allows the users to draw on conclusions that cannot be obtained through ordinary study and visualization techniques.This submodule performs an exploratory analysis of the data using K-Means and finding correlations between features, enabling the detection of anomalous cases, trends, and correlations in the students' data.
Finally, to perform automatic adaption of the students' smart environment, a semantic task automation platform has been integrated.The integrated automation platform is EWE Tasker, developed in a previous work [16].This platform offers an easy way to integrate smart devices and services in an environment and use them for automating daily tasks.These integrations stem from devices as smart lights, sensors, and smartphones to services as Gmail or Twitter.Users in the platform are able to create and configure their own automation rules, and integrate new devices and services thanks to its semantic core, based on the EWE ontology [52].In order to import the received data into the platform, the emotion event trigger submodule applies over it a semantic layer.Listing 1 shows an example of emotion detected event, along with its corresponding parameters and the generator channel, modeled according to work in Muñoz et al. [62].The event has been generated by the detection of "sadness" with an average intensity of 84%.The example is written using Notation3, a shorthand non-XML serialization of Resource Description Framework (RDF) models designed with human-readability in mind [63].
The task automation platform processes these events and performs the corresponding actions on the environment, according to the automation rules predefined by the users.The platform is linked to different services and devices, enabling the adaption of the environment to students' emotion or engagement.In addition, it includes a smartphone app that can connect with different devices in order to open new possibilities.Some of the developed possibilities are stress detection from Empatica E4 wristband [64] and indoor geolocation using Estimote iBeacons [65].The integration of the automation platform enables the automatic improvement of students' mood and comfort by adapting their environment.

Case Study
In order to better understand the usefulness of the proposed system, we have defined three case studies: Academic and Emotional Dashboards for teachers, Emotional Dashboard for students, and Smart Automation for students and teachers.The context in which these case studies take place is an online programming course.This course, entitled "Basic Python for Beginners" [66] is available on the Moodle.net.The course offers a set of video lessons and topic-based questionnaires.
These case studies are described in the following, explaining how they allow us to test the main features of the system: emotion and engagement recognition; data analysis, collection, and visualization; and smart automation.

Academic and Emotional Dashboards for Teachers
This case study is meant to remove communication barriers between teachers and students in distance learning environments.An overview of this use case is given in Figure 3.
The student's goal is to complete the lessons that compound the course.With this purpose, the student will carry out three types of activities: theory study, video lessons, and questionnaires.Meanwhile, student's engagement, stress, and mood will be monitored.In addition, self-reported emotions' questionnaires will be performed during the course.While the student is doing the video lessons, his/her attention is being measured.This enables us to capture the student engagement related to the topic and to the current video time.In addition, the student can perform a test, answering a series of questions related to the studied topic.While the user performs the test, emotions and stress level experienced are captured, as can be seen in Figure 4.The collected data will be shown to the teacher in a dashboard, enabling a better understanding of the student's learning experience.This allows the teacher to perform adaptations in methodology, content, or activities in order to improve the student mood, and, consequently, his academic results.Figure 5 shows an example of the charts enabled to the teacher in the dashboard.These charts show the statistics obtained from students' during the different activities of the course.This information stems from grades to emotional data and associated with each topic.In this way, teachers can analyze how each emotion and engagement affects the learning outcomes obtained.In this example, the chart shows the students' average engagement, allowing the teacher to analyze the average of attention that the students have paid to the video lesson of a specific topic.It can be seen how the attention in the first seconds of the video is high, while as time goes by, it varies until it drops sharply at the end.
Through these visualizations, the teacher can better understand the characteristics of the group of students who are participating in the course.This allows him to adapt the contents to the type of students who have greater difficulties, taking into account a great variety of factors such as emotion, attention, age, or nationality.In this way, the communication barriers of an e-learning environment are overcome.

Emotional Dashboard for Students
This case study is based on the previous one but also includes an emotional dashboard for students.This dashboard allows students to perform an introspection exercise, in which they can learn how emotions affect their performance, and, consequently, look for solutions to make the most of the course.An overview of this study case is presented in Figure 6.The Student's Emotional Dashboard is designed to show the evolution of the students' emotional state in a clean and straightforward way.In addition, the visualizations include the grades obtained during the course.Figure 7 shows some of the developed visualizations.
Particularly, Figure 7a shows the evolution in the emotions that the student has experienced while performing tests.The color code, used in this visualization and in others in which emotions appear, is inspired by Borth et al. [67].In this example, it can be seen how the most reported emotion is happiness.It can also be seen how surprise is a frequently reported emotion and how negative emotions have small values in comparison.
On the other side, Figure 7b shows how the feelings expressed by the student have evolved during the different lessons of the course.In this example, we can see that sadness is the emotion that the student experiences with higher intensity, followed by anger.

Smart Automation for Students and Teachers
The last case study has as a main goal to improve the comfort and mood of students while they are studying, through the use of smart devices and task automation.For this purpose, three different scenarios have been designed, as shown in Figure 8.These scenarios are mood regulation, attention improvement, and report generation.The mood regulation scenario aims to adapt the environment to the emotions experienced by a student through the use of smart devices located in his/her workplace.Students can create and configure their own mood regulation rules using the task automation platform.In this way, the environment of the student can automatically change according to his/her emotions, with the purpose of enhancing positive feelings.The automation rules can involve several devices or services, such as smart lights or speakers.In this way, while the student is studying, the work environment will adapt to his/her mood, improving his/her comfort and, consequently, his/her performance.An example of a rule to achieve this automation is: If a student is sad, then play happy music.
Another possible automation is attention improvement.This case study aims to improve students' attention when they are studying the course contents through the e-learning platform.As in the mood regulation scenario, automation rules will be created and configured by the students.In this scenario, the system will detect if the attention paid to the slides is under a certain threshold.When this happens, an action will be done by the system for warning the student and improving his/her attention.These actions can go from showing notifications on the phone to light or sound signals, depending on the student's preferences.Through these automations, the student can improve his concentration, optimizing the study time, and obtaining better academic results.A rule example for attention improvement is: If the student's attention falls, then show a notification.
Finally, the report generation scenario has the goal of facilitating the exploratory analysis of the course data presented in the previous section.This data analysis, when we are faced with a large amount of data, becomes a process that can take from the order of minutes to hours.Waiting until the end of this process can discourage teachers from using these analysis techniques, leaving them in disuse or undervalued.For this reason, and taking advantage of the capabilities of the automation platform, the teachers can configure rules for being warned about the completion of the analysis.In this way, when the system detects an analysis task has finished, the teacher can be notified using his preferred method configured in the automation rule (email, mobile notification, etc.), and can also obtain the generated report.A possible rule following this scenario is: If a new analysis has been performed, then send an email to the teacher.
These automations are carried out through the definition of semantic rules from the automation platform.Listing 2 shows the semantic definition of a sample mood regulation rule, with its corresponding event, action, and parameters.In this case, the rule is: If a student is sad, then play happy music, composed by the event of sadness detection and the action of playing a happy song.The song to play and the emotion detected are the parameters of the rule.
In this section, we have presented the case studies designed to improve the mood of students participating in an online course through the study of their emotions.The proposed platform achieves this goal by the integration of several visualization dashboards, analysis techniques, and smart automation.To this end, firstly, the use of visualization tools has been proposed to broaden the knowledge of both teachers and students about the interactions that occur during the course activities.These interactions include data pertaining to emotions, attention, and course grades.

Validation
As presented before, the main experimental contribution of this work was the design and implementation of an emotion-aware Learning Analytics system for e-learning platforms, based on Machine Learning techniques, and integrated with a semantic task automation platform.With the purpose of knowing the scope of this research, a set of hypotheses was raised: Hypothesis 1 (H1).To be aware of his/her emotions improves user satisfaction.
Hypothesis 2 (H2).The system quality of an e-learning platform directly affects user satisfaction.

Hypothesis 3 (H3).
To be aware of their emotions helps students to obtain better academic results.Hypothesis 4 (H4).The system quality of an e-learning platform is a key factor for academic results.
To evaluate the proposed system, it was tested in an experiment that was carried out in the scope of a Python's course in Moodle platform [68] with real users.In this experiment, the entire system was deployed, including the emotion recognition tools, the visualization, and analysis modules and the smart automation system.The following sections cover the participants, materials, design results, and conclusions drawn from the experiment, focusing on its objective.

Participants
The experiment was conducted among 30 students from a technical university, including both genders.The ages of the participants ranged from 20 to 25.Their selection was motivated by the characteristics of the course in which the experiment was set, a Python programming course for beginners.The participants' knowledge of programming was extensive, while their knowledge of Python language depended on each case.
The ethical approval for the experiment was obtained by the Ethics Committee of the Technical University of Madrid, and the study was carried out in accordance with the ethical guidelines [69].Participants were aware of the nature of the experiment, and gave informed consent prior to participation in the study.All subjects could abort the experiment at any time.

Materials
The material needed to carry out this experiment includes a wide variety of devices, sensors, and computer equipment, detailed below:

•
Camera (Gucee HD92) that feeds the video to the emotion recognizer submodule.

•
Room lighting (WS2812B LED strip controlled by WeMos ESP8266 board) is used as an actuator on the light level of the room, with the possibility of using several lighting patterns.

•
Google Chromecast [70] that transmits content in a local network.

•
LG TV 49UJ651V.This device is used for displaying images.

•
Google Home.The system uses this device to play music.
Participants accessed the web HTML-based interface using a desktop computer with the Firefox browser [71].

Methodology
During the experiment, each participant performed a lesson in an online programming course.This course, entitled "Basic Python for Beginners" [66], is available on the Moodle.net.This course has been chosen mainly because it adapts easily to the case studies that we defined in Section 4.
While participants were doing the course, the e-learning system is fed with the information provided by the different sensors that are continually monitoring the participant emotional and attention states.While the student is doing the course, different adaption rules are triggered depending on his/her emotion and attention, such as changing light color or playing music.In addition, the student can visualize his/her emotions utilizing the developed dashboard.The experiment finishes when the lesson is finished.In addition, a questionnaire is given to the participants just after the sessions conclude.These questions use five-point Likert scales and are oriented to offer the participant's view of the system.The questions raised are summarized in Table 1.
Table 1.Questionnaire for the participants.

Q1
The system is easy to use SQ1 Q2 System reliability is correct SQ2 Q3 The system provides interactive features between user and system SQ3 Q4 The system provides custom information SQ4 Q5 The system features are attractive for the user SQ5 Q6 The system provides information in a rapid way SQ6 Q7 System interface is clear and intuitive SQ7 Q8 The system provides the needed information related to emotions EIQ1 Q9 The information provided by the system is accessible at the required moment EIQ2 Q10 The information related to emotions provided by the system is relevant for your task EIQ3 Q11 The system provides enough information related to emotions EIQ4 Q12 The information provided by the system is easy to understand EIQ5 Q13 The system provides updated data of your emotions EIQ6 Q14 The system enables me to accomplish the tasks more efficiently US1 Q15 The system helps me learn effectively US2 Q16 The system improves my learning performance US3

No. Question Formulation Code
Q17 The system features are improvements with regards to standard e-learning platforms US4 Q18 Overall, the system is useful US5 Q19 The system saves my time in searching for materials BE1 Q20 The system has increased my knowledge and helped me to be successful in the course BE2 Q21 The system makes communication easier with the teacher and other students BE3 Q22 The system allows the teacher to adapt to students' needs BE4 Q23 The system has helped me to achieve the learning goals of the module BE5

Results
We tested our research hypotheses using partial least squares structural equation modeling [17], with SmartPLS 3.2.9[72].This software allows us to easily check the psychometric properties of the measurement and structural model.It enables the analysis of the reliability and validity, and estimate the strength of the relationships among different model variables simultaneously.
PLS-SEM is suitable for small samples, obtaining good results with a sample size at least 10 times the largest number of structural paths directed at a particular construct in the structural model [73].Figure 9 shows an overview of our model, according to the proposed hypotheses.As in our model, the maximum number of paths is two, and our minimum sample size should be 20, so we consider PLS-SEM an appropriate method according to our sample.The measurement model was assessed using the following criteria: First, we will check the indicator reliability, verifying that outer loading for the indicators is greater than 0.70 [74].Secondly, we will use Cronbach's alpha (α) and Composite Reliability (CR) for proving internal consistency reliability.The cut-off value is to be greater 0.70 for both tests [75].Finally, we will check for convergent and discriminant validity.The former one with the average variance extracted (AVE), which should be greater than 0.50 [76].The later, discriminant validity, will be checked: using Fornell-Larcker criterion [76]; Cross-loadings [75]; and the Heterotrait-Monotrait ratio (HTMT) [77].
For the test of individual item reliability, we examined the outer loadings.This gives us the loading of the measures on the constructs they intend to measure.According to the work of [74], we deleted all indicators with a threshold lower than 0.40 and retained those which were higher than 0.70.For those whose value was between the two thresholds, we retained the indicators which did not decrease the value of AVE and CR.As a result, eight indicators were eliminated, resulting in the indicators shown in Table 2. Next, we retrieved CR, Cronbach's alpha, and AVE in order to check for internal consistency and convergent validity.In Table 2, we can see how all the values meet the minimum threshold for internal consistency.The composite reliability for all scales was between 0.76 and 0.93, greater than the minimum required value of 0.70.In addition, the average variance extracted AVE employed to assess the convergent validity was greater than 0.50 for all constructs.Third, we evaluate discriminant validity using the correlation matrix for the Fornell-Larcker method, whose results are presented in Table 4. Discriminant validity indicates the extent to which a given construct differs from other latent constructs [78].The table shows that the diagonal values are larger than the other values inside the one column, which is the expected behavior according to the AVE.The cross-loadings formed the second method utilized to assess discriminant validity.Table 3 shows the values of these loadings for each indicator.The results show that each indicator all the items correlated more strongly with their latent variable than with any other variables in the model [79].Finally, we check for discriminant validity problems using the HTMT criterion, proposed by [77].HTMT is equal to the average Heterotrait-Heteromethod correlation relative to the average Monotrait-Heteromethod correlations.The Heterotrait-Heteromethod correlations are correlations of indicators across constructs measuring different phenomena, while the Monotrait-Heteromethod correlations are correlations of indicators measuring the same construct.Table 4 shows that all the HTMT values are within the accepted threshold values (less than 0.90).Thus, our measurement model verifies all three conditions suggested by prior research, providing strong support for its reliability.Having confirmed that the measurement model is reliable and valid, we next evaluated the structural model results.For the structural model, we have followed the following criteria, proposed by [74]: • Assess the structural model for collinearity issues (VIF < 5).

•
Assess the significance and relevance of the structural model relationships (t > 1.65).

•
Assess the level of R2 (The cut-off levels are: 0.190 weak; 0.333 moderate; and 0.6702 substantial).

•
Assess the level of Q (cut-off point larger than zero).

•
Assess the model's fit.
Thus, first, we check collinearity symptoms, to ensure that the results are valid.We do this by analyzing the variance inflation factor (VIF).A VIF greater than 5.00 would indicate overly high collinearity, and, consequently, a potential collinearity problem.The retrieved VIF values are all between 1.17 and 3.98, so they always meet the accepted threshold values (VIF < 5).Thus, we can affirm that collinearity was not a problem in our data.
Then, we proceed to evaluate the significance of each path coefficient.Path coefficients (β) representation relationships between the constructs in the model.These relationships are shown in Figure 9.The significance of each path coefficient, the standard errors, and the t-values were calculated by the algorithm of bootstrapping in PLS, with 5000 samples.From the analysis, we obtain that the emotional information contributes to improve user satisfaction (β = 0.771, p = 0) and benefits (β = 0.23, p = 0.04), supporting H1 and H3.In addition, we also observe that the system quality also improves user satisfaction and benefits, supporting H2 (β = 0.142, p = 0.03) and H4 (β = 0.63, p = 0).
We then assessed our model performance using the explanatory power (R 2 ) and predictive relevance (Q 2 ).Table 5 shows the results obtained with SmartPLS for each endogenous variable.The predictive relevance was obtained using Blindfolding in SmartPLS, with an omission distance D being 7.The values of R 2 , according to the cut-off points of 0.67, 0.33, and 0.19 for substantial, moderate, and weak levels, respectively [80], gives us a moderate level for emotion information quality, and a substantial level for system quality.According to Hair et al. [73], Q 2 measures the capability of the path model to predict the endogenous measuring items indirectly from the prediction of their latent variables using the related structural relations.The cut-off point is to be larger than 0, so, as can be seen in Table 5, the model has strong predictive relevance for the endogenous constructs.As an additional indicator, we used the global goodness-of-fit (GoF) criterion proposed by [81] to assess the model fit.GoF is defined as "how well the specified model reproduces the observed covariance matrix among the indicator items" [74], so it allows us to evaluate the overall performance of the model in both measurement and structural levels.The GoF index is ranged between 0 and 1, and is calculated using the following formula: There is no inference based criteria to assess the statistical significance of the GoF index, due to its descriptive nature [82].However, Wetzels et al. [83] propose several baselines in order to validate the PLS models globally.These baselines are, according to different effect sizes of R 2 : GoF small (0.10), GoF medium (0.25), and GoF large (0.36).The obtained value of 0.46, higher than the cut-off of 0.36 proposed by Wetzels et al. [83] for large effect sizes of R 2 indicates a good overall fit, and adequately validates the complex PLS model globally.
These results also validate the structural model, confirming the proposed hypotheses and validating that the use of an emotion-aware e-learning platform improves students' satisfaction and helps to improve their academic performance.

Teacher Validation
Finally, in order to validate the usefulness of the system for teachers, it has been tested by five professors at the Technical University of Madrid.Participants were asked to try the system, and then answer a set of questions for evaluating the usefulness of the system.These questions use five-point Likert scales and are oriented to offer the participant's view of the system.The questions raised, along with the mean values obtained, are summarized in Table 6.As shown in the table, the system results in being quite useful for the professors, achieving high scores for all the questions.However, the score for the Q4 is a little lower compared with the others, which suggests that an effort must be still done in improving communication between the teacher and the students.

No. Question Formulation
Avg.

Q1
The information provided by the system related to students' emotion is useful 4.2 Q2 The system features are improvements with regard to standard e-learning platforms 4.8 Q3 The system allows the teacher to adapt the content to students' needs 4.6 Q4 The system makes communication easier with the teacher and other students 3.8 Q5 Overall, the system is useful 4.6

Conclusions and Future Work
This paper proposes an architecture for an emotion aware e-learning platform based on semantic technologies, which enables the emotion analysis in study places and the automated adaption of these study places to the students' needs.The proposed architecture allows teachers to study these analyses in order to adapt the methodology to this information, with the purpose of improving students' mood and consequently help them to achieve better academic performance.In this way, our platform advances in decreasing communication barriers between teachers and students in distance learning environments; while it also provides the students with mood regulation capabilities.In addition, the application of a semantic layer enables data interoperability and portability of components.Finally, the system has been implemented and evaluated using PLS-SEM methodology.
Through the experimentation, we have verified a set of hypotheses.In summary: (1) using the proposed emotion-aware e-learning system helps students to obtain better academic results; (2) the system quality of an e-learning platform directly affects user satisfaction; (3) using the proposed emotion-aware e-learning system improves user satisfaction; and, finally, (4) the system quality of an e-learning platform is a critical factor for academic results.These results encourage the use and improvement of this kind of e-learning systems, as they seem to provide users with several advantages, such as analysis of their mood and attention or adaption of their smart environment to this mood.
As future work, many lines can be followed to continue this work.One of these lines is to deep in the context of emotion and mood regulation.The integration of more sophisticated ways for smart environment adaption to emotions could be a key improvement for these systems.Currently, we are working on this line exploring new ways of emotion regulation.In addition, another line of future work would be the enhancement of the learning analytics side of the system, studying new methods, and improving the current analysis in order to offer new relevant data.Furthermore, we also plan to integrate new ways of emotion recognition in the system by means of the analysis of text interactions in students.This adds more reliability to the system as it incorporates a mature way of recognizing emotions.In addition, it would be relevant to implement the system in a real college course, in order to study and evaluate its benefits in a real scenario.In addition, in this line, the implementation of the system in companies which provide courses for employees' training would open new opportunities to the validation and improvement of the system.Finally, it would be also interesting to deeply evaluate the benefits of the system for the teachers, analyzing if the information about the students' provided by the system entails a saving in terms of effort and time costs to the adaptation of the material and courses to students, and if this adaptation provides better results.

Listing 1 .
Semantic representation of emotion detected event written in Notation3.ewe:hasEmotion r d f s : s u b P r o p e r t y O f e w e : h a s P a r a m e t e r .ewe:hasUser r d f s : s u b P r o p e r t y O f e w e : h a s P a r a m e t e r .onyx:Emotion rd f s: s ub Cl a ss Of ewe:Parameter .ewe:WebCamS e n s o r a owl:Class ; rdfs:subCla ss O f ewe:Channel ; dcterms:title " Web Camera " ; d c t e r m s :d e s c r i p t i o n " This ~channel represents a web camera sensor able to recognize emotions from video ." ; e we : ge n er a t e s E v e n t e w e : E m o t i o n D e t e c t i o n E v e n t .e w e : E m o t i o n D e t e c t i o n E v e n t a owl:Class ; rdfs:subCla ss O f ewe:Event ; d c t e r m s :d e s c r i p t i o n " This ~event fires every time an emotion is detected by a certain user ." ; dcterms:title " New emotion detected " ; ewe:hasEmotion onyx:Emotion ; ewe:hasUser ewe:User .ewe:web -cam1 rdf:type e w e : W e b C a m S e n s o r .onyx:sadness1 rdf:type onyx:Emotion ; dcterms:title " Sadness emotion detected " ; dcterms:value " sadness " ; onyx:source ewe:web -cam1 ; o n y x : h a s E m o t i o n I n t e n s i t y 0.84 .ewe:student 32 3 8 rdf:type ewe:User ; foaf:accoun t N a m e " Student 3238 " .e w e : s a d n e s s d e t e c t e d 1 rdf:type e w e : E m o t i o n D e t e c t i o n E v e n t ; ewe:generat ed B y ewe:web -cam1 ; ewe:hasEmotion onyx:sadness1 ; ewe:hasUser e w e: st u de nt 3 23 8 .

Figure 3 .
Figure 3. Academic and emotional dashboards for teachers' case study.

Figure 4 .
Figure 4. Emotional capture in the e-learning platform.

Figure 5 .
Figure 5. Average attention of students for a certain lesson.

Figure 6 .
Figure 6.Emotional dashboard for students' case study.

Figure 7 .
Figure 7. Self-reported emotion and average captured emotions for a test.(a) emotions self-reported by students shown per week; (b) example of the average captured emotions.

Listing 2 .
Rule instances.ewe:PlaySong a owl:Class ; rdfs:subCla ss O f ewe:Action ; d c t e r m s :d e s c r i p t i o n " Play a song on a speaker ." ; dcterms:title " Play song " ; ewe:hasSong mo:Track .ewe:Speaker a owl:Class ; rdfs:subCla ss O f ewe:Channel ; dcterms:title " Audio speaker " ; d c t e r m s :d e s c r i p t i o n " This ~channel represents a smart speaker able to play sounds ." ; e we : pr o vi d e s A c t i o n ewe:PlaySong .ewe:play -happy -song rdf:type ewe:PlaySong ; ewe:providedBy ewe:speaker1 ; ewe:hasSong mo:all -together -now .ewe:speaker1 rdf:type ewe:Speaker .e w e : s a d n e s s d e t e c t e d 1 rdf:type e w e : E m o t i o n D e t e c t i o n E v e n t ; ewe:generat ed B y ewe:web -cam1 ; ewe:hasEmotion onyx:sadness1 ; ewe:hasUser e w e: st u de nt 3 23 8 .mo:all -together -now rdf:type mo:Track ; dc:title " All ~together now " ; foaf:maker mo:the -beatles .ewe:regulate -sadness a ewe:Rule ; dcterms:title " Sadness regulation rule " ; d c t e r m s :d e s c r i p t i o n " This ~rule aims to regulate sadness by mean of playing a happy song ." ; e w e : t r i g g e r e d B y E v e n t e w e : s a d n e s s d e t e c t e d 1 ; ewe:firesAc ti o n ewe:play -happy -song .

Table 2 .
Internal consistency and convergent validity.

Table 3 .
Cross loadings for indicators.Bold numbers represent correlation of each item with its latent variable, which is stronger than with any other variable in the model.
Bold numbers show that the diagonal values of correlation are larger than the other values in the column, which is the expected behavior and validates discriminant validity.

Table 6 .
Questionnaire for the professors.