Next Article in Journal
Al-Fe-Ni Metallic Glasses via Mechanical Alloying and Its Consolidation
Next Article in Special Issue
Data Analytics and Machine Learning in Education
Previous Article in Journal
Employing a Multilingual Transformer Model for Segmenting Unpunctuated Arabic Text
Previous Article in Special Issue
Predicting GPA of University Students with Supervised Regression Machine Learning Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Learning Analytics to Determine Profile Dimensions of Students Associated with Their Academic Performance

by
Andres Gonzalez-Nucamendi
1,
Julieta Noguez
1,*,
Luis Neri
1,
Víctor Robledo-Rella
1,
Rosa María Guadalupe García-Castelán
1 and
David Escobar-Castillejos
2
1
Tecnologico de Monterrey, School of Engineering and Science, Ave. Eugenio Garza Sada 2501, Monterrey 64849, Mexico
2
Facultad de Ingeniería, Universidad Panamericana, Augusto Rodin 498, Mexico City 03920, Mexico
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(20), 10560; https://doi.org/10.3390/app122010560
Submission received: 14 September 2022 / Revised: 6 October 2022 / Accepted: 10 October 2022 / Published: 19 October 2022
(This article belongs to the Special Issue Data Analytics and Machine Learning in Education)

Abstract

:
With the recent advancements of learning analytics techniques, it is possible to build predictive models of student academic performance at an early stage of a course, using student’s self-regulation learning and affective strategies (SRLAS), and their multiple intelligences (MI). This process can be conducted to determine the most important factors that lead to good academic performance. A quasi-experimental study on 618 undergraduate students was performed to determine student profiles based on these two constructs: MI and SRLAS. After calibrating the students’ profiles, learning analytics techniques were used to study the relationships among the dimensions defined by these constructs and student academic performance using principal component analysis, clustering patterns, and regression and correlation analyses. The results indicate that the logical-mathematical intelligence, intrinsic motivation, and self-regulation have a positive impact on academic performance. In contrast, anxiety and dependence on external motivation have a negative effect on academic performance. A priori knowledge of the characteristics of a student sample and its likely behavior predicted by the models may provide both students and teachers with an early-awareness alert that can help the teachers in designing enhanced proactive and strategic decisions aimed to improve academic performance and reduce dropout rates. From the student side, knowledge about their main academic profile will sharpen their metacognition, which may improve their academic performance.

1. Introduction

Learning analytics, understood as the use of data about students to improve their learning, is an approach through which teachers can understand education, help them to be student conscious and better capitalize teaching resources [1]. In particular, the search to provide adaptive learning environments that offer students with alternative learning options, such as various types of resources, interactive activities, and personalized services, begins with the challenge of knowing their academic backgrounds, needs, and profiles. Throughout history, educational institutions have been concerned about improving the skills and learning outcomes of students to provide society with well-prepared professionals, who are ready to work out solutions and enroll in the labor market. However, one of the main issues has been the determination of the key factors that influence academic performance in a given learning environment. In this context, education has benefited recently from powerful data analysis tools, such as data mining and learning analytics [2,3].
Educational data mining, such as learning analytics, may guide educational institutions in providing suitable learning environments that promote academic success [4,5,6]. Therefore, institutions have started using learning analytics tools to improve services and student outcomes and promote life-long learning [7,8]. Learning analytics denotes the collection and analysis of data about learners and their instructional and learning contexts to improve learning and learning environments. Therefore, learning analytics is near the top of the priority list for many institutions in higher education. Furthermore, new and evolving technologies are creating more and greater opportunities for the personalization of education. However, poor academic performance and decline in student retention in higher education continue to drive the need for more personalized, engaging student experiences to maintain enrollment. Therefore, current technologies are reaching into the education ecosystem and creating opportunities to bring the personalization of education to real environments [9]. This can benefit: (i) students in their learning process along with the outcomes, (ii) designers of specific programs and courses focused on personalizing learning, (iii) instructors in their performance, and (iv) researchers. All of them can apply Learning Analytics more effectively to improve teaching as well as learning in higher education [10].
The benefits of learning analytics typically take one of three forms: (a) early alert warning or reminder systems, so that teachers or institutions can intervene with academic support for students, (b) predictive analytics platforms, so that institutions can monitor students regarding the evolution of their learning, and (c) course planning and navigation systems to support course designers by providing relevant data-driven insights. Frequently, these systems obtain data from the scholar services systems of institutions to identify, for example, students at risk of failing courses or dropping out, student behavior patterns, or points of failure within the system [11,12]. However, for learning environments that are only partially digitized, teachers are required to use their pedagogy and transmission of knowledge to enable students to acquire knowledge and develop their skills. This conjunction is transformed by the connection of specific characteristics between teachers and students. This meeting point helps in the discovery of how teachers and their teaching methods influence the manner in which students feel, think, and act. This aspect is one that is not always intentionally planned during the teaching process [13].
In previous research, the authors of the current study defined student profiles on the basis of the constructs of multiple intelligences (MI) and self-regulated learning and affective strategies (SRLAS) to identify the most important characteristics related to the academic success of engineering students [14]. In that work, the authors proposed three alternative measures to handle the 16 different dimensions associated with both constructs. The study found that biases due to intrinsic student optimism or pessimism can be significantly reduced in the proposed average measure by considering a normalized measure (NM). Furthermore, the study identified that students with high levels of logical-mathematical intelligence, improved self-regulation, and low levels of anxiety exhibited better academic performance. To complete and extend these previous findings, and to take advantage of new tools for data analysis, the current study presents a formal learning analytics study of the data, including clustering, correlation, regression, and principal component analyses, to examine the relationships among the above-mentioned dimensions and student academic performance.
We present a novel solution that combines student profiles with learning analytics techniques to build predictive academic performance models. We use learning analytics tools to identify the most important dimensions of the MI and SRLAS constructs associated with academic performance. For this reason, the appropriate instruments for determining student profiles according to the MI and SRLAS dimensions were first selected [14]. Then, learning analytics techniques were applied to determine the main profile dimensions associated with academic performance.
The present study addresses the following research question:
Is it possible to identify the impact of the students’ profile dimensions on their academic performance using predictive models based on MI and SRLAS?
Furthermore, we present the following research hypothesis:
The timely determination of students’ profiles based on their MI and SRLAS dimensions to enhance their academic performance can be achieved through learning analytics.
The remainder of the paper is structured as follows: Section 2 discusses related work. Section 3 introduces the MI and SRLAS constructs as adapted from our previous study [14]. The methodology of this research is outlined in Section 4. Section 5 provides the selected learning analytics techniques and shows the main results. Section 6 offers a discussion of the findings and compares them with those of previous studies. Finally, Section 7 concludes and outlines future work.

2. Related Work

Recently, learning analytics has been used to disclose patterns that exert an impact on student learning. Specifically, Van Leeuwen et al. [15] have used learning analytics tools in a computer-supported collaborative learning environment to motivate and guide teachers in providing better interventions and in supporting collaborative groups of students faced with problems regarding cognitive activities. Moreover, the search for successful patterns for timely interventions led Sousa-Vieira et al. [16] to conduct an in-depth examination of student activities on the SocialWire platform. Particularly, this platform programs three types of online activities: (a) pre-class activities, (b) questionnaires before partial exams, and (c) the use of forums for collaborative learning. Comparing the results obtained through various success/failure classifiers, the authors concluded that the student final course grades are best predicted with the pace of the activities in which they participated, that is, the number of events per unit of time, instead of the type of initial activity. Moreover, Teo et al. [17] demonstrated the usefulness of learning analytics methods in analyzing knowledge creation and collaboration in an online electric and electronics engineering course, whereas Kim et al. [18] used learning analytics to support self-regulated learning in asynchronous online courses.
Undoubtedly, technological advances have improved the design and development processes of educational applications. In addition, interest in the use of ICT to enhance and predict academic performance has emerged [19,20,21]. Some studies have focused on identifying hidden knowledge and patterns using data mining techniques [22]. As such, applications and systems have experienced exponential growth in recent years in this field.
Pandey and Taruna [23] developed multiple classifiers using K-nearest neighbor, and decision trees to predict academic performance. The authors used a data-set on academic information as well as demographic information from a university in India to predict the academic performance of undergraduate engineering students. The authors mentioned that the proposed method can also be used for the development of decision support systems.
Hasan et al. [24] used decision tree algorithms to achieve the prediction of academic performance. To test their methodology, records from 22 students that contained academic information and activities in Moodle were used. A mining tool, named the Waikato Environment for Knowledge Analysis and developed at the University of Waikato, New Zealand, was used to evaluate the decision tree algorithm along with access time in Moodle. The authors found that the random forest tree approach obtained better results in this task than comparative decision tree algorithms. Similarly, Hamsa et al. [25] also used decision trees along with their implemented genetic fuzzy systems and Fuzzy Fitness Finder. The authors reported that the results obtained from the decision tree classifier enabled the lecturers to take better care of students. Alternatively, the fuzzy logic approach provided friendlier results, which provided students with mental satisfaction, whereas lecturers could attend them indirectly.
In the same area, Bravo-Agapito et al. [26] examined the use of exploratory factor analysis, multiple linear regressions, cluster analysis, and correlation to determine whether students are engaged in the course and to predict their academic performance. The authors used data from Moodle interaction, characteristics, and grades of 802 undergraduate students and found that the prediction of academic performance is principally based on four factors, namely, access (variables related to student access to Moodle, including visits to forums and glossaries), questionnaires (visits to and attempts to complete questionnaires), tasks (variables related to consulted and submitted tasks), and age. Moreover, the authors reported that the age factor predicts that academic performance is inversely related to age.
Trujillo-Torres et al. [27] focused on mathematical competence. They proposed that the perception of students, the relationship between teacher and students, the classroom, gender, teaching-learning methods, and motivation are crucial factors for achieving optimal academic performance. The study intended to determine the optimal algorithm model for predicting the maximum learning gain of students. They employed a 14-item questionnaire, which was validated using the Kaiser–Guttman criterion and Tucker–Lewis Index. The cross-sectional study recruited a total of 2018 high-school students. The results indicated that the role of the classroom and the teacher–student relationship exerted a large influence on mathematics scores. Along a similar research line, Sharabiani et al. [28] designed a prediction model using Bayesian networks to forecast the grades of engineering students in three courses. The study examined the records of 300 students to test the proposed model and used 10 variables, such as demographic data and scores obtained from previous courses. The accuracy exhibited by their approach was compared with other models, such as decision trees, K-nearest neighbors, and naive Bayes. In this direction, D’Uggento et al. [29] also identified the usefulness of adopting a periodic monitoring system, which considers statistical techniques, such as logistic regression, survival analysis, and Cox regression model. These techniques enabled the early detection and modification of factors to achieve optimal results regarding students’ expectations and quality of higher education. The authors used data from 7485 freshmen students enrolled in an academic year.
In the search for factors that exert various impacts on learning, Akhtar et al. [30] used a computer support collaborative learning environment in a computer laboratory course to monitor student participation and to predict student success. The authors found that achievement was positively correlated with course attendance, grouping with peers, and time allocation for task, whereas it was negatively correlated with the seating distance of students relative to the position of the lecturer. Using the linear regression approach, the authors suggested that learning analytics can be used to predict academic performance and to identify students at risk of course failure. Similarly, Atkinson [31] investigated the relationship of learning style, gender, and prior experience in design and technology among trainee teachers in their degree program. Although the results from the learning style groupings (verbal-visual and holistic-analytic) did not meet expectations and, although the conclusions about gender differences lacked a consensus, the study observed a positive relationship between achievement and past experience.
Regarding the role of anxiety on the learning outcomes of students, Chapell et al. [32] investigated the relationship between test anxiety and academic performance on a large sample composed of 4000 undergraduate and 1414 graduate students enrolled in public universities in the USA enrolled in different majors. Using descriptive statistics, the authors observed a small but significant inverse relationship between these two variables. Moreover, Vitasari et al. [33] investigated the relationship between study anxiety and academic performance on a large sample of engineering students in Malaysia. The results demonstrated a significant correlation between high levels of anxiety and low levels of academic performance. Furthermore, the study concluded that anxiety during studying is a major predictor of academic performance and exerts a detrimental effect on student academic achievement. In similar research, Balogun et al. [34] scrutinized the moderating role of achievement motivation in the relationship between test anxiety and academic performance among undergraduate students in Nigeria. The results indicated that, although test anxiety and achievement motivation exerted negative and positive effects on academic performance, respectively, achievement motivation significantly moderated these relationships. Therefore, the authors concluded that universities should design appropriate psycho-educational interventions to enhance the achievement motivation of students.
Nowadays, evaluation should be aligned with specific competencies, such that students can exhibit their understanding and abilities through examinations so that teachers can improve their teaching [35]. Empirical evidence illustrates that an active learning environment encourages students to be more open and committed. When evaluation considers class participation, quizzes, lab experiments, and presentations, in addition to written exams, then students obtain a better well-rounded view of their capabilities. In Table 1, we summarized and categorized the references mentioned above regarding their main attributes, methods, together with contribution to the field.

3. Multiple Intelligence (MI) and Self-Regulated Learning and Affective Strategies (SRLAS)

After the description of the state-of-the-art of the field, this section explains the constructs used by this research. The employed instruments have been validated statistically and socially [36]. To discover the variables that may be related to academic performance, we define student profile on the basis of two constructs: (a) MI and (b) SRLAS, as explained in our previous research [14].

3.1. Multiple Intelligence (MI)

To identify the characteristics of the students from a wide perspective, the study employed the MI theory by Gardner [37]. Scholars have recognized that MI can be related to academic performance [38,39,40,41]. Therefore, the authors developed an instrument, as explained in [14], to assess students’ MI from Gardner’s questionnaire as presented by Armstrong [42]. However, after reviewing the wording of several questions, it was necessary to adapt some of them to the Latin-Mexican culture of our student sample. Table 2 presents the eight MI dimensions considered in this study.

3.2. Self-Regulation Learning and Affective Strategies (SRLAS)

SRLAS refers to the ability of students to recognize and adopt the most appropriate approaches to achieve optimal learning, which can be derived from knowledge about their academic strengths and limitations. It implies consciousness about their skills and areas of opportunity and, the adoption of the appropriate attitude toward self-motivation to reach specific goals, despite the difficulty of attaining such goals. Consequently, appropriate learning strategies, in addition to suitable affective schemes, can play decisive roles in academic performance [43,44,45]. As explained in [14], Gargallo’s instrument was adapted to assess these learning strategies [46]. The tool was statistically validated for university students with academic levels and culture (Latin) similar to those of our Mexican student sample. Table 2 presents the eight SRLAS dimensions adopted in this research.

4. Methodology

The methodology of the quasi-experimental study was divided into two phases, as shown in Figure 1: (1) What we call “previous research” published in [14], in which the MI and SRLAS instruments were defined. The data were collected through surveys published on a website, the reliability of the instruments was validated and a normalized measure was obtained to remove the bias of the optimism–pessimism effect of the students’ self-perception; (2) The investigation of this work called “current research” that includes the learning analytics process once the normalized database of the student profiles was obtained.
To facilitate the understanding of the first part [14], the main processes that comprised it are briefly described below.

4.1. Previous Research

4.1.1. Instruments’ Adaptation

In the first phase, we decided to determine the main factors to characterize student profiles, using questionnaires adapted from Gardner’s multiple intelligences (MI) model [37] and Gargallo’s self-regulation, learning and affective strategies skills (SRLAS) [46]. Various instruments were selected and adapted for each questionnaire, as presented in preliminary studies by [47,48]. The criterion used to select the best measure was the predictive power on the students’ final grade. In this study, the student profile was defined as the set of values obtained for eight MI dimensions in the Gardner questionnaire and eight SRLAS dimensions adapted from the Gargallo questionnaire.

4.1.2. Students’ Profiles Data Collection

Student Sample

We started with a purposive sample of N = 1693 students who completed the adapted MI and SRLAS questionnaires. The sample included primarily engineering students (94%) enrolled in physics, mathematics, and software engineering courses, and a smaller sample (6%) of students enrolled in finance and economics courses.

Collection Method

To collect student responses, a web system was developed to load the MI and SRLAS questionnaires, as a dashboard, as illustrated in Figure 2. In this way, each student obtained their profile composed of a set of values for the eight MI dimensions and eight SRLAS (Table 2). However, not all students in the initial sample provided complete responses, and all incomplete entries were excluded from data analysis. Therefore, the final sample size was reduced to N = 618 students from different courses. Figure 2 shows a view of the basic Dashboard created once the student answers each questionnaire, and the radar graphs are shown with its profile in each dimension of the instrument. It is worth mentioning that it is explained to the students that both instruments are diagnostic, not defining, and they show the areas of opportunity in their cognitive abilities (MI) and self-regulation, learning and affective strategies (SRLAS).

4.1.3. Reliability with Cronbach’s Alphas

Cronbach’s Alpha is a method for assessing and evaluating questionnaires [49]. It is calculated as the mean of the correlations of the questionnaire items that are part of a scale. It is a measure of internal consistency of the items comprising the questionnaire, that is, how closely related a set of items are as a group. It varies from 0 to 1. The closer it is to 1, the more consistent the items related to that scale will be. It is considered to be a measure of scale reliability. Therefore, to analyze the reliability of the instruments, the values of Cronbach’s alphas were calculated for each dimension of each instrument, for the student sample. The average value for each dimension was 0.8121, which means that the reliability of the instruments used here are solid [48].

4.1.4. Normalized Measurement

Three alternative measures were proposed to assess students’ levels in the different dimensions of both constructs (MI and SRLAS). It was possible to significantly reduce biases in the average measure due to students’ optimism or pessimism when considering a normalized measure [14]. Therefore, an NM was defined in order to decrease this effect on the average measure. For a given student and a given construct (MI or SRLAS), the NM was defined as the ratio of the average measure of each dimension ( A M i ) and the mean value of the eight dimensions of that construct, as given in Equation (1)):
N M i = A M i Q u e s t i o n n a i r e   M e a n ( s t u d e n t )
where i = 1–8 for each construct. In this way, for each student of our sample, a numerical value was calculated for each one of the 16 dimensions in Table 2: 8 for MI and 8 for SRLAS. This allowed us to determine each student profile.

4.2. Current Research

For this study, the Department of Student Services of our institution provided us with the database corresponding to the sample of students with the final grades. Once the database of student profiles was integrated and normalized, based on the research question, the following data analytics process was applied.

4.2.1. Research Question

As mentioned in the Introduction section, the research question selected in this second phase is:
Is it possible to identify the impact of the students’ profile dimensions on their academic performance using predictive models based on MI and SRLAS?
To test this question, the learning analytics techniques described below were applied.

4.2.2. Exploratory Analysis

Principal components analysis (PCA) was firstly used since it provides a visual tool for correlations in a multivariate context (many variables). PCA showed how the variables correlate considering the joint effects among all the variables. To facilitate its visualization, it was decided to use the circle of correlations of all the variables of each instrument for the final sample of N = 618 students.

4.2.3. Explanatory Analysis

Although clustering is commonly applied as an exploratory technique, in this case, since the final grades of each student in their respective courses were available, each group formed could be associated with different grade levels.

4.2.4. Predictive Model

In order to identify the variables that have the greatest influence on the students’ school performance, a multiple regression analysis was first carried out. However, due to the presence of multicollinearity between the variables, it was necessary to analyze some correlations between the variables that explain the results of the multiple regression predictive model.

4.2.5. Results and Hypothesis Testing

The following section shows the results and analysis, and Section 6 discusses the findings regarding the significance of the explanatory variables, which corroborate the hypothesis.

5. Results and Analysis Using Learning Analytics Techniques

To test our research question, a descriptive analysis of the sample using learning analytics techniques was performed to generalize the results observed in the target population. The selected tools were as follows: (a) Principal Component Analysis (PCA), (b) Clustering, and (c) Correlation and Regression Analysis. Through PCA, a graphical visualization of the relationships between the different variables or dimensions is sought. This analysis tool shows a visual perspective that provides important details of the relationships among the variables. As a complementary tool, clustering allows groups of students with similar characteristics to be formed, which can be subsequently analyzed in terms of their academic performance. The analysis of the structure of these clusters provides relevant information for our goal. In addition, correlation and regression analysis provide generalizable regularities to the target population. However, as discussed below, highly correlated explanatory variables cannot be present in the same regression model due to the multicollinearity effect, which distorts the regression coefficients and the significance of the observed relations. Consequently, it is important to complement the regression analysis with correlation analysis to better understand the dynamics through which the various regression models are being formed.
Once the NM was chosen as the best way to characterize students’ profiles in a more objective manner, the selected learning analytical techniques were applied to identify those dimensions that could be related to academic performance. These analyses are presented and discussed next.

5.1. Principal Component Analysis (PCA) and Correlation’s Circle

Considering all the student profiles obtained in our sample, biplot diagrams of principal components for MI and SRLAS were built. In Figure 3a,b, the principal planes containing the largest amount of possible information with only two axes are presented for the MI and SRLAS dimensions, respectively. These axes are named PC1 and PC2 for convenience. In these plots, each dot represents the profile of a given student. Each dimension is represented with an arrow, and the angle between any two arrows corresponds to the degree of correlation between these two dimensions. Therefore, a small angle corresponds to a strong correlation between these two dimensions, a 90° angle implies zero correlation, and a 180° angle indicates total anti-correlation. Moreover, points close to a specific arrow represent students with preponderance in the corresponding dimension. Large arrows or far points from the center indicate dimensions or students best represented in this plot, respectively.
In Figure 3a, it is observed that LogMath, Lin, BodKin, and Intra are quite correlated, while the remaining dimensions are less correlated with them, with Mus being almost independent of Spa and Inter. Overall, the PCA biplots show relative independence among the MI dimensions, which was one of Gardner’s hypotheses when he proposed his multiple intelligences’ scheme [37].
Regarding the SRLAS biplot (Figure 3b), they form two distinct groups. On one hand, Anx and ExtMot are highly correlated, while the remaining dimensions are correlated among them, but not with Anx and ExtMot. The numerical values of the correlations indicate that IntMot and SelfReg present the highest correlation within the SRLAS dimensions.

5.2. Clustering

In this section, the formation of clusters of students having similar student profiles according to their MI and/or SRLAS dimensions is presented. The clustering processes were programmed with R using the hierarchical classification algorithm coupled with the Ward method. Several trials were tested with varying cluster sizes. After an analysis of the study, four clusters presenting the derived information more clearly were determined. For each formed cluster, the average grades for the courses in which the students were enrolled as registered in our database were included to compare academic performance among the different clusters.

5.2.1. MI Clustering

In Figure 4a,b, the populations of the four MI clusters and their corresponding average grades are presented, in the pie chart and the bar diagram, respectively. Table 3 also presents these values, along with the average grades and their standard deviations. The populations of the four MI clusters and their corresponding average grades are presented, in the pie chart and the bar diagram, respectively. Table 2 also presents these values, along with the standard deviation of the average grades.
From Table 3, it is observed that the average grades for Clusters 1, 2, and 3 are similar but relatively different from that of Cluster 4. Therefore, paired one-tail t-tests between the clusters were performed to validate these results. It was found that there was no statistical difference among the average grades of Clusters 1, 2, and 3. On the other hand, Cluster 4 was compared with the union of Clusters 1, 2, and 3. The following hypotheses were used:
H o : μ 1 2 3 = μ 4
H a : μ 1 2 3 > μ 4
It was observed that the average grade for Clusters 1, 2, and 3 is statistically higher than the average grade for Cluster 4 with a p-value = 0.0580 (Table 4).
To investigate the possible differences among MI clusters in more detail, a bar-plot in a 1–5 scale, with the average values obtained for each MI dimension, was constructed (Figure 5a). Cluster 3, consisting of 34 students and having a slightly higher average grade, presents the smaller values of all MI dimensions, as compared with the other clusters. To emphasize the differences among clusters and to facilitate the analysis of the data, in Figure 5b, a normalized radar is formed by setting the difference between the maximum and minimum values of the sample equal to 1 and interpolating the remaining values of the sample.
As previously mentioned above, it is seen that Cluster 3 (red line in Figure 5) has the lowest values for all MI dimensions, despite exhibiting (slightly) the highest average grade. Cluster 1 shows the highest value for almost all dimensions, while Clusters 2 and 3 present intermediate values. The results indicate that Cluster 4, composed of 88 students with the lowest average grade, presents a very high value of the intrapersonal dimension but a relatively low value of the interpersonal dimension. These results suggest that MI dimensions alone are not useful enough to categorically explain the differences between students’ grades.
Consequently, in this study, the information derived from MI is complemented with that provided by SRLAS to determine the combination of dimensions that could better explain the academic performance of undergraduate engineering students.

5.2.2. SRLAS Clustering

A similar clustering analysis for the SRLAS dimensions was performed, where four clusters were also determined after an analysis of the results derived from different number of clusters. Figure 6a,b present the populations of the four clusters formed based on the SRLAS dimensions and their corresponding average grades using pie chart and bar diagrams, respectively. These values are also included in Table 5 along with the average final grades and their standard deviations.
It was found that the average grade for Cluster 1 is higher than those for Clusters 2, 3, and 4. Similar to the MI dimensions, one-tail t-tests between clusters were performed to determine whether the differences in the average final grades are statistically different. The study observed no statistically significant differences among Clusters 2, 3, and 4. Therefore, we opted to compare the average grade for Cluster 1 with those for Clusters 2, 3, and 4 combined (Table 6). The following hypotheses were used:
H o : μ 2 3 4 = μ 1
H a : μ 2 3 4 < μ 1
Table 6 conclusively indicates that the difference between the average grades for Cluster 1 and the union of Clusters 2, 3, and 4 is statistically significant with a very small p-value (less than 1 × 10 4 ).
To study the difference among clusters in more detail, a bar-plot in a 1–5 scale was constructed, with the average values obtained for each of the SRLAS dimensions (Figure 7a), similarly to how it was done for the MI dimensions. The corresponding radar diagram, emphasizing the differences among the clusters’ dimensions, is presented in Figure 7b.
As can be seen in Figure 7b, Cluster 1, which has the best academic performance, is the cluster with the lowest anxiety values and the lowest need for extrinsic motivation.
It is important to mention that, although the MI and SRLAS constructs were combined in preliminary analyses, this integration is not recommended due to the information overload produced by the 16 dimensions involved. No additional benefits were obtained, so it was decided to study these two constructs separately in this work.

5.3. Regression Analysis and Correlations

Although the clustering analysis may provide insight on dimensions that have a greater impact on academic performance, the conclusions emerging from this analysis are not conclusive yet. Regression analysis provides a generalization of the results on a probabilistic basis, which considers a sample of the data to conduct hypothesis testing to provide conclusions about the entire population. To verify the statistical rigor of the relationships between academic performance and students’ MI and SRLAS profiles, regressions were performed between students’ final grades and the dimensions of the following: (a) MI, (b) SRLAS, and (c) a combination of both. Minitab was used to select the best set of explanatory variables using a stepwise method, and the best regressions obtained by least-squares were determined. The coefficients in the equations below represent the relative weight of the dimensions that appear in the regression equation, and the figures in parentheses indicate their statistical significance, given by their corresponding p-values. Positive coefficient values indicate dimensions that have a favorable impact on the final grades, while negative values have the opposite effect.
1.
Regression 1 (Final grade vs. MI dimensions)
F i n a l G r a d e = 73.10 + 6.66 L o g M a t h 2.15 M u s 2.08 L i n 2.06 N a t . . . ( 0.0000 ) ( 0.0000 ) ( 0.0010 ) ( 0.0342 ) ( 0.0260 )
N = 618 , R 2 = 16.01 %
2.
Regression 2 (Final grade vs. SRLAS dimensions)
F i n a l G r a d e = 80.50 2.76 A n x + 3.88 S e l f R e g 1.60 E x t M o t 1.34 S o c I n t . . . ( 0.0000 ) ( 0.0001 ) ( 0.0010 ) ( 0.0164 ) ( 0.0532 )
N = 618 , R 2 = 16.50 %
3.
Regression 3 (Final grade vs. MI ∪ SRLAS dimensions)
F i n a l G r a d e = 74.06 2.94 A n x + 4.81 L o g M a t h 2.41 L i n + 2.58 S e l f R e g 2.17 M u s . . . ( 0.0000 ) ( 0.0000 ) ( 0.0000 ) ( 0.0127 ) ( 0.0020 ) ( 0.0008 )
N = 618 , R 2 = 19.05 %
Based on the regression equations, the following conclusions can be derived: (a) the logical-mathematical intelligence exerts an important positive impact on students’ final grades, (b) the self-regulation dimension also has a positive influence on the final grade, whereas the anxiety and external motivation dimensions have a negative relationship, and (c) considering both MI and SRLAS constructs together, the most important dimensions that present a positive relationship with academic performance are the logical-mathematical and self-regulation dimensions, whereas anxiety most negatively influences academic performance. In multiple regression, the correlated explanatory variables interact by competing to reach a place in the regression in such a manner that the significance of the present variables is modified when a new variable is introduced.
Consequently, a correlation analysis is essential to perceive the dynamics of the stepwise algorithm to obtain an appropriate regression analysis. Table 7, Table 8 and Table 9 present correlation matrices, which enable a better understanding of the three regressions mentioned above. The entries illustrate the correlation coefficient r between the corresponding dimensions, as well as the correlations of the final grade with each of the MI and/or SRLAS dimensions.
In Table 7, it is observed that all intelligences are positively correlated, where the highest correlation observed is between LogMath and Intra (r = 0.74). This means that, on average, the higher a student’s LogMath dimension is, the higher their Intra dimension, and vice versa. Moreover, it is found that the LogMath dimension is the most correlated with the remaining ones. The average of the correlations among LogMath and the remaining intelligences is r = 0.57. On the contrary, the lowest correlation is found for Mus and Inter (r = 0.15). This means that, if a student has a high Mus dimension, it does not provide much information about their Inter dimension. Finally, the Mus dimension is the least correlated with the remaining seven dimensions with an average correlation of r = 0.36.
On the other hand, Table 8 indicates that the SRLAS dimensions are also positively correlated. Notably, Anx and ExtMot are the least correlated with all the other dimensions; however, they possess a high correlation between them (r = 0.50, as discussed in Section 5.1). The average correlation for Anx with the remaining variables is r = 0.19 and for ExtMot, it is r = 0.23. This means that Anx and ExtMot have a low linear association concerning the other dimensions. The SRLAS dimension that presents the higher correlation with the remaining seven dimensions is SelfReg, with an average correlation of r = 0.54. Consequently, it is found that the dimension that best represents the SRLAS construct as a whole is SelfReg. Individually, the highest correlation is found between SelfReg and IntMot, with r = 0.75. In other words, on average, the more intrinsically motivated the students are, the more self-regulated they are and vice versa. The lowest correlation coefficient is found between ExtMot and InfProc, with r = 0.019, which is an indicator of the level of independence between the students’ information processing capacity and their extrinsic motivation. All these findings agree reasonably well with the results obtained with the PCA presented in Section 5.1.
Finally, Table 9 presents the main cross-correlations between MI and SRLAS dimensions. In general, we observed low correlations, which can be interpreted as evidence of the level of independence between the MI and SRLAS constructs.

6. Discussion

The PCA biplots demonstrate the relative independence of several of the MI (Figure 3a) and SRLAS (Figure 3b) dimensions, which may help to interpret the characteristics of students of a given section. However, it is important to emphasize that each construct is composed of eight dimensions, whereas the 2D biplot only enables the visualization of two composite dimensions (i.e., principal components). The two corresponding principal components for MI and SRLAS constitute 49% and 60%, respectively, of their total inertia. In other words, the main planes of the MI and SRLAS contain 49% and 60%, respectively, of the total information that can be obtained from the eight dimensions of each construct. Therefore, using only the distribution of students on the biplots to interpret the dimensions of the entire sample does not provide yet by itself enough information for teachers to implement the most appropriate pedagogical actions for their students.
Even though MI dimensions may be considered more related to cognitive indicators, and SRLAS dimensions to behavioral aspects of the student, our results are not meant to provide an elaborated dashboard to monitor and evaluate the actual performance of a given student regarding a particular task. We do not yet have documented interventions derived from this basic dashboard (Figure 2). It provides only a first “picture” of the student profile regarding the 16 considered dimensions. To measure the actual effect of the MI and SRLAS profiles on the academic performance of the student separately is a complicated task, since actual learning is not limited to the learning process, and the final academic performance is also influenced by other complex factors, such as family environment, personal feelings and student personality.
To better track the effect of cognitive and behavioral learning analytics (LA), several authors have worked out cognitive and/or behavioral dashboards to assist in the learning process [50,51,52]. According to Yousef and Khatirty (2021) [51], the key objective of a behavioral LA dashboard is to gather data in a single repository, from multiple channels and networks, used to generate context models to provide students with customized input and personal recommendations. In the case of cognitive LA dashboard, four levels are considered: (a) Description: Observing events and other data to obtain a detailed picture of a student’s activity; (b) Diagnosis: Descriptive elements needed to evaluate an outcome; (c) Prediction: Set likely outcome based on certain elements; and (d) Recommendations: Set how to achieve a desired learning outcome result from a specific element.
Sedrakyan et al. [50] implemented a system of dashboards that allow the student and the teacher to continuously monitor the academic and behavioral status of the students, as well as the evolution of their academic performance regarding a given task. They consider the regulatory mechanisms underlying the learning processes to provide the student with an effective feedback (epistemic, corrective and/or suggestive) to advance efficiently and effectively in the learning cycle, including aspects of self-regulation controlled by the student. The most comprehensive feedback includes both cognitive and conceptual aspects. However, the detailed mechanisms for user intervention in feedback remains challenging. In this sense, Wiggins and McTighe (1998)’s [53] “backward instructional design” provides richer opportunities for tracking the whole learning processes. In addition, recent research shows increased interest in exploring biofeedback opportunities based on multi-modal data collected from various wearable sensors and audio/video streams [54].
Considering the MI clustering process alone, it is difficult to clearly identify the dimensions that exerted an important influence on academic performance. Cluster 3, with a slightly better average final grade, is also the group with the lowest values in all dimensions, whereas Cluster 4, with the lowest average grade, did not provide a suitable pattern leading to sound conclusions for its MI dimensions (Table 3 and Table 4, and Figure 5). This is partially due to the fact that the differences in grades among the clusters obtained were rather small and statistically insignificant. Therefore, the results of this research suggest that the MI construct alone lacked sufficient strength to clearly explain the differences between the grades. This conclusion is consistent with findings of other authors in the sense that there is not a simple correlation between MI dimensions and student academic performance. In fact, in the past decades, the relationship between MI dimensions and academic performance has been considered a field of research among educators from various areas at different academic levels. It has been argued that a simple correlation is lacking among these variables [55,56,57,58]. Notably, Lee [59] suggests that, apart from clearly defining the MI in the surveys, it is also a good idea to ask senior students to participate in the study, since they have gone through the entire academic engineering spectrum. Lee’s work indicates that most students had some level of mixed MI, except the musical one. Whereas the logical-mathematical and linguistic skills were found as the most influential dimensions, musical and body-kinesthetic intelligences were the ones perceived with the least applicability to predict academic performance.
Regarding the results of the SRLAS clustering, it can be appreciated that Cluster 1, with a significant highest average grade (Table 5), has the highest IntMot and InfProc values, relatively high values for SelfReg, SocInt, FitMood, and InfSearch, but the smallest Anx and ExtMot values (Figure 6b). On the other hand, Cluster 3, with the lowest average grade, has high values for all dimensions, including Anx and ExtMot. Cluster 4, with an intermediate average grade, presents relatively small values for all dimensions. Finally, Cluster 2, with an intermediate average grade, reveals intermediate values for all the dimensions. Therefore, it is suggested that a combination of relatively high values of IntMot, InfProc, InfSearch, SocInt, SelfReg, and FitMood, on the one hand, combined with low values of ExtMot, and Anx, on the other hand, can promote better academic performance. Likewise, students presenting high levels of anxiety (Anx) and the need for external motivation (ExtMot) may face difficulties in achieving appropriate learning outcomes. This assertion is somehow validated by the teaching experience of the authors and the expectations from such student behavior.
To reinforce and complement the information obtained from the clustering analysis, a regression analysis was performed. From the regression equation obtained from the MI dimensions (Equation (2)), it was found that logical-mathematical intelligence had a critically positive impact on the student’s final grades, as expected, the sample being comprised mostly of engineering students. In addition, from the regression equation for the SRLAS construct (Equation (3)), it was identified that students’ self-regulation also had an important positive impact on their final grades, while students’ anxiety and the need for external motivation shared a negative impact on students’ outcomes. Overall, taking the MI and SRLS constructs together in the regression analysis (Equation (4)), it can be concluded that the most important dimensions that presented a positive relationship with academic performance are the logical-mathematical and self-regulation, while the students’ anxiety has the most negative impact on academic performance.
There have been multiple efforts to predict student academic outcomes in order to improve their academic performance. The contexts, the variables used, the analytical techniques used, and the objectives pursued have all been diverse. For example, Akhtar et al. [30], through analysis of variance, correlation, and regression, found that academic performance was positively correlated with course attendance, grouping with peers in the collaborative learning environment, and the time spent on learning tasks. However, it was negatively correlated with the distance from the students’ seat to the position of the teacher. They consider their findings to be important in detecting students who are at risk of failing a course. In contrast, the present study rather focuses in considering variables regarding MI and SRLAS dimensions.
To examine academic performance in detail, Matzavela et al. [60] suggest that, apart from pedagogical efforts, complementing the analysis with a specific student profile may be helpful in analyzing student performance. Information on gender, level of education of parents, their income, birth order in the family, and the current working conditions of the students, among others, may be useful. Along this idea, Aman et al. [61] believe that socioeconomic factors, academic history, and personal interests also play a crucial role in predicting academic performance in developing countries.
During the present research, it was found that MI multiple intelligences, which may be considered similar to learning styles, were less effective than SRLAS to predict academic performance. These results are similar to those reported by Atkinson [31], who observed that learning styles are not decisive for predicting student academic performance. However, Atkinson’s work shows that students’ achievements are positively correlated with previous experience.
The learning analytics techniques applied in this work were similar to those used by Bravo-Agapito et al. [26]. In their research, Bravo-Agapito et al. used main components analysis, correlation analysis, factor analysis, and clustering as exploratory techniques. They used multiple regression for their predictive analysis and did not require the Exploratory Factor Analysis because the used instruments have already been validated. The input data used by them are different than ours, since they analyzed student interaction data from Moodle, such as accesses, questionnaires, tasks, and student’s age, while our database uses student profiles based on MI and SRLAS constructs.
The effect of self-regulation learning on academic performance was also studied by Kim et al. [18], who analyzed student statistics in asynchronous online courses using classification techniques, such as Decision Trees and Random Forests. These authors used clustering to identify groups of self-regulated, partially self-regulated, and non-self-regulated students. When observing a better academic performance of self-regulated students, they used the Random Forest classifier to deduce rules for the development of student self-regulation. The present research’s findings are in line with these authors in the sense that self-regulation is an important aspect of academic success. However, unlike their work, this study included mostly engineering students, while self-regulation learning was estimated through one of the SRLAS dimensions.
Another important aspect to consider in learning prediction models is the management of student anxiety. The relationship between test anxiety and academic performance has been investigated by several authors. Chapell et al. [32] found a small but significant inverse relationship between these two variables through basic statistics in students from different careers and universities in the United States. Similarly, Vitasari et al. [33] reported a significant positive correlation between anxiety level and low academic performance of a large sample of engineering students in Malaysia, concluding that anxiety is an important predictor generating a detrimental effect on student academic performance. These findings are consistent with the results obtained in this study, where anxiety is one of the eight dimensions of our SRLAS construct with a significant effect on academic performance. The negative relationship between anxiety and academic performance is found in the clustering analysis, where the clusters characterized by a low level of anxiety had a significantly higher performance than the other clusters. In addition, in the regression analysis, it was identified that anxiety is a variable with a very significant negative effect on the prediction of academic performance. Furthermore, in the present study, a high correlation between students’ anxiety and their extrinsic motivation needs was also found.
The detrimental role of anxiety on student academic outcomes is also noticed in the work of Garg et al. [21], who uses a machine learning-based model for predicting students’ performance in higher education. They also used visualizations and classification techniques to find significant factors to build a predictive model. They found that support vector machine, random forest, and naive Bayes techniques may effectively train limited samples to generate appropriate prediction performance.
With regard to motivation, Trujillo-Torres et al. [27] investigated mathematical competence in middle schools using a 14-item questionnaire. They found that the teacher–student relationship and motivation were crucial factors to achieve optimal academic performance. In addition, Balogun et al. [34] found that, although anxiety and motivation had a negative and a positive effect in academic performance, respectively, it was possible to moderate anxiety through motivation. Therefore, they concluded that universities should design adequate psychoeducational interventions to improve motivation and increase the performance of the students. In this regard, in the present research, two types of motivation, intrinsic and extrinsic, were included in the SRLAS construct with undergraduate students. It was identified that the intrinsic motivation was positively related to students’ academic performance, while extrinsic motivation, defined as the need of students to have external motivation, presented an inverse relationship with students’ academic outcomes.
Consequently, the methodology proposed in this research will allow instructors to propose mechanisms to design tools for two main types of support actions. First, it will be possible to: (a) provide early warnings to increase student success, (b) build models of student behavior to predict academic performance, (c) increase self-reflection and self-awareness of responsibilities and roles in the teaching-learning process, and (d) design applications to improve on time feedback and evaluation processes. Secondly, it will also be possible to offer course recommendations in adaptive systems, providing tools to predict dropouts, thus increasing student retention, as well as making suggestions on the optimal use of educational resources. The contributions of the predictive values of the constructs used in the present work to define the students’ profiles may also be combined with the results of diagnostic tests applied at the beginning of the courses to offer personalized and adaptive learning environments, both in face-to-face or online modes. Furthermore, in a next phase, we plan to determine individualized predictions by means of classification algorithms, in order to know if each student has a risk profile of failure and take the appropriate help actions at the appropriate time.

7. Conclusions and Future Work

Based on Multiple Intelligences (MI) and Self-regulation Learning and Affective Strategies (SRLAS) constructs, obtained by means of previously validated surveys, the formation and characterization of clusters of students with similar profiles were obtained. Therefore, instructors may have the opportunity to classify their students from the beginning of the course, aiming to make proactive and timely decisions to improve their teaching strategies. Regression analysis results suggest that undergraduate engineering students are likely to obtain better grades when: (a) they have higher logical-mathematical intelligence (LogMath), (b) they are intrinsically motivated to learn the subject (IntMot), and (c) they are self-regulated (SelfReg). Cluster and correlation analyses further showed that students with higher grades are those who can better handle their anxiety (Anx), do not show a strong need for external motivation (ExtMot), and have appropriate skills to process information (InfProc). Consequently, from the present research, it can be concluded that engineering students lacking logical-mathematical intelligence that require a strong extrinsic motivation, and that show relatively high levels of anxiety, may face difficulties with their academic performance. This conclusion fits well with our research hypothesis.
The methodology of the present research would allow instructors to: provide early warnings to increase student success, build models of student behavior to predict academic performance, increase self-reflection and self-awareness of responsibilities and roles in the teaching-learning process, design applications to improve on time feedback and evaluation processes, offer course recommendations in adaptive systems to predict dropouts, and make optimal use of educational resources. Future work should focus on an individualized prediction by means of classification algorithms to determine students with a risk profile of failure and take the time-appropriate help actions. With this information, a dynamic dashboard could be constructed to follow student academic performance and skills acquisition. Similar studies with student samples from other disciplines such as social sciences, humanities, or business are required to broaden the horizon.

Author Contributions

Conceptualization, A.G.-N. and J.N.; Data curation, A.G.-N. and D.E.-C.; formal analysis, A.G.-N., J.N., and L.N.; funding acquisition, J.N.; investigation, D.E.-C., V.R.-R., and R.M.G.G.-C.; methodology, A.G.-N., J.N., and L.N.; project administration, J.N. and R.M.G.G.-C.; resources, J.N.; software, D.E.-C. and A.G.-N.; supervision, V.R.-R. and R.M.G.G.-C.; validation, L.N. and V.R.-R.; visualization, J.N. and D.E.-C.; writing—original draft, A.G.-N., J.N., and L.N.; writing—review and editing, L.N., V.R.-R., R.M.G.G.-C., and D.E.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by the NOVUS Grant (PEP No. PHHT032-17CX00003), Writing Lab, and TecLabs of the Institute for the Future of Education from the Tecnologico de Monterrey.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to acknowledge Vicerrectoríıa de Investigación y Posgrado, the Advanced Artificial Intelligence Research Group, the CyberLearning and Data Science Laboratory, and the Science Department of Tecnologico de Monterrey, Mexico City Campus.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Clow, D. An overview of learning analytics. Teach. High. Educ. 2013, 18, 683–695. [Google Scholar] [CrossRef] [Green Version]
  2. Dietz-Uhler, B.; Hurn, J.E. Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective. J. Interact. Online Learn. 2013, 12, 17–26. [Google Scholar]
  3. Namoun, A.; Alshanqiti, A. Predicting Student Performance Using Data Mining and Learning Analytics Techniques: A Systematic Literature Review. Appl. Sci. 2021, 11, 237. [Google Scholar] [CrossRef]
  4. Tao, T.; Sun, C.; Wu, Z.; Yang, J.; Wang, J. Deep Neural Network-Based Prediction and Early Warning of Student Grades and Recommendations for Similar Learning Approaches. Appl. Sci. 2022, 12, 7733. [Google Scholar] [CrossRef]
  5. Sclater, N.; Peasgood, A.; Mullan, J. Learning Analytics in Higher Education: A Review of UK and International Practice; Technical Report; Jisc: Bristol, UK, 2016. [Google Scholar]
  6. Baashar, Y.; Alkawsi, G.; Mustafa, A.; Alkahtani, A.A.; Alsariera, Y.A.; Ali, A.Q.; Hashim, W.; Tiong, S.K. Toward Predicting Student’s Academic Performance Using Artificial Neural Networks (ANNs). Appl. Sci. 2022, 12, 1289. [Google Scholar] [CrossRef]
  7. Bienkowski, M.; Brecht, J.; Klo, J. The Learning Registry: Building a Foundation for Learning Resource Analytics. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, 29 April–2 May 2012; ACM: New York, NY, USA, 2012; pp. 208–211. [Google Scholar] [CrossRef]
  8. Gray, C.C.; Perkins, D. Utilizing early engagement and machine learning to predict student outcomes. Comput. Educ. 2019, 131, 22–32. [Google Scholar] [CrossRef]
  9. Williams, K.C.; Lowendahl, J.M.; Thayer, T.L.; Morgan, G. Predicts 2017: Education Gets Personal. 2021. Available online: https://www.gartner.com/doc/3519719?ref=mrktg-srch (accessed on 20 April 2022).
  10. Nunn, S.; Avella, J.; Kanai, T.; Kebritchi, M. Learning Analytics Methods, Benefits, and Challenges in Higher Education: A Systematic Literature Review. Online Learn. 2016, 20, 13–29. [Google Scholar] [CrossRef]
  11. Sclater, N. Learning Analytics Explained, 1st ed.; Routledge: Oxfordshire, UK, 2017. [Google Scholar]
  12. Lepi, K. The 4 Levels Of Learning Analytics. 2021. Available online: https://schoolleadership20.com/forum/topics/the-4-levels-of-learning-analytics-by-katie-lepi (accessed on 20 April 2022).
  13. Fuentes, S.; Rosário, P. Mediar Para la Autorregulación del Aprendizaje: Un Desafío Educativo Para el Siglo XXI; Facultad de Ciencias de la Educación, Universidad Central de Chile e Instituto Internacional para el Desarrollo Cognitivo, INDESCO: Santiago de Chile, Chile, 2013. [Google Scholar]
  14. Gonzalez-Nucamendi, A.; Noguez, J.; Neri, L.; Robledo-Rella, V.; García-Castelán, R.M.G.; Escobar-Castillejos, D. The prediction of academic performance using engineering student’s profiles. Comput. Electr. Eng. 2021, 93, 107288. [Google Scholar] [CrossRef]
  15. van Leeuwen, A.; Janssen, J.; Erkens, G.; Brekelmans, M. Teacher regulation of cognitive activities during student collaboration: Effects of learning analytics. Comput. Educ. 2015, 90, 80–94. [Google Scholar] [CrossRef]
  16. Sousa-Vieira, M.E.; López-Ardao, J.C.; Fernández-Veiga, M.; Ferreira-Pires, O.; Rodríguez-Pérez, M.; Rodríguez-Rubio, R.F. Prediction of learning success/failure via pace of events in a social learning network platform. Comput. Appl. Eng. Educ. 2018, 26, 2047–2057. [Google Scholar] [CrossRef]
  17. Teo, H.J.; Johri, A.; Lohani, V. Analytics and patterns of knowledge creation: Experts at work in an online engineering community. Comput. Educ. 2017, 112, 18–36. [Google Scholar] [CrossRef]
  18. Kim, D.; Yoon, M.; Jo, I.H.; Branch, R.M. Learning analytics to support self-regulated learning in asynchronous online courses: A case study at a women’s university in South Korea. Comput. Educ. 2018, 127, 233–251. [Google Scholar] [CrossRef]
  19. Mat, U.; Buniyamin, N.; Arsad, P.M.; Kassim, R. An overview of using academic analytics to predict and improve students’ achievement: A proposed proactive intelligent intervention. In Proceedings of the 2013 IEEE 5th Conference on Engineering Education (ICEED), Kuala Lumpur, Malaysia, 4–5 December 2013; pp. 126–130. [Google Scholar]
  20. Zulkifli, F.; Mohamed, Z.; Azmee, N.A. Systematic Research on Predictive Models on Students’ Academic Performance in Higher Education. Int. J. Recent Technol. Eng. 2019, 8, 357–363. [Google Scholar] [CrossRef]
  21. Garg, A.; Lilhore, U.K.; Ghosh, P.; Prasad, D.; Simaiya, S. Machine Learning-based Model for Prediction of Student’s Performance in Higher Education. In Proceedings of the 2021 8th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 26–27 August 2021; pp. 162–168. [Google Scholar] [CrossRef]
  22. Shahiri, A.M.; Husain, W.; Rashid, N.A. A Review on Predicting Student’s Performance Using Data Mining Techniques. Procedia Comput. Sci. 2015, 72, 414–422. [Google Scholar] [CrossRef] [Green Version]
  23. Pandey, M.; Taruna, S. Towards the integration of multiple classifier pertaining to the Student’s performance prediction. Perspect. Sci. 2016, 8, 364–366. [Google Scholar] [CrossRef] [Green Version]
  24. Hasan, R.; Palaniappan, S.; Raziff, A.R.A.; Mahmood, S.; Sarker, K.U. Student Academic Performance Prediction by using Decision Tree Algorithm. In Proceedings of the 2018 4th International Conference on Computer and Information Sciences (ICCOINS), Kuala Lumpur, Malaysia, 13–14 August 2018; pp. 1–5. [Google Scholar]
  25. Hamsa, H.; Indiradevi, S.; Kizhakkethottam, J.J. Student Academic Performance Prediction Model Using Decision Tree and Fuzzy Genetic Algorithm. Procedia Technol. 2016, 25, 326–332. [Google Scholar] [CrossRef] [Green Version]
  26. Bravo-Agapito, J.; Romero, S.J.; Pamplona, S. Early prediction of undergraduate Student’s academic performance in completely online learning: A five-year study. Comput. Hum. Behav. 2021, 115, 106595. [Google Scholar] [CrossRef]
  27. Trujillo-Torres, J.M.; Hossein-Mohand, H.; Gómez-García, M.; Hossein-Mohand, H.; Hinojo-Lucena, F.J. Estimating the Academic Performance of Secondary Education Mathematics Students: A Gain Lift Predictive Model. Mathematics 2020, 8, 2101. [Google Scholar] [CrossRef]
  28. Sharabiani, A.; Karim, F.; Sharabiani, A.; Atanasov, M.; Darabi, H. An enhanced bayesian network model for prediction of students’ academic performance in engineering programs. In Proceedings of the 2014 IEEE Global Engineering Education Conference (EDUCON), Istanbul, Turkey, 3–5 April 2014; pp. 832–837. [Google Scholar]
  29. D’Uggento, A.M.; d’Ovidio, F.D.; Toma, E.; Ceglie, R. A Framework for Detecting Factors Influencing Students’ Academic Performance: A Longitudinal Analysis. Soc. Indic. Res. 2020, 156, 389–407. [Google Scholar] [CrossRef]
  30. Akhtar, S.; Warburton, S.; Xu, W. The use of an online learning and teaching system for monitoring computer aided design student participation and predicting student success. Int. J. Technol. Des. Educ. 2017, 27, 251–270. [Google Scholar] [CrossRef]
  31. Atkinson, S. Factors Influencing Successful Achievement in Contrasting Design and Technology Activities in Higher Education. Int. J. Technol. Des. Educ. 2006, 16, 193–213. [Google Scholar] [CrossRef]
  32. Chapell, M.; Blanding, Z.; Silverstein, M.; Takahashi, M.; Newman, B.A.; Gubi, N.M. Test anxiety and academic performance in undergraduate and graduate students. J. Educ. Psychol. 2005, 97, 268–274. [Google Scholar] [CrossRef]
  33. Vitasari, P.; Wahab, M.; Othman, A.; Herawan, T.; Sinnadurai, S. The relationship between study anxiety and academic performance among engineering students. Procedia-Soc. Behav. Sci. 2010, 8, 490–497. [Google Scholar] [CrossRef] [Green Version]
  34. Balogun, A.; Balogun, S.; Onyencho, C. Test anxiety and academic performance among undergraduates: The moderating role of achievement motivation. Span. J. Psychol. 2017, 20, E14. [Google Scholar] [CrossRef] [PubMed]
  35. Alonso-Nuez, M.; Gil-Lacruz, A.; Rosell-Martínez, J. Assessing evaluation: Why student engages or resists to active learning? Int. J. Technol. Des. Educ. 2021, 31, 1001–1017. [Google Scholar] [CrossRef]
  36. Noguez, J.; Neri, L.; González-Nucamendi, A.; Robledo-Rella, V. Characteristics of self-regulation of engineering students to predict and improve their academic performance. In Proceedings of the 2016 IEEE Frontiers in Education Conference (FIE), Erie, PA, USA, 12–15 October 2016; pp. 1–8. [Google Scholar] [CrossRef]
  37. Gardner, H. Frames of Mind: The Theory of Multiple Intelligences, 3rd ed.; Hachette UK: London, UK, 2011. [Google Scholar]
  38. Sulaiman, T.; Abdurahman, A.R.; Rahim, S.S.A. Teaching Strategies Based on Multiple Intelligences Theory among Science and Mathematics Secondary School Teachers. Procedia-Soc. Behav. Sci. 2010, 8, 512–518. [Google Scholar] [CrossRef] [Green Version]
  39. Petruta, G.P. Multiple Intelligences Stimulated within the Lessons by the Practicant Students from the Faculty of Sciences. Procedia-Soc. Behav. Sci. 2013, 76, 676–680. [Google Scholar] [CrossRef]
  40. Piaw, C.Y.; Ishak, A.; Yaacob, N.A.; Said, H.; Pee, L.E.; Kadir, Z.A. Can Multiple Intelligence Abilities Predict Work Motivation, Communication, Creativity, and Management Skills of School Leaders? Procedia-Soc. Behav. Sci. 2014, 116, 4870–4874. [Google Scholar] [CrossRef]
  41. Constantinescu, R.S. The Theory of Multiple Intelligences-applications in Mentoring Beginning Teachers. Procedia-Soc. Behav. Sci. 2014, 116, 3345–3349. [Google Scholar] [CrossRef] [Green Version]
  42. Armstrong, T. Multiple Intelligences in the Classroom, 4th ed.; Association for Supervision & Curriculum Development: Alexandria, VA, USA, 2017. [Google Scholar]
  43. Arriola, M.A. Relación Entre Estrategias de Aprendizaje y Autorregulación: Un Modelo Explicativo. Ph.D. Thesis, Universidad Iberoamericana, Ciudad de Mexico, Mexico, 2001. [Google Scholar]
  44. Zimmerman, B.; Schunk, D. Self-regulated learning and performance. In Handbook of Self-Regulation of Learning and Performance; Routledge: Oxfordshire, UK, 2016; pp. 1–12. [Google Scholar]
  45. Schunk, D.H. Learning Theories: An Educational Perspective, 6th ed.; Pearson: London, UK, 2012. [Google Scholar]
  46. Gargallo, B.; Suárez-Rodríguez, J.; Pérez-Pérez, C. El cuestionario CEVEAPEU. Un instrumento para la evaluación de las estrategias de aprendizaje de los estudiantes universitarios. Relieve Rev. Electrónica Investig. Y Evaluación Educ. 2009, 15, 1–31. [Google Scholar] [CrossRef] [Green Version]
  47. Noguez Monroy, J.; Escárcega Centeno, D.; Escobar Castillejos, D. Validación de instrumento para Inteligencias Múltiples y Estrategias de Aprendizajes. In Memorias CIIE 2015; Tecnologico de Monterrey: Monterrey, Mexico, 2015; pp. 1000–1005. [Google Scholar]
  48. Neri, L.; Noguez Monroy, J.; Alanis Funes, G. Validación de instrumento para determinar Habilidades de Autorregulación de los Alumnos. In Memorias CIIE 2015; Tecnologico de Monterrey: Monterrey, Mexico, 2015; pp. 994–999. [Google Scholar]
  49. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef] [Green Version]
  50. Sedrakyan, G.; Malmberg, J.; Verbert, K.; Järvelä, S.; Kirschner, P.A. Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Comput. Hum. Behav. 2020, 107, 105512. [Google Scholar] [CrossRef]
  51. Yousef, A.M.F.; Khatiry, A.R. Cognitive versus behavioral learning analytics dashboards for supporting learner’s awareness, reflection, and learning process. Interact. Learn. Environ. 2021, 1–17. [Google Scholar] [CrossRef]
  52. Susnjak, T.; Ramaswami, G.S.; Mathrani, A. Learning analytics dashboard: A tool for providing actionable insights to learners. Int. J. Educ. Technol. High. Educ. 2022, 19, 12. [Google Scholar] [CrossRef]
  53. Wiggins, G.; McTighe, J. What is backward design. In Understanding by Design; ASCD: Alexandria, VA, USA, 1998; Chapter 1; pp. 7–19. [Google Scholar]
  54. Rojas Melendez, J.A.; Sedrakyan, G.; Colpaert, P.; Vander Sande, M.; Verborgh, R. Supporting Sustainable Publishing and Consuming of Live Linked Time Series Streams. In Proceedings of the The Semantic Web: ESWC 2018 Satellite Events, Crete, Greece, 3–7 June 2018; Gangemi, A., Gentile, A.L., Nuzzolese, A.G., Rudolph, S., Maleshkova, M., Paulheim, H., Pan, J.Z., Alam, M., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 148–152. [Google Scholar]
  55. Naderi, H.; Abdullah, R.; Hamid, T.; Jamaluddin, S. Intelligence and Academic Achievement: An Investigation of Gender Differences. Life Sci. J. 2010, 7, 83–87. [Google Scholar] [CrossRef]
  56. Kandeel, R. Multiple Intelligences Patterns among Students at King Saud University and Its Relationship with Mathematics’ Achievement. J. Educ. Learn. 2016, 5, 94–106. [Google Scholar] [CrossRef] [Green Version]
  57. Hernández, C.; Prada, R.; Rincon, G. Multiple Intelligences and Academic Performance in Basic Education Students: An Analysis of Main Components. J. Phys. Conf. Ser. 2019, 1388, 012047. [Google Scholar] [CrossRef]
  58. Mel, B.N. The Gardner’s Multiple Intelligences and Academic Performance Among the Second-Semester Mechanical Engineering Students in Politeknik Kuching Sarawak: A Correlation Analysis. Int. J. Adv. Res. Educ. Soc. 2021, 3, 132–141. [Google Scholar]
  59. Lee, W. Board 46: Multiple Intelligences and Undergraduate Engineering Education. In Proceedings of the 2019 ASEE Annual Conference &Exposition, Tampa, FL, USA, 16–19 June 2019; p. 6. [Google Scholar] [CrossRef]
  60. Matzavela, V.; Alepis, E. Decision tree learning through a Predictive Model for Student Academic Performance in Intelligent M-Learning environments. Comput. Educ. Artif. Intell. 2021, 2, 100035. [Google Scholar] [CrossRef]
  61. Aman, F.; Rauf, A.; Ali, R.; Iqbal, F.; Khattak, A.M. A Predictive Model for Predicting Students Academic Performance. In Proceedings of the 2019 10th IInternational Conference on Information, Intelligence, Systems and Applications (IISA), Patras, Greece, 15–17 July 2019; pp. 1–4. [Google Scholar] [CrossRef]
Figure 1. Overview of the methodology.
Figure 1. Overview of the methodology.
Applsci 12 10560 g001
Figure 2. Example of Basic Student’s Dashboard of the results of the application of the questionnaires of MI (left) and SRLAS (right) for a student.
Figure 2. Example of Basic Student’s Dashboard of the results of the application of the questionnaires of MI (left) and SRLAS (right) for a student.
Applsci 12 10560 g002
Figure 3. PCA Biplot diagrams and Correlation of dimensions of (a) MI and (b) SRLAS dimensions.
Figure 3. PCA Biplot diagrams and Correlation of dimensions of (a) MI and (b) SRLAS dimensions.
Applsci 12 10560 g003
Figure 4. (a) MI clusters’ sizes; (b) MI clusters’ average grades.
Figure 4. (a) MI clusters’ sizes; (b) MI clusters’ average grades.
Applsci 12 10560 g004
Figure 5. Comparison of MI clusters: (a) bar diagram; (b) normalized radar.
Figure 5. Comparison of MI clusters: (a) bar diagram; (b) normalized radar.
Applsci 12 10560 g005
Figure 6. (a) SRLAS clusters’ sizes; (b) SRLAS clusters’ average final grades.
Figure 6. (a) SRLAS clusters’ sizes; (b) SRLAS clusters’ average final grades.
Applsci 12 10560 g006
Figure 7. Comparing SRLAS clusters: (a) bar diagram; (b) normalized radar.
Figure 7. Comparing SRLAS clusters: (a) bar diagram; (b) normalized radar.
Applsci 12 10560 g007
Table 1. Relation of article attributes, methods, and contribution to the field for Related work.
Table 1. Relation of article attributes, methods, and contribution to the field for Related work.
AuthorArticle Attribute *
ABCDEFGHIJK
Van Leeuwen et al. [15]**
Sousa-Vieira et al. [16]* *
Teo et al. [17] *
Kim et al. [18] *
Mat et al. [19] **
Zulkifli et al. [20] * *
Garg et al. [21] * **
Shahiri et al. [22] * *
Pandey & Tarura [23] * ****
Hasan et al. [24] * *** *
Hamsa et al. [25]* * * * *
Bravo-Agapito et al. [26]* * * * *
Trujillo-Torres et al. [27] ****
Sharabiani et al. [28] * *****
D’Uggento et al. [29] * ** **
Akhtar et al. [30]** * * *
Atkinson [31]* ** *
Chapell et al. [32] ** * *
Vitasari et al. [33] ** *
Balogun et al. [34] ** *
Alonso-Nuez et al. [35] **
* Notes: A = Computer-supported learning environment with student activities; B = Aimed for collaborative group of students; C = Aimed for early and timely intervention; D = Self-regulated learning in asynchronous online courses; E = Use of medium or large student dataset; F = Use of final grades or admission scores or Math scores, or summative learning outcomes, or achievement data, or Grade Point Average (GPA), or active learning; G = Use of predictive models, or Educational Data mining or classifiers; H = Use of information regarding: demographic, or school environment, or gender, or student–teacher relationship, or study time, or teaching resources, or time spent on task, or class attendance, or sitting location, or learning style or prior experience; I = Use of Random Forest tree algorithm, or Decision trees, or Fuzzy logic, or Bayesian networks; J = Analysis using: exploratory factors, or multiple regression, or clustering, or logistic regression, or survival, or Cox regression, or linear regression, or hierarchical multiple regression; K = Anxiety or achievement motivation.
Table 2. Dimensions for the MI and SRLAS constructs.
Table 2. Dimensions for the MI and SRLAS constructs.
MI ConstructSRLAS Construct
DimensionAbbreviationDimensionAbbreviation
LinguisticLin Intrinsic motivationIntMot
Logical-MathematicalLogMathExtrinsic motivationExtMot
SpatialSpaFitness and MoodFitMood
Bodily-KinestheticBodKinAnxietyAnx
MusicalMusSelf-RegulationSelfReg
InterpersonalInterSocial interactionSocInt
IntrapersonalIntraInformation search and selection strategiesInfSearch
NaturalisticNatInformation use and processing strategiesInfProc
Table 3. Size, Average Final Grade, and Standard Deviation of clusters formed with MI dimensions.
Table 3. Size, Average Final Grade, and Standard Deviation of clusters formed with MI dimensions.
Cluster 1Cluster 2Cluster 3Cluster 4
N2922043488
Average final grade76.376.677.674.1
Std. Dev.11.914.114.815.2
Table 4. p-values for one-tail t-tests’ comparisons among clusters’ average grades (MI dimensions).
Table 4. p-values for one-tail t-tests’ comparisons among clusters’ average grades (MI dimensions).
Cluster 1 2 3 Cluster 4
N53088
Average final grade76.574.1
Std. Dev.12.915.2
p-value0.0580
Table 5. Size, Average Final Grade, and Standard Deviation of clusters formed with SRLAS dimensions.
Table 5. Size, Average Final Grade, and Standard Deviation of clusters formed with SRLAS dimensions.
Cluster 1Cluster 2Cluster 3Cluster 4
N15626117427
Average final grade81.375.672.774.4
Std. Dev.11.413.513.512.3
Table 6. p-values for one-tail t-tests comparisons among clusters’ average grades (SRLAS).
Table 6. p-values for one-tail t-tests comparisons among clusters’ average grades (SRLAS).
Cluster 1Cluster 2 3 4
N156462
Average final grade81.374.4
Std. Dev.11.413.5
p-value0.0000
Table 7. Correlations table for MI dimensions.
Table 7. Correlations table for MI dimensions.
Final GradeIntraInterLinSpaLogMathMusBodKin
Intra0.04
Inter0.060.39
Lin0.010.680.52
Spa0.000.380.640.43
LogMath0.170.740.500.720.49
Mus−0.080.400.150.430.270.41
BodKin0.070.550.440.540.460.610.49
Nat−0.040.570.280.450.330.530.370.52
Table 8. Correlations table for SRLAS dimensions.
Table 8. Correlations table for SRLAS dimensions.
Final GradeIntMotExtMotFitMoodAnxSelfRegSocIntInfSearch
IntMot0.130
ExtMot−0.1820.108
FitMood0.0140.5050.267
Anx−0.2080.0900.5000.186
SelfReg0.0960.7510.2470.6270.181
SocInt−0.0290.5150.1590.4530.1910.548
InfSearch0.0250.6040.2840.5250.1300.7200.445
InfProc0.1040.7050.0190.4120.0260.6970.5610.632
Table 9. Correlations table for SRLAS versus MI dimensions.
Table 9. Correlations table for SRLAS versus MI dimensions.
Final GradeLinLogMathMusNat
IntMot0.1300.2120.2370.1690.108
ExtMot−0.1820.038−0.0140.2260.188
Anx−0.208−0.031−0.1110.1160.103
SelfReg0.0960.2640.2500.2550.205
SocInt−0.0290.1650.1840.1620.154
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gonzalez-Nucamendi, A.; Noguez, J.; Neri, L.; Robledo-Rella, V.; García-Castelán, R.M.G.; Escobar-Castillejos, D. Learning Analytics to Determine Profile Dimensions of Students Associated with Their Academic Performance. Appl. Sci. 2022, 12, 10560. https://doi.org/10.3390/app122010560

AMA Style

Gonzalez-Nucamendi A, Noguez J, Neri L, Robledo-Rella V, García-Castelán RMG, Escobar-Castillejos D. Learning Analytics to Determine Profile Dimensions of Students Associated with Their Academic Performance. Applied Sciences. 2022; 12(20):10560. https://doi.org/10.3390/app122010560

Chicago/Turabian Style

Gonzalez-Nucamendi, Andres, Julieta Noguez, Luis Neri, Víctor Robledo-Rella, Rosa María Guadalupe García-Castelán, and David Escobar-Castillejos. 2022. "Learning Analytics to Determine Profile Dimensions of Students Associated with Their Academic Performance" Applied Sciences 12, no. 20: 10560. https://doi.org/10.3390/app122010560

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop