Adaptive and Personalized Learning in Higher Education: An Artificial Intelligence-Based Approach
Abstract
1. Introduction
2. Theoretical Framework
2.1. Foundations of Adaptive Learning and Scalability Mechanisms
2.2. Ethical, Technical, and Governance Challenges
2.3. Teacher Agency and Pedagogical Limits
3. Materials and Methods
3.1. Phase 1: Diagnostic Analysis
- Selection of numeric variables relevant to higher education.
- Removal of variables with more than 50% missing values.
- Replacement of infinite values and imputation of remaining missing data using constant-value substitution (zero-filling), consistent with the categorical nature of the ENAPE indicators.
- Elimination of zero-variance variables.
- Standardization of all retained variables using z-score normalization.
3.1.1. Instruments and Variables
- Impact of Education on Daily Life (Factor 1):Measured through Likert-scale items assessing the perceived utility of education in decision-making, quality of life, and employment opportunities (e.g., items PB3_6 to PB3_9).
- Access and Continuity (Factor 2): Comprising indicators of enrollment status and semester progression (e.g., PA3_3, PA3_6), operationalized as categorical variables reflecting retention.
- Educational Performance (Factor 3): Aggregating variables related to academic remediation, such as the need for supplemental training or extraordinary exams (e.g., PA3_7 series).
- Perception of Assessment (Factor 4): Grouping items related to evaluation formats, including project-based and multimedia assessments (e.g., PA3_8 series).
3.1.2. Analysis Strategy
- Descriptive Analysis. Univariate statistics and frequency distributions were computed to assess the distributional characteristics and identify potential anomalies prior to multivariate modeling.
- Dimensionality Reduction and Factor Extraction. An Exploratory Factor Analysis (EFA) was conducted using the MinRes extraction method and Varimax rotation. Sampling adequacy was verified through Bartlett’s Test of Sphericity () and the Kaiser–Meyer–Olkin (KMO) index. For all inferential tests in this diagnostic phase, the statistical significance level was set at . The number of factors was determined through convergence of (i) eigenvalues , (ii) inspection of the scree plot, (iii) theoretical interpretability, and (iv) retention of loadings . This procedure resulted in a four-factor solution representing latent dimensions of learning impact, access and progression, academic trajectories, and evaluation practices.
- Correlation Analysis. Pearson’s correlation coefficients were computed among the factor scores to examine the interrelationships between latent constructs. These associations provide empirical grounding for understanding how adaptive learning models might engage distinct dimensions of students’ educational experiences.
3.2. Phase 2: Quasi-Experimental Pilot
3.2.1. Participants and Context
- Civil Engineering Cluster (): Students enrolled in the “Sustainable Road Project” course (Proyecto Vial Sustentable), characterized by project-based learning requirements.
- Nutrition Cluster (): Students enrolled in the “Cellular and Molecular Biology” course (Biología Celular y Molecular), focused on theoretical and factual knowledge acquisition.
3.2.2. Intervention Procedure: The ActivAI Workflow
- Diagnostic Input. Prior to the instructional sessions, the instructors inputted qualitative diagnostic observations into a cloud-based database (Airtable). These entries included the group identifier, course subject, and specific behavioral or cognitive traits observed (e.g., “The group is quiet but works well individually,” or “They struggle with team integration”).
- Automated Orchestration. The instructor accessed the ActivAI interface and requested an activity for a specific group. The system utilized a custom API action to retrieve the stored diagnostic comments from Airtable.
- Content Generation (RAG). Using the retrieved diagnostic data as a contextual prompt, the GPT generated a personalized teaching–learning activity. To prevent “hallucinations” and ensure educational quality, the model was restricted to referencing three uploaded evidence-based frameworks: authentic assessment elements, active learning strategies, and peer instruction methodologies.
- Human Validation and Application. The instructor reviewed the AI-generated proposal for safety and accuracy—following the decision tree protocols for safe AI use Sabzalieva and Valentini (2023)—and subsequently applied the tailored activity in the classroom environment.
- Average age of the audience.
- Type of focus (e.g., workshop, higher-education activity, etc.).
- Learning objective.
- Topic.
- Subtopic.
- Prior knowledge (if required).
- Activity duration.
- Group name (to retrieve teachers’ comments).
3.2.3. Data Collection and Measures
- Satisfaction with Personalization (4 items): Assessed the perception of tailored feedback and pacing (e.g., “The pace of some sessions was adjusted to our understanding”).
- Relevance and Utility (5 items): Measured the connection between activities and real-world contexts or student interests (e.g., “Examples connected with real situations we mentioned”).
- Perceived Impact on Learning (4 items): Evaluated progress awareness and the utility of feedback for improvement (e.g., “Feedback helped me improve the next submission”).
- Participation and Engagement (4 items): Gauged behavioral involvement when choices were offered versus rigid pathways (e.g., “I participated more when I could choose the work approach”).
3.2.4. Statistical Analysis Strategy
4. Results
4.1. Descriptive Analysis
4.2. Component and Factor Analysis Structure
- Factor 1 (Impact of Education on Daily Life): Perceived utility of education in decision-making, quality of life, and employment opportunities.
- Factor 2 (Access and Continuity): Enrollment status, semester progression, and academic retention.
- Factor 3 (Educational Performance): Behaviors related to academic remediation, including supplemental training, extraordinary exams, and course repetition.
- Factor 4 (Perception of Education): Evaluation practices, such as written/oral assessments and the submission of multimedia evidence.
4.3. Correlation Analysis
- (1)
- Factor1–Factor 4 (Impact of Education on Daily Life × Perception of Education)
- (2)
- Factor 2–Factor 3 (Access to Education × Educational Performance)
4.4. Results of the Quasi-Experimental Pilot (Phase 2)
Instructor Observations (Engagement)
5. Discussion
5.1. Inputs: The Strategic Foundation
- Student Diagnostics: These capture dynamic learner characteristics such as prior knowledge and motivation Shemshack and Spector (2020), alongside structural access conditions identified in the ENAPE analysis. Importantly, this component avoids discredited learning styles (Goswami & Bryant, 2012).
- Pedagogical Parameters: Defined by the instructor, who acts as the strategic orchestrator (Molenaar, 2022). These parameters include learning goals, relevance criteria, and curriculum alignment.
- Ethical Guardrails: This foundational layer establishes strict protocols for data privacy, bias mitigation, and content verification Zhu et al. (2025) prior to any AI processing.
5.2. Orchestration and Design: The Human-in-the-Loop
5.2.1. Diagnostic Interpretation
5.2.2. Master Prompt Design (Instructional Design & Prompting)
5.2.3. Safety Guardrails Configuration
- “Do not use real student names in examples.”
- “Verify factual content using the validated RAG database.”
5.3. Scalability Toolkit
- Content Variation Engine: Transforms the teacher’s base instruction into multiple differentiated versions (simplified, analogy-based, critical-analysis, etc.).
- Large Language Model (LLM): Executes the variation by processing teacher-defined prompts (e.g., GPT or Claude).
- GPT Filter (RAG & Safety Decision Tree): Inspired by Sabzalieva and Valentini (2023), this layer validates whether the content is factual and safe before reaching students. Outputs failing verification are flagged.
- Teacher Dashboard: Presents aggregated progress patterns rather than isolated interactions, enabling data-informed pedagogical decisions.
5.4. Classroom Implementation
5.5. Diagnostic & Clustering Tool
5.6. Feedback Loop
5.7. Limitations and Ethical Considerations
5.8. Significance of the Proposed Model
6. Conclusions
6.1. Theoretical and Practical Contributions
6.2. Recommendations
- For Institutions: Shift professional development from basic digital literacy to Pedagogical AI Orchestration. Training should prioritize enabling teachers to interpret diagnostic data and design “Master Prompts,” rather than simply operating software.
- For Policy: Establish institutional Ethical Guardrails (as defined in the Input Phase) prior to AI deployment. Policies must explicitly prohibit the use of “black-box” AI for high-stakes assessment, ensuring that all AI-generated feedback passes through a human validation filter.
- For Instructional Design: Adopt a Cluster-First approach for scalability. Instead of attempting unsustainable one-to-one personalization, educators should use AI to target dynamic subgroups—as evidenced by the Civil Engineering vs. Nutrition variance—allowing efficient yet tailored scaffolding.
6.3. Limitations and Future Research
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
| Variable | F1 | F2 | F3 | F4 |
|---|---|---|---|---|
| PA3_2 | −0.013 | −0.013 | −0.028 | 0.039 |
| PA3_3_NIVEL | −0.036 | 0.011 | 0.006 | −0.015 |
| PA3_3_SEMESTRE | −0.051 | 0.077 | −0.021 | −0.048 |
| PA3_4 | −0.144 | −0.960 | −0.001 | −0.097 |
| PA3_6 | 0.147 | 0.849 | 0.003 | 0.094 |
| PA3_7_1 | −0.005 | 0.246 | 0.245 | 0.715 |
| PA3_7_2 | 0.004 | 0.256 | 0.191 | 0.827 |
| PA3_7_3 | 0.008 | 0.283 | 0.199 | 0.800 |
| PA3_8_1 | 0.051 | 0.636 | 0.036 | 0.046 |
| PA3_8_2 | 0.073 | 0.490 | 0.093 | 0.085 |
| PA3_8_3 | 0.071 | 0.604 | 0.026 | 0.031 |
| PA3_8_4 | 0.060 | 0.632 | 0.094 | 0.036 |
| PA3_8_5 | 0.061 | 0.596 | 0.033 | 0.020 |
| PA3_8_6 | 0.074 | 0.691 | 0.085 | 0.040 |
| PA3_8_7 | 0.154 | 0.919 | −0.008 | 0.099 |
| PA3_8_8 | 0.147 | 0.951 | −0.010 | 0.090 |
| PB3_1 | −0.971 | −0.091 | −0.218 | −0.011 |
| PB3_3 | 0.780 | 0.062 | 0.164 | 0.036 |
| PB3_5_NIVEL | 0.966 | 0.091 | 0.220 | 0.008 |
| PB3_6 | 0.831 | 0.094 | 0.205 | −0.014 |
| PB3_8 | 0.708 | 0.085 | 0.164 | −0.014 |
| PB3_9_1 | 0.306 | 0.003 | 0.474 | 0.281 |
| PB3_9_2 | 0.324 | −0.016 | 0.528 | 0.311 |
| PB3_9_3 | 0.329 | 0.007 | 0.516 | 0.273 |
| PB3_10_1 | 0.327 | 0.062 | 0.587 | 0.003 |
| PB3_10_2 | 0.289 | 0.065 | 0.677 | −0.010 |
| PB3_10_3 | 0.231 | 0.062 | 0.766 | 0.017 |
| PB3_10_4 | 0.255 | 0.068 | 0.718 | 0.075 |
| PB3_10_5 | 0.377 | 0.052 | 0.570 | 0.081 |
| PB3_11_1 | 0.760 | 0.064 | 0.214 | 0.044 |
| PB3_11_2 | 0.769 | 0.090 | 0.213 | 0.005 |
| PB3_11_3 | 0.755 | 0.093 | 0.193 | 0.025 |
| PB3_11_4 | 0.862 | 0.100 | 0.216 | −0.023 |
| PB3_11_5 | 0.971 | 0.090 | 0.214 | 0.012 |
| PB3_12_1 | 0.790 | 0.090 | 0.200 | 0.012 |
| PB3_12_2 | 0.765 | 0.069 | 0.189 | −0.001 |
| PB3_12_3 | 0.868 | 0.082 | 0.209 | 0.015 |
| PB3_12_4 | 0.938 | 0.088 | 0.219 | 0.011 |
| PB3_12_5 | 0.959 | 0.090 | 0.214 | 0.006 |
| PB3_12_6 | 0.774 | 0.118 | 0.210 | −0.016 |
| PB3_12_7 | 0.970 | 0.090 | 0.216 | 0.013 |
| PB3_12_8 | 0.970 | 0.091 | 0.219 | 0.011 |
| PB3_16_1 | 0.811 | 0.076 | 0.181 | 0.050 |
| PB3_16_2 | 0.925 | 0.084 | 0.200 | 0.034 |
| PB3_16_3 | 0.838 | 0.091 | 0.192 | 0.040 |
| PB3_16_4 | 0.957 | 0.087 | 0.211 | 0.021 |
| PB3_16_5 | 0.753 | 0.070 | 0.181 | −0.034 |
| PD3_1 | 0.267 | 0.049 | 0.049 | 0.021 |
References
- Adhikari, D. P., & Pandey, G. P. (2025). Integrating AI in higher education: Transforming teachers’ roles in boosting student agency. Educational Technology Quarterly, 2025(2), 151–168. [Google Scholar] [CrossRef]
- Aguilera-Hermida, A. P., Quiroga-Garza, A., Gómez-Mendoza, S., Del Río Villanueva, C. A., Avolio Alecchi, B., & Avci, D. (2021). Comparison of students’ use and acceptance of emergency online learning due to COVID-19 in the USA, Mexico, Peru, and Turkey. Education and Information Technologies, 26(6), 6823–6845. [Google Scholar] [CrossRef]
- Alamri, H. A., Watson, S., & Watson, W. (2021). Learning technology models that support personalization within blended learning environments in higher education. TechTrends, 65(1), 62–78. [Google Scholar] [CrossRef]
- Almeida, L. M. C. G., Münzer, S., & Kühl, T. (2024). More personal, but not better: The personalization effect in learning neutral and aversive health information. Journal of Computer Assisted Learning, 40(5), 2248–2260. [Google Scholar] [CrossRef]
- Almusaed, A., Almssad, A., Yitmen, I., & Homod, R. Z. (2023). Enhancing student engagement: Harnessing “AIED”’s power in hybrid education—A review analysis. Education Sciences, 13(7), 632. [Google Scholar] [CrossRef]
- Amemasor, S. K., Oppong, S. O., Ghansah, B., Benuwa, B.-B., & Agbeko, M. (2025). The influence of digital professional development and professional learning communities on STEM teachers’ digital competency. PLoS ONE, 20(1), e0328883. [Google Scholar] [CrossRef]
- An, Q., Yang, J., Xu, X., Zhang, Y., & Zhang, H. (2024). Decoding AI ethics from users’ lens in education: A systematic review. Heliyon, 10(20), e39357. [Google Scholar] [CrossRef]
- Angrist, N., Bergman, P., & Matsheng, M. (2022). Experimental evidence on learning using low-tech when school is out. Nature Human Behaviour, 6(7), 941–950. [Google Scholar] [CrossRef] [PubMed]
- Baidoo-Anu, D., & Owusu Ansah, L. (2023). Education in the era of generative artificial intelligence (AI): Understanding the potential benefits of ChatGPT in promoting teaching and learning. Journal of AI, 7(1), 52–62. [Google Scholar] [CrossRef]
- Bittle, K., & El-Gayar, O. (2025). Generative AI and academic integrity in higher education: A systematic review and research agenda. Information, 16(4), 296. [Google Scholar] [CrossRef]
- Bombaerts, G., & Vaessen, B. (2022). Motivational dynamics in basic needs profiles: Toward a person-centered motivation approach in engineering education. Journal of Engineering Education, 111(2), 357–375. [Google Scholar] [CrossRef]
- Bond, M., Khosravi, H., De Laat, M., Bergdahl, N., Negrea, V., Oxley, E., Pham, P., Chong, S. W., & Siemens, G. (2024). A meta systematic review of artificial intelligence in higher education: A call for increased ethics, collaboration, and rigour. International Journal of Educational Technology in Higher Education, 21(1), 4. [Google Scholar] [CrossRef]
- Chans, G. M., Orona-Navar, A., & Orona-Navar, C. (2023). Higher education in Mexico: The effects and consequences of the COVID-19 pandemic. Sustainability, 15(12), 9476. [Google Scholar] [CrossRef]
- Chiu, T. K., Moorhouse, B. L., Chai, C. S., & Ismailov, M. (2024). Teacher support and student motivation to learn with artificial intelligence (AI) based chatbot. Interactive Learning Environments, 32(7), 3240–3256. [Google Scholar] [CrossRef]
- Du, H. T., & Wang, X. (2025). Research on the role transformation of teachers in the AI Era. International Journal on Social and Education Sciences, 7(4), 346–359. [Google Scholar] [CrossRef]
- Du, X., Du, M., Zhou, Z., & Bai, Y. (2025). Facilitator or hindrance? The impact of AI on university students’ higher-order thinking skills in complex problem solving. International Journal of Educational Technology in Higher Education, 22(1), 39. [Google Scholar] [CrossRef]
- El-Sabagh, H. A. (2021). Adaptive e-learning environment based on learning styles and its impact on development students’ engagement. International Journal of Educational Technology in Higher Education, 18(1), 53. [Google Scholar] [CrossRef]
- Garzón, J., Patiño, E., & Marulanda, C. (2025). Systematic review of artificial intelligence in education: Trends, benefits, and challenges. Multimodal Technologies and Interaction, 9(8), 84. [Google Scholar] [CrossRef]
- Gligorea, I., Cioca, M., Oancea, R., Gorski, A. T., & Gorski, H. (2023). Adaptive learning using artificial intelligence in e-learning: A literature review. Education Sciences, 13(12), 1216. [Google Scholar] [CrossRef]
- Gopal, R., Singh, V., & Aggarwal, A. (2021). Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Education and Information Technologies, 26(6), 6923–6947. [Google Scholar] [CrossRef]
- Goswami, U., & Bryant, P. (2012). Children’s cognitive development and learning. In The cambridge primary review research surveys (pp. 141–169). Routledge. [Google Scholar]
- Hadar Shoval, D. (2025). Artificial intelligence in higher education: Bridging or widening the digital divide? Education Sciences, 15(5), 637. [Google Scholar] [CrossRef]
- Hevia, F. J., Vergara-Lope, S., Velásquez-Durán, A., & Calderón, D. (2022). Estimation of the fundamental learning loss and learning poverty related to COVID-19 pandemic in Mexico. International Journal of Educational Development, 88, 102515. [Google Scholar] [CrossRef]
- Kabudi, T., Pappas, I., & Olsen, D. H. (2021). AI-enabled adaptive learning systems: A systematic mapping of the literature. Computers and Education: Artificial Intelligence, 2, 100017. [Google Scholar] [CrossRef]
- Karam, J. (2023). Reforming higher education through AI. In N. Azoury, & G. Yahchouchi (Eds.), Governance in higher education (pp. 275–306). Springer Nature. [Google Scholar] [CrossRef]
- Li, B., Tan, Y. L., Wang, C., & Lowell, V. (2025). Two years of innovation: A systematic review of empirical generative AI research in language learning and teaching. Computers and Education: Artificial Intelligence, 9, 100445. [Google Scholar] [CrossRef]
- Lim, W. M., Gunasekara, A., Pallant, J. L., Pallant, J. I., & Pechenkina, E. (2023). Generative AI and the future of education: Ragnarök or reformation? A paradoxical perspective from management educators. The International Journal of Management Education, 21(2), 100790. [Google Scholar] [CrossRef]
- Melisa, R., Ashadi, A., Triastuti, A., Hidayati, S., Salido, A., Luansi Ero, P., Marlini, C., Zefrin, Z., & Al Fuad, Z. (2025). Critical thinking in the age of AI: A systematic review of AI’s effects on higher education. Educational Process International Journal, 14, e2025031. [Google Scholar] [CrossRef]
- Minn, S. (2022). AI-assisted knowledge assessment techniques for adaptive learning environments. Computers and Education: Artificial Intelligence, 3, 100050. [Google Scholar] [CrossRef]
- Miranda, J., Navarrete, C., Noguez, J., Molina-Espinosa, J.-M., Ramírez-Montoya, M.-S., Navarro-Tuch, S. A., Bustamante-Bello, M.-R., Rosas-Fernández, J.-B., & Molina, A. (2021). The core components of education 4.0 in higher education: Three case studies in engineering education. Computers & Electrical Engineering, 93, 107278. [Google Scholar] [CrossRef]
- Molenaar, I. (2022). Towards hybrid human-AI learning technologies. European Journal of Education, 57(4), 632–645. [Google Scholar] [CrossRef]
- Núñez-Canal, M., de Obesso, M. d. l. M., & Pérez-Rivero, C. A. (2022). New challenges in higher education: A study of the digital competence of educators in COVID times. Technological Forecasting and Social Change, 174, 121270. [Google Scholar] [CrossRef]
- Ortiz-Gallegos, T. (2020). Student academic performance, learning modality, gender and ethnicity at a four-year university in new mexico [Unpublished Doctoral dissertation]. Grand Canyon University.
- Peng, H., Ma, S., & Spector, J. M. (2019). Personalized adaptive learning: An emerging pedagogical approach enabled by a smart learning environment. Smart Learning Environments, 6, 9. [Google Scholar] [CrossRef]
- Peña, J. M., Moreno, O. B., Herrera, Á. L. O., & Moreno, T. E. B. (2024). Balancing privacy and ethics in the use of artificial intelligence in education. Sapiens International Multidisciplinay Journal, 1(3), 148–170. [Google Scholar] [CrossRef]
- Pozo, J. I., Cabellos, B., & Echeverría, M. d. P. (2024). Has the educational use of digital technologies changed after COVID-19? A longitudinal study. PLoS ONE, 19(10), e0311695. [Google Scholar] [CrossRef] [PubMed]
- Ramírez Montoya, M. S., Romero Rodríguez, J. M., Glasserman Morales, L. D., & Navas Parejo, M. R. (2023). Collaborative online international learning between Spain and Mexico: A microlearning experience to enhance creativity in complexity. Education + Training, 65(2), 340–354. [Google Scholar] [CrossRef]
- Rifah, L., & Zamahsari, G. K. (2022, November 22–23). Can technology replace the teachers’ role in higher education settings? A systematic literature review. 7th International Conference on Sustainable Information Engineering and Technology (pp. 217–221), Malang, Indonesia. [Google Scholar] [CrossRef]
- Sabzalieva, E., & Valentini, A. (2023). ChatGPT and artificial intelligence in higher education: Quick start guide. UNESCO International Institute for Higher Education in Latin America and the Caribbean (IESALC), Caracas, Venezuela. Available online: https://etico.iiep.unesco.org/en/chatgpt-and-artificial-intelligence-higher-education-quick-start-guide (accessed on 10 September 2025).
- Salido, A., Syarif, I., Sitepu, M. S., Suparjan, Wana, P. R., Taufika, R., & Melisa, R. (2025). Integrating critical thinking and artificial intelligence in higher education: A bibliometric and systematic review of skills and strategies. Social Sciences & Humanities Open, 12, 101924. [Google Scholar] [CrossRef]
- Shemshack, A., Kinshuk, & Spector, J. M. (2021). A comprehensive analysis of personalized learning components. Journal of Computers in Education, 8(4), 485–503. [Google Scholar] [CrossRef]
- Shemshack, A., & Spector, J. M. (2020). A systematic literature review of personalized learning terms. Smart Learning Environments, 7(1), 33. [Google Scholar] [CrossRef]
- Srinivasa, K., Kurni, M., & Saritha, K. (2022). Learning, teaching, and assessment methods for contemporary learners. Springer. [Google Scholar]
- Tretiak, O., Smolnykova, H., Fedorova, Y., Yakunin, Y., & Shopina, M. (2025). Optimization of the educational process through the use of artificial intelligence in teachers’ work. Revista Eduweb, 19(1), 105–119. [Google Scholar] [CrossRef]
- Wang, S., Christensen, C., Cui, W., Tong, R., Yarnall, L., Shear, L., & Feng, M. (2023). When adaptive learning is effective learning: Comparison of an adaptive learning system to teacher-led instruction. Interactive Learning Environments, 31(2), 793–803. [Google Scholar] [CrossRef]
- Xia, Q., Weng, X., Ouyang, F., Lin, T. J., & Chiu, T. K. (2024). A scoping review on how generative artificial intelligence transforms assessment in higher education. International Journal of Educational Technology in Higher Education, 21(1), 40. [Google Scholar] [CrossRef]
- Zhu, H., Sun, Y., & Yang, J. (2025). Towards responsible artificial intelligence in education: A systematic review on identifying and mitigating ethical risks. Humanities and Social Sciences Communications, 12(1), 1111. [Google Scholar] [CrossRef]







| Python Library | Application in the Research |
|---|---|
| Pandas == 2.2.2 | The specific function to generate Pearson correlations is corr(). |
| Factor_analyzer == 0.5.1 | Factor analysis for identifying latent factors in the data. |
| Scikit-learn == 1.5.0 | Machine learning and data mining algorithms. |
| Scipy == 1.13.1 | Advanced scientific and mathematical tools and algorithms. |
| Título | Factor 1 | Factor 2 | Factor 3 | Factor 4 |
|---|---|---|---|---|
| Factor 1 | 1.0 | −0.783965 | −0.702753 | 0.716720 |
| Factor 2 | −0.783965 | 1.0 | 0.914637 | −0.808960 |
| Factor 3 | −0.702753 | 0.914637 | 1.0 | −0.759052 |
| Factor 4 | 0.716720 | −0.808960 | −0.759052 | 1.0 |
| Dimension | Civil Engineering () | Nutrition () | Total () |
|---|---|---|---|
| Satisfaction (Personalization) | 4.68 (0.45) | 4.19 (0.78) | 4.49 (0.64) |
| Relevance (Utility of Activities) | 4.61 (0.51) | 4.24 (0.81) | 4.47 (0.66) |
| Perceived Impact (Learning Process) | 4.55 (0.58) | 4.08 (0.92) | 4.37 (0.76) |
| Participation (Behavioral Engagement) | 4.64 (0.48) | 4.11 (0.89) | 4.43 (0.71) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Hernández-Herrera, J.R.; Ortiz-Bejar, J.; Ortiz-Bejar, J. Adaptive and Personalized Learning in Higher Education: An Artificial Intelligence-Based Approach. Educ. Sci. 2026, 16, 109. https://doi.org/10.3390/educsci16010109
Hernández-Herrera JR, Ortiz-Bejar J, Ortiz-Bejar J. Adaptive and Personalized Learning in Higher Education: An Artificial Intelligence-Based Approach. Education Sciences. 2026; 16(1):109. https://doi.org/10.3390/educsci16010109
Chicago/Turabian StyleHernández-Herrera, Juan Roberto, Jesus Ortiz-Bejar, and Jose Ortiz-Bejar. 2026. "Adaptive and Personalized Learning in Higher Education: An Artificial Intelligence-Based Approach" Education Sciences 16, no. 1: 109. https://doi.org/10.3390/educsci16010109
APA StyleHernández-Herrera, J. R., Ortiz-Bejar, J., & Ortiz-Bejar, J. (2026). Adaptive and Personalized Learning in Higher Education: An Artificial Intelligence-Based Approach. Education Sciences, 16(1), 109. https://doi.org/10.3390/educsci16010109

