You are currently on the new version of our website. Access the old version .
SustainabilitySustainability
  • Article
  • Open Access

10 January 2026

Knowledge Graphs as Cognitive Scaffolding for Sustainable Engineering Education: A Quasi-Experimental Study in Structural Geology

,
,
,
and
1
College of Safety and Environmental Engineering, Shandong University of Science and Technology, Qingdao 266590, China
2
College of Earth Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue AI for Sustainable and Creative Learning in Education

Abstract

The transition to Outcome-Based Education (OBE) in engineering demands instructional tools that bridge theoretical knowledge and practical engineering competencies. However, traditional Learning Management Systems (LMS) primarily function as static resource repositories, lacking the semantic structure necessary to support deep learning and precise competency tracking. To address this, this study developed a three-layer domain Knowledge Graph (KG) for Structural Geology and integrated it into the ChaoXing LMS (a widely used Learning Management System in Chinese higher education). A semester-long quasi-experimental study (N = 84) was conducted to evaluate its impact on student performance and specific graduation attribute achievement compared to a conventional folder-based approach. Empirical results demonstrate that the KG-integrated group significantly outperformed the control group (p < 0.01, Cohen’s d = 0.74). Notably, while performance on rote memorization tasks was similar, the experimental group showed marked improvement in identifying and solving complex engineering problems. LMS log analysis confirmed a strong positive correlation (r = 0.68) between graph navigation depth and academic success. KG effectively bridged the gap between theoretical knowledge and practical engineering applications (e.g., geohazard analysis). This research confirms that explicit semantic visualization acts as vital cognitive scaffolding, effectively enhancing higher-order thinking and ensuring the rigorous alignment of instruction with engineering accreditation standards. Ultimately, this approach promotes sustainable learning capabilities and prepares future engineers to address complex, interdisciplinary challenges in sustainable development.

1. Introduction

Under the paradigm of Industry 4.0 and the United Nations’ Sustainable Development Goals (specifically SDG 4: Quality Education), engineering education is undergoing a fundamental shift from static knowledge transmission to dynamic capability cultivation. To achieve sustainable engineering education, it is crucial to equip students not only with theoretical knowledge but also with lifelong learning skills and the ability to solve complex, non-standard problems in changing environments. This necessitates a rigorous implementation of the Outcome-Based Education (OBE) philosophy, which emphasizes measurable student outcomes and the “substantial equivalence” of global engineering competencies [1,2,3].
Structural Geology serves as a critical case study for this educational transformation. It is not merely a theoretical discipline describing rock deformation; it constitutes the fundamental scientific basis for sustainable engineering practices. In the context of global sustainability, the professional competencies derived from this course are prerequisites for deep-earth exploration, geo-hazard mitigation, and environmental protection. For instance, the accurate 3D interpretation of fracture systems is vital for the efficiency of Enhanced Geothermal Systems (EGS) and ensuring the seal integrity of Carbon Capture and Storage (CCS) sites [4,5]. Similarly, in civil infrastructure, the ability to identify hidden fault zones is critical for preventing catastrophic water inrush accidents in tunnel construction and selecting safe, long-term repositories for nuclear waste disposal [6]. Therefore, the primary challenge in structural geology education is fostering the high-order spatial reasoning required to solve these complex environmental and engineering problems, rather than simple rote memorization.
However, applying OBE in Structural Geology remains challenging due to the discipline’s inherent complexity, which requires students to reconstruct four-dimensional evolutionary processes (3D space + time) from limited two-dimensional observations [7]. Novice learners often struggle with “spatial penetration”—the ability to mentally visualize subsurface structures based on surface outcrops—leading to a gap between academic learning and professional adaptability. While current Learning Management Systems (LMS) facilitate digital resource distribution, they often fall short in bridging this cognitive gap. In China, as in many other countries, universities rely heavily on LMS platforms (e.g., ChaoXing, which is functionally equivalent to Blackboard or Canvas, widely used in Western institutions) to manage course content [8]. Despite their popularity, these platforms typically organize content in linear, folder-based structures. This creates an “epistemological mismatch”: while sustainable underground development requires systemic, interconnected reasoning to navigate complex geological bodies, traditional LMS presents knowledge in fragmented silos, hindering the development of cross-concept links necessary for engineering decision-making.
To address this mismatch, Knowledge Graphs (KGs) have emerged as a promising educational technology. By representing knowledge as a network of entities and relationships, KGs can visualize the semantic structure of a discipline, acting as a “cognitive scaffold” for learners [9]. Although KGs have been applied in various educational settings, few studies have explored their integration into the instructional design specifically for Engineering Education Accreditation, nor have they sufficiently evaluated how KG-based navigation influences the achievement of defined engineering competency indicators in a blended learning environment.
Addressing these gaps, this study proposes an OBE-oriented instructional framework that integrates Knowledge Graphs into the teaching of Structural Geology. Utilizing the widely used ChaoXing platform as the technical base, we constructed a domain-specific knowledge graph that explicitly maps course content to engineering accreditation indicators. Unlike previous studies that focus solely on algorithmic construction, this research emphasizes pedagogical intervention: using the KG as a navigational and cognitive tool to transform the learning process. Specifically, this paper aims to: (1) propose a methodology for constructing a Structural Geology Knowledge Graph that embeds OBE accreditation standards; (2) develop a blended learning model where the KG serves as a non-linear navigation tool to support pre-class inquiry, in-class elaboration, and post-class consolidation; and (3) validate the effectiveness of this approach through a quasi-experimental study, providing empirical data on how KG-integrated instruction improves students’ academic performance and their competency in addressing sustainable engineering challenges.

2. Literature Review

This section critically reviews the existing literature on engineering education paradigms, the cognitive challenges in geology education, and the technological evolution from traditional Learning Management Systems (LMS) to intelligent Knowledge Graphs. It aims to identify the gaps between current instructional delivery and the requirements of sustainable engineering competency.

2.1. OBE and Cognitive Challenges in Engineering Education

The global shift towards Outcome-Based Education (OBE) has redefined engineering pedagogy. Unlike traditional input-oriented models, OBE focuses on “organizing everything in an educational system around what is essential for all students to be able to do successfully at the end of their learning experiences” [1,10]. This paradigm aligns with the Washington Accord and the United Nations’ Sustainable Development Goals (SDGs), particularly SDG 4, emphasizing that future engineers must possess high-order thinking skills to address complex sustainability challenges [11,12,13].
However, implementing OBE in Structural Geology presents unique cognitive hurdles. Research in geoscience education highlights that “spatial thinking”—the ability to visualize 3D structures from 2D data—is a critical predictor of success [14]. Kastens and Ishikawa [15] argue that novice learners often suffer from cognitive overload when mentally reconstructing geological evolution. Traditional teaching methods, which often rely on static textbook illustrations, fail to provide the necessary “scaffolding” for students to bridge the gap between abstract concepts and practical engineering applications [16,17]. Consequently, although students may memorize definitions, they frequently lack the “deep understanding” required for solving real-world geo-hazard or resource exploration problems [18].

2.2. The “Linear Trap” of Current Learning Management Systems

To support large-scale engineering education, LMS platforms such as Moodle, Canvas, Blackboard, and Chaoxing (the latter being dominant in Chinese higher education) have become ubiquitous [19,20]. These platforms have successfully digitized the delivery of course materials, enabling blended learning modes.
Despite their administrative efficiency, the pedagogical architecture of these LMSs has faced growing criticism. As highlighted by Dabbagh et al. and recent empirical studies, most LMS platforms employ a file-folder metaphor that organizes content linearly (e.g., chapters, weeks). This structure creates “information silos,” where knowledge points are isolated in separate digital containers [7,21].
For a reticulated discipline like geology, where a fault structure (Chapter 4) is genetically related to stress mechanics (Chapter 2) and regional tectonics (Chapter 8), this linear presentation causes an “epistemological mismatch” [22]. Students navigating these linear paths often struggle to perceive cross-chapter connections, leading to fragmented knowledge structures that are insufficient for complex engineering decision-making [23]. While some platforms offer keyword search functions, they lack a semantic map to guide learners through the logical relationships of the discipline.

2.3. Knowledge Graphs: From Algorithm to Pedagogical Intervention

To address the issue of knowledge fragmentation, Knowledge Graphs (KGs) have emerged as a powerful tool for Knowledge Visualization. A KG represents knowledge as a directed graph G = (E, R), where nodes E represent concepts (entities) and edges R represent semantic relationships [24].
Current literature on Educational Knowledge Graphs (EKGs) can be categorized into three streams: Automated Construction: Studies focusing on using NLP and Deep Learning to automatically extract entities from textbooks to build graphs efficiently [24,25,26].
Resource Recommendation: Research utilizing KGs to power adaptive algorithms that recommend learning paths or resources based on student profiles [27,28].
Concept Mapping: Psychological studies show that visual maps can reduce cognitive load and improve memory retention [29].
However, significant gaps remain. Most existing studies treat the KG as a “black box” backend technology for algorithms, rather than a frontend “cognitive tool” for students. There is a paucity of empirical research that explicitly integrates KGs into the instructional design of engineering courses to satisfy accreditation standards [30]. Furthermore, few studies have quantified how KG-based navigation specifically impacts the attainment of sustainability-related competencies compared to traditional linear LMS navigation [31].
This study aims to fill these gaps by moving beyond algorithm optimization to pedagogical validation. We propose a framework that uses KGs not just to recommend resources, but to visualize the logic of Structural Geology, thereby transforming the LMS from a file repository into a dynamic cognitive map.

3. Methodology

3.1. Research Context and Participants

This study was conducted during the Fall 2023 semester at Shandong University of Science and Technology, focusing on the core undergraduate course “Structural Geology”. This course is a foundational pillar for Geological Engineering majors, requiring students to master complex spatial reasoning and mechanical principles of rock deformation.
The participants consisted of 84 third-year undergraduate students (Mean age = 21.4 years, SD = 1.2; 78.6% Male, 21.4% Female). All participants had completed prerequisite coursework in Advanced Mathematics and General Geology, ensuring a homogenous baseline of domain knowledge. The students were assigned to two parallel classes based on administrative scheduling: an Experimental Group (EG, N = 42) and a Control Group (CG, N = 42). Both groups were taught by the same instructor to eliminate teacher-effect bias.
The instructional intervention was implemented via the ChaoXing (Superstar) Learning Management System. While functionally equivalent to global platforms such as Blackboard, Canvas, or Moodle, ChaoXing is the dominant digital learning infrastructure in Chinese higher education, serving over 90% of local universities. It supports standard LMS features—including file hosting, discussion forums, and quizzes—while also offering a unique plugin environment that allows for the integration of our custom Knowledge Graph module.

3.2. The KG-Integrated Instructional Design

3.2.1. Ontology Construction and Semantic Definition

To ensure the replicability of the knowledge graph and its alignment with the OBE framework, we defined a standardized three-layer ontology schema termed “Concept-Method-Application” (CMA). This schema serves as the semantic skeleton of the graph, categorizing knowledge into three distinct cognitive levels. The foundational Concept Layer comprises theoretical entities (e.g., “Stress,” “Strain,” “Fold”), representing the declarative knowledge students must memorize. The intermediate Method Layer consists of procedural entities, such as “Stereographic Projection” or “Mohr Circle Analysis,” which represent the analytical tools required to interpret geological structures. The highest level, the Application Layer, integrates real-world engineering scenarios—such as “Tunnel Water Inrush” or “Dam Foundation Stability”—requiring students to synthesize concepts and methods for complex problem-solving.
The semantic connectivity between these layers is governed by three specific relationship types defined to mirror engineering logic. First, Hierarchical Relations (e.g., is a) provide taxonomic structure, classifying specific entities under broader categories (e.g., a “Normal Fault” is a “Fault”). Second, Logical Relations (e.g., prerequisite of) define the learning sequence, ensuring students master foundational theories before attempting advanced analysis (e.g., “Rock Mechanics” is a prerequisite of “Slope Stability”). Finally, and most critically for the OBE approach, Application Relations (e.g., applied to) bridge the gap between theory and practice, explicitly linking analytical methods to their practical engineering uses. This structured ontology resulted in a network of 345 nodes and 762 edges, creating a comprehensive semantic map of the course syllabus.
The construction of the graph followed a rigorous, iterative “Expert-in-the-loop” process designed to ensure both content validity and methodological replicability. Initially, the primary entities and relationships were extracted from standard textbooks by the course instructor to create a preliminary network. This draft was subsequently evaluated by an external expert panel consisting of two senior geology professors and one geotechnical engineer. The inclusion of an industry practitioner was crucial for specifically validating the relevance of the “Application” nodes to current engineering practices and standards. Through a consensus-based approach, the panel refined the semantic relationships, achieving a high inter-rater agreement (>85%) to minimize subjective bias. Finally, before full-scale implementation, the graph underwent a pilot validation with a small cohort of senior students to confirm the logical flow and navigational usability of the learning paths.

3.2.2. The KG-Integrated Blended Learning Framework

To bridge the gap between abstract geological concepts and the rigorous competency requirements of engineering accreditation, we designed a KG-Integrated Blended Learning Model (Figure 1). Grounded in Social Constructivism, this model posits that meaningful learning occurs when learners actively construct new knowledge upon their existing cognitive structures. Unlike traditional blended learning, which often presents resources in isolated lists, our approach utilizes the Knowledge Graph as a persistent “cognitive scaffold” throughout the learning lifecycle. The KG functions as an “Advance Organizer”, a concept introduced by Ausubel (1960), providing a global semantic overview that helps students anchor new details within a broader conceptual framework [32]. This intervention transforms the passive consumption of LMS resources into an active, navigation-driven inquiry process, explicitly targeting the “Complex Problem Solving” graduate attributes.
Figure 1. Conceptual framework of the proposed instructional model. The architecture integrates the three-layer domain Knowledge Graph (Concept-Method-Application) into the pre-, in-, and post-class phases of blended learning to support data-driven engineering competency evaluation.
To operationalize this model, the online learning component was implemented using the ChaoXing (Superstar) Learning Management System (LMS). However, it is crucial to emphasize that the Knowledge Graph-driven framework proposed in this study is platform-agnostic. The architectural logic—specifically the integration of the semantic graph with front-end learning modules—is transferable and can be seamlessly adapted to other global LMS platforms such as Moodle, Canvas, or Blackboard. The ChaoXing platform served merely as the experimental environment to validate the framework’s efficacy in fostering sustainable engineering competencies.

3.2.3. Pre-Class Phase: Guided Inquiry and Pathfinding

Prior to the face-to-face sessions, the pedagogical objective is to facilitate structured inquiry and reduce the cognitive load associated with new terminology. In traditional settings, students often feel overwhelmed by the density of textbook content. In our intervention, students are required to navigate the KG on the LMS to visualize the logical path of the upcoming lesson. For instance, before the lecture on “Fault Mechanics,” students trace the prerequisite relationships (R prerequisite) in the graph, identifying that “Stress Analysis” and “Rock Failure Criteria” are necessary antecedents. By clicking on these nodes, they access bite-sized micro-videos to reactivate prior knowledge. This graph-guided preparation ensures that students enter the classroom with a clear mental map of the knowledge structure, significantly enhancing the efficacy of the Flipped Classroom approach [33].

3.2.4. In-Class Phase: Structured Scaffolding and Visualization

During the instructional phase, the instructor utilizes the dynamic KG as the primary presentation tool, replacing linear slide decks for structural explanations. Structural Geology requires students to toggle between microscopic mechanisms and macroscopic tectonic patterns, a cognitive leap that is often difficult for novices. The instructor demonstrates this linkage by dynamically traversing the graph, visually zooming out from a specific node like “En Echelon Fractures” to its parent tectonic setting, “Shear Zone”. This visual demonstration serves as Instructional Scaffolding, explicitly modeling the expert thinking process required for engineering analysis [34]. Furthermore, when discussing complex engineering cases (e.g., tunnel stability in faulted zones), the instructor highlights the specific nodes involved, visually connecting theoretical concepts to the “Engineering Problem Analysis” accreditation indicator, thereby reinforcing the relevance of the curriculum to professional practice.

3.2.5. Post-Class Phase: Self-Regulated Consolidation

In the post-class phase, the KG transforms into a tool for Self-Regulated Learning (SRL). Unlike linear homework assignments, the graph supports personalized remediation. When a student struggles with a specific problem in the chapter quiz (e.g., interpreting a geological map), they are guided to consult the KG to identify the specific knowledge gaps. The semantic links allow them to trace back from the “Application” node (the map problem) to the underlying “Concept” nodes (e.g., “Rule of Vs”) and access targeted remedial resources. This mechanism encourages students to take ownership of their learning process, shifting the focus from merely “getting the right answer” to understanding the underlying knowledge network, a critical trait for lifelong learning in engineering [35].

3.3. Data Collection and Instruments

3.3.1. Quasi-Experimental Design and Participants

To empirically validate the effectiveness of the KG-integrated instructional model, a quasi-experimental study was conducted over a full semester (16 weeks) in the Structural Geology course. The participants consisted of 84 junior undergraduates majoring in Geological Engineering at [Name of University], divided into two parallel classes based on administrative assignment. One class was designated as the Experimental Group, which adopted the KG-driven blended learning model described in Section 3. The other served as the Control Group, receiving instruction via the traditional LMS interface (folder-based resources) and standard PowerPoint lectures. To ensure the internal validity of the experiment, both groups were taught by the same instructor, adhered to identical syllabi, and underwent the same assessments. An independent t-test on the pre-course GPA showed no statistically significant difference between the two groups (p > 0.05), indicating comparable baseline academic ability [36].

3.3.2. Data Collection Instruments

Data collection was rigorously designed to capture both the process of learning engagement and the outcome of competency achievement.
First, LMS Log Data were extracted from the backend of the ChaoXing platform to quantify student interaction patterns. For the Experimental Group, specific behavioral metrics included “KG Node Click Frequency” and “Navigation Path Depth,” representing the utilization of the semantic structure. In contrast, for the Control Group, standard metrics such as “Resource Download Frequency” and “Video Completion Rate” were monitored.
Second, Assessment Data were aggregated from three distinct sources to ensure a holistic evaluation: (1) Regular Assignments (20%), (2) Chapter Quizzes (30%), and (3) Final Examination (50%). Crucially, every assessment item—whether a quiz question or an exam problem—was explicitly tagged with a corresponding Graduate Attribute (GA) indicator prior to administration. This granular tagging strategy allows for the disaggregation of student scores specifically mapped to accreditation standards [37].
To specifically evaluate Indicator 2.3: Design/Development of Solutions for Complex Engineering Problems, we focused on the comprehensive analysis section of the final exam. Unlike traditional questions that test rote memorization, this section featured a real-world engineering scenario:
Case Example: The Tunnel-Fault Intersection Challenge.
Problem: A high-speed railway tunnel is projected to cross a regional normal fault. Students were required to (1) predict the intersection geometry based on surface data, (2) assess geohazards (e.g., water inrush), and (3) propose a sustainable lining reinforcement strategy.
This specific item serves as a critical proxy for “Sustainable Engineering Competency,” requiring students to activate the semantic chain (e.g., Fault Geometry → Stress Field → Engineering Stability) fostered by the KG. Grading was conducted using a standardized rubric by two independent instructors to ensure reliability.

3.3.3. Calculation Method of Indicator Achievement

Although the knowledge graph itself serves as a pedagogical tool rather than a direct calculation engine, the achievement of OBE indicators was rigorously computed using the assessment data generated. Following the standard evaluation mechanism of Engineering Education Accreditation, the achievement value (A) for a specific indicator (Indk) is calculated using a weighted summation formula:
A   ( Ind k )   =   i = 1 n ( S i   ×   W i     ) i = 1 n (   T i   ×   W i   )
where
  • Si represents the average score obtained by the student cohort on assessment item i associated with indicator k.
  • Ti is the total possible score for assessment item i.
  • Wi is the weight of the assessment item in the overall evaluation scheme.
  • n is the total number of assessment items supporting indicator k.
This calculation was performed externally using Python scripts (Version: 3.13) to process the raw score data exported from the LMS. By comparing the calculated A (Indk) values between the Experimental and Control groups, we can quantitatively assess whether the KG-based intervention led to a significant improvement in specific engineering competencies, particularly those related to structural analysis and spatial reasoning.

4. Results

4.1. Overall Academic Performance

To evaluate the general impact of the KG-integrated instruction, we compared the final comprehensive scores of the two groups. The descriptive statistics reveal that the Experimental Group achieved a higher mean score (M = 82.4, SD = 7.8) compared to the Control Group (M = 76.1, SD = 9.2). An independent-samples t-test confirmed that this difference is statistically significant (t (82) = 3.38, p < 0.01, d = 0.74) (Table 1), with a medium-to-large effect size [38]. Notably, the variance in the Experimental Group was lower, suggesting that the structured scaffolding provided by the knowledge graph helped bridge the gap for lower-performing students, resulting in a more uniform mastery of the course content. In contrast, the Control Group exhibited a bi-modal distribution, indicating that while high achievers could navigate the traditional folder-based resources effectively, weaker students struggled to synthesize the fragmented information without semantic guidance.
Table 1. Comparison of Academic Performance between Experimental and Control Groups.

4.2. Achievement of Engineering Competency Indicators

A more granular analysis was conducted based on the specific Graduation Attribute indicators defined in Section 3.3.3. The results demonstrate a non-uniform improvement across different competency dimensions (Figure 2).
Figure 2. Comparison of Graduation Attribute achievement values between the Experimental and Control groups across different cognitive levels.
(1)
Memorization-based Indicators (e.g., Ind 1.2): For indicators requiring simple recall of geological definitions, the difference between groups was marginal (p > 0.05). This suggests that traditional LMS resources are sufficient for rote learning.
(2)
Complex Problem Solving Indicators (e.g., Ind 2.1, Ind 3.2): The Experimental Group showed a marked superiority in higher-order indicators. Specifically, for Indicator 2.1 (Identify and formulate complex engineering problems), the Experimental Group’s achievement value was 0.78, significantly outperforming the Control Group’s 0.65 (p < 0.01).
This divergence supports the hypothesis that the knowledge graph’s primary value lies in Cognitive Integration. By explicitly visualizing the links between mechanical principles and geological structures, the KG enabled students to perform the multi-step reasoning required for complex indicators, whereas the Control Group often failed to connect isolated concepts during problem-solving tasks [39,40].

4.3. Correlation Between KG Engagement and Learning Outcomes

To establish a direct link between the instructional intervention and the observed outcomes, we analyzed the LMS log data of the Experimental Group. A Pearson correlation analysis was performed between the “KG Node Access Frequency” and the “Final Exam Score” (Figure 3). The analysis revealed a strong positive correlation (r = 0.68, p < 0.001). Furthermore, we analyzed the “Navigation Depth”—defined as the number of sequential hops a student makes in the graph during a single session. Students who frequently traversed 3 or more layers (e.g., from Tectonics → Faulting → Mechanics → Engineering Case) scored significantly higher on the comprehensive analysis questions. This behavioral pattern provides empirical evidence that the knowledge graph was not merely used as a resource repository but as a cognitive tool for Deep Learning. The visualization of semantic relationships successfully prompted students to engage in the “relational processing” of information, which is critical for transferring geological knowledge
Figure 3. Correlation analysis between Knowledge Graph node access frequency and final comprehensive scores for the Experimental Group. Note: The solid line represents the linear regression trend. The Pearson correlation coefficient is r = 0.68 (p < 0.001).

5. Discussion

5.1. Implications for Sustainable Engineering Education

The primary objective of this study was to evaluate whether a Knowledge Graph (KG) driven intervention could enhance the competencies required for sustainable engineering practice. Our findings affirm that this pedagogical shift extends beyond mere academic improvement to align directly with SDG 4 (Quality Education).
Specifically, the marked improvement in Indicator 2.1 (Complex Problem Solving)—where the Experimental Group achieved 0.78 compared to the Control Group’s 0.65—validates the role of the KG as a cognitive transformation tool. As Zhu et al. [41] argue, in the AI era, the value of education lies not in information storage but in structuralization. Traditional rote learning is insufficient for modern engineering challenges, such as predicting geohazards or designing eco-friendly tunnels. By compelling students to traverse the semantic path from “Geological Phenomenon” to “Mechanical Mechanism” and finally to “Engineering Consequence,” the KG framework effectively cultivates the systems thinking necessary for sustainability.
This cognitive transformation is theoretically supported by the observed reduction in extraneous load. Drawing on Sweller’s Cognitive Load Theory [42], the visualization allows students to reallocate mental resources from “searching for information” to “constructing schemas”. This explains why the EG demonstrated higher accuracy in identifying complex geological structures. Such a shift is critical for sustainable engineering education because it equips future engineers with the “germane load” capacity to handle interdisciplinary challenges (e.g., environmental ethics and safety), rather than being overwhelmed by basic terminology. Thus, the KG acts not just as a technological novelty, but as a catalyst for the deep learning required by the Washington Accord [43].
Crucially, this shift toward deep conceptual understanding also directly addresses the challenge of knowledge retention and skills transfer. While this study primarily assessed immediate competency achievement at the semester’s end, the ‘Transfer of Learning’ theory posits that deep structural knowledge—as opposed to surface rote memorization—is the primary driver for applying academic skills to novel professional contexts [39]. Although a longitudinal assessment was outside the scope of this study, the significant improvement in complex problem-solving (Indicator 2.1) serves as a strong proxy. This suggests that students supported by the KG scaffolding are better positioned to retain these core competencies and effectively transfer them to future engineering projects, thereby fulfilling the long-term sustainability goals of engineering education.

5.2. International Applicability and Comparison

A critical question regarding this study concerns its generalizability beyond the specific context of a Chinese engineering university and the ChaoXing platform. While the implementation was local, the pedagogical and epistemological implications are inherently global. Fundamentally, the cognitive challenges in Structural Geology—specifically spatial reasoning and stress analysis—represent universal barriers for engineering students, whether in China, Europe, or the United States [44]. The Knowledge Graph constructs a “visual language” that transcends linguistic boundaries; semantic relationships such as “Faulting implies Stress Release” remain invariant regardless of the language of instruction [45]. Consequently, the “Graph-Navigated” learning model offers a replicable solution for international engineering programs facing similar issues of curriculum fragmentation.
Moreover, from a technical standpoint, although this study utilized the ChaoXing LMS (prevalent in Asia), the underlying architecture is platform-agnostic. The “Cloud-Edge-End” framework and the Neo4j-based ontology can be seamlessly adapted to global LMS ecosystems such as Canvas, Blackboard, or Moodle [46]. The core methodology—mapping curriculum ontologies via graph databases—is transferable, suggesting that future research could readily validate this framework in diverse cross-cultural educational settings to further explore its efficacy.

5.3. Implementation Challenges and Educational Equity

A primary institutional challenge identified in this study is the significant ‘front-loaded’ investment required for Knowledge Graph construction. Developing the initial CMA ontology and manually validating the 762 semantic edges required substantial expertise and time commitment from the instructor—approximately 60 h of dedicated development time. While this initial cost is high compared to preparing traditional slides, it aligns with the principles of sustainable education through ‘reusability.’ Once the core semantic structure is established, the marginal cost of updating or expanding the graph in subsequent semesters is negligible. Furthermore, emerging generative AI tools can be integrated in future iterations to semi-automate the ontology extraction process, thereby significantly reducing this barrier for widespread adoption.
Beyond the aggregate improvement in scores, a granular analysis of learner diversity reveals a dual narrative regarding educational equity. On one hand, the intervention demonstrated a significant “leveling up” effect aligned with SDG 10 (Reduced Inequalities). As evidenced in Table 1, the Standard Deviation (SD) of the Experimental Group was significantly lower than that of the Control Group. This statistical convergence implies that the Knowledge Graph functions effectively as a “cognitive scaffold” for lower-achieving students [47]. While high-performing students often intuitively construct mental maps of complex concepts, struggling students lack this implicit capability. By externalizing the disciplinary logic into a visual network, the system helps these students bridge the gap between fragmented facts and structural understanding, thereby reducing the “competency divide” within the classroom.
However, this pedagogical benefit is not unconditional; it is heavily contingent upon the students’ digital agency [48]. The correlation analysis (Figure 3) highlighted a strong positive relationship (r = 0.68) between active system usage and academic outcomes, suggesting that the “non-linear” nature of the graph can be a double-edged sword. Qualitative feedback indicated that a minority of students with lower self-regulated learning (SRL) skills felt initially overwhelmed by the complexity of the semantic network. This observation resonates with Baber’s finding that online engagement is a primary predictor of success [49]. Consequently, while the KG lowers the cognitive barrier to understanding geology, it may inadvertently raise the technical barrier for engagement. This suggests that future iterations must incorporate stronger “nudge” mechanisms—such as automated reminders or simplified “guided tour” modes—to ensure that the technology supports, rather than alienates, learners who are less digitally confident [50].
In the current implementation, we addressed this challenge through a ‘human-in-the-loop’ strategy within the blended learning framework. To mitigate cognitive overload for students with lower self-regulation, the instructor provided explicit ‘navigation templates’ (e.g., pre-defined paths for novices) during face-to-face sessions. Furthermore, when students with low digital competencies struggled with the non-linear structure, peer-assisted learning groups were utilized as a compensatory scaffold. This suggests that for ‘at-risk’ learners, the Knowledge Graph should not function as a solitary tool but must be embedded within a supportive social learning context to ensure equitable access to deep learning.

6. Conclusions

This study addresses the critical challenge of aligning digital learning environments with Outcome-Based Education (OBE) standards to foster sustainable engineering competencies. By constructing a Domain Knowledge Graph (DKG) within a “Cloud-Edge-End” collaborative architecture, we developed an instructional model that transforms static resource repositories into dynamic cognitive maps. The empirical results confirm that while the knowledge graph intervention had a moderate effect on basic memory retention, it significantly amplified students’ ability to analyze complex engineering problems (d = 0.74). This indicates that explicit semantic visualization serves as an effective cognitive scaffold, helping Students Bridge the gap between theoretical geology and practical engineering analysis. Furthermore, by validating a workflow that leverages LMS log data to quantify specific Graduation Attributes, this framework promotes the social sustainability of education, narrowing the performance gap between high- and low-achieving learners and fostering the resilience required by international engineering accords.
Despite the promising outcomes, several limitations should be acknowledged. First, the study was conducted with a relatively small sample size (n = 84) at a single engineering university in China; while representative of the local context, the results may vary across different cultural or institutional settings. Furthermore, the educational culture represents a significant boundary condition. This study was conducted in a context where students are traditionally accustomed to teacher-centered, lecture-based instruction. Consequently, the novelty of the KG-driven autonomous exploration might have produced a ‘novelty effect.’ It remains unclear whether the same efficacy would be observed in Western educational contexts that already emphasize inquiry-based learning. Future cross-cultural studies are needed to validate whether the KG acts as a universal scaffold or if its benefits are culturally specific to learners transitioning from rote memorization to constructivist learning. Second, the evaluation period was limited to one academic semester (16 weeks), focusing on immediate competency achievement. The long-term retention of these skills and their transferability to professional practice remain to be verified through longitudinal studies. Third, the current construction of the ontology relied heavily on manual extraction by experts. This “human-in-the-loop” process, while accurate, is labor-intensive and presents a bottleneck for rapidly scaling the framework to other disciplines.
Future work will focus on addressing these challenges through technological and pedagogical expansion. To improve scalability, we aim to integrate Large Language Models (LLMs) to semi-automatically extract entities and relationships from geological textbooks, significantly reducing the cost of graph construction. Additionally, we plan to extend the validation to cross-institutional contexts, testing the robustness of the framework in diverse educational ecosystems. Ultimately, research will move beyond static navigation towards deep reinforcement learning, developing algorithms that dynamically recommend learning paths based on real-time performance to realize true “Personalized Adaptive Learning”.

Author Contributions

Conceptualization, J.N.; methodology, J.N.; formal analysis, Q.C.; investigation, J.N. and Y.M.; resources, Y.M. and L.Z.; data curation, L.Z.; writing—original draft preparation, X.T.; writing—review and editing, all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by: Teaching Reform Project from Education Department of Shandong Province of China, Grant Number. ZHK202406; Teaching Reform Project from Shandong University of Science and Technology of China, Grant Number MS20231204; National Natural Science Foundation of China, Grant Number 42072226.

Institutional Review Board Statement

Ethical review and approval were waived for this study by the Institutional Committee, as it complies with Article 32(3) of the “Measures for the Ethical Review of Life Sciences and Medical Research Involving Humans” (China, 2023). This exemption applies because the study involved anonymous surveys and was conducted as part of standard educational practices, posing no risk to participants.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Spady, W.G. Outcome-Based Education: Critical Issues and Answers; American Association of School Administrators (AASA): Alexandria, VA, USA, 1994; p. 212. [Google Scholar]
  2. Passow, H.J.; Passow, W.E. What competencies should undergraduate engineering programs emphasize? A systematic review. J. Eng. Educ. 2017, 106, 475–526. [Google Scholar] [CrossRef]
  3. Chen, Q.; Yin, H.; Feng, J. Construction of Course Content Integrating Ideas of Engineering Education Accreditation for Higher Education in China: An Example of Geochemistry Course. Sustainability 2023, 15, 12709. [Google Scholar] [CrossRef]
  4. Jonassen, D.H. Supporting problem solving in PBL. Interdiscip. J. Probl. Based Learn. 2011, 5, 95–119. [Google Scholar] [CrossRef]
  5. Libarkin, J.C.; Anderson, S.W. Assessment of learning in entry-level geoscience courses: Results from the Geoscience Concept Inventory. J. Geosci. Educ. 2005, 53, 394–401. [Google Scholar] [CrossRef]
  6. Kastens, K.A.; Manduca, C.A.; Cervato, C. How geoscientists think and learn. Eos. Trans. Am. Geophys. Union 2009, 90, 265–266. [Google Scholar] [CrossRef]
  7. Dabbagh, N.; Kitsantas, A. Personal Learning Environments, social media, and self-regulated learning: A natural formula for connecting formal and informal learning. Internet High. Educ. 2012, 15, 3–8. [Google Scholar] [CrossRef]
  8. Moeck, I.S. Catalog of geothermal play types based on geologic controls. Renew. Sustain. Energy Rev. 2014, 37, 867–882. [Google Scholar] [CrossRef]
  9. Alcalde, J.; Flude, S.; Wilkinson, M.; Johnson, G.; Edlmann, K.; Bond, C.E.; Scott, V.; Gilfillan, S.M. Estimating geological CO2 storage security to deliver on climate mitigation. Nat. Commun. 2018, 9, 2201. [Google Scholar] [CrossRef] [PubMed]
  10. Brundiers, K.; Barth, M.; Cebrián, G. Key competencies in sustainability in higher education—Toward an agreed-upon refer ence framework. Sustain. Sci. 2021, 16, 13–29. [Google Scholar] [CrossRef]
  11. UNESCO. Engineering for Sustainable Development: Delivering on the Sustainable Development Goals; UNESCO Publishing: Paris, France, 2021; p. 185. [Google Scholar]
  12. Barth, M.; Godemann, J.; Rieckmann, M.; Stoltenberg, U. Developing key competencies for sustainable development in higher education. Int. J. Sustain. High. Educ. 2007, 8, 416–430. [Google Scholar] [CrossRef]
  13. Chankseliani, M.; McCowan, T. Higher education and the Sustainable Development Goals. High. Educ. 2021, 81, 1–8. [Google Scholar] [CrossRef] [PubMed]
  14. Uttal, D.H.; Meadow, N.G.; Tipton, E.; Hand, L.L.; Alden, A.R.; Warren, C.; Newcombe, N.S. The malleability of spatial skills: A meta-analysis of training studies. Psychol. Bull. 2013, 139, 352–402. [Google Scholar] [CrossRef]
  15. Kastens, K.A.; Ishikawa, T. Spatial thinking in the geosciences and cognitive sciences: A cross-disciplinary look at the intersec tion of the two fields. Spec. Pap. Geol. Soc. Am. 2006, 413, 53–76. [Google Scholar]
  16. Almulla, M.A. The effectiveness of the project-based learning (PBL) approach as a way to engage students in learning. Sage Open 2020, 10, 2158244020938702. [Google Scholar] [CrossRef]
  17. Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef]
  18. Bond, C.E.; Cawood, A.J. A role for virtual outcrop models in structural geology field instruction: The impact on 3D spatial thinking. J. Geogr. High. Educ. 2021, 69, 161–172. [Google Scholar]
  19. Ellis, R.A.; Goodyear, P.; Calvo, R.A.; Prosser, M. University students’ conceptions of learning through blended learning: A phenomenographic analysis. Br. J. Educ. Technol. 2011, 42, 171–185. [Google Scholar]
  20. Haleem, A.; Javaid, M.; Qadri, M.A.; Suman, R. Understanding the role of digital technologies in education: A review. Sustain. Oper. Comput. 2022, 3, 275–285. [Google Scholar] [CrossRef]
  21. Bond, M.; Buntins, K.; Bedenlier, S.; Zawacki-Richter, O.; Kerres, M. Mapping research in student engagement and educational technology in higher education: A systematic evidence map. Int. J. Educ. Technol. High Educ. 2020, 17, 2–30. [Google Scholar] [CrossRef]
  22. Novak, J.D.; Cañas, A.J. The Theory Underlying Concept Maps and How to Construct and Use Them; Institute for Human and Machine Cognition: Pensacola, FL, USA, 2008. [Google Scholar]
  23. Van Merriënboer, J.J.; Sweller, J. Cognitive load theory in health professional education: Design principles and strategies. Med. Educ. 2010, 44, 85–93. [Google Scholar] [CrossRef]
  24. Chen, L.; Chen, P.; Lin, Z. Artificial intelligence in education: A review. IEEE Access 2020, 8, 75264–75278. [Google Scholar] [CrossRef]
  25. Dessì, D.; Fenu, G.; Marras, M.; Recupero, D.R. Bridging learning analytics and cognitive computing for big data classification in micro-learning. Future Gener. Comput. Syst. 2019, 92, 468–477. [Google Scholar] [CrossRef]
  26. Chi, Y.; Qin, Y.; Song, R.; Xu, H. Knowledge Graph in Smart Education: A Case Study of Entrepreneurship Education. Sustainablity 2018, 10, 995. [Google Scholar] [CrossRef]
  27. Abu-Salih, B.; Wongthongtham, P.; Chan, K.Y. Knowledge graphs in education: A systematic review. Appl. Sci. 2024, 11, e25383. [Google Scholar]
  28. Duan, C.; Yang, J.; Cui, Q.; Zhang, W.; Wan, X.; Zhang, M. Enhancing the Recommendation of Learning Resources for Learners via an Advanced Knowledge Graph. Appl. Sci. 2023, 15, 4204. [Google Scholar] [CrossRef]
  29. Nesbit, J.C.; Adesope, O.O. Learning with concept and knowledge maps: A meta-analysis. Rev. Educ. Res. 2006, 76, 413–448. [Google Scholar] [CrossRef]
  30. Zawacki-Richter, O.; Marín, V.I.; Bond, M.; Gouverneur, F. Systematic review of research on artificial intelligence applications in higher education—where are the educators. Int. J. Educ. Technol. High. Educ. 2019, 16, 39. [Google Scholar] [CrossRef]
  31. Biasutti, M.; Frate, S. A validity and reliability study of the Attitudes towards Sustainable Development scale. Environ. Educ. Res. 2017, 23, 214–230. [Google Scholar] [CrossRef]
  32. Ausubel, D.P. The use of advance organizers in the learning and retention of meaningful verbal material. J. Educ. Psychol. 1960, 51, 267–272. [Google Scholar] [CrossRef]
  33. Tucker, B. The flipped classroom. Educ. Next 2012, 12, 82–83. [Google Scholar]
  34. Van de Pol, J.; Volman, M.; Beishuizen, J. Scaffolding in teacher–student interaction: A decade of research. Educ. Psychol. Rev. 2010, 22, 271–296. [Google Scholar] [CrossRef]
  35. Zimmerman, B.J. Becoming a self-regulated learner: An overview. Theor. Pract. 2002, 41, 64–70. [Google Scholar] [CrossRef]
  36. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 5th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  37. Spurlin, J.E.; Rajala, S.A.; Lavelle, J.P. Designing Better Engineering Education Through Assessment: A Practical Resource for Faculty and Department Chairs on Using Assessment and ABET Criteria; Stylus Publishing: Sterling, VA, USA, 2008; p. 384. [Google Scholar]
  38. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 1988; p. 567. [Google Scholar]
  39. Biggs, J.B.; Tang, C. Teaching for Quality Learning at University, 4th ed.; McGraw-Hill Education: Columbus, OH, USA, 2011; p. 416. [Google Scholar]
  40. Siemens, G.; Long, P. Penetrating the fog: Analytics in learning and education. Educ. Rev. 2011, 46, 30–32. [Google Scholar]
  41. Hwang, G.J.; Yang, L.H.; Wang, S.Y. A concept map-embedded educational computer game for improving students’ learning performance and decreasing their cognitive load. Comput. Educ. 2013, 69, 121–133. [Google Scholar] [CrossRef]
  42. Sweller, J. Cognitive load theory. Psychol. Learn. Motiv. 2011, 55, 37–76. [Google Scholar]
  43. González-Pérez, L.I.; Ramírez-Montoya, M.S. Components of Education 4.0 in 21st Century Skills Frameworks: Systematic Re view. Sustainability 2022, 14, 1493. [Google Scholar] [CrossRef]
  44. Carbonell-Carrera, C.; Saorín, J.L. Geospatial Google Street View with Virtual Reality: A Motivational Approach for Spatial Training Education. ISPRS Int. J. Geo-Inf. 2017, 6, 261. [Google Scholar] [CrossRef]
  45. Zhang, J.; Zain, A.M.; Zhou, K.; Chen, X.; Zhang, R. A review of recommender systems based on knowledge graph embedding. Expert Syst. Appl. 2024, 250, 123876. [Google Scholar] [CrossRef]
  46. Turnbull, D.; Chugh, R.; Luck, J. Learning Management Systems, an Overview. In Encyclopedia of Education and Information Technologies; Tatnall, A., Ed.; Springer International Publishing: Cham, Switzerland, 2020; pp. 1052–1058. [Google Scholar]
  47. Kalyuga, S. Expertise Reversal Effect and Its Implications for Learner-Tailored Instruction. Educ. Psychol. Rev. 2007, 19, 509–539. [Google Scholar] [CrossRef]
  48. Hooshyar, D.; Tammets, K.; Ley, T.; Aus, K.; Kollom, K. Learning Analytics in Supporting Student Agency: A Systematic Review. Sustainability 2023, 15, 13662. [Google Scholar] [CrossRef]
  49. Wong, J.; Baars, M.; Davis, D.; Van Der Zee, T.; Houben, G.; Paas, F. Supporting Self-Regulated Learning in Online Learning Environments and MOOCs: A Systematic Review. Int. J. Hum. Comput. Interact. 2019, 35, 356–373. [Google Scholar] [CrossRef]
  50. Weijers, R.J.; de Koning, B.B.; Paas, F. Nudging in education: From theory towards guidelines for successful implementation. Eur. J. Psychol. Educ. 2021, 36, 883–902. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.