1. Introduction
Distance learning has become a defining feature of contemporary higher education, offering flexibility, scalability, and broader access to knowledge [
1,
2]. Yet, its application in military officer training introduces distinct academic and operational complexities that differ significantly from civilian contexts [
3,
4]. For Lithuania’s future officers, educational programs must not only convey theoretical knowledge but also cultivate decision-making, leadership, and operational readiness—competencies traditionally developed in immersive, in-person environments [
5,
6]. This dual requirement creates inherent tensions between pedagogical effectiveness, technological infrastructure, and the security-sensitive nature of military training.
From a disciplinary perspective, distance education for cadets spans multiple problem domains. The transition from classroom-based to virtual instruction disrupts established teaching models, challenging the delivery of practical skills, situational awareness, and collaborative problem-solving [
7,
8]. Ensuring assessment reliability and academic integrity is particularly difficult in distributed environments, especially when high-stakes evaluations underpin professional accreditation and readiness [
9]. Sustaining motivation, discipline, and engagement—already complex in higher education—is intensified by the strictly structured yet physically decentralized cadet context [
10]. Moreover, the integration of classified or sensitive operational content into online platforms demands stringent cybersecurity measures, which are often absent from civilian e-learning systems [
11]. These challenges are further amplified by the nature of military higher education as a complex adaptive system, where pedagogical processes are tightly coupled with operational demands, personnel rotations, and evolving national security contexts [
12,
13]. Research in system dynamics and risk management has demonstrated the value of adaptive modelling for understanding interdependent subsystems such as instructional design, learner performance, and technological reliability [
14,
15,
16]. However, applications of such modelling approaches to the specific requirements of military distance learning remain limited, despite the fact that performance outcomes in this domain have direct implications for national defence readiness.
In Lithuania, the development of distance education for armed forces personnel is shaped by the National Security Strategy [
17], the Law on Higher Education and Research [
18], and the Long-Term Development Programme of the Lithuanian Armed Forces [
19]. These policy documents emphasize the need for flexible, technology-enabled training solutions to ensure operational readiness, lifelong learning opportunities for cadets and officers, and alignment with NATO standards for interoperability. The General Jonas Žemaitis Military Academy of Lithuania (MAL) has integrated e-learning and blended formats into officer training, especially during limited mobility, international deployments, and the COVID-19 pandemic. Under a Commandant’s order, the Academy shifted operations to Moodle, enabling remote lectures, tasks, and exams to ensure continuity [
20]. It also delivered the Troop Leading Procedures in Movement Course for Ukrainian cadets under NATO’s Defence Education Enhancement Program entirely online, showcasing MAL’s capacity to apply distance learning in both national and international contexts [
21]. Within this framework, distance military education is not only a pedagogical innovation but also a security requirement, ensuring continuity of professional development under hybrid threat conditions and during periods of national emergency.
The existing literature reveals three converging but under-integrated approaches that inform the present study. First, system dynamics modelling offers a means to capture feedback loops, delays, and non-linear effects within educational systems, providing tools for scenario simulation and overall system resilience analysis [
22,
23]. Second, contemporary risk management frameworks, including ISO 31000:2022 and the Dynamic Risk Management (DRM) framework, provide structured processes for identifying, assessing, and adapting to evolving threats and uncertainties in both pedagogical and operational contexts [
24,
25]. Third, learning analytics enables continuous performance monitoring, the detection of early warning indicators of learning lacks, and evidence-based instructional interventions [
26,
27].
Despite advances in each of these areas, their integration into a unified decision-making framework for security-sensitive distance education has not been systematically explored. The purpose of this study is to address this gap by using system dynamics modelling, modern risk management standards, and learning analytics into a tailored decision-making methodology for Lithuanian military cadet distance learning. This integration helps proactively identify and mitigate academic and operational risks ranging from skill loss to job attrition, while enabling the dynamic allocation of instructional and technological resources. This study aims to contribute both to the theoretical understanding of distance learning in complex, security-sensitive domains and to the practical development of resilient, adaptive education systems. Therefore, this study contributes to the combination of three methodological aspects and aims to identify the main causal factors, prioritize intervention points and increase the overall system resilience in military e-learning environments.
The article is structured as follows.
Section 2 reviews the relevant literature and justifies the selection of study criteria.
Section 3 details the methodology, including research design, classification of dimensions, justification for fuzzy DEMATEL (Decision-Making Trial and Evaluation Laboratory), and data collection.
Section 4 reports the analytical results, while
Section 5 discusses their implications for distance military education in complex, risk-sensitive environments.
Section 6 concludes with key findings, practical implications, and future research directions.
2. Literature Review
Military distance education presents challenges that extend beyond those typically addressed in civilian e-learning [
28]. Whereas civilian contexts prioritize efficiency, learner satisfaction, and employability, military programs must simultaneously ensure operational readiness, interoperability with allied standards, and continuity under deployment or crisis conditions [
29]. Instructional materials often involve classified or security-sensitive content, requiring strict adherence to information assurance frameworks and overall system resilience against cyber intrusion. Cadets also operate within rigid routines and readiness demands, intensifying difficulties in sustaining motivation, discipline, and engagement. Moreover, officer training must cultivate leadership and decision-making competencies that are not easily transferable through conventional online methods [
30,
31]. These distinct requirements position military distance learning not merely as a pedagogical adaptation but as a strategic capability tied to force development and national security. Accordingly, analytical frameworks for this domain must integrate operational risk, technological robustness, and pedagogical dynamics, rather than relying on models derived from civilian distance-education contexts [
32].
Designing a risk-aware, resilient distance-learning model for military officer education requires criteria grounded in three complementary knowledge streams: system dynamics of complex socio-technical systems, contemporary risk management, and learning analytics for data-driven decision-making. These domains are not independent but mutually reinforcing: system dynamics provides the modelling grammar for causal structures and feedback loops, risk management standards define the evaluative lens through which uncertainties are classified and mitigated, and learning analytics offers the continuous empirical signals that inform adaptive interventions. The integration of these three elements establishes a theoretical mechanism where models, norms, and data converge to guide resilient decision-making.
From a systems dynamics perspective, cadet distance learning can be conceptualized as a tightly coupled complex system where instructional design, technology reliability, and organizational processes co-evolve. Seminal system dynamics studies emphasize the importance of feedback loops, delays, nonlinearity, and policy resistance, highlighting how interventions in one part of the system can trigger unexpected effects elsewhere. For example, stricter assessment rules may discourage collaboration, or increased online monitoring could reduce student trust and engagement [
33,
34,
35]. Strategy support and decision-focused modelling further motivate criteria on adaptivity (capacity to adjust processes in response) or overall system resilience and resource scalability because policy levers in education frequently create unintended consequences that must be revealed through causal maps and simulation [
36]. In this study, system dynamics is positioned as the structural modelling core into which risk management requirements and learning analytics indicators can be embedded.
Contemporary risk management prioritizes structured identification, analysis, and treatment of uncertainty. ISO 31000 provides organization-level guidance for establishing risk criteria, controls, and continual improvement [
37], while Aven clarifies risk conceptualization under deep uncertainty—informing the inclusion of assessment integrity, technology reliability, and cybersecurity/IA as explicit risk domains [
38]. Overall system resilience engineering extends this to everyday performance—emphasizing an organization’s capacity to anticipate, monitor, respond, and learn—which directly motivates criteria on adaptive risk monitoring and organizational learning loops in distance programs [
39].
Learning analytics (LA) contributions offer data-driven capability to surface early-warning indicators for disengagement, skill decay, and academic integrity risks. Foundational LA work and systematic reviews establish the role of behavioural traces, dashboards, and interventions in improving outcomes, but also stress ethics, privacy, and governance constraints that must be built into any monitoring architecture [
40,
41,
42]. Meta-analytic evidence from online and blended learning links teaching/social/cognitive presence to actual learning and satisfaction [
43], while recent scoping reviews show LA research tends to privilege behavioural measures (clicks, time-on-task), highlighting the need to include multi-dimensional engagement criteria (behavioural, cognitive, affective) [
44].
Regarding assessment fairness, mixed but converging evidence suggests that malpractice in online exam misconduct remains a material risk; proctoring can reduce inflated scores, though equity, privacy, and validity concerns persist [
45,
46,
47,
48,
49,
50]. Given the defence context, cybersecurity and information assurance are non-negotiable. NIST SP 800-171 [
51] specifies concrete safeguards for Controlled Unclassified Information that often appears in military curricula and training artefacts [
51,
52,
53,
54]. Vulnerability analyses of widely used learning management systems (LMS) platforms confirm persistent exposure to common vulnerabilities and exposures (CVE) tracked issues, justifying criteria on platform security posture and secure interoperability with defence systems [
55]. At the same time, technology/service quality and instructor digital competence affect learning effectiveness at scale in online programs, reinforcing criteria on platform reliability/availability and instructor readiness for data-informed pedagogy [
56].
Taken together, this literature base establishes a triangulated theoretical mechanism: system dynamics provides the structural logic of causal interactions, risk management supplies the evaluative criteria for handling uncertainty, and learning analytics ensures continuous data-driven feedback to adjust both models and controls. This synthesis justifies the application of Fuzzy Decision-Making Trial and Evaluation Laboratory to structure causal criteria into clusters, which can then be embedded in system dynamics modelling and used for adaptive risk treatment planning [
57].
3. Methodology
3.1. Research Design
The present study adopts a Fuzzy Decision-Making Trial and Evaluation Laboratory (Fuzzy DEMATEL) approach to model and prioritize the complex causal relationships among critical factors affecting the overall system resilience and effectiveness of distance military education. The methodology integrates System Dynamics Modelling (SDM), Contemporary Risk Management Standards (CRMS), and Learning Analytics (LA) reflecting the study’s three primary dimensions identified in the introduction. This integration supports both qualitative expert judgment and quantitative network analysis within multi-criteria decision-making (MCDM) framework.
The three dimensions of the study are operationalized into twelve evaluation criteria that address pedagogical, technological, operational, and security-specific challenges (
Table 1). Each criterion is based on previous research, ensuring conceptual validity.
3.2. Classification of Study Dimensions and Criteria
The study is structured around three interconnected dimensions that reflect the complex nature of distance military education. System Dynamics Modelling (SDM) captures the feedback loops, interdependencies, and time delays inherent in military training systems, enabling simulation of instructional interventions under operational constraints. Contemporary Risk Management Standards (CRMS), specifically ISO 31000:2022 and the Dynamic Risk Management Framework (DRMF), provide a structured approach for identifying, assessing, and mitigating both pedagogical and operational risks in a security-sensitive context. Learning Analytics (LA) leverages real-time educational data to monitor cadet performance, predict potential failures, and inform adaptive instructional strategies.
Details of the specific criteria selected for this study, belonging to the three dimensions discussed, are presented in
Table 2.
These three dimensions are represented by twelve criteria, each justified by the literature sources to ensure theoretical validity and practical significance. Moreover, to analyze the complex cause–effect relationships among these criteria, the study employs the multi-stage Multi-Criteria Decision-Making (MCDM) approach to evaluate and optimize the overall system resilience of distance learning for Lithuanian military cadets, integrating system dynamics modelling, contemporary risk management standards, and learning analytics. The methodological design follows the complex systems perspective, acknowledging the interdependence of educational, technological, and operational subsystems.
3.3. Justification for Fuzzy DEMATEL
The DEMATEL (Decision-Making Trial and Evaluation Laboratory) technique is well-suited for recognition cause–effect structures in complex systems, while its fuzzy extension was chosen because it can handle the uncertainty and subjectivity of expert judgments while revealing the most influential and dependent factors in the system [
57,
58]. In military education, the interplay between operational constraints, technology readiness, and pedagogical outcomes is influenced by subjective judgments, making crisp numerical evaluations insufficient. The logical procedures of the Fuzzy DEMATEL method are presented in previous studies [
59,
60,
61]. Accordingly, the full fuzzy-triangular DEMATEL method procedure can be used after eight steps of experts’ opinion examination (see
Figure 1).
Accordingly, the full fuzzy-triangular DEMATEL method procedure can be carried out in eight sequential steps of expert opinion examination (see
Figure 1). First, the problem context is defined, and relevant evaluation criteria are established based on the literature and expert validation. Second, a panel of domain experts is purposively selected, and asked to provide pairwise influence assessments between criteria using predefined linguistic terms. Third, experts’ linguistic assessments are translated into triangular fuzzy numbers to capture uncertainty and imprecision in human judgement to ensure adequate knowledge coverage. Fourth, individual fuzzy matrices are aggregated to obtain a consolidated group direct-relation matrix. Fifth, normalization is applied to ensure comparability of scales across expert responses. Sixth, the total-relation matrix is computed, which incorporates both direct and indirect influences among criteria. Seventh, prominence (
Ri +
Ci) and relation (
Ri −
Ci) values are derived, enabling the classification of factors into cause (drivers) and effect (dependents). Finally, structural model as cause-and- effect diagram is constructed. This structured approach ensures methodological transparency. Also, the fuzzy-triangular representation maintains robustness by handling vagueness characteristic in expert-based evaluations.
As from the third step onward, the procedure is grounded in the mathematical formulations of the fuzzy-triangular DEMATEL method. These formulations are presented in detail through a structured, stepwise explanation below.
Step 3. At the beginning, we must design the direct—relation matrix . First, the set of criteria for evaluating and optimizing the overall system resilience of distance learning for Lithuanian military cadets is defined as , where each characterizes the i-th criterion with . The set of experts is denoted as , where corresponds to the k-th expert . To capture expert opinions, a set of linguistic terms is introduced , where each represents the s-th linguistic term, must be used. Each expert , completes an individual direct-relation matrix, which can be expressed as Equation (1):
where
Step 4. The linguistic terms in the direct-relation matrices are then converted into triangular fuzzy numbers using the established scale [
61], shown in
Table 3.
there
characterizes the triangular fuzzy number and describes the outcome on the strength of the relation among criteria C
i and C
j provided by expert E
k,
.
where
Step 9. The causal diagram is built with the horizontal axis and the vertical axis . The horizontal axis, named Prominence, represents the overall degree of importance assigned to each criterion. The vertical axis, denoted as Relation, captures the magnitude of its causal influence within the system. In this framework, positive Relation values identify criteria functioning as causal drivers that shape system behaviour. The negative values indicate effect criteria that are primarily influenced by others. This distinction enables the prioritization of driver factors as leverage points for strengthening system resilience.
3.4. Data Collection Procedure
Prior to the evaluation process, a set of twelve criteria was developed based on an extensive review of the scientific literature and aligned with the three analytical dimensions of the study. These criteria were subjected to expert validation, during which participants assessed their significance, clarity, and comprehensiveness. Minor refinements in terminology were introduced to ensure alignment with operational and policy-specific language in the military education context. Following validation, experts conducted pairwise comparisons to evaluate the direct influence of each criterion on every other criterion. These assessments were expressed on a five-point linguistic scale ranging from ‘No influence’ to ‘Very high influence’ (see
Table 3).
The empirical stage of the study relied on expert-based judgments to capture the complex interdependencies among the criteria relevant to military distance education. Given the security-sensitive and domain-specific nature of the research, a purposive sampling strategy was employed to ensure that participants possessed the necessary knowledge, experience, and operational insight to provide reliable judgments. Selection criteria included: (1) a minimum of ten years of professional experience in military education, instructional design, or defence-related IT and cybersecurity; (2) direct involvement in the planning, implementation, or evaluation of distance or blended learning programs; and (3) familiarity with risk management practices in military contexts. Data collection took place in May 2025 using a questionnaire designed as a relational matrix of twelve factors, where experts assessed the degree of influence between each pair. In total, fifteen experts participated: eight instructional staff members from the Lithuanian Military Academy, four defence information technology and cybersecurity specialists, and three military education policy and risk management officers. Each expert possessed at least ten years of professional experience, thereby providing both operational familiarity and strategic insight.
The experts’ linguistic evaluations were subsequently transformed into triangular fuzzy numbers in accordance with established fuzzy DEMATEL protocols [
59]. This enabled systematic management of uncertainty and subjectivity inherent in human judgment. The overall methodological sequence is illustrated in
Figure 1, which demonstrates the structured process of expert selection, criteria validation, pairwise influence assessment, fuzzy number conversion, and group matrix aggregation. This procedure ensured that the results were both analytically robust and representative of the domain expertise required in the military education context.
4. Results
This study applies Fuzzy System Theory in combination with the Decision-Making Trial and Evaluation Laboratory (DEMATEL) method to develop a structured and systematic approach for assessing the overall system resilience and effectiveness of distance learning for Lithuanian military cadets. The integration of fuzzy logic with DEMATEL enables the modelling of expert judgments under uncertainty, allowing for a nuanced and quantitative analysis of the complex interdependencies among critical factors. This approach is particularly suited to military education contexts, where both operational and pedagogical variables are interlinked, and where decision-making often occurs under conditions of incomplete or imprecise information.
As described in the Methodology Section, the analysis is grounded in expert-based evaluation. Fifteen subject-matter experts with extensive experience in military education, operational planning, and risk management participated in structured pairwise influence assessments. They evaluated twelve essential criteria: C1—Instructional Design Adaptability; C2—Feedback Loop Effectiveness; C3—Learner Engagement Dynamics; C4—Scenario Simulation Capability; C5—Risk Identification Completeness; C6—Risk Assessment Accuracy; C7—Contingency Planning Robustness; C8—Real-Time Risk Monitoring; C9—Early Warning Indicators; C10—Data-Driven Personalization; C11—Performance Monitoring Granularity; C12—Predictive Intervention Effectiveness, that together shape the sustainability and operational readiness potential of distance learning for military cadets (see
Table 2). These criteria include pedagogical adaptability, operational risk management, and analytics-driven decision-making, reflecting the multidimensional nature of military distance education. Using the fuzzy DEMATEL approach, first linguistic assessment from experts’ opinions was identified (see
Table 4).
The assessments presented in
Table 4 were changed into a fuzzy direct-relation matrix (see
Table 5) to follow the fuzzy DEMATEL methodology and ensure a rigorous mathematical analysis to model and prioritize the complex causal relationships among critical factors affecting the overall system resilience and effectiveness of distance military education. Next, Equation (3) was applied to normalize the direct-relation matrix, producing the normalized direct-relation matrix to continue study’s multi-stage MCDM framework analysis procedure. Specifically, each fuzzy value was divided by the maximum entry identified in the matrix, thereby standardizing the scale of influence judgments while preserving the relative significance of inter-criterion relationships. This step is critical in the analysis of complex systems, as it refines the input data to facilitate reliable cross-dimensional integration of system dynamics modelling, contemporary risk management standards, and learning analytics.
Following normalization, the generalized relation matrix was derived (Step 5) and subsequently processed through a defuzzification procedure (Step 6), wherein triangular fuzzy values were systematically converted into crisp values. This transformation enhances analytical precision and enables quantitative evaluation across interdependent educational, technological, and operational subsystems.
In Step 7, the study continued to identify causal structures within the system by calculating the row and column sums of the generalized relation matrix. These values (Ri + Ci and Ri − Ci) provide a quantitative basis for distinguishing between causal drivers and dependent effects, thus aligning with the core objective of the DEMATEL methodology: to uncover directional cause–effect dynamics within complex decision environments. In the specific context of distance military education, this process reveals the hierarchical influence structure of the twelve criteria, offering insights into how systemic overall system resilience can be optimized through targeted interventions (see
Table 6).
So, the hierarchical influence structure of the study criteria provides two critical measures for understanding the systemic role of each criterion: degree of centrality (Ri + Ci) and degree of causality (Ri − Ci). Moreover, centrality indicates the overall importance of a criterion within the system by capturing both the influence exerted (Ri) and the influence received (Ci). Higher Ri + Ci values reflect criteria that are highly interconnected and thus structurally central to system resilience and adaptability.
Consequently, according to calculations, the criteria with the highest centrality scores are Predictive Intervention Effectiveness (C12, 0.743), Scenario Simulation Capability (C4, 0.742), and Early Warning Indicators (C9, 0.734). These findings suggest that the ability to anticipate risks, simulate operational scenarios, and establish early detection mechanisms are essential for shaping the overall effectiveness of distance military education. Feedback Loop Effectiveness (C2, 0.823) also exhibits high centrality, highlighting the central role of adaptive information exchange in maintaining instructional and operational overall system resilience.
Conversely, Risk Assessment Accuracy (C6, 0.600) and Learner Engagement Dynamics (C3, 0.615) are positioned at the lower end of centrality, suggesting that while important, they are more peripheral outcomes that depend strongly on upstream drivers.
Causality (Ri − Ci) measure distinguishes driver (cause) criteria from dependent (effect) criteria. Positive values (Ri − Ci > 0) identify drivers that employ net influence on the system, while negative values indicate dependent factors that are primarily influenced by others:
Driver criteria (Ri − Ci > 0): Six criteria fall into this group, including Predictive Intervention Effectiveness (C12, 0.137), Scenario Simulation Capability (C4, 0.115), Performance Monitoring Granularity (C11, 0.112), and Early Warning Indicators (C9, 0.092). These represent influence points within the system: strengthening them will force improvements across dependent factors. Notably, these drivers extent all three analytical dimensions, System Dynamics Modelling (SDM), Risk Management Standards (RMS), and Learning Analytics (LA), underscoring the interdependent nature of the decision environment.
Effect criteria (Ri − Ci < 0): The remaining six criteria, including Instructional Design Adaptability (C1, −0.146), Learner Engagement Dynamics (C3, −0.155), and Risk Assessment Accuracy (C6, −0.204), are classified as outcomes. These reflect the end-state performance of the distance education system, improving only when upstream drivers are effectively enhanced.
The final step of the analysis involved constructing the cause-and-effect relationship diagram (see
Figure 2), which provides a visual representation of the interdependencies among educational, technological, and operational subsystems.
The diagram (see
Figure 2) provides a structured basis for decision-makers to identify leverage points and formulate evidence-based strategies aimed at strengthening the sustainability and overall system resilience of distance learning in military education. The classification of criteria into causal factors (C12 < C4 < C11 < C2 < C9 < C10 < C7) and effect factors (C8 < C5 < C1 < C3 < C6) is particularly significant, as it describes which dimensions require practical intervention to generate system-wide improvements. Because causal criteria exert direct influence over the dependent effect criteria, they should be accorded priority in decision-making and resource allocation.
Furthermore, the dual interpretation of centrality and causality reveals a layered structure in which driver criteria act as enablers of overall system resilience and adaptability, while effect criteria represent performance indicators. Study results proved that strengthening Feedback Loop Effectiveness (C2) and Scenario Simulation Capability (C4) not only enhances real-time adaptability (C1, Instructional Design Adaptability) but also improves learner motivation (C3, Learner Engagement Dynamics). Similarly, bolstering Early Warning Indicators (C9) and Predictive Intervention Effectiveness (C12) enhances downstream operational safeguards such as Risk Identification Completeness (C5) and Real-Time Risk Monitoring (C8), thereby contributing to the overall resilience of the distance-learning system.
Additionally, this analysis demonstrates that the strategic priority should first target highest-ranked driver criteria, Predictive Intervention Effectiveness (C12, Ri + Ci = 0.743, Ri − Ci = 0.137), Scenario Simulation Capability (C4, Ri + Ci = 0.742; Ri − Ci = 0.115), and Performance Monitoring Granularity (C11, Ri + Ci = 0.715; Ri − Ci = 0.112), all show both high centrality and positive causality values, confirming their role as systemic factors. Because these criteria exert influence on multiple downstream effects (Instructional Design Adaptability, Learner Engagement Dynamics, and Risk Identification Completeness), investments here generate the strongest cascading impact across the entire system.
Also, effect criteria such as Instructional Design Adaptability (C1, Ri − Ci = −0.146), Learner Engagement Dynamics (C3, Ri − Ci = −0.155), and Risk Assessment Accuracy (C6, Ri − Ci = −0.204) exhibit negative causality, meaning they are dependent outcomes rather than drivers. Thus, efforts to directly refine pedagogy or engagement strategies are unlikely to succeed unless supported by the upstream reinforcement of feedback loops (C2, Ri − Ci = 0.101), simulation environments (C4), and predictive analytics (C12). This confirms the systems-thinking principle that improving outcomes requires strengthening their causal drivers rather than addressing symptoms in isolation.
Finally, the top-ranked driver criteria emerge from all three dimensions of the study: System Dynamics Modelling (Scenario Simulation Capability (C4), Feedback Loop Effectiveness (C2)), Risk Management Standards (Contingency Planning Robustness (C7)), and Learning Analytics (Predictive Intervention Effectiveness (C12), Performance Monitoring Granularity (C11), Early Warning Indicators (C9)). Notably, this result empirically confirms that overall system resilience in distance military education cannot be achieved through a single disciplinary aspect. Instead, a unified decision-support framework that integrates system dynamics, risk management, and learning analytics is essential to ensure adaptability of instructional processes, structural robustness of platforms and procedures, and operational readiness and security in training environments.
5. Discussion
The Fuzzy DEMATEL analysis clarifies a strong causal–effect structure among the twelve criteria, each aligned with the study’s three conceptual dimensions: System Dynamics Modelling, Contemporary Risk Management Standards, and Learning Analytics. This structure not only confirms theoretical expectations from prior literature but also provides a domain-specific interpretation for distance military education systems operating in complex, risk-sensitive environments.
From a system dynamics perspective, the identification of strong causal variables, particularly C2 (Feedback Loop Effectiveness), C4 (Scenario Simulation Capability), and C12 (Predictive Intervention Effectiveness), is consistent with previous modelling studies, which emphasize feedback-rich leverage points as determinants of system adaptability and performance under dynamic conditions [
14,
15,
16]. In this study, these criteria form a reinforcing mechanism: feedback loops (C2) capture real-time learning signals, scenario simulations (C4) operationalize those signals in a training context, and predictive interventions (C12) ensure timely corrective action. Prior studies in defence simulation training similarly conclude that simulation-based adaptive design accelerates skill acquisition while enhancing decision-making under uncertainty [
28].
In the risk management dimension, causal criteria such as C7 (Contingency Planning Robustness) and C9 (Early Warning Indicators) strongly influence effect variables like C5 (Risk Identification Completeness), C6 (Risk Assessment Accuracy), and C8 (Real-Time Risk Monitoring). This aligns with ISO 31000:2022 principles, where upstream contingency planning and early warning systems are basics for precise risk identification and assessment [
24]. The results of this study extend those principles into a learning environment, showing that well-structured contingency protocols not only mitigate operational risks but also enhance the responsiveness of instructional systems in the event of disruptions—a finding supported by recent research in digital education risk governance [
62].
The learning analytics dimension is marked by the causal positioning of C10 (Data-Driven Personalization), C11 (Performance Monitoring Granularity), and C12 (Predictive Intervention Effectiveness). These findings link earlier studies demonstrating that robust analytics infrastructures do not merely provide passive reporting but actively shape instructional strategies and learner engagement trajectories [
27,
43]. In this study’s context, analytics-driven personalization ensures that cadets’ learning paths are dynamically aligned with their performance data, thereby reducing attrition risk and maintaining training quality across geographically distributed cohorts.
When interpreted collectively, the causal group represents strategic control levers that, when optimized, trigger improvements in effect variables that serve as system performance indicators. This interplay reinforces the need for an integrated governance model that combines feedback-rich system design, proactive risk control, and real-time analytics. Such a model is essential for educational institutions tasked with preparing military officers for unpredictable, high-stakes environments.
Beyond the immediate military training setting, these findings resonate with the governance of other complex socio-technical systems where adaptability, overall system resilience, and data-driven decision-making are essential, such as critical infrastructure management, emergency response planning, and large-scale remote education. The emphasis on feedback loops, scenario simulation, and predictive interventions mirrors strategies adopted in sectors like civil aviation and disaster management, underscoring the transferability of this study’s recommendations.
While the present study provides a robust causal map grounded in expert consensus, it is limited by its overall system reliance on a single institutional context. Future research should conduct longitudinal studies across multiple military academies and operational scenarios to test the stability of the identified relationships. Moreover, simulation-based optimization and AI-enhanced predictive analytics could further refine intervention strategies, enabling real-time system reconfiguration in response to emerging threats or performance anomalies.
6. Conclusions
This study applied a structured fuzzy multi-criteria causal analysis to assess twelve interrelated criteria shaping the effectiveness and overall system resilience of distance military education. Using three interconnected dimensions, namely System Dynamics Modelling, Contemporary Risk Management Standards, and Learning Analytics, we distinguished causal drivers: Feedback Loop Effectiveness (C2), Scenario Simulation Capability (C4), Contingency Planning Robustness (C7), Early Warning Indicators (C9), Data-Driven Personalization (C10), Performance Monitoring Granularity (C11), and Predictive Intervention Effectiveness (C12), and effect indicators such as Instructional Design Adaptability (C1), Learner Engagement Dynamics (C3), Risk Identification Completeness (C5), Risk Assessment Accuracy (C6), and Real-Time Risk Monitoring (C8).
The dominance of feedback-rich and simulation-oriented drivers underscores that adaptability depends on timely data exchange and scenario-based rehearsal. Risk management functions, particularly upstream planning and early warning, are critical for accurate risk identification, while granular learning analytics and predictive interventions sustain learner engagement and operational readiness.
To put these insights into practice at the Lithuanian Military Academy, we propose a three-tiered implementation strategy:
Short-term efforts should enhance feedback loops and pilot scenario simulations to improve adaptive instructional processes.
Medium-term steps should develop predictive analytics dashboards and train instructors in learning analytics tools to operationalize monitoring and early warning.
Long-term priorities integrate these systems with secure defence IT infrastructure and extend the framework to allied military academies, ensuring interoperability and scalability.
This approach prioritizes high-impact causal drivers, aligning resource allocation with systemic influence to maximize improvement outcomes. The integration of system dynamics, risk governance, and advanced analytics strengthens overall system resilience and adaptability in complex, high-stakes training environments. Furthermore, the framework is transferable to other security-sensitive domains such as emergency management, critical infrastructure protection, and large-scale online education, where robust, data-driven decision-making is essential.
Overall, this study provides a practical roadmap for designing resilient, adaptive, and data-informed military learning ecosystems capable of operating effectively under uncertainty and rapidly evolving conditions, bridging theoretical insights and actionable implementation.