Next Article in Journal
A Four-Reference-Point Sliding-Window Game-Theoretic Model for Sustainable Emergency Decision-Making
Previous Article in Journal
Evaluation of Metal-Doped ZIF-8-Hyaluronic Acid Nanocomposites for Disruption of Salmonella Typhimurium and Escherichia coli on Food Contact (Stainless Steel) Surfaces
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Active Learning in University Physics for Sustainable Higher Education: Effective Components, Mechanisms, and SDG-Aligned Competency Pathways—A Multidimensional Review

1
College of Educational Sciences, Harbin Normal University, Harbin 150025, China
2
College of Educational Sciences, Harbin University, Harbin 150086, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sustainability 2026, 18(6), 2791; https://doi.org/10.3390/su18062791
Submission received: 6 January 2026 / Revised: 8 February 2026 / Accepted: 13 February 2026 / Published: 12 March 2026
(This article belongs to the Special Issue STEM Education and Innovative Methodologies for Sustainability)

Abstract

Active learning has increasingly been adopted as an evidence-aligned approach to improving learning quality in university physics—a domain characterized by high conceptual abstraction, persistent misconceptions, and substantial variability in student performance. Evidence from physics education research indicates that active-learning designs can outperform lecture-dominant instruction in conceptual learning and student engagement; however, reported effects vary substantially across instructional settings and implementation models. Here, empirical studies and review-level syntheses are integrated to delineate (i) the instructional components that most reliably underpin successful active learning, (ii) the mechanisms through which these components influence learning processes and outcomes, and (iii) the boundary conditions that moderate effectiveness across higher-education contexts. The synthesis is further situated within sustainability-oriented higher education by linking physics active-learning designs to competence development relevant to quality education, climate literacy, and collaborative problem solving. Evidence spanning flipped classroom implementations, peer instruction, collaborative problem solving, inquiry- and project-based approaches, and technology-enhanced formats is organized into a component–mechanism–outcome framework structured along cognitive, affective, and behavioral pathways. Two deliverables are advanced: an integrative mechanism model connecting instructional components to mediating processes, learning outcomes, and sustainability-aligned competencies, and an operational toolbox that translates the evidence into actionable design heuristics, measurement options, and scaling considerations. By redirecting attention from “which strategy works” to “which components work, how, and under what conditions,” the review aims to support instructors, departments, and institutions seeking scalable, evidence-aligned active learning in university physics.

1. Introduction

1.1. Context and Rationale for Active Learning in University Physics

University physics is widely positioned as a high-stakes gateway within STEM degree pathways, yet several instructional difficulties remain persistent. Chief among these are heavy representational demands, mathematically intensive forms of reasoning, and misconceptions that tend to remain stable when instruction is dominated by lectures [1,2,3,4,5]. Active learning—defined here as structured instructional sequences that require learners to generate, test, and refine ideas—has therefore been foregrounded in physics education research as a practical means of strengthening conceptual understanding and engagement in large, heterogeneous undergraduate cohorts [6,7,8]. Across peer instruction, collaborative problem solving, and inquiry-oriented implementations, reported gains have been attributable less to the mere inclusion of “activity” than to the organization of class time in ways that elicit reasoning, supply timely feedback, and facilitate knowledge integration [9,10,11,12].
Following the post-pandemic expansion of blended and technology-mediated teaching, flipped and hybrid formats have been adopted more widely in university physics [12,13,14,15,16]. This shift has enlarged the instructional design repertoire, enabling pre-class content delivery alongside in-class application routines, real-time polling, simulation-based exploration, learning analytics, and structured collaboration. However, a recurrent limitation has become more salient: outcomes remain heterogeneous across settings, and this variability has been closely associated with differences in implementation quality, facilitation practices, and learner readiness [14,17]. Consequently, syntheses organized primarily around strategy labels (e.g., “flipped classroom versus lecture”) have often yielded limited operational guidance for sustaining high-fidelity implementation in authentic university contexts.
A review organized around components and mechanisms is therefore justified on two grounds. First, many active-learning formats rely on overlapping functional elements—including preparatory work, in-class sensemaking tasks, feedback cycles, and collaborative accountability—suggesting that reusable components may be more informative than strategy labels for explaining effectiveness across implementations [6,9,10]. Second, sustainability-oriented higher education prioritizes approaches that remain robust under constraints commonly observed in real courses, such as large enrollments, uneven access to technology, limited instructional support, and equity-related considerations [15,16,17]. Accordingly, the present review concentrates on identifying effective components, clarifying mechanisms of action, and delineating moderating conditions that shape scalability and context sensitivity in university physics active learning.

1.2. Objectives and Scope of the Review

Three aims guide this synthesis. First, effective components that consistently support successful active learning in university physics are identified across common formats, including flipped designs, peer instruction, collaborative problem solving, inquiry- and project-based learning, and technology-enhanced approaches [6,8,9,10,11,12]. Second, mechanisms of action are synthesized by integrating evidence on cognitive, affective, and behavioral pathways that mediate the relationship between instructional components and learning outcomes [11,14]. Third, boundary conditions are consolidated by organizing moderating factors at the classroom, instructor, and learner levels, including technology infrastructure, class size, implementation fidelity, prior knowledge, motivation, and inclusion-related factors that shape both effectiveness and sustainability [9,10,15,16,17,18,19,20,21,22].
The scope is restricted to higher-education physics, with emphasis on undergraduate university physics and closely related introductory sequences. Priority is given to peer-reviewed empirical studies and evidence syntheses reporting measurable outcomes relevant to physics learning and/or the learning processes supporting it [8,9,10,11,12,14,23,24,25,26,27]. Where relevant, work addressing blended or digitally supported physics instruction is included to reflect contemporary instructional conditions [15,16,28,29,30,31,32].

1.3. Alignment with Sustainability-Oriented Competencies

Higher education with a focus on sustainability frequently emphasizes competency development, which gives students the tools they need to tackle challenging, socially important issues. While maintaining the main focus of the manuscript on the component–mechanism–boundary-conditions synthesis developed in Section 2, Section 3, Section 4 and Section 5, sustainability is used in this review primarily as an implementation and assessment lens rather than as an additional theoretical framework. It helps specify what “successful” active learning should support in real departments—maintainable instructional routines, scalable designs, and equitable participation.
Because it offers fundamental concepts related to energy systems, climate-related technologies, and quantitative modeling, as well as because active-learning designs can foster higher-order thinking and cooperative problem solving when used with the right support systems, university physics is well-positioned to support these goals [13,23,32,33,34,35]. We convey the competency emphasis in terminology often used in higher education—21st-century skills/soft skills—that are directly observable in physics learning activities and assessable in course settings in order to lessen conceptual overload. Three competence clusters are used in this review to operationalize this alignment: (i) critical thinking and problem solving; (ii) collaboration and communication; and (iii) responsible decision making in socioscientific contexts [13,23]. In order to connect the synthesis to SDG 5 (Gender Equality), we also address equality, namely gender-sensitive participation and learning environments, as part of sustainability-oriented implementation. These domains are frequently foregrounded in university physics active-learning designs that extend beyond procedural problem solving toward evidence-based argumentation, group accountability, and application to authentic contexts [11,13,19,23,36,37,38,39,40].

1.3.1. Critical Thinking and Problem Solving

Active learning in physics can support critical thinking by requiring learners to articulate assumptions, evaluate evidence, and reconcile conceptual conflicts during explanation and problem-solving tasks. Peer instruction, structured discussion prompts, and conceptual-question cycles are commonly used to elicit reasoning and surface misconceptions for targeted feedback [11,12,41,42,43,44]. Technology-enhanced environments, including simulations and interactive platforms, can further support exploration of trade-offs and model-based reasoning when tasks are structured around interpretation and justification rather than mere manipulation [23,26,27,45,46,47,48]. In this review, critical thinking is treated as an observable set of behaviors (e.g., justification quality, reasoning steps, model evaluation) and is linked to measurable indicators and instruments (Table 4) and to competency-aligned activity–assessment mapping (Table 5).

1.3.2. Collaboration and Communication

Collaboration is a defining feature of many active-learning implementations in university physics, particularly in collaborative problem solving, team-based learning, and project-oriented formats [19,20,49,50,51,52]. Beyond dividing labor, productive collaboration requires communication norms, role clarity, shared accountability, and feedback structures that sustain equitable participation. Digital tools (e.g., online collaboration spaces, breakout-room structures, shared artifacts) can extend collaboration across time and space, but they also increase the need for intentional orchestration to prevent disengagement and unequal participation [3,15,18,53,54,55,56]. The design of collaboration-related components, their hypothesized mechanisms, and scalable assessment options are synthesized in Section 3, Section 4, Section 5 and Section 6 and are summarized in the implementation and moderator toolboxes (Tables 4–6).

1.3.3. Responsible Decision Making in Socio-Scientific Contexts

Physics education increasingly intersects with societal questions, including energy transitions, risk assessment, and climate-relevant technology choices. Active-learning formats can support responsible decision making by embedding physics principles in scenario-based tasks, debates, projects, and reflective activities that require decisions to be justified under uncertainty and evaluated with respect to consequences [13,23,57,58,59,60,61]. In this review, responsible decision making is treated as an outcome domain linking conceptual physics understanding to contextualized reasoning and evaluative judgment. Feasible assessment approaches and task families aligned to this competence cluster are consolidated in Table 5.

1.3.4. How the Competence Lens Is Used in This Review

To translate sustainability alignment into actionable course design, a Competency–Activity–Assessment perspective is adopted: targeted competence domains are specified, active-learning components are selected to elicit competence-relevant behaviors, and assessment tools are aligned to capture both outcomes and underlying processes [11,13,19,23,62,63,64,65]. This mapping is consolidated in Table 5 and is intended to support instructors and institutions seeking to connect university physics active learning to competence development while maintaining feasibility, measurement clarity, and scalability.

1.4. Review Approach and Organization of the Review

This article is a traditional evidence synthesis with explicit transparency regarding search and screening. A targeted process was used to identify peer-reviewed research and synthesis studies relevant to active learning in university physics and closely related contexts. Searches were guided by combinations of terms including physics education, active learning, flipped classroom, peer instruction, collaborative problem solving, inquiry-based learning, project-based learning, simulation, blended learning, and engagement/motivation/self-efficacy, supplemented by backward and forward reference checking to capture foundational and recent high-impact studies [8,9,10,11,12,14,15,16,17,65,66,67,68,69,70]. Studies were prioritized when instructional design was described with sufficient granularity to identify reusable components, when measurable learning or process outcomes were reported, and/or when mechanisms or moderating factors relevant to higher-education implementation were examined.

Related Reviews and Positioning of the Present Review

Important aspects of the current pedagogical environment related to physics and STEM higher education have been mapped by previous review-level work, although usually by intervention families or subject strands rather than by reusable design components. For instance, evaluations have compiled data on particular teaching strategies, such as problem-based learning in physics [21] and the connection between active learning and flipped learning in various higher education settings [56]. The use of various representations in undergraduate physics learning and simulation-supported learning and related digital techniques [4,57] has also been summarized via technology- and representation-oriented syntheses. Additionally, constructs closely related to active learning, like computer-supported collaborative learning outcomes and collaborative learning structures [61], as well as the measurement and conceptualization of active learning in adjacent domains [17], have been reviewed in broader higher-education syntheses (often spanning STEM disciplines). When taken as a whole, these evaluations give insightful coverage of certain tactics, tool families, or outcome strands and provide a crucial basis for analyzing the consequences of active learning in situations that are pertinent to physics.
By providing an integrative organizational language for design and implementation, the current paper aims to supplement this earlier review literature rather than to replace it. Our contribution is specifically to translate heterogeneous interventions into a unified component–mechanism–boundary-conditions synthesis that (i) identifies functional components that are reusable and can be combined across instructional formats (Section 3; Table 2), (ii) articulates mechanism propositions along with observable indicators and measurement options (Section 4; Table 3), and (iii) combines boundary conditions into design responses that can be implemented to support sustainable scaling (Section 5; Table 6). While remaining based on the review-level evidence outlined above, this framing is intended to be useful for departments and instructors who must modify implementations under regular limitations.

1.5. Methods of the Review

1.5.1. Data Source and Search Strategy

To find studies on active learning in university/undergraduate physics and closely related higher-education physics contexts, a targeted literature search was carried out in the Web of Science Core Collection utilizing the Topic field (TS). The final update was made on 22 December 2025, and the search was conducted between 1 January 2019 and 22 December 2025 (search date: 22 December 2025). The widespread use of blended, flipped, and technology-supported active-learning formats, among other modern implementations and reporting procedures in physics education research, was taken into consideration when selecting this time frame.
Boolean logic was used to combine search terms into two concept blocks: (i) discipline/context (e.g., physics; university; undergraduate; higher education; introductory) and (ii) pedagogy/approach (e.g., active learning; flipped; peer instruction; collaborative problem solving; inquiry; problem-/project-based learning; formative assessment; feedback; simulations; AR/VR; learning assistants). Relevance and reporting adequacy were enforced during screening, but no document-type restriction was used during the search stage to optimize recall.
To guarantee a repeatable and interdisciplinary beginning point, Web of Science was chosen as the main entry database. In order to maintain the screening workflow auditable under a single-index sampling frame, we did not conduct a parallel Scopus search in the current revision. Nevertheless, we recognize this as a limitation and point out that future extensions could use Scopus/ERIC to test the robustness of the evidence map.

1.5.2. Study Identification and Selection

With a focus on (i) university-level physics teaching and learning contexts, (ii) identifiable active-learning components (e.g., readiness assurance, structured sensemaking tasks, feedback loops, collaboration structures, example-based reasoning tasks, or simulation-supported inquiry), and (iii) interpretable outcomes and/or process evidence aligned with the component–mechanism framework developed in Section 2, Section 3, Section 4 and Section 5, retrieved records were screened for relevance to the scope of this review. The core synthesis omitted studies that were mainly concerned with non-physics disciplines, non-tertiary (K–12) settings, or interventions that lacked adequate instructional specificity.
Text-only selecting process. 830 records were found in the Web of Science Core Collection by the search (date of search: 22 December 2025). For screening, every record was exported and then imported into Zotero. There were 829 records left for title/abstract screening after 1 duplicate record was eliminated. 652 records that were obviously outside the purview of this review were eliminated by title/abstract screening. Full-text evaluation was only carried out for records that could not be reliably included or excluded based on title/abstract alone, in accordance with the integrative narrative aim of this work (e.g., ambiguous population/setting, insufficient instructional specification to identify active-learning components, or unclear reporting of outcomes/process evidence). Forty-seven recordings had their full texts evaluated. Full texts were assessed for 47 records. The final core synthesis retained 150 studies aligned with the component–mechanism framework developed in Section 2, Section 3, Section 4 and Section 5. Forward and backward reference checking was used selectively to strengthen coverage of key components, mechanisms, and boundary conditions.

1.5.3. Synthesis Approach and Reporting Emphasis

The evidence was compiled using a component-based analytical lens rather than just strategy labels because active-learning formats and outcome metrics vary widely. A reusable component taxonomy (Section 3), mechanism propositions and operational indicators (Section 4), and boundary requirements pertinent to scalable and equitable implementation (Section 5) were all supported by the findings. Therefore, in addition to outcome patterns and contextual restrictions required to evaluate variability across contexts, extraction gave priority to functional design features—what was implemented, how it was enacted, and what learners were required to perform.
Assertion of descriptive coverage. We included a cautious descriptive coverage statement in response to reviewer demands for an “overall picture” of the evidence base. We do not include a potentially deceptive vote-count tally (positive/negative/undecided) because this paper is a classic integrative narrative review covering various designs, outcomes, and reporting quality. Rather, we emphasize important dependencies that account for inconsistent results and outline where evidence is relatively rich vs. thin across taxonomy components (Section 3) and mechanism propositions (Section 4).
Statements of support and framework extraction. The synthesis was created using an iterative framework extraction rather than effect-size pooling because the manuscript is a conventional integrative narrative review. We used a constant-comparative approach to merge overlapping labels and preserve minimal, reusable elements while coding (i) implementable instructional components (what was enacted), (ii) mechanism propositions (why it should work), and (iii) boundary conditions/moderators (when and for whom it works) during full-text assessment. We qualitatively indicate the strength of support (well-supported, moderate, emerging/limited) based on the extent and consistency of the evidence across contexts, the availability of interpretable outcome/process indicators, and the completeness of reporting in primary studies in order to prevent the tables from being presented as unsupported assertions.
Logic of construction for Figure 1. An integrative synthesis created for this conventional narrative review is shown in Figure 1. Instead of relying on a single quantitative model, the connections in the figure are based on the integration of qualitative evidence. In the full-text evaluated literature, we specifically created a link when at least one of the following was present: (i) explicit theoretical or mechanism-oriented argumentation linking a component to a mediator/outcome (e.g., explanation, feedback, SRL processes); (ii) empirical process evidence (e.g., readiness completion signals, discourse or interaction indicators, participation/engagement traces) consistent with the proposed pathway; and/or (iii) convergent outcome patterns reported across multiple studies which clearly specified the relevant component and the direction of change aligned with the proposed mechanism. Where evidence was indirect, context-dependent, or primarily conceptual, we represent the connection as a propositional link (i.e., a plausible mechanism pathway) rather than a definitive causal claim, and we interpret it together with the boundary conditions synthesized in Section 5 and the operational indicators summarized in Tables 3–5.

1.6. Novelty and Practical Contributions of This Review

Although this article is a traditional integrative narrative review, it makes three practical contributions beyond a strategy-by-strategy overview.
First, it reframes “active learning” from strategy labels (e.g., flipped, peer instruction) into a component-based taxonomy of reusable, implementable design elements, enabling transfer across formats and constraints (Section 3; Tables 1 and 2).
Second, it links these components to mechanism propositions with operational indicators, clarifying what should be observed if a component is functioning as intended and how mixed results can be interpreted (Section 4; Table 3).
Third, it consolidates moderators into actionable design responses (rather than a descriptive list of “things that matter”), supporting audit-friendly implementation and scalable adoption under routine constraints (Section 5; Table 6).
Finally, the review translates these insights into a Competency–Activity–Assessment mapping aligned with sustainability-oriented higher education, offering a feasible way to connect physics active learning to SDG-relevant competence development and assessment without adding conceptual overhead (Section 6; Table 5).

2. Theoretical Foundations of Active Learning in University Physics

2.1. Cognitive Architecture, Cognitive Load, and Conceptual Change

Active-learning designs in university physics impose cognitive demands that differ in kind from those of lecture-dominant instruction. Physics learning frequently requires coordination of multiple representations—graphs, vectors, equations, and verbal explanations—and integration of these representations into stable schemas. Cognitive Load Theory (CLT) offers a compact account of how instructional design shapes the allocation of limited working-memory resources and, consequently, the likelihood that coherent schemas will be constructed and retained [33,34].
Within Cognitive Load Theory (CLT), the cognitive demands of university physics can be differentiated into intrinsic, extraneous, and germane load. Intrinsic load is set by element interactivity—for instance, in vector superposition, multi-step modeling, or field reasoning, where multiple elements must be coordinated concurrently. Extraneous load is introduced by avoidable features of instructional design, including split attention, ambiguous prompts, and unscaffolded shifts between representations. Germane load, in contrast, denotes productive effort invested in schema construction and refinement, such as comparing solution approaches, articulating justifications, and mapping principles across contexts [33,34]. The resulting design implication for physics is therefore not a generic imperative to “reduce information,” but to minimize avoidable extraneous load while preserving—and, where appropriate, increasing—germane processing in ways that are aligned with learners’ prior knowledge and task complexity [34].
CLT is particularly informative for physics learning when interpreted alongside conceptual change perspectives. Physics misconceptions often persist because they operate as coherent, intuitive frameworks rather than isolated factual errors. When new statements are merely appended, extraneous load may increase without reliably destabilizing prior schemas. By contrast, active-learning designs can be structured to support conceptual change by eliciting an initial commitment (prediction or explanation), introducing anomalous evidence (from experiments, simulations, or counterexamples), and enabling reconciliation through guided reflection and peer comparison [33,36,37]. Simulation- and inquiry-supported tasks are especially relevant in this regard because they can externalize otherwise invisible dynamics, reduce representational burden, and redirect effort toward evaluating competing models and revising explanations [35,36,37,38,39].
These accounts converge on concrete design consequences that recur throughout the evidence base. Representational coordination is most effective when it is made explicit (e.g., prompts that require linking equations to graphs), when tasks are sequenced from scaffolded to generative, and when feedback loops are rapid enough to support timely misconception diagnosis and correction [34,35,36,37,38,39]. Correspondingly, studies have indexed these processes using concept inventories and diagnostic items, cognitive-load instruments (global or task-specific), and process measures such as explanation quality and error-diagnosis performance (summarized in Tables 3 and 4; placed at the end of the manuscript) [33,34,35,36,37,38,39]. Operational heuristics for time allocation and preparation–application alignment are consolidated in Table 1.

2.2. Motivation, Self-Efficacy, and Value: Why Students Engage

Even when tasks are cognitively well designed, learning benefits will not accrue if students do not persist long enough to complete the sensemaking work that physics requires. Motivation and self-beliefs therefore constitute a second explanatory layer. In Bandura’s framework, self-efficacy shapes whether difficulty is interpreted as a signal to invest effort (“improvement is possible”) or as evidence of incapacity (“I am not a physics person”) [40]. This distinction is particularly consequential in gateway physics courses, where repeated impasses in problem solving are common and where public error can amplify anxiety and suppress participation [43].
Active-learning environments can strengthen self-efficacy through the canonical sources: mastery experiences, vicarious experiences, social persuasion, and physiological/affective states [40,42,43,44]. Technology-enhanced implementations (e.g., simulations, AR-supported visualization, structured online practice) can provide low-stakes mastery opportunities and immediate feedback. Meanwhile, well-structured collaboration can scale vicarious learning and credible social persuasion, provided that norms and roles are made explicit [35,37,41,42].
Self-efficacy should not be conflated with the quality of motivation. Self-Determination Theory (SDT) offers a complementary account in which sustained engagement is supported when autonomy, competence, and relatedness are jointly satisfied. In university physics, SDT is especially useful for specifying conditions under which ostensibly “active” tasks may be counterproductive: challenge can undermine perceived competence when scaffolding is insufficient, and participation norms can elevate social-evaluative threat in ways that erode relatedness [37]. Conversely, active-learning designs are more likely to sustain productive effort during demanding conceptual work when meaningful choice is enabled (autonomy), feedback is interpretable and actionable (competence), and struggle is normalized through structured peer interaction (relatedness) [37]. Complementary motivational perspectives further highlight perceived utility and identity relevance; accordingly, connecting physics principles to authentic applications can increase perceived value and reduce the “abstract and irrelevant” framing that often suppresses engagement [15,39,40,45].
From a sustainability-oriented design standpoint, two implications follow. First, psychological safety should be treated as a core design requirement rather than an incidental by-product, because fear of negative evaluation can constrain the participation through which active-learning mechanisms are enacted [43,44]. In addition, motivation-sensitive scaffolds—feedback routines, role structures, and low-stakes practice—function as enabling conditions for equitable uptake and scalable implementation.

2.3. Engagement as a Multidimensional Process and the Role of Self-Regulated Learning

A third foundation concerns engagement, conceptualized not as a single attribute but as a coupled system of behavioral, emotional, and cognitive processes that determines whether students enact the learning opportunities embedded in instruction [46]. In university physics, superficial participation (attendance without sustained sensemaking) is common, and activity completion does not necessarily imply engagement. The Fredricks model is therefore useful for distinguishing: behavioral engagement (participation in tasks and interactions), emotional engagement (interest, anxiety, belonging), and cognitive engagement (strategic and metacognitive investment such as planning, monitoring, and elaboration) [46].
Active learning can increase behavioral and emotional engagement by reshaping interaction norms (discussion, polling, collaborative artifacts) and reducing social threat through anonymous response systems or structured group roles [43,47,49]. Nonetheless, durable learning in physics is frequently contingent on cognitive engagement, particularly self-regulated learning (SRL). Self-regulated learning (SRL) is typically operationalized as learners’ capacity to plan, monitor, and evaluate learning and problem-solving processes—competencies that are central in physics, where performance depends on diagnosing errors, selecting governing principles, and checking the coherence of proposed solutions [48,50,51,52].
SRL development may be supported through two complementary routes: (i) explicit strategy instruction, such as metacognitive prompts and checklists, and (ii) task designs that necessitate self-regulation while providing structured support, including readiness-oriented pre-class preparation with low-stakes checks, guided inquiry sequences, and scaffolded project work [50,51,52]. In blended and online contexts, SRL demands often increase precisely when support can decrease, owing to reduced instructor immediacy and uneven access. The design of feedback and guidance structures therefore becomes central to both sustainability and equity [48,53]. Technology-enhanced tools can contribute through immediate feedback and trace data for learning analytics, but benefits are most likely when analytics are coupled to instructional responses rather than treated as passive dashboards [35,48,49].
These theoretical accounts imply that sustainable active learning requires clear participation structures, low-stakes feedback that guides effort, and explicit supports for monitoring and reflection during problem solving [43,48,49,50,51,52]. Empirically, engagement has been indexed using validated engagement scales and behavioral proxies (e.g., participation logs, response rates), while SRL has been measured via SRL questionnaires, metacognitive prompt performance, and process traces from digital platforms [46,48,49,50,51,52].

2.4. Summary: Rationale for a Component–Mechanism–Outcome Synthesis

Single-modality inference is often fragile in authentic classrooms because each sensor is susceptible to modality-specific noise and construct ambiguity. Multimodal fusion aims to improve robustness and explanatory power by integrating complementary signals across time and context [3,37,52,53,54,55]. In university physics education, fusion is particularly relevant because cognitive engagement is multidimensional and can manifest differently across students, tasks, and instructional formats (lecture, problem session, laboratory) [3,11].

3. Effective Components of Active Learning Strategies

Active learning in university physics is often described using strategy labels (e.g., flipped classroom, peer instruction, inquiry, project-based learning). Across physics education and broader STEM research, however, effectiveness is more consistently accounted for by reusable instructional components that can be recombined across formats—particularly those shaping preparation quality, in-class sensemaking, feedback loops, and the deliberate use of examples to support reasoning [55,56,60,61,68]. The synthesis in this section therefore adopts a component lens and foregrounds design implications that remain feasible under the practical constraints of sustainable implementation in diverse higher-education settings. Operational definitions and implementation heuristics are summarized in Table 2.

3.1. Pre-Class Preparation and Flipped Classroom Design

3.1.1. Preparation as a Readiness Component Rather than an Add-On

Across flipped and blended formats, pre-class preparation functions as a readiness component that determines whether in-class time can be devoted to higher-order sensemaking rather than first exposure to content [55,56]. Consistent findings indicate that preparation is most effective when it is aligned with in-class tasks, bounded in length and cognitive scope, and made accountable through low-stakes checks that clarify expectations while providing diagnostic feedback [55,56,68]. When these conditions are not satisfied, flipped designs can shift workload to students without improving the quality of in-class reasoning.

3.1.2. Content Formats: Short Videos, Structured Readings, and Interactive Resources

Pre-class preparation commonly relies on short instructional videos, structured readings, and guided question sets. In blended physics courses, video-based preparation can support learning when extraneous complexity is minimized and when representations and concepts are explicitly primed for subsequent application [55,56]. Interactive resources, most prominently PhET-based activities, are also used to build intuition and to prepare students for in-class exploration by externalizing complex phenomena and enabling rapid hypothesis testing [56,57]. Reviews of PhET integration suggest that conceptual benefits are most likely when simulations are embedded in purposeful tasks rather than positioned as optional, unstructured exploration [55,56,57].

3.1.3. Accountability and Readiness Assurance in Scalable Designs

At scale, adoption tends to hinge less on the availability of materials than on the feasibility of readiness assurance routines. Low-stakes quizzes, guided worksheets, and short pre-class prompts can simultaneously motivate completion and generate information that supports adaptive in-class instruction [55,68]. Accordingly, the pre-class component is most appropriately treated as a system—materials coupled to prompts, short checks, and feedback—rather than as content provision alone.

3.2. In-Class Sensemaking Tasks and Peer Interaction

3.2.1. Repurposing Class Time from Reception to Reasoning

Effective active learning in physics is characterized by a systematic repurposing of in-class time from passive reception to sensemaking. In such settings, students are prompted to explain, compare, justify, and revise ideas, while instruction is organized around elicitation and feedback rather than continuous exposition [55,56,60,64]. The functional mechanism is not increased activity per se, but the creation of structured opportunities to articulate reasoning under constraints, such that misconceptions become visible and correctable.

3.2.2. Peer Instruction, Collaborative Problem Solving, and Cooperative Structures

Peer interaction is repeatedly identified as a high-leverage component because it increases the frequency of explanation, provides vicarious learning opportunities, and can normalize struggle during conceptual work [59,60,63,66]. In university physics, cooperative and collaborative formats—including cooperative learning structures and collaborative problem solving—are most robust when roles, shared artifacts, and accountability mechanisms are specified, rather than assumed to emerge spontaneously [61,62,64]. Student perceptions also function as enabling conditions: enjoyment and perceived value have been associated with participation and persistence, particularly under high conceptual demand [63].

3.2.3. Peer Assessment and Public Reasoning Artifacts

Peer assessment and peer feedback can deepen learning when students are required to apply criteria to reasoning quality, not merely to final answers, and when rubrics are calibrated to physics representations and common misconceptions [59,60,61]. Public reasoning artifacts (e.g., group solution boards or shared problem representations) can facilitate collective comparison and support instructor diagnosis of conceptual bottlenecks. Their implementation nonetheless introduces practical constraints—space, time, and facilitation demands—that must be managed if effectiveness is to be maintained in large-enrollment settings [64,65]. As a result, lightweight substitutes are frequently adopted, including digital shared boards and structured worksheets, which preserve the central function of making reasoning visible while lowering logistical burden.

3.3. Feedback Mechanisms and Formative Assessment

3.3.1. Feedback as the Engine of Revision Loops

Formative assessment and feedback operationalize a learning cycle that is foundational to active learning: student thinking is elicited, evaluated against explicit criteria, and then leveraged to support revision [68]. In university physics, feedback tends to be most consequential when it targets the structure of reasoning, coordination of representations, and diagnosis of misconceptions, rather than correctness in isolation. Across higher education, formative assessment has been associated with improved learning outcomes; however, effects are contingent on timing, specificity, and—critically—whether feedback is taken up and acted upon by students [68].

3.3.2. Timing and Modality: Immediate Versus Delayed Feedback

Immediate feedback—delivered via in-class response systems or rapid checks embedded in problem solving—can enable error correction before misconceptions consolidate and can provide instructors with information needed to adjust instruction to emergent difficulties [60,68]. Delayed feedback (e.g., post-class comments or returned graded work) is more likely to support deeper reflection when guidance is actionable and when follow-up structures require its use, such as error corrections, brief reflections, or resubmissions. For sustainability at scale, feedback architectures must remain feasible within instructor workload constraints, which typically necessitates a hybrid approach combining automated feedback, structured peer review, and targeted instructor intervention [60,68].

3.3.3. Feedback Literacy and Meta-Feedback

A persistent limitation in higher education is that feedback may be received yet not used effectively. Meta-feedback—guidance on interpreting criteria, diagnosing errors, and planning improvement—directly addresses this uptake gap and is particularly salient in physics, where difficulty is often misattributed to fixed ability rather than to strategy selection or model choice [60,63,68]. In large courses, feedback literacy can be supported with scalable tools, including structured rubrics and calibrated exemplar comparisons, without requiring extensive individualized commentary.

3.3.4. Equity and Differential Effects of Feedback Systems

Formative assessment and feedback systems may yield differential benefits across learner groups as a function of prior knowledge, self-efficacy, and classroom climate. Evidence from calculus-based physics settings indicates that gender, self-efficacy, and perceptions of the learning environment can be associated with differences in experiences and outcomes, underscoring the need for feedback routines that are transparent, supportive, and psychologically safe [69,70,71,72]. Such considerations are central to sustainability, because large-scale implementation should not systematically widen existing disparities.

3.4. Example-Based Learning: Worked, Error, and Contrast Examples

3.4.1. Examples as a Distinct Component Within Active Learning

Examples are ubiquitous in physics instruction, yet their role within active learning is often under-specified. Example-based learning functions as an active-learning component when examples are used for sensemaking—error diagnosis, comparison of solution structures, identification of deep features, and explanation generation—rather than for passive imitation [54]. Under this view, examples operate as engineered prompts that reduce unproductive search while directing attention to principle selection and representation mapping.

3.4.2. Error Examples and Contrast Examples

Physics education research has emphasized the utility of error examples (analysis of incorrect solutions) and contrast examples (comparison of alternative solutions or approaches) for surfacing misconceptions and improving discrimination among solution strategies [54]. Error examples can support conceptual conflict and correction when guidance directs attention to where and why reasoning fails. Contrast examples can support transfer by promoting recognition of deep structural cues—such as conservation principles or symmetry constraints—that determine which model applies [54].

3.4.3. Boundary Conditions and Design Implications

Example-based components are not uniformly effective. Outcomes depend on prior knowledge, guidance level, and task structure. When conceptual grounding is limited, error diagnosis can devolve into guesswork; when guidance is excessive, contrast tasks can reduce to passive confirmation. These boundary conditions are developed further in Section 5. For sustainable implementation, example-based tasks are most appropriately tiered from guided to independent and integrated with feedback routines, such that insights from example analysis translate into improved problem solving [54,68]. Mechanism linkages and measurable-indicator-based components, for example, are summarized in Tables 3 and 4.

3.5. Section Summary and Transition

Section 3 delineates four component families that recur across effective active-learning designs in university physics: pre-class readiness systems, in-class sensemaking with peer interaction, formative assessment and feedback loops, and example-based reasoning tasks [54,55,56,57,60,61,62,63,64,65,68,70,71,72,73,74]. These components are necessary but not sufficient; their influence is mediated by cognitive, affective, and behavioral processes and is moderated by context and implementation quality. Section 4 therefore synthesizes mechanisms of action and introduces the integrative mechanism framework (Figure 1), supported by the measurement toolbox summarized in Tables 3 and 4.
Overall image of all the components. Scholarly attention is not fairly divided among the components of the papers we evaluated in full text for careful reading. For components focused on organized peer sensemaking and feedback-supported revision, which are most commonly presented with enough instructional information to interpret implementation and results, the evidence base is relatively thick. On the other hand, components that call for more detailed specification—such as role-structured collaboration routines, explicit participation architecture, or prolonged revision cycles—are less frequently reported and, as a result, make up thinner evidence clusters. As a result, we approach the taxonomy as both a synthesis and a research agenda: it identifies under-specified components that require more robust reporting and mechanism-linked metrics, while highlighting “workhorse” components with wider coverage.

4. Mechanisms of Action: How Active-Learning Components Produce Learning and Competency Outcomes

Section 3 delineates recurring component families in university physics active learning, including pre-class readiness systems, in-class sensemaking with peer interaction, formative assessment and feedback loops, and example-based reasoning tasks [54,55,56,57,59,60,61,62,63,64,65,68,75,76,77,78]. The present section specifies how these components translate into learning gains by synthesizing evidence into a mechanism account organized along three interdependent pathways: cognitive processes, affective–motivational processes, and behavioral–social/self-regulatory processes [33,34,35,36,37,38,39,40,41,42,43,44,45,46,48,49,50,51,52,79,80,81,82,83,84,85]. Mechanisms are treated here as testable mediators rather than as abstract theory. Accordingly, each pathway is articulated in terms of observable processes and measurable indicators that can be aligned with course-level evaluation (Figure 1, Table 3).
Figure 1. Component–mechanism–boundary-conditions framework developed in this review.
Figure 1. Component–mechanism–boundary-conditions framework developed in this review.
Sustainability 18 02791 g001
The figure summarizes how reusable instructional components are expected to activate learning mechanisms (including engagement as a proximal mediator/process indicator) under specific boundary conditions, which in turn shape downstream learning outcomes (e.g., conceptual understanding, problem solving, transfer, and persistence-related outcomes). The framework is an integrative synthesis developed for the present traditional narrative review, drawing on recurring constructs in physics education research and learning sciences while providing a unified component- and mechanism-oriented organizing structure. Links represent qualitatively synthesized mechanism propositions supported by reported process and/or outcome evidence (Tables 3–5), rather than a fitted quantitative model.
Solid arrows: mechanism links; dashed arrows: SDG competency mapping; moderators influence link strength. Engagement is treated as a proximal indicator/mediator.
Note: Engagement is discussed as a proximal process indicator/mediator rather than a distal learning outcome, whereas persistence is treated as an outcome endpoint.
Table 3. Mechanism pathway matrix: mechanisms, component levers, observable indicators, recommended measures, and representative evidence.
Table 3. Mechanism pathway matrix: mechanisms, component levers, observable indicators, recommended measures, and representative evidence.
PathwayMechanism (What Changes)Component Levers (What You Implement)Operational Indicators (What You Should Observe)Recommended Measures (How to Measure)Representative Evidence (from Your Reference List)
CognitiveRepresentational integration (MR coordination)Tasks that require translating between equations–graphs–diagrams; peer explanation using ≥2 representationsAccuracy/consistency across representations; fewer representation-specific errors; improved transfer to novel contextsTopic-aligned concept inventories; rubric-coded explanation quality; representation-translation items[4,66]
CognitiveInteractive model testing (simulation-supported reasoning)Pre-class or in-class simulation exploration with prediction–test–explain prompts (not “watch-only”)More correct predictions; improved model-based reasoning; reduced “plug-and-chug” behaviorPre/post conceptual test; log traces (attempts, variables changed); written prediction–explanation prompts[56,83]
CognitiveConceptual conflict → revisionConcepTests + peer discussion; structured explanation + revision cycles; two-stage testingIncreased correction after discussion; higher quality justifications; durable retention on misconception-prone conceptsTwo-stage quiz score shifts; error-analysis prompts; misconception-focused items[51]
CognitiveFeedback-driven error correction (formative assessment loop)Immediate feedback + explanation; formative checks embedded in activities (not only summative grading)Faster misconception repair; higher revision rate; reduced repeated errors across tasksFormative assessment instruments; revision-required assignments; item-level error tracking[9,83]
AffectiveSelf-efficacy stabilization (competence signals under challenge)Low-stakes practice; mastery-oriented feedback; structured peer support; reduce “public failure” costsIncreased self-efficacy over time; higher willingness to attempt difficult problems; reduced avoidancePhysics self-efficacy scales; weekly micro-surveys; persistence/attempt rates[67,69,84]
AffectiveBelonging/reduced stereotype threat (psychological safety)Inclusive participation routines (roles, turn-taking, anonymity options); norming respectful critiqueHigher belonging; more equitable participation; reduced anxiety about speakingBelonging measures; participation distribution metrics (who speaks/how often); climate surveys[79,80,81]
AffectiveMotivation regulation (value/autonomy/engagement)Coherent flipped coupling (pre-class directly “needed” in class); meaningful choices in tasks; relevance cuesHigher time-on-task; better preparation compliance; improved engagement–performance couplingLMS logs; engagement scales; completion rates tied to in-class performance[37,47,76]
BehavioralParticipation density & equitable interactionPeer instruction routines; small-group CPS with explicit roles; structured accountabilityBroader distribution of talk; fewer “silent attenders”; more help-seeking and peer explanationObservation protocol (structured coding); participation counts; group artifact analysis[11,58,80]
BehavioralSRL enactment (planning–monitoring–revision behaviors)Pre-class quizzes with feedback; reflection prompts; revision-required homework/labsMore planning and monitoring; increased revision after feedback; reduced last-minute cramming patternsSRL scales; reflection log coding; trace indicators (spacing, retries, revisions)[16,50]
Note: Ensure abbreviations (e.g., MR, SRL, ARS, LMS) match the manuscript Abbreviations list and are defined consistently. The evidence supporting each element varies in density and reporting quality; the tables summarize recurring elements, while Section 3, Section 4 and Section 5 indicate where evidence is comparatively denser versus thinner.

4.1. Mechanism Logic: From Components to Mediators to Outcomes

Under the mechanism perspective adopted in this review, active learning is not effective because learners are “busy,” but because instructional components systematically alter what students attend to, attempt, and revise during learning episodes. Across implementations, several proximal mediating processes recur. First, thinking is elicited and externalized when students are required to commit to predictions or explanations, rendering reasoning visible and therefore addressable [36,37,38,39,59,60]. Second, revision loops are enabled when feedback is sufficiently rapid and interpretable to support diagnosis and correction of errors and misconceptions [60,68]. Third, deep-feature discrimination and representation coordination are strengthened when tasks require principle selection and alignment among equations, graphs, diagrams, and verbal models [33,34,35,36,37,38,39,54]. Fourth, persistence and strategic effort under challenge are shaped by motivational supports and self-regulatory structures that determine whether students sustain engagement during difficult problem solving [40,41,42,43,44,45,46,48,49,50,51,52].
These mediators map onto outcome domains commonly reported in the literature: conceptual understanding, problem-solving performance and transfer, and engagement/persistence [11,46]. When learning designs explicitly target broader educational aims, the same mediating processes can also support competency development aligned with sustainability-oriented higher education, including evidence-based reasoning, collaboration and communication, and responsible decision making in socio-scientific contexts [13,19,23,74,75].

4.2. Cognitive Pathway Mechanisms

4.2.1. Cognitive Load Management and Schema Construction

From a cognitive architecture perspective, effective active learning reduces avoidable extraneous load while preserving—often increasing—germane processing directed toward meaning-making and schema refinement [33,34]. Components that support this shift include structured prompts, scaffolded task sequences, and representation-bridging questions that require interpretation rather than algebraic completion [34,35,36,37,38,39]. In university physics, this design feature is consequential because novices frequently allocate working memory to surface manipulations (e.g., procedural algebra) while neglecting conceptual structure; prompts requiring justification and interpretation reallocate attention toward the conceptual level [33,34,35,36,37,38,39]. As a design implication, tasks are most likely to promote durable learning when completion cannot be achieved through rote substitution alone and when timely feedback is available to stabilize correct structures and destabilize misconceptions [33,34,35,36,37,38,39,60,68].

4.2.2. Conceptual Conflict, Explanation, and Reconciliation

Physics misconceptions often function as coherent, intuitive models. Active-learning designs can therefore promote conceptual change when a three-step sequence is instantiated: commitment (prediction/explanation), conflict (anomalous outcome through experiment, simulation, or peer comparison), and reconciliation (guided revision of model and explanation) [36,37,38,39]. Peer instruction and inquiry-oriented tasks are particularly well matched to this mechanism because competing explanations are made salient and reasoning must be articulated rather than passively received [59,60,63,66]. The mechanism is boundary-sensitive: conceptual conflict can degrade into confusion when scaffolding is insufficient to support reconciliation. Moderation by prior knowledge and guidance level is therefore expected (formalized in Section 5) [33,34,35,36,37,38,39].

4.2.3. Deep-Feature Discrimination and Transfer Through Example-Based Reasoning

Example-based components (worked, error, and contrast examples) can support learning by reducing unproductive search and directing attention toward principle selection, representation mapping, and structural cues that determine which model applies [54]. Error examples are expected to elicit diagnostic reasoning concerning where and why an argument fails, whereas contrast examples are expected to strengthen discrimination of deep features (e.g., conservation constraints; symmetry) that support transfer beyond familiar templates [54]. Transfer gains are most plausible when example analysis is paired with explanation and comparison prompts rather than treated as passive demonstration [33,34,35,36,37,38,39,54].

4.3. Affective–Motivational Pathway Mechanisms

4.3.1. Self-Efficacy and Perceived Competence as Determinants of Persistence

Active learning requires intellectual risk-taking: partial understanding is exposed, tentative claims are advanced, and ideas are revised in public. Self-efficacy influences whether challenge is interpreted as controllable through effort and strategy or as identity-threatening failure [40,43,44]. Well-designed active-learning environments can strengthen self-efficacy by providing frequent mastery opportunities, interpretable feedback, and vicarious learning through peer comparison [40,41,42,43,44]. In large-enrollment settings, this mediator is particularly consequential because perceived public risk is often elevated; without protective norms and supportive feedback structures, participation may be suppressed, and disparities may widen [43,44,70,86,87,88].

4.3.2. Autonomy, Relatedness, and Psychological Safety

Motivation quality is determined not merely by the amount of effort expended, but by the perceived conditions under which effort is mobilized and sustained. Within Self-Determination Theory, durable engagement is more likely when autonomy, competence, and relatedness are concurrently supported [37,76,77]. In university physics, ostensibly “active” tasks may become counterproductive when challenge undermines perceived competence in the absence of adequate scaffolding, or when participation norms heighten evaluative anxiety and thereby erode relatedness [37,76,77]. Conversely, designs that normalize error, separate critique of ideas from critique of persons, and employ low-stakes response channels when appropriate (e.g., anonymous polling) are more likely to maintain psychological safety and participation [43,47,49,70]. These design requirements are directly relevant to sustainable implementation because disengagement and attrition are recurrent failure modes in high-demand gateway courses [15,70].

4.3.3. Value and Relevance Through Authentic Contexts

Sustained investment in physics learning is less likely when the discipline is perceived as abstract and irrelevant. Contextualized tasks and scenario-based activities can increase perceived value and identity relevance, supporting persistence and deeper engagement [15,39,40,45,88,89]. This mechanism also provides a direct bridge to sustainability-aligned competencies: socio-scientific contexts (e.g., energy systems; climate-related decisions) can reposition physics learning as practice in evidence-based judgment under uncertainty and in collaborative problem solving [13,23].

4.4. Behavioral–Social and Self-Regulatory Pathway Mechanisms

4.4.1. Engagement as Enacted Opportunity

Engagement is multidimensional—behavioral, emotional, and cognitive—and learning benefits are most consistently associated with cognitive engagement (strategic effort, elaboration, monitoring) rather than activity completion alone [46]. Components expected to promote cognitive engagement include explanation-requiring prompts, peer comparison routines that force justification, and formative assessment structures that provide actionable feedback [46,60,68].
Clarification regarding involvement. Rather than being viewed as a distal learning outcome, engagement is largely considered in this review as a mechanism-level mediator (a proximal process via which design elements transfer into learning). We interpret engagement measures (behavioral, cognitive, or affective engagement) reported in primary studies as markers of the activation of the intended mechanism pathway and separate them from downstream outcomes like transfer, conceptual learning, and problem-solving performance.

4.4.2. Self-Regulated Learning During Physics Problem Solving

Self-regulated learning (SRL) is central in physics because students must select principles, monitor coherence, diagnose errors, and revise strategies. SRL can be strengthened explicitly through metacognitive prompts, checklists, and reflection questions, or implicitly through task designs that require monitoring and revision while providing sufficient support [48,50,51,52]. Pre-class readiness systems can also function as SRL supports by establishing routine cycles of preparation, checking, and adjustment [55,56,68]. SRL support becomes especially consequential in blended and hybrid formats, where learning demands shift beyond structured class time and inequities in resources or prior experience may widen [48,53].

4.4.3. Social Regulation and Collaborative Knowledge Building

Collaboration in university physics is not reducible to division of labor; it functions as social regulation of learning through co-construction of explanations, negotiation of models, and mutual accountability to reasoning norms [61,62,64]. When roles, shared artifacts, and accountability are specified, opportunities for explanation and feedback are increased and both cognitive and motivational mechanisms may be amplified [59,60,61,62,63,64]. When collaboration is weakly structured, inequitable participation and superficial consensus are more likely, undermining learning and sustainability [63,70,88,89,90,91].

4.5. Cross-Pathway Interactions: Coupled Mechanisms Rather Than Isolated Effects

The pathways described above are coupled. Cognitively demanding tasks may depress engagement when competence threat and anxiety are elevated, producing withdrawal despite task quality [33,34,35,36,37,38,39,40,41,42,43,44,45,46]. Conversely, well-designed feedback loops can simultaneously reduce extraneous load (by clarifying what matters), strengthen self-efficacy (by making progress visible), and support SRL (by guiding next steps) [60,68]. Mechanisms are therefore most appropriately conceptualized as a system. This systems view clarifies why outcomes are heterogeneous when one pathway is neglected—for example, high-quality tasks implemented without psychological safety, or motivation supports provided without structured conceptual conflict and feedback [43,68,70].

4.6. Mechanism Propositions and Evaluation Implications

For actionable synthesis, five mechanism propositions are stated with corresponding evaluation options (Tables 3–5):
Proposition 1
(Externalization–Revision Loop). When students are required to articulate predictions/explanations and receive rapid, interpretable feedback, conceptual learning is expected to increase via revision cycles [36,37,38,39,60,68]. Evaluation may be conducted using concept inventories/diagnostic items, explanation-quality rubrics, response-system traces, and error-correction tasks [60,68].
Proposition 2
(Representation Coordination). Tasks that force alignment across representations (equations–graphs–diagrams–verbal models) are expected to strengthen deep understanding and transfer by reducing representational fragmentation [33,34,35,36,37,38,39]. Evaluation may use multi-representation items, reasoning-step analyses, and transfer problems with novel surface features [33,34,35,36,37,38,39].
Proposition 3
(Efficacy–Persistence). Mastery opportunities accompanied by supportive feedback are expected to increase self-efficacy, which mediates persistence and participation under high challenge [40,41,42,43,44]. Evaluation may use self-efficacy instruments, persistence indicators, participation patterns, and subgroup-sensitive analyses of differential effects [70].
Proposition 4
(SRL Strengthening). Readiness assurance and metacognitive prompting are expected to strengthen SRL behaviors (planning, monitoring, reflection), thereby improving problem solving and sustained engagement [48,50,51,52,55,56]. Evaluation may include SRL questionnaires, reflection quality indicators, and learning analytics coupled to targeted interventions [48,53].
Proposition 5
(Structured Collaboration). Collaboration structures specifying roles, shared artifacts, and accountability are expected to increase explanation frequency and social regulation, strengthening both learning and engagement outcomes [61,62,64]. Evaluation may use collaboration rubrics, peer evaluation, discourse/interaction indicators, and equity audits of participation [63,70].

4.7. Section Summary and Transition

Section 4 explains how active-learning components can produce learning gains by activating coupled mechanisms across cognitive load management and conceptual change, affective–motivational supports (self-efficacy, autonomy/relatedness, value), and engagement/self-regulated learning processes [33,34,35,36,37,38,39,40,41,42,43,44,45,46,48,49,50,51,52,60,68,70]. These mechanisms are not invariant: their magnitude and stability depend on contextual and implementation conditions. Section 5 therefore synthesizes boundary conditions and moderators—including implementation fidelity, class size, technology constraints, learner preparedness, and equity-related factors—that determine whether active learning remains effective and sustainable across higher-education contexts [15,16,17,63,70].
Overall view of all mechanisms: mechanism propositions have a similar distributional trend. Mechanisms requiring specialized process measures (e.g., fine-grained regulation dynamics within groups or sustained transfer-oriented reasoning) are less frequently documented than mechanisms that are easily observable in typical course data, such as explanation-based sensemaking and feedback-driven error correction. As a result, rather than making uniform causal claims, we present the mechanism account as a collection of testable propositions with variable evidence densities, and we highlight boundary conditions that plausibly explain mixed results.

5. Boundary Conditions and Moderators: When, for Whom, and Under What Conditions Active Learning Works

Active-learning effects in university physics are well documented, yet variability remains substantial across implementations, institutions, and student populations [9,10,11,12,15,16,17,81,82,83,90,91]. This heterogeneity is not ancillary; it is central to sustainability-oriented higher education because scalable adoption requires designs that remain effective under routine constraints, including large enrollments, uneven preparation, limited instructional support, and differential access to technology [15,16,17,83,84,92,93,94,95,96]. The present section synthesizes boundary conditions and moderators that shape the magnitude—and in some cases the direction—of active-learning effects. A consolidated moderator-to-design response toolbox is provided in Table 4.

5.1. Implementation Fidelity and the “Active Ingredient” Problem

5.1.1. Fidelity as Preservation of Functional Components

A recurring source of heterogeneity is implementation fidelity, defined here as the extent to which enacted instruction preserves the functional components required for mechanisms to operate (Section 4). In practice, instructional labels may be adopted (e.g., “flipped”) while core active ingredients are weakened or omitted, including readiness assurance, in-class sensemaking, and feedback-driven revision loops [55,56,60,68]. When preparation is optional, when in-class time reverts to extended exposition, or when feedback is weak or delayed, the component–mechanism chain is disrupted and outcome gains attenuate [55,56,68]. For sustainable scaling, fidelity is therefore most appropriately treated as functional rather than procedural: adaptation across contexts is expected, but the mechanism-enabling components must be preserved. To make fidelity and dosage practically checkable, Section 5 uses low-cost implementation indicators (time allocation, cycle frequency, completion signals, and revision opportunities) as audit proxies for whether the mechanism-enabling components are actually enacted.

5.1.2. Dosage and Time Allocation: Sufficient Activity Without Overload

Although “dosage” is rarely recorded in a manner that is directly comparable across trials, effects are sensitive to dosage. We therefore treat dosage as an auditable set of implementation indicators rather than a single universal threshold. When active-learning labels are used, but the mechanism-enabling chances for preparation, peer sensemaking, and revision happen too infrequently or too quickly to initiate cycles of explanation, feedback, and knowledge updating, this is known as under-dosing in practice [55,56,60,68,94,95,96].
In order to make this diagnosis feasible, teachers can keep an eye on a limited number of low-cost indicators: (i) the proportion of class time spent on structured student work as opposed to lengthy exposition (active-time ratio); (ii) whether or not each session features repeated complete sensemaking cycles (prompt → attempt → peer discussion → feedback); (iii) the completion rates of preparation and readiness prior to class; (iv) the availability of explicit revision opportunities following feedback; and (v) whether or not participation is widely distributed rather than concentrated in a small subset (e.g., response rates, group contribution traces) [43,55,56,63,68,70,96]. Low values on these indicators over time usually indicate “thin” activity and the requirement for further scaffolding or time reallocation to maintain mechanism integrity under routine restrictions.

5.1.3. Coherence Across Components

Active learning operates as a system, and misalignment across components is a common failure mode. Misalignment occurs when pre-class preparation is not used in class, when in-class tasks assume skills that were not scaffolded, when assessment rewards procedural reproduction rather than reasoning, or when group work is assigned without accountability and feedback [55,56,61,62,64,68]. Such incoherence weakens mediators (Section 4) and can produce negative student attributions (“busywork”), thereby undermining engagement and sustainability [63,70].

5.2. Instructor and Facilitation Factors

5.2.1. Facilitation Quality as a Moderator

In university physics, facilitation behaviors—prompting explanation, diagnosing misconceptions, managing discussion, and providing timely feedback—can moderate outcomes even when tasks appear similar [60,64,68,97]. Effective orchestration maintains cognitive demand while preventing unproductive confusion and treats student contributions as instructional data rather than as noise [60,68,98]. Under weaker facilitation, cognitively demanding tasks may fail to trigger revision loops or may be experienced as disorganized, reducing both learning and buy-in.

5.2.2. Instructor Beliefs and Instructional Goals

Instructor beliefs about learning (e.g., transmission versus construction) shape enactment. When active learning is treated as an engagement add-on rather than as a mechanism-driven design, implementation may become superficial and effects may diminish [9,10,11,12,60]. Sustainable adoption therefore depends on alignment between instructional goals and the mechanism logic: making thinking visible, sustaining revision loops, and valuing reasoning in both tasks and assessment.

5.2.3. Training, TA Support, and Instructional Teams

Large-enrollment physics courses frequently depend on teaching assistants and coordinated instructional teams. TA preparation and alignment can influence fidelity and equity, particularly through small-group facilitation, feedback quality, and management of group dynamics [61,62,64,99]. Institutions seeking sustainable scaling have typically relied on lightweight but consistent supports (shared rubrics, facilitation scripts, common feedback standards) that preserve active ingredients without imposing excessive labor [100].

5.3. Learner Factors: Prior Knowledge, Preparedness, and Psychological Constraints

5.3.1. Prior Knowledge and Cognitive Readiness

Prior knowledge moderates response to active tasks. Novices often benefit from guided examples and structured prompts, whereas more advanced learners may benefit from greater open-endedness and productive struggle [33,34,54]. When tasks exceed learners’ current schemas without scaffolding, cognitive load may shift from productive revision to confusion and withdrawal, reducing learning gains [33,34,36,37,38,39,101,102,103].

5.3.2. Motivation, Self-Efficacy, and Fear of Being Wrong

As established in Section 4, self-efficacy and psychological safety shape whether learners engage in explanation and revision—behaviors that function as a practical gateway for the activation of active-learning mechanisms [40,41,42,43,44]. In university physics, fear of negative evaluation has been associated with reduced participation in peer instruction and discussion, particularly when classroom norms prioritize speed or correctness over reasoning quality [43,70]. Under such conditions, uptake may become differential: active learning may be present in name, yet enacted primarily by students with greater confidence.

5.3.3. Self-Regulated Learning Capacity

Flipped and blended formats shift a substantial portion of planning and monitoring to learning episodes outside class, thereby increasing self-regulated learning (SRL) demands. When SRL skills are weaker, pre-class preparation is more likely to be incomplete, students may enter class without the readiness required for sensemaking tasks, and active learning may be experienced as overwhelming or inequitable [48,50,51,52,53,55,56]. This moderating pattern does not imply that flipped designs should be avoided; rather, it indicates that SRL supports—readiness checks, structured guidance, and metacognitive prompts—should be treated as core design components in scalable implementations [48,55,56,68,104]. In practice, SRL under-support can be detected through simple signals such as persistently low preparation completion rates, weak alignment between pre-class tasks and in-class use, and large gaps between intended and observed readiness.

5.4. Classroom and Institutional Context: Size, Resources, and Assessment Regimes

5.4.1. Class Size and Logistical Constraints

Large-enrollment settings can amplify both advantages (expanded opportunities for peer explanation; polling at scale) and risks (reduced individualized feedback, uneven participation, and anonymity that can facilitate disengagement) [60,63,68]. To preserve mechanism integrity in such cohorts, response systems, structured worksheets, and modular group routines are often required as enabling implementation choices [47,49,60,105,106,107].

5.4.2. Resource Constraints and Workload Sustainability

Sustainable implementation depends on feasibility under instructor workload and institutional resource constraints. High-touch feedback and individualized coaching may yield strong outcomes, but such approaches are often difficult to maintain at scale. Robust designs therefore tend to rely on blended feedback architectures that combine automated low-stakes checks, structured peer feedback, and targeted instructor intervention, supported by reuse of high-quality task banks [68]. A consequential contextual moderator is the extent to which institutions provide enabling infrastructure—TA allocation, classroom layout, and digital platforms—so that active components can operate without erosion [15,16,17].

5.4.3. Assessment Alignment and Incentive Structures

Assessment regimes systematically shape student behavior. When grading emphasizes procedural speed and answer reproduction, explanation, sensemaking, and collaboration are rationally deprioritized, thereby weakening active-learning mechanisms [60,68]. When assessment instead values reasoning quality, representation use, and transfer, intended mechanisms are reinforced and student buy-in is more likely to be sustained [54,68]. This alignment becomes particularly consequential for sustainability-oriented competency development (Section 6), where assessment must capture more than correctness, including justification quality, decision rationale, and collaboration artifacts [13,23].

5.5. Equity, Inclusion, and Differential Effects

5.5.1. Equity Effects Depend on Design

Equity-related effects are not uniform across implementations. Although active learning can improve outcomes broadly, disparities may be amplified when participation structures disproportionately reward confidence, when group work reproduces status hierarchies, or when access to feedback differs systematically across students [63,70]. Evidence from physics contexts has documented associations among perceptions of the learning environment, self-efficacy, demographic factors, and differential outcomes, underscoring the need for inclusion-sensitive implementation [70,105,106,107,108].

5.5.2. Participation Architecture: Roles, Norms, and Accountability

Equitable implementation requires explicit participation architecture, including rotating roles, structured turn-taking, shared artifacts, and accountability mechanisms that limit free-riding and protect less confident students from exclusion [61,62,63,64]. Psychological safety should be treated as a design feature rather than as an emergent property. Anonymous polling can reduce social risk, but benefits are most likely when norms explicitly value reasoning and revision rather than speed [43,47,49].

5.5.3. Technology Access and the Digital Divide

Technology-enhanced active learning can broaden opportunities (simulations, analytics, collaboration tools), yet unequal access to devices, connectivity, or prior digital experience can moderate who benefits [48,53]. Sustainable scaling therefore requires low-tech fallbacks, device-agnostic platforms, and tool choices that do not make participation contingent on privileged access [15,16,17,48,53].

5.5.4. Gender Disparities and Gender-Sensitive Implementation (SDG 5)

In university physics, gender discrepancies continue to be a persistent concern. These include well-documented gaps in self-efficacy and sense of belonging, which can result in divergent involvement and persistence even when achievement is comparable [9,10,70]. Crucially, the results of active learning rely on how participation is organized and how social risk is controlled during group projects, peer discussions, and public reasoning [43,63,70]. Gender-sensitive implementation can be viewed from a component-mechanism perspective as a boundary condition that determines who gets timely feedback, whose thinking is visible, and who feels that it is psychologically safe to make mistakes and make revisions in the classroom [43,63,70].
The following mechanisms are reflected in practical design responses: explicit norms for critique that focus on reasoning quality; feedback that emphasizes strategy use and growth rather than fixed ability attributions; low-risk response channels (e.g., anonymous polling) that reduce fear of negative evaluation; and structured turn-taking and rotating group roles that prevent status hierarchies from solidifying [11,43,47,63,81]. Furthermore, monitoring should go beyond overall results. Disaggregated measures of perseverance, self-efficacy, and involvement offer instructors useful signals to modify task design and facilitation [9,10,70]. In this way, by making inclusive engagement and equitable learning circumstances an explicit implementation aim, equity-sensitive involvement within active learning directly supports SDG 5 (Gender Equality).

5.6. Modality and Technology: Blended/Online Versus Face-to-Face

Active learning has been implemented across modalities; modality moderates mechanisms by reshaping feedback immediacy, social cues, and SRL demands. In online or hybrid contexts, reduced instructor immediacy can weaken feedback loops and relatedness unless digital routines are intentionally designed (structured discussion, rapid checks, and explicit feedback cycles) [48,49,53]. Conversely, technology can strengthen mechanisms when it increases practice and feedback opportunities, supports visualization (e.g., simulations), and provides trace data for targeted interventions—provided that analytics are coupled to instructional response rather than treated as passive monitoring [35,48,49].

5.7. Sustainability Lens: Scaling Without Losing Effectiveness

From a sustainable higher-education perspective, the core question is not only whether active learning can work, but whether it can be maintained, adapted, and equitably delivered over time. Across the moderators above, three implementation principles recur:
Preserve functional fidelity. Readiness assurance, in-class sensemaking, and feedback loops should be retained even when surface features are adapted to constraints [55,56,60,68].
Treat equity as a core requirement. Participation architecture, psychological safety, and access considerations should be built into the design rather than appended after implementation [43,63,70].
Align incentives and measurement. Assessment should reward reasoning and competency-relevant performances; otherwise, student behavior will diverge from intended mechanisms [13,23,54,68].
These principles motivate the competency-oriented mapping in Section 6, where sustainability-aligned competencies are linked to active-learning components and feasible assessment options (Table 5).

5.8. Section Summary and Transition

Section 5 explains why active-learning effects in university physics vary across settings: outcomes are moderated by implementation fidelity, facilitation quality, learner preparedness and self-efficacy, SRL capacity, class size and resource constraints, assessment alignment, modality, and equity-related participation dynamics [33,34,40,41,42,43,44,48,49,50,51,52,53,54,55,56,60,61,62,63,64,68,70]. These moderators are decisive for sustainable higher education because they determine whether active learning remains effective when scaled and maintained over time [15,16,17]. Section 6 therefore operationalizes sustainability alignment by mapping physics active-learning designs to competency development and assessment options relevant to sustainability-oriented higher education [13,23].

6. Aligning University Physics Active Learning with SDG-Oriented Competencies

Sustainability-oriented higher education emphasizes competence development that enables graduates to engage with complex socio-scientific problems, collaborate across disciplinary boundaries, and make reasoned decisions under uncertainty [12,13,23]. University physics contributes foundational knowledge relevant to energy systems, climate-relevant technologies, and quantitative modeling; however, competence development is not an automatic by-product of content coverage. University physics contributes. It depends on intentional instructional design that links physics reasoning to competence-relevant practices and dispositions. In addition to SDG-related competencies broadly, equity-oriented aims—including gender equality (SDG 5)—are relevant insofar as sustainable implementation requires inclusive participation and fair learning conditions across student groups. Alignment is operationalized here using a Competency–Activity–Assessment logic: targeted competence domains are specified, active-learning components are selected to elicit observable competence-relevant behaviors, and assessment tools are aligned to capture both outcomes and underlying processes [11,13,19,23,109,110,111]. The consolidated mapping is provided in Table 6.

6.1. Conceptualizing SDG-Oriented Competencies in the Context of Physics

Although the SDGs represent broad societal goals, educational translation commonly converges on cross-cutting competencies relevant to quality education, climate literacy, and collaborative problem solving [13,23]. For university physics, three competence clusters provide a practical mapping layer that is both defensible within sustainability-competence frameworks and actionable within physics curricula:
Critical thinking and evidence-based problem solving, expressed as model-based reasoning, evaluation of assumptions, and argumentation with quantitative evidence.
Collaboration and communication, expressed as explanation of reasoning, negotiation of models, coordination of tasks, and communication of uncertainty and trade-offs.
Responsible decision making in socio-scientific contexts, expressed as physics-informed judgment in authentic decisions, explicit consideration of constraints, and attention to consequences and equity implications. (including participation equity and gender-related disparities).
These clusters are directly linkable to observable behaviors and assessable products in problem solving, explanation, and project-based artifacts [13,23]. The remainder of this section translates each cluster into designable learning activities and measurable indicators (Table 5).

6.2. Competency 1: Critical Thinking and Evidence-Based Problem Solving

6.2.1. Components That Elicit Model-Based Reasoning

In university physics, critical thinking is most appropriately conceptualized as model-based reasoning: selecting principles, articulating assumptions, coordinating representations, and evaluating solution coherence. Multiple active-learning components reliably elicit these behaviors. Conceptual-question cycles characteristic of peer instruction require prediction and justification followed by peer discussion and revision, thereby externalizing reasoning and activating conceptual conflict and reconciliation processes [59,60,63,112,113,114,115]. Collaborative problem-solving tasks can increase strategic reasoning when problems require explanation and principle selection rather than direct substitution [61,62,64]. Inquiry- and simulation-supported activities can strengthen hypothesis testing and interpretation when prompts require explanation and reconciliation rather than unstructured exploration [35,36,37,38,39,57,115]. In addition, error and contrast example analyses can elicit diagnostic reasoning and deep-feature discrimination by requiring learners to identify where reasoning fails and why alternative solution structures differ [54,115,116,117,118]. Collectively, these components operationalize critical thinking as a process of claim–evidence–revision, aligning with both the cognitive mechanisms in Section 4 and sustainability-relevant capacities for evidence evaluation under uncertainty [13,23,118,119,120].

6.2.2. Assessment Alignment: Capturing Reasoning and Transfer

Critical thinking cannot be inferred reliably from correctness alone. Assessment therefore requires tools that make reasoning structure and representation use observable. Feasible approaches include rubrics that score justification quality, principle selection, representation coordination, and coherence checks; transfer-oriented problems with novel surface features that test deep-feature discrimination [54]; short written explanations or oral defenses embedded in quizzes or assignments; and concept inventories/diagnostic items, particularly when paired with explanation prompts to reveal conceptual structure rather than guessing [36,37,38,39,60]. From a sustainability standpoint, scalability can be supported through structured rubrics, sampling strategies (grading subsets of explanations), and peer calibration routines, thereby reducing instructor workload while preserving measurement validity [68,121,122,123]. The activity–assessment mapping for this competence cluster is summarized in Table 5.

6.3. Competency 2: Collaboration and Communication

6.3.1. Components That Build Collaborative Reasoning

Collaboration competence in sustainability contexts requires more than teamwork; it requires coordinated reasoning, communication of uncertainty, and joint decision making. Physics active learning can cultivate these skills when collaboration is designed as a social regulation of learning. Role-based routines with rotating functions (e.g., explainer, skeptic, checker, connector) can make contributions visible and reduce status domination [61,62,64,124,125,126]. Shared reasoning artifacts—solution boards, diagrams, or shared digital workspaces—externalize thinking and support comparison across groups [64,65]. Peer feedback and peer assessment can deepen learning when evaluation criteria target reasoning quality rather than surface agreement [59,60,61,125]. Structured discussion protocols that require all members to articulate and compare models can further stabilize productive collaboration norms [63,70]. These components are aligned with sustainability-oriented emphases on collaboration across perspectives and communication of technical reasoning in accessible forms [13,23]. They are also inseparable from the equity moderators emphasized in Section 5: collaboration competence is unlikely to develop when participation is uneven or psychologically unsafe [43,63,70,127,128,129,130].

6.3.2. Assessment Alignment: Evaluating Collaborative Processes and Products

Collaboration and communication are assessable using scalable tools when criteria are tied to observable behaviors and artifacts. Collaboration rubrics can capture turn-taking, justification quality, constructive critique, and group regulation behaviors. Peer evaluation instruments can be used when ratings are anchored in observable contributions and combined with safeguards against bias and free-riding [63]. Group artifacts may be assessed for clarity of reasoning, representation use, and explicit treatment of uncertainty and trade-offs. Short reflective prompts can capture how feedback was interpreted and how reasoning was revised, providing a bridge to SRL. For sustainable implementation, these assessments should be embedded within the design rather than treated as optional additions, because assessment alignment reinforces buy-in and stabilizes collaboration norms [68]. Consolidated options are provided in Table 5.

6.4. Competency 3: Responsible Decision Making in Socio-Scientific Contexts

6.4.1. Designing Physics Tasks That Support Responsible Decisions

Responsible decision making requires integration of physics principles with contextual constraints, uncertainty, and consequences. In university physics, this competence can be elicited through scenario-based problems focused on energy, climate, risk, or technology choices in which decisions must be justified using quantitative reasoning and explicit assumptions [13,23]. Mini-projects and project-based modules can be used to foreground data interpretation, model selection, and the communication of uncertainty to a specified audience [19,20]. Debate and argumentation tasks similarly lend themselves to evaluating competing claims with evidence and to making trade-offs explicit. Case-based simulation activities can be structured to support exploration of system behavior, after which decisions are articulated and defended with reference to model outputs and their limitations [57,131,132,133,134]. Collectively, these designs situate physics learning in authentic applications and may strengthen perceived value and identity relevance—an affective pathway associated with persistence and deeper engagement (Section 4) [15,40,45]. Importantly, competence development need not be pursued by displacing physics content with policy content; physics remains central, serving as the evidentiary basis for judgment.

6.4.2. Assessment Alignment: Decision Rationale and Uncertainty Handling

Assessment of responsible decision making should capture both technical justification and reasoning about contextual constraints. Decision memos or short policy briefs can be used to elicit a stated decision supported by physics-based evidence, explicit assumptions, and a discussion of uncertainty and limitations. Rubrics may be designed to prioritize transparency, trade-off reasoning, and evidence quality. Oral presentations or poster artifacts can be used to evaluate communication clarity and evidence integration, while brief reflection prompts can document how decisions are revised in response to feedback or new evidence. At scale, these formats can remain feasible through structured rubrics, sampling approaches, and calibrated peer review, thereby preserving validity in large courses [68]. The mapping of task families to assessable indicators is summarized in Table 5.

6.5. Integrating the Competency–Activity–Assessment Logic Across Course Levels

Competence development is more sustainable when distributed across curricula rather than concentrated into a single capstone requirement. A pragmatic progression aligns competence clusters with increasing task complexity. Introductory courses can emphasize conceptual-question cycles, structured explanation prompts, basic collaboration routines, and small scenario-based tasks. Intermediate courses can incorporate more open-ended problem solving, error/contrast example analyses, structured laboratory sequences requiring hypothesis testing, and short decision memo assignments. Advanced courses can embed project-based modules, data-driven modeling tasks, and explicit communication products (reports/posters) that integrate uncertainty and trade-offs. Such progression supports scalability by distributing competence development across a program rather than overloading a single course. It may also support equity by building SRL and collaboration skills early, thereby reducing disparities in later high-demand tasks [48,50,51,52,53,63,70,135,136,137,138,139].

6.6. Implementation Notes: Preserving Alignment and Sustainability

Three implementation cautions follow directly from Section 3, Section 4 and Section 5. First, competence rhetoric without assessment alignment is unlikely to change student behavior; when examinations reward procedural correctness alone, reasoning, collaboration, and decision justification will be rationally de-emphasized [54,68]. Second, psychological safety and participation architecture are enabling conditions for competence development because public reasoning and collaboration will not occur equitably without structured norms and low-risk participation options [43,63,70]. Third, a feasible and scalable assessment is required for sustainability; structured rubrics, calibrated peer review, and sampling approaches can reduce workload while supporting valid measurement of competence-relevant behaviors [68].

6.7. Section Summary and Transition

Section 6 operationalizes sustainability alignment by mapping SDG-oriented competence clusters to reusable active-learning components and scalable assessment approaches in university physics [13,23]. This mapping supports the applied contribution of the review by enabling designs that improve conceptual and problem-solving outcomes while developing competencies relevant to sustainability-oriented higher education. Section 7 translates the synthesis into practical guidelines for sustainable implementation and scaling, followed by a research agenda and conclusions in Section 8 [15,16,17,68,70].

7. Practical Implications and Scaling Guidelines for Sustainable Higher Education

Evidence synthesized in Section 2, Section 3, Section 4, Section 5 and Section 6 indicates that active learning in university physics is most effective when functional components are preserved (readiness assurance, in-class sensemaking, feedback loops), coupled mechanisms are activated (cognitive, affective, behavioral), and designs are adapted to boundary conditions (fidelity, resources, equity, modality) [33,34,35,36,37,38,39,40,41,42,43,44,45,46,48,49,50,51,52,53,54,55,56,60,61,62,63,64,65,68,70]. For sustainability-oriented higher education, the central challenge is scale: designs must be expanded without degrading mechanism integrity or amplifying inequities [15,16,17,63,70]. The guidance below translates the synthesis into actionable recommendations across three levels—instructor/course design, departmental/program infrastructure, and institutional policy and support systems—aligned with the moderator toolbox summarized in Table 6.

7.1. Instructor-Level Guidance: Designing for Mechanisms Rather Than Labels

7.1.1. A Minimum Viable Active-Learning System

Sustainable adoption is often enabled by implementing a minimum viable active-learning system that preserves active ingredients while maintaining a manageable workload. Across formats, three elements recur as mechanism-enabling components: brief pre-class preparation coupled to a low-stakes readiness check that generates diagnostic information [55,56,68]; structured in-class sensemaking routines (e.g., concept questions or short problem segments) that require prediction/explanation and peer comparison [59,60,63]; and a feedback loop in which student thinking is rapidly interpreted (polling, short checks, group artifacts) and followed by targeted instructor/TA response [60,68]. This approach addresses a common failure mode in which strategy labels are adopted (e.g., “flipped”) without implementing the readiness–sensemaking–feedback chain required for mechanisms to operate [55,56,68].

7.1.2. Task Design That Forces Reasoning and Representation Coordination

In physics, tasks should be engineered to elicit mediators most predictive of durable learning. Design features include prompts that require justification of principle selection and explicit linking across representations (equation–graph–diagram–verbal model) [33,34,35,36,37,38,39]; structured use of error and contrast examples to build deep-feature discrimination and transfer [54]; and routines that instantiate conceptual conflict and reconciliation (prediction → anomaly → revision) with scaffolding sufficient to prevent unproductive confusion [36,37,38,39]. These features operationalize the cognitive pathway mechanisms summarized in Section 4 and provide concrete levers for course redesign without requiring wholesale changes to course structure.

7.1.3. Psychological Safety and Equitable Participation as Core Design Properties

Because active learning requires public reasoning, psychological safety should be treated as a design requirement rather than an aspirational outcome. Error and revision should be normalized as expected behaviors, and evaluation signals should reward reasoning quality rather than speed [43,70]. Low-risk participation channels (e.g., anonymous response systems) can reduce threat when integrated into revision routines rather than used as isolated participation tools [47,49]. Collaboration should also be structured through roles and shared artifacts to reduce status hierarchies and uneven participation [61,62,63,64]. These practices align with the equity moderators identified in Section 5 and are central to sustainable scaling in large courses [63,70,140,141,142].

7.1.4. Aligning Grading and Assessment with Intended Mechanisms and Competencies

Assessment alignment is a primary determinant of sustained student buy-in. Course grading should include evaluation of explanation quality, representation use, and coherence checks rather than correctness alone [54,68]. Workload feasibility can be supported through structured rubrics and selective sampling strategies [68]. Where sustainability-oriented competency development is targeted (Section 6), assessment should capture decision rationale and uncertainty handling using short memos, structured reflections, or calibrated peer review, as appropriate to course level and class size [13,23,68]. The Competency–Activity–Assessment mapping is summarized in Table 5.

7.2. Course-Team and Department-Level Guidance: Infrastructure for Fidelity and Sustainability

7.2.1. Standardizing Core Components Without Prescribing Surface Pedagogy

Departments can facilitate sustainable scaling by standardizing functional components while allowing instructors flexibility in surface implementation. Shared banks of concept questions, short sensemaking tasks, and example-based activities aligned with course learning objectives can reduce redesign burden and improve consistency across sections [54,59,60]. Common readiness-check routines and feedback standards can preserve mechanism operation across instructors [55,56,68]. In addition, shared assessment principles valuing reasoning and transfer can reduce cross-section misalignment that otherwise undermines student incentives [54,68].

7.2.2. TA Training and Facilitation Supports for Large-Enrollment Courses

In large physics courses, teaching assistants frequently determine whether collaboration, feedback, and participation norms function as intended. Lightweight but consistent supports can improve fidelity and equity: training in questioning strategies and misconception diagnosis, routines for equitable participation, and shared rubrics and exemplars for feedback and peer assessment [59,60,61,63,64,68,143,144,145,146,147]. Coordination protocols (e.g., common prompts and weekly planning) can maintain fidelity without imposing excessive oversight.

7.2.3. Curriculum-Level Staging of Competencies and SRL Supports

Competency development is more sustainable when distributed across curricula. Departments can introduce structured collaboration and SRL supports early and increases open-endedness and authenticity later [48,50,51,52,53,63,70]. Prerequisite sequencing that builds representation coordination and explanation norms may reduce later inequities in participation and performance, particularly in high-demand tasks that presume these practices [33,34,35,36,37,38,39,48,148,149,150].

7.3. Institutional Guidance: Policy, Incentives, and Resource Provision

7.3.1. Incentives for Evidence-Aligned Teaching Innovation

Sustainable implementation is more likely when institutional incentive structures render sustained investment in active learning both rational and visibly recognized. Promotion and evaluation systems may be designed to acknowledge evidence-based teaching practice and pedagogical innovation [15,16,17]. In parallel, professional development and course-redesign support can be oriented toward mechanism-informed pedagogy, rather than being framed primarily as technology adoption [60,68]. Peer-observation communities, together with protected time for structured redesign, can further lower adoption costs and strengthen continuous improvement cycles.

7.3.2. Enabling Infrastructure and Equitable Access

Institutional infrastructure functions as a contextual moderator of whether active learning can be delivered equitably. Group-work-supportive classroom layouts, accessible response systems, and reliable digital platforms can enable implementation fidelity in large-enrollment courses [15,16,17,47,49]. To limit the effects of digital divides, device-agnostic tools and low-tech fallback options should be available, particularly in hybrid settings [48,53]. In addition, support services that build students’ SRL and academic skills can reduce readiness gaps that would otherwise moderate outcomes in flipped and blended designs [48,50,51,52,53].

7.3.3. Quality Assurance and Continuous Improvement Loops

Sustainability is strengthened when implementation is coupled with continuous improvement. Actionable data—concept inventories, engagement indicators, equity audits of participation, and evidence of feedback uptake—can be collected and linked to instructional response [48,49]. Monitoring differential outcomes and iterating participation structures and supports are particularly important to prevent inequity amplification during scale-up [63,70].

7.4. Practical Checklist for Sustainable Implementation

To summarize the moderator-to-design response logic in operational terms, sustainable active-learning implementation in university physics should satisfy the following conditions: readiness is assured through aligned, bounded, and accountable pre-class work [55,56,68]; in-class time is dominated by sensemaking rather than passive reception [59,60]; feedback loops are rapid and interpretable such that revision is expected and supported [60,68]; tasks require representation coordination (equation–graph–diagram–verbal model) [33,34,35,36,37,38,39]; collaboration is structured and equitable through roles, accountability, and psychological safety [43,61,62,63,64,70]; assessment rewards reasoning, transfer, and (where relevant) decision rationale [13,23,54,68]; and adaptations preserve functional fidelity under constraints of class size, resources, and modality [15,16,17,55,56]. A consolidated version is provided in Table 6.

8. Future Research Agenda and Conclusions

8.1. Future Research Agenda: Evidence Gaps and Priority Directions

Despite strong evidence that active learning can improve learning in university physics, several gaps constrain the field’s ability to support sustainable scaling across diverse institutions.

8.1.1. Mechanism Testing Using Mediation and Process-Tracing Designs

Outcome gains are frequently reported, yet direct testing of mediating processes remains comparatively limited. Future work should operationalize mechanisms with validated measures—including cognitive load, explanation quality, SRL indicators, and self-efficacy—and test mechanism pathways using mediation or process-tracing approaches [33,34,35,36,37,38,39,40,41,42,43,44,48,49,50,51,52]. Cross-pathway interactions also warrant direct attention, particularly conditions under which psychological safety moderates the impact of cognitively demanding tasks [43,70,150,151,152,153,154].

8.1.2. Fidelity Metrics and Functional Component Coding for Explaining Heterogeneity

To interpret heterogeneous outcomes, shared fidelity frameworks are needed that capture functional components rather than surface strategy labels. Priorities include development and validation of practical fidelity instruments for readiness assurance, sensemaking task quality, feedback-loop strength, and collaboration structure [55,56,60,68]. Reporting standards that support replication and synthesis—time allocation, task types, feedback routines, and assessment alignment—should also be strengthened to enable cross-study comparability.

8.1.3. Equity-Focused Research and Differential Impact Analyses

Sustainability depends on equitable effectiveness. Future studies should evaluate differential impacts across demographic groups, prior preparation, and motivational profiles using designs that can identify mechanism-based explanations rather than attributing differences to learner deficits [63,70,154,155,156,157,158]. Participation architectures and psychological-safety interventions should be tested as hypothesized moderators with measurable outcomes [43,70,159,160,161,162].

8.1.4. Modality-Sensitive Designs and Technology as a Mechanism Amplifier

Hybrid and online implementations increase SRL demands and can weaken the immediacy of feedback and relatedness unless routines are intentionally designed. Research should identify which digital routines preserve feedback immediacy and relatedness and how SRL supports should be adapted for blended contexts [48,53,163]. In parallel, analytics-driven interventions—rather than analytics alone—should be evaluated by linking trace data to targeted instructional actions and assessing downstream learning and equity outcomes [48,49,164,165,166,167,168].

8.1.5. Competency-Oriented Outcomes and Longer-Horizon Evaluation

To substantiate SDG-oriented alignment, evaluation should extend beyond short-term exam performance. Priorities include development of validated, scalable assessments of sustainability-relevant competencies in physics (decision rationale, uncertainty handling, collaborative reasoning) [13,23,169,170,171] and longitudinal studies examining retention, identity development, and downstream application of physics reasoning in authentic contexts [15,45,169,172,173,174].

8.2. Conclusions

In order to promote sustainable implementation in higher education, this review synthesizes data on active learning in university physics using a component-mechanism-boundary-conditions lens. Readiness assurance, organized sensemaking with peer interaction, feedback-driven revision loops, and example-based reasoning tasks are examples of reusable component configurations that more consistently explain effectiveness than strategy labels [33,34,35,36,37,38,39,40,41,42,43,44,45,46,48,49,50,51,52,53,54,55,56,59,60,61,62,63,64,65,68,70,175,176,177,178]. When boundary conditions, such as implementation fidelity, facilitation quality, learner readiness and self-efficacy, SRL capacity, resource constraints, assessment alignment, modality, and equity-sensitive participation dynamics, are made explicit, outcome heterogeneity is expected and easier to interpret [15,16,17,55,56,63,68,70,178]. Finally, sustainability alignment is operationalized through a Competency–Activity–Assessment approach linking physics active learning to the development and measurement of competencies relevant to evidence-based problem solving, collaboration, and responsible decision making in socio-scientific contexts [13,23,178,179,180,181,182,183]. By translating these insights into scalable guidance for instructors, departments, and institutions, the review supports the design of university physics active learning that is effective, maintainable, and aligned with sustainability-oriented higher education goals (Figure 2).
Basis for the evidence and the extent of interpretation. Although the amount of support varies among elements, the synthesis tables (Table 1, Table 2, Table 3, Table 4, Table 5 and Table 6) compile recurrent design elements and propositions taken from the full-text evaluated literature. Repeatedly documented “workhorse” components (e.g., structured sensemaking with peer interaction and feedback-supported revision) have comparatively stronger evidence, while other elements—especially those requiring more detailed reporting or process measures—remain more contingent or emerging. Therefore, rather than offering a definite ranking of tactics, we give the framework as a set of testable hypotheses and a useful organizing language.
Practical ramifications. The framework can serve as a design checklist for instructors: choose a few essential elements (structured peer sensemaking, feedback-supported revision, readiness assurance), confirm that mechanism-enabling cycles happen on a regular basis, and audit boundary conditions like participation equity, facilitation quality, and preparation completion. The tables offer a standard vocabulary for TA training, task banks, and evaluation rubrics to maintain functional fidelity across sections when departments scale active learning. Researchers can use the synthesis to identify areas with comparatively dense vs. thin evidence and prioritize boundary-condition testing and mechanism-linked measurement.
Restrictions and perspectives. Rather than being a PRISMA-style systematic review or meta-analysis, this paper is a classic integrative narrative review that aims to provide a component–mechanism framework for active learning in university physics. Therefore, when mapping heterogeneous interventions to functional components and mechanism premises, the synthesis invariably requires interpretive judgment. Even while we have made the eligibility criteria clear and the search and screening processes more transparent, the evidence base is still limited by primary studies’ reporting completeness, which could lead to an overrepresentation of therapies with more operationally detailed descriptions. Furthermore, the framework places more emphasis on conceptual coherence and actionable design elements than on comprehensive coverage and pooled effect estimation. By combining framework creation with methodical evidence mapping and, when practical, quantitative synthesis, future research could expand upon this basis. To assess coverage disparities and boost memory, future research could duplicate the search in Scopus and education-focused databases (like ERIC).

Author Contributions

Conceptualization, F.X. and C.W.; methodology, F.X. and C.W.; investigation, F.X.; data curation, F.X.; writing—original draft preparation, F.X.; writing—review and editing, C.W. and J.J.; visualization, F.X.; supervision, J.J.; project administration, C.W.; funding acquisition, C.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Heilongjiang Province Higher Education Teaching Reform Research Project, Exploration and Practice of Teacher Education Talent Cultivation under the New Liberal Arts Context: “Industry–Education Integration + Multi-University Collaboration” (Grant No. SJGZ20220132).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Pattiserlihun, A.; Setiadi, S.J. Blended-flipped classroom learning for physics students with the topic of the photoelectric effect. J. Inov. Pendidik. IPA 2020, 6, 28109. [Google Scholar] [CrossRef]
  2. Dewantara, D.; Mısbah, M.; Wati, M. The implementation of blended learning in analog electronic learning. J. Phys. Conf. Ser. 2020, 1422, 012002. [Google Scholar] [CrossRef]
  3. Suryani, Y.; Ningrum, A.; Hidayah, N.; Dewi, N.R. The effectiveness of blended learning-based scaffolding strategy assisted by google classroom toward the learning outcomes and students’ self-efficacy. J. Phys. Conf. Ser. 2021, 1796, 012031. [Google Scholar] [CrossRef]
  4. Hasas, A.; Enayat, W.; Hakimi, M.; Ahmady, E. A Comprehensive Review of ICT Integration in Enhancing Physics Education. Magneton J. Inov. Pembelajaran Fis. 2024, 2, 36–44. [Google Scholar] [CrossRef]
  5. Haris; Wibawa, B.; Mahdiyah. Knowledge-Based Flipped Classroom Model to Improve Physics Learning Outcomes. J. Phys. Conf. Ser. 2024, 2866, 012108. [Google Scholar] [CrossRef]
  6. Karagöl, İ.; Esen, E. The Effect of Flipped Learning Approach on Academic Achievement: A Meta-Analysis Study. Hacet. Univ. J. Educ. 2019, 34, 1–20. [Google Scholar] [CrossRef]
  7. Zhang, Q.; Cheung, E.S.; Cheung, C.S. The Impact of Flipped Classroom on College Students’ Academic Performance: A Meta-Analysis Based on 20 Experimental Studies. Sci. Insights Educ. Front. 2021, 8, 1059–1080. [Google Scholar] [CrossRef]
  8. Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef]
  9. Kalender, Z.Y.; Marshman, E.; Schunn, C.D.; Nokes-Malach, T.J.; Singh, C. Damage caused by women’s lower self-efficacy on physics learning. Phys. Rev. Phys. Educ. Res. 2020, 16, 010118. [Google Scholar] [CrossRef]
  10. Marshman, E.M.; Kalender, Z.Y.; Nokes-Malach, T.; Schunn, C.; Singh, C. Female students with A’s have similar physics self-efficacy as male students with C’s in introductory courses: A cause for alarm? Phys. Rev. Phys. Educ. Res. 2018, 14, 020123. [Google Scholar] [CrossRef]
  11. Tullis, J.G.; Goldstone, R.L. Why does peer instruction benefit student learning. Cogn. Res. 2020, 5, 15. [Google Scholar] [CrossRef]
  12. Marzoli, I.; Colantonio, A.; Fazio, C.; Giliberti, M.; Scotti di Uccio, U.; Testa, I. Effects of emergency remote instruction during the COVID-19 pandemic on university physics students in Italy. Phys. Rev. Phys. Educ. Res. 2021, 17, 020130. [Google Scholar] [CrossRef]
  13. Rahayu, S.; Setyosari, P.; Hidayat, A.; Kuswandi, D. The Effectiveness of Creative Problem Solving-Flipped Classroom for Enhancing Students’ Creative Thinking Skills of Online Physics Educational Learning. J. Pendidik. IPA Indones. 2022, 11, 649–656. [Google Scholar] [CrossRef]
  14. Jang, H.Y.; Kim, H.J. A Meta-Analysis of the Cognitive, Affective, and Interpersonal Outcomes of Flipped Classrooms in Higher Education. Educ. Sci. 2020, 10, 115. [Google Scholar] [CrossRef]
  15. Lai, J.W.; Cheong, K.H. Educational Opportunities and Challenges in Augmented Reality: Featuring Implementations in Physics Education. IEEE Access 2022, 10, 43143–43158. [Google Scholar] [CrossRef]
  16. Krasnova, L.A.; Shurygin, V.Y. Blended Learning of Physics in the Context of the Professional Development of Teachers. Int. J. Emerg. Technol. Learn. (IJET) 2019, 14, 17. [Google Scholar] [CrossRef]
  17. Hartikainen, S.; Rintala, H.; Pylväs, L.; Nokelainen, P. The Concept of Active Learning and the Measurement of Learning Outcomes: A Review of Research in Engineering Higher Education. Educ. Sci. 2019, 9, 276. [Google Scholar] [CrossRef]
  18. Yusro, A.C.; Sasono, M.; Primayoga, G. The influence of active involvement on learning outcomes of physics pre-service teachers: A case study of blended learning on statistics course. Momentum Phys. Educ. J. 2020, 6, 30–37. [Google Scholar] [CrossRef]
  19. Parappilly, M.; Woodman, R.; Randhawa, S. Feasibility and Effectiveness of Different Models of Team-Based Learning Approaches in STEMM-Based Disciplines. Res. Sci. Educ. 2019, 51, 391–405. [Google Scholar] [CrossRef]
  20. Suana, W. Inquiry-based Blended Learning Design for Physics Course: The Effectiveness and Students’ Satisfaction. Berk. Ilm. Pendidik. Fis. 2022, 10, 126. [Google Scholar] [CrossRef]
  21. Nicholus, G.; Muwonge, C.M.; Joseph, N. The Role of Problem-Based Learning Approach in Teaching and Learning Physics: A Systematic Literature Review. F1000Research 2023, 12, 951. [Google Scholar] [CrossRef] [PubMed]
  22. Fidan, M.; Tuncel, M. Integrating augmented reality into problem based learning: The effects on learning achievement and attitude in physics education. Comput. Educ. 2019, 142, 103635. [Google Scholar] [CrossRef]
  23. Prayogi, S.; Verawati, N.N. Physics Learning Technology for Sustainable Development Goals (SDGs): A Literature Study. Int. J. Ethnosci. Technol. Educ. 2024, 1, 155. [Google Scholar] [CrossRef]
  24. Yaşar, M.; Polat, M. A MOOC-based Flipped Classroom Model: Reflecting on pre-service English language teachers’ experience and perceptions. Particip. Educ. Res. 2021, 8, 103–123. [Google Scholar] [CrossRef]
  25. Atwa, Z.; Sulayeh, Y.; Abdelhadi, A.; Jazar, H.A.; Eriqat, S. Flipped Classroom Effects on Grade 9 Students’ Critical Thinking Skills, Psychological Stress, and Academic Achievement. Int. J. Instr. 2022, 15, 737–750. [Google Scholar] [CrossRef]
  26. Hamilton, D.; McKechnie, J.; Edgerton, E.; Wilson, C. Immersive virtual reality as a pedagogical tool in education: A systematic literature review of quantitative learning outcomes and experimental design. J. Comput. Educ. 2020, 8, 1–32. [Google Scholar] [CrossRef]
  27. Lampropoulos, G.; Kinshuk. Virtual reality and gamification in education: A systematic review. Educ. Technol. Res. Dev. 2024, 72, 1691–1785. [Google Scholar] [CrossRef]
  28. Talan, T.; Batdı, V. Evaluating the flipped classroom model through the multi-complementary approach. Turk. Online J. Distance Educ. 2020, 21, 31–67. [Google Scholar] [CrossRef]
  29. Hu, Y.; Huang, J.; Kong, F. College students’ learning perceptions and outcomes in different classroom environments: A community of inquiry perspective. Front. Psychol. 2022, 13, 1047027. [Google Scholar] [CrossRef]
  30. Styers, M.L.; Van Zandt, P.A.; Hayden, K.L. Active Learning in Flipped Life Science Courses Promotes Development of Critical Thinking Skills. CBE Life Sci. Educ. 2018, 17, ar39. [Google Scholar] [CrossRef]
  31. Martin, F.; Wu, T.; Wan, L.; Xie, K. A Meta-Analysis on the Community of Inquiry Presences and Learning Outcomes in Online and Blended Learning Environments. Online Learn. 2022, 26, 2604. [Google Scholar] [CrossRef]
  32. Rasmitadila, R.; Widyasari, W.; Humaira, M.; Tambunan, A.; Rachmadtullah, R.; Samsudin, A. Using Blended Learning Approach (BLA) in Inclusive Education Course: A Study Investigating Teacher Students’Perception. Int. J. Emerg. Technol. Learn. (IJET) 2020, 15, 72–85. [Google Scholar] [CrossRef]
  33. Mbonyiryivuze, A.; Yadav, L.L.; Amadalo, M.M. Students’ conceptual understanding of electricity and magnetism and its implications: A review. Afr. J. Educ. Stud. Math. Sci. 2019, 15, 55–67. [Google Scholar] [CrossRef]
  34. Marshman, E.; DeVore, S.; Singh, C. Holistic framework to help students learn effectively from research-validated self-paced learning tools. Phys. Rev. Phys. Educ. Res. 2020, 16, 020108. [Google Scholar] [CrossRef]
  35. Awuor, F.M.; Okono, E. ICT Integration in Learning of Physics in Secondary Schools in Kenya: Systematic Literature Review. Open J. Soc. Sci. 2022, 10, 421–461. [Google Scholar] [CrossRef]
  36. Nerantzi, C. The Use of Peer Instruction and Flipped Learning to Support Flexible Blended Learning During and After the COVID-19 Pandemic. Int. J. Manag. Appl. Res. 2020, 7, 184–195. [Google Scholar] [CrossRef]
  37. Radulović, B.; Dorocki, M.; Olić Ninković, S.; Adamov, J. The effects of blended learning approach on student motivation for learning physics. J. Balt. Sci. Educ. 2023, 22, 73–82. [Google Scholar] [CrossRef]
  38. AlArabi, K.; Tairab, H.; Wardat, Y.; Belbase, S.; Alabidi, S. Enhancing the learning of newton’s second law of motion using computer simulations. J. Balt. Sci. Educ. 2022, 21, 946–966. [Google Scholar] [CrossRef]
  39. Zakaria, N.; Phang, F.; Pusppanathan, J. Physics on the Go: A Mobile Computer-Based Physics Laboratory for Learning Forces and Motion. Int. J. Emerg. Technol. Learn. (IJET) 2019, 14, 167–183. [Google Scholar] [CrossRef]
  40. Chala, A.A.; Kedir, I.; Wami, S. Secondary School Students’ Beliefs Towards Learning Physics and Its Influencing Factors. Res. Humanit. Soc. Sci. 2020, 10, 37–49. [Google Scholar] [CrossRef]
  41. Radu, I.; Schneider, B. What Can We Learn from Augmented Reality (AR)? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar] [CrossRef]
  42. Blajvaz, B.K.; Bogdanović, I.Z.; Jovanović, T.S.; Stanisavljević, J.D.; Pavkov-Hrvojević, M.V. The jigsaw technique in lower secondary physics education: Students’ achievement, metacognition and motivation. J. Balt. Sci. Educ. 2022, 21, 545–557. [Google Scholar] [CrossRef]
  43. Downing, V.R.; Cooper, K.M.; Cala, J.M.; Gin, L.E.; Brownell, S.E. Fear of Negative Evaluation and Student Anxiety in Community College Active-Learning Science Courses. CBE Life Sci. Educ. 2020, 19, ar20. [Google Scholar] [CrossRef]
  44. Doucette, D.; Clark, R.; Singh, C. Professional development combining cognitive apprenticeship and expectancy-value theories improves lab teaching assistants’ instructional views and practices. Phys. Rev. Phys. Educ. Res. 2020, 16, 020102. [Google Scholar] [CrossRef]
  45. Rizki, I.A.; Saphira, H.V.; Alfarizy, Y.; Saputri, A.D.; Ramadani, R.; Suprapto, N. Adventuring Physics: Integration of Adventure Game and Augmented Reality Based on Android in Physics Learning. Int. J. Interact. Mob. Technol. (IJIM) 2023, 17, 4–21. [Google Scholar] [CrossRef]
  46. Shroff, R.H.; Ting, F.S.; Lam, W.H.; Cecot, T.; Yang, J.; Chan, L.K. Conceptualization, Development and Validation of an Instrument to Measure Learners’ Perceptions of their Active Learning Strategies within an Active Learning Context. Int. J. Educ. Methodol. 2021, 7, 201–223. [Google Scholar] [CrossRef]
  47. Bawaneh, A.K.; Moumene, A.B. Flipping the Classroom for Optimizing Undergraduate Students’ Motivation and Understanding of Medical Physics Concepts. Eurasia J. Math. Sci. Technol. Educ. 2020, 16, em1899. [Google Scholar] [CrossRef]
  48. Abtokhi, A.; Jatmiko, B.; Wasis, W. Evaluation of self-regulated learning on problem-solving skills in online basic Physics learning during the COVID-19 pandemic. J. Technol. Sci. Educ. 2021, 11, 541–555. [Google Scholar] [CrossRef]
  49. Owen, H.E.; Licorish, S.A. Game-Based Student Response System: The Effectiveness of Kahoot! on Junior and Senior Information Science Students’ Learning. J. Inf. Technol. Educ. Res. 2020, 19, 511–553. [Google Scholar] [CrossRef] [PubMed]
  50. Dignath, C.; Veenman, M.V. The Role of Direct Strategy Instruction and Indirect Activation of Self-Regulated Learning—Evidence from Classroom Observation Studies. Educ. Psychol. Rev. 2020, 33, 489–533. [Google Scholar] [CrossRef]
  51. Dessie, E.; Gebeyehu, D.; Eshetu, F. Enhancing critical thinking, metacognition, and conceptual understanding in introductory physics: The impact of direct and experiential instructional models. Eurasia J. Math. Sci. Technol. Educ. 2023, 19, em2287. [Google Scholar] [CrossRef]
  52. Evendi, E.; Verawati, N.N. Evaluation of Student Learning Outcomes in Problem-Based Learning: Study of Its Implementation and Reflection of Successful Factors. J. Penelit. Pendidik. IPA 2021, 7, 69–76. [Google Scholar] [CrossRef]
  53. Rosen, D.; Kelly, A. Epistemology, socialization, help seeking, and gender-based views in in-person and online, hands-on undergraduate physics laboratories. Phys. Rev. Phys. Educ. Res. 2020, 16, 020116. [Google Scholar] [CrossRef]
  54. Sokoloff, D.R.; Yüksel, T. Physics Education Research and the Development of Active Learning Strategies in Introductory Physics. In The International Handbook of Physics Education Research: Learning Physics; AIP Publishing: Melville, NY, USA, 2023; pp. 23–26. [Google Scholar] [CrossRef]
  55. Kannan, V.; Kuromiya, H.; Gouripeddi, S.; Majumdar, R.; Madathil Warriem, J.; Ogata, H. Flip & Pair—A strategy to augment a blended course with active-learning components: Effects on engagement and learning. Smart Learn. Environ. 2020, 7, 34. [Google Scholar] [CrossRef]
  56. Banda, H.; Nzabahimana, J. Effect of integrating physics education technology simulations on students’ conceptual understanding in physics: A review of literature. Phys. Rev. Phys. Educ. Res. 2021, 17, 023108. [Google Scholar] [CrossRef]
  57. Rafon, J.E.; Mistades, V.M. Interactive Engagement in Rotational Motion via Flipped Classroom and 5E Instructional Model. Int. J. Inf. Educ. Technol. 2020, 10, 905–910. [Google Scholar] [CrossRef]
  58. Bozzi, M.; Raffaghelli, J.E.; Zani, M. Peer Learning as a Key Component of an Integrated Teaching Method: Overcoming the Complexities of Physics Teaching in Large Size Classes. Educ. Sci. 2021, 11, 67. [Google Scholar] [CrossRef]
  59. Borda, E.; Schumacher, E.; Hanley, D.; Geary, E.; Warren, S.; Ipsen, C.; Stredicke, L. Initial implementation of active learning strategies in large, lecture STEM courses: Lessons learned from a multi-institutional, interdisciplinary STEM faculty development program. Int. J. STEM Educ. 2020, 7, 4. [Google Scholar] [CrossRef]
  60. Møgelvang, A.; Nyléhn, J. Co-operative Learning in Undergraduate Mathematics and Science Education: A Scoping Review. Int. J. Sci. Math. Educ. 2022, 21, 1935–1959. [Google Scholar] [CrossRef]
  61. Mercier, E.; Goldstein, M.H.; Baligar, P.; Rajarathinam, R.J. Collaborative Learning in Engineering Education. In International Handbook of Engineering Education Research; Routledge: New York, NY, USA, 2023; pp. 402–432. [Google Scholar] [CrossRef]
  62. Hood, S.; Barrickman, N.; Djerdjian, N.; Farr, M.; Magner, S.; Roychowdhury, H.; Gerrits, R.; Lawford, H.; Ott, B.; Ross, K.; et al. “I Like and Prefer to Work Alone”: Social Anxiety, Academic Self-Efficacy, and Students’ Perceptions of Active Learning. CBE Life Sci. Educ. 2021, 20, ar12. [Google Scholar] [CrossRef]
  63. Beichner, R.J.; Saul, J.M.; Allain, R.J.; Deardorff, D.L.; Abbott, D.S. Introduction to Scale Up: Student Centered Activities for Large Enrollment University Physics. 2000, pp. 1–12. Available online: https://peer.asee.org/introduction-to-scale-up-student-centered-activities-for-large-enrollment-university-physics.pdf (accessed on 5 January 2026).
  64. Barlow, A.; Brown, S. Correlations between modes of student cognitive engagement and instructional practices in undergraduate STEM courses. Int. J. STEM Educ. 2020, 7, 22. [Google Scholar] [CrossRef]
  65. Fakoya, A.O.; Ndrio, M.; McCarthy, K.J. Facilitating Active Collaborative Learning in Medical Education; a Literature Review of Peer Instruction Method. Adv. Med. Educ. Pract. 2023, 14, 1087–1099. [Google Scholar] [CrossRef]
  66. Nasution, E.S.; Nasution, F.; Harahap, T.R.; Tambunan, E.E. Language and Visual Representation in Physics: Enhancing Understanding Through Multimedia. Int. J. Educ. Res. Excell. (IJERE) 2025, 4, 1–9. [Google Scholar] [CrossRef]
  67. Morris, R.; Perry, T.; Wardle, L. Formative assessment and feedback for learning in higher education: A systematic review. Rev. Educ. 2021, 9, 3292. [Google Scholar] [CrossRef]
  68. Holmes, N.; Lewandowski, H. Investigating the landscape of physics laboratory instruction across North America. Phys. Rev. Phys. Educ. Res. 2020, 16, 020162. [Google Scholar] [CrossRef]
  69. Li, Y.; Singh, C. Effect of gender, self-efficacy, and interest on perception of the learning environment and outcomes in calculus-based introductory physics courses. Phys. Rev. Phys. Educ. Res. 2021, 17, 010143. [Google Scholar] [CrossRef]
  70. Munfaridah, N.; Avraamidou, L.; Goedhart, M. The Use of Multiple Representations in Undergraduate Physics Education: What Do we Know and Where Do we Go from Here? Eurasia J. Math. Sci. Technol. Educ. 2021, 17, em1934. [Google Scholar] [CrossRef]
  71. Pacala, F. Combining Active Learning Strategies: Performances and Experiences of Grade School Filipino Students. Int. J. Soc. Learn. (IJSL) 2021, 2, 84–104. [Google Scholar] [CrossRef]
  72. Kryjevskaia, M.; Stetzer, M.R.; Lindsey, B.A.; McInerny, A.; Heron, P.R.; Boudreaux, A. Designing research-based instructional materials that leverage dual-process theories of reasoning: Insights from testing one specific, theory-driven intervention. Phys. Rev. Phys. Educ. Res. 2020, 16, 020140. [Google Scholar] [CrossRef]
  73. Jong, T. Moving towards engaged learning in STEM domains; there is no simple answer, but clearly a road ahead. J. Comput. Assist. Learn. 2019, 35, 153–167. [Google Scholar] [CrossRef]
  74. Liang, Y.W.; Wu, M.W.; Pan, Z. Information and Communication Technology Enabled Active Learning in College Physics Experiment. In Proceedings of the 2021 16th International Conference on Computer Science & Education (ICCSE), Lancaster, UK, 17–21 August 2021; Volume 24, pp. 1014–1018. [Google Scholar] [CrossRef]
  75. Moore, M.E.; Vega, D.M.; Wiens, K.M.; Caporale, N. Connecting Theory to Practice: Using Self-Determination Theory To Better Understand Inclusion in STEM. J. Microbiol. Biol. Educ. 2020, 21, 1955. [Google Scholar] [CrossRef] [PubMed]
  76. Bøe, M.V.; Lauvland, A.; Henriksen, E.K. How Motivation for Undergraduate Physics Interacts With Learning Activities in a System With Built-In Autonomy. Sci. Educ. 2024, 109, 506–522. [Google Scholar] [CrossRef]
  77. Wang, N.; Tan, A.L.; Xiao, W.R.; Zeng, F.; Xiang, J.; Duan, W. The effect of learning experiences on interest in stem careers: A structural equation model. J. Balt. Sci. Educ. 2021, 20, 651–663. [Google Scholar] [CrossRef]
  78. Chen, T.I.; Lin, S.K.; Chung, H.C. Gamified educational robots lead an increase in motivation and creativity in stem education. J. Balt. Sci. Educ. 2023, 22, 427–438. [Google Scholar] [CrossRef]
  79. Master, A.; Meltzoff, A. Cultural Stereotypes and Sense of Belonging Contribute to Gender Gaps in STEM. Int. J. Gend. Sci. Technol. 2020, 12, 152–198. [Google Scholar]
  80. Johnson, K. Implementing inclusive practices in an active learning STEM classroom. Adv. Physiol. Educ. 2019, 43, 207–210. [Google Scholar] [CrossRef]
  81. Fuesting, M.A.; Diekman, A.B.; Boucher, K.L.; Murphy, M.C.; Manson, D.L.; Safer, B.L. Growing STEM: Perceived faculty mindset as an indicator of communal affordances in STEM. J. Personal. Soc. Psychol. 2019, 117, 260–281. [Google Scholar] [CrossRef]
  82. O’Connell, K.; Hoke, K.; Berkowitz, A.; Branchaw, J.; Storksdieck, M. Undergraduate learning in the field: Designing experiences, assessing outcomes, and exploring future opportunities. J. Geosci. Educ. 2020, 69, 387–400. [Google Scholar] [CrossRef]
  83. O’Connell, K.; Hoke, K.L.; Giamellaro, M.; Berkowitz, A.R.; Branchaw, J. A Tool For Designing And Studying Student-Centered Undergraduate Field Experiences: The Ufern Model. BioScience 2021, 72, 189–200. [Google Scholar] [CrossRef]
  84. Jones, D.; Lotz, N.; Holden, G. A longitudinal study of virtual design studio (VDS) use in STEM distance design education. Int. J. Technol. Des. Educ. 2020, 31, 839–865. [Google Scholar] [CrossRef]
  85. van de Heyde, V.; Siebrits, A. The ecosystem of e-learning model for higher education. S. Afr. J. Sci. 2019, 115, 5808. [Google Scholar] [CrossRef] [PubMed]
  86. Capone, R. Blended Learning and Student-centered Active Learning Environment: A Case Study with STEM Undergraduate Students. Can. J. Sci. Math. Technol. Educ. 2022, 22, 210–236. [Google Scholar] [CrossRef]
  87. Zeng, H.; Zhou, S.N.; Hong, G.R.; Li, Q.Y.; Xu, S.Q. Evaluation of interactive game-based learning in physics domain. J. Balt. Sci. Educ. 2020, 19, 484–498. [Google Scholar] [CrossRef]
  88. Jamaluddin, F.; Razak, A.Z.; Rahim, S.S. Navigating the challenges and future pathways of STEM education in Asia-Pacific region: A comprehensive scoping review. STEM Educ. 2024, 5, 53–88. [Google Scholar] [CrossRef]
  89. Taşar, M.F.; Heron, P.R. The International Handbook of Physics Education Research: Teaching Physics; AIP Publishing: Melville, NY, USA, 2023. [Google Scholar] [CrossRef]
  90. Kumaş, A.; Kan, S. Infographic applications in cooperative groups in physics teaching. Can. J. Phys. 2022, 101, 30–42. [Google Scholar] [CrossRef]
  91. Hamed, G.; Aljanazrah, A. The Effectiveness of Using Virtual Experiments on Students’ Learning in the General Physics Lab. J. Inf. Technol. Educ. Res. 2020, 19, 977–996. [Google Scholar] [CrossRef]
  92. Campos, E.; Hidrogo, I.; Zavala, G. Impact of virtual reality use on the teaching and learning of vectors. Front. Educ. 2022, 7, 965640. [Google Scholar] [CrossRef]
  93. Castillo, J.; Santiago, L.; Martínez, S. Optimization of Physics Learning Through Immersive Virtual Reality: A Study on the Efficacy of Serious Games. Appl. Sci. 2025, 15, 3405. [Google Scholar] [CrossRef]
  94. Tanjung, Y.I.; Festiyed, F.; Diliarosta, S. Developing the Physics Learning Management System (PLMS) to Support Blended Learning Models. Int. J. Inf. Educ. Technol. 2025, 15, 18–29. [Google Scholar] [CrossRef]
  95. Buday Benzar, M.; Dalisay, C.N.; Emralino Blaisie, S.; Laurio Shiela Lyn, R. Exploring Students’ motivation and academic performance in learning ohm’s law using PhET simulations. World J. Adv. Res. Rev. 2023, 20, 287–294. [Google Scholar] [CrossRef]
  96. Kusumaningtyas, D.A.; Manyunu, M.; Kurniasari, E.; Awalin, A.N.; Rahmaniati, R.; Febriyanti, A. Enhancing Learning Outcomes: A Study on the Development of Higher Order Thinking Skills based Evaluation Instruments for Work and Energy in High School Physics. Indones. J. Learn. Adv. Educ. (IJOLAE) 2024, 6, 14–31. [Google Scholar] [CrossRef]
  97. Doyan, A.; Susilawati, S.; Hadisaputra, S.; Muliyadi, L. Effectiveness of Quantum Physics Learning Tools Using Blended Learning Models to Improve Critical Thinking and Generic Science Skills of Students. J. Penelit. Pendidik. IPA 2022, 8, 1030–1033. [Google Scholar] [CrossRef]
  98. Good, M.; Maries, A.; Singh, C. Impact of traditional or evidence-based active-engagement instruction on introductory female and male students’ attitudes and approaches to physics problem solving. Phys. Rev. Phys. Educ. Res. 2019, 15, 020129. [Google Scholar] [CrossRef]
  99. Richter, K.; Kickmeier-Rust, M. Gamification in Physics Education: Play Your Way to Better Learning. Int. J. Serious Games 2025, 12, 59–81. [Google Scholar] [CrossRef]
  100. Nasir, M.; Fakhruddin, Z. Design and Analysis of Multimedia Mobile Learning Based on Augmented Reality to Improve Achievement in Physics Learning. Int. J. Inf. Educ. Technol. 2023, 13, 993–1000. [Google Scholar] [CrossRef]
  101. Dewantara, D.; Wati, M.; Mahtari, S.; Haryandi, S. Blended Learning to Improve Learning Outcomes in Digital Electronics Courses. In Proceedings of the 1st South Borneo International Conference on Sport Science and Education (SBICSSE 2019); Atlantis Press: Dordrecht, The Netherlands, 2020. [Google Scholar] [CrossRef]
  102. Bitzenbauer, P.; Hennig, F. Flipped classroom in physics teacher education:(how) can students’ expectations be met? Front. Educ. 2023, 8, 1194963. [Google Scholar] [CrossRef]
  103. Erlına, N.; Prayektı, P.; Wıcaksono, I. Atomic physics teaching materials in blended learning to improve self-directed learning skills in distance education. Turk. Online J. Distance Educ. 2022, 23, 20–38. [Google Scholar] [CrossRef]
  104. Widyaningsih, S.W.; Yusuf, I.; Prasetyo, Z.K.; Istiyono, E. Online Interactive Multimedia Oriented to HOTS through E-Learning on Physics Material about Electrical Circuit. JPI (J. Pendidik. Indones.) 2020, 9, 17667. [Google Scholar] [CrossRef]
  105. Husnaini, S.; Chen, S. Effects of guided inquiry virtual and physical laboratories on conceptual understanding, inquiry performance, scientific inquiry self-efficacy, and enjoyment. Phys. Rev. Phys. Educ. Res. 2019, 15, 010119. [Google Scholar] [CrossRef]
  106. Novitra, F.; Festiyed; Yohandri; Asrizal. Development of Online-based Inquiry Learning Model to Improve 21st-Century Skills of Physics Students in Senior High School. Eurasia J. Math. Sci. Technol. Educ. 2021, 17, em2004. [Google Scholar] [CrossRef]
  107. Agyei, E.D.; Agyei, D.D. Promoting Interactive Teaching with ICT: Features of Intervention for the Realities in the Ghanaian Physics Senior High School Classroom. Int. J. Interact. Mob. Technol. (IJIM) 2021, 15, 93. [Google Scholar] [CrossRef]
  108. Al-Kamzari, F.; Alias, N. A systematic literature review of project-based learning in secondary school physics: Theoretical foundations, design principles, and implementation strategies. Humanit. Soc. Sci. Commun. 2025, 12, 286. [Google Scholar] [CrossRef]
  109. Kämpf, L.; Stallmach, F. Spiral-curricular blended learning for the mathematics education in physics teacher training courses. Front. Educ. 2024, 9, 1450607. [Google Scholar] [CrossRef]
  110. Koumpouros, Y. Revealing the true potential and prospects of augmented reality in education. Smart Learn. Environ. 2024, 11, 2. [Google Scholar] [CrossRef]
  111. Sujanem, R.; Suwindra, I. Problem-based Interactive Physics E-Module in Physics Learning Through Blended PBL to Enhance Students’ Critical Thinking Skills. J. Pendidik. IPA Indones. 2023, 12, 135–145. [Google Scholar] [CrossRef]
  112. Bao, L.; Koenig, K. Physics education research for 21st century learning. Discip. Interdiscip. Sci. Educ. Res. 2019, 1, 2. [Google Scholar] [CrossRef]
  113. Hanč, J.; Borovský, D.; Hančová, M. Blended learning: A data-literate science teacher is a better teacher. J. Phys. Conf. Ser. 2024, 2715, 012012. [Google Scholar] [CrossRef]
  114. Agyare, B.; Asare, J.; Kraishan, A.; Nkrumah, I.; Adjekum, D.K. A cross-national assessment of artificial intelligence (AI) Chatbot user perceptions in collegiate physics education. Comput. Educ. Artif. Intell. 2025, 8, 100365. [Google Scholar] [CrossRef]
  115. Xenakis, A.; Kalovrektis, Κ.; Theodoropoulou, K.; Karampelas, A.; Giannakas, G.; Sotiropoulos, D.J.; Vavougios, D. Using Sensors and Digital Data Collection/Analysis Technologies in K–12 Physics Education Under the STEM Perspective. In The International Handbook of Physics Education Research: Teaching Physics; Taşar, M.F., Heron, P.R.L., Eds.; AIP Publishing: New York, NY, USA, 2023; Chapter 6. [Google Scholar] [CrossRef]
  116. Herayanti, L.; Wıdodo, W.; Susantini, E.; Gunawan, G. The effectiveness of blended learning model based on inquiry collaborative tutorial toward students’ problem-solving skills in physics. J. Educ. Gift. Young Sci. 2020, 8, 959–972. [Google Scholar] [CrossRef]
  117. Tuveri, M.; Steri, A.; Fadda, D.; Stefanizzi, R.; Fanti, V.; Bonivento, W.M. Fostering the Interdisciplinary Learning of Contemporary Physics Through Digital Technologies: The “Gravitas” Project. Digital 2024, 4, 971–989. [Google Scholar] [CrossRef]
  118. Silva, R.; Rodrigues, R.; Leal, C. Gamification in Management Education: A Systematic Literature Review. BAR—Braz. Adm. Rev. 2019, 16, 180103. [Google Scholar] [CrossRef]
  119. Tumangkeng, J.; Muya, A. Enhancing Student Learning Activities Through Interactive Learning Design in Basic Physics I. Phys. Educ. 2024, 6, 00173. [Google Scholar] [CrossRef]
  120. Suma, K.; Suwindra, I.N.; Sujanem, R. The Effectiveness of Blended Learning in Increasing Prospective Physics Teacher Students’ Learning Motivation and Problem-Solving Ability. JPI (J. Pendidik. Indones.) 2020, 9, 436–445. [Google Scholar] [CrossRef]
  121. Ruggieri, C. Students’ use and perception of textbooks and online resources in introductory physics. Phys. Rev. Phys. Educ. Res. 2020, 16, 020123. [Google Scholar] [CrossRef]
  122. Papaioannou, G.; Volakaki, M.-G.; Kokolakis, S.; Vouyioukas, D. Learning Spaces in Higher Education: A State-of-the-Art Review. Trends High. Educ. 2023, 2, 526–545. [Google Scholar] [CrossRef]
  123. Cui, T.; Wang, J. Empowering active learning: A social annotation tool for improving student engagement. Br. J. Educ. Technol. 2023, 55, 712–730. [Google Scholar] [CrossRef]
  124. Shofiyah, A.; Suprianto, V.R.; Robiz, M.N.Z. Meta-analisis Kemampuan Kognitif Siswa dalam Pembelajaran Fisika dengan Model Problem Based Learning (PBL). Mutiara J. Ilm. Multidisiplin Indones. 2024, 2, 204–216. [Google Scholar] [CrossRef]
  125. Darmaji, D.; Kurniawan, D.; Astalini, A.; Lumbantoruan, A.; Samosir, S. Mobile Learning in Higher Education for The Industrial Revolution 4.0: Perception and Response of Physics Practicum. Int. J. Interact. Mob. Technol. (IJIM) 2019, 13, 4. [Google Scholar] [CrossRef]
  126. Makiyah, Y.; Nurdiansah, I.; Mahmudah, I.; Maulidah, R.A. Implementation of Circuit Wizard Software in Basic Electronics Course to Improving Student Motivation and Learning Outcomes. Radiasi J. Berk. Pendidik. Fis. 2022, 15, 22–27. [Google Scholar] [CrossRef]
  127. Jamil, M.; Hafeez, F.; Muhammad, N. Critical Thinking Development for 21st Century: Analysis of Physics Curriculum. J. Soc. Organ. Matters 2024, 3, 01–10. [Google Scholar] [CrossRef]
  128. Jamil, M.; Chohan, I.; Ali, M. Unpacking the 4cs: The qualitative content analysis of 21st century learning skills in physics textbook (grade ix). J. Soc. Res. Dev. 2025, 6, 37–49. [Google Scholar] [CrossRef]
  129. Marnita, M.; Taufiq, M.; Iskandar, I.; Rahmi, R. The Effect of Blended Learning Problem-Based Instruction Model on Students’ Critical Thinking Ability in Thermodynamic Course. J. Pendidik. IPA Indones. 2020, 9, 430–438. [Google Scholar] [CrossRef]
  130. İnce, E. Implementation and Results of a New Problem Solving Approach in Physics Teaching. Momentum Phys. Educ. J. 2019, 3, 58–68. [Google Scholar] [CrossRef]
  131. Djudin, T. An Easy Way to Solve Problems of Physics by Using Metacognitive Strategies: A Quasy-Experimental Study on Prospective Teachers in Tanjungpura University-Indonesia. J. Teach. Teach. Educ. 2020, 8, 19–27. [Google Scholar] [CrossRef]
  132. Susilawati, A.; Yusrizal, Y.; Halim, A.; Syukri, M.; Khaldun, I.; Susanna, S. Effect of Using Physics Education Technology (PhET) Simulation Media to Enhance Students’ Motivation and Problem-Solving Skills in Learning Physics. J. Penelit. Pendidik. IPA 2022, 8, 1157–1167. [Google Scholar] [CrossRef]
  133. Hidayatulloh, M.; Wiryokusumo, I.; Walujo, D.A. Remidiasi Muskiness Siswa Pada Materi Listrik Dinamis Menggunakan Ebook Interaktif. J. Pendidik. Fis. Dan Teknol. 2019, 5, 30–39. [Google Scholar] [CrossRef]
  134. Muhammad, S.; Sami, M.; Bano, N.; Rida, B.; Muhammad, R. Artificial Intelligence in Physics Education: Transforming Learning from Primary to University Level. Indus J. Soc. Sci. 2025, 3, 717–733. [Google Scholar] [CrossRef]
  135. Xing, H.; Zhai, Y.; Han, S.; Zhao, Y.; Gong, W.; Wang, Y.; Han, J.; Liu, Q. The measuring instrument of primitive physics problem for upper-secondary school students: Compilation and exploration. J. Balt. Sci. Educ. 2022, 21, 305–324. [Google Scholar] [CrossRef]
  136. Ilma, A.; Adhelacahya, K.; Ekawati, E. Assessment for Learning Model in Competency Assessment of 21st Century Student Assisted by Google Classroom. J. Phys. Conf. Ser. 2021, 1805, 012005. [Google Scholar] [CrossRef]
  137. Hidayatullah, Z.; Wilujeng, I.; Nurhasanah, N.; Gusemanto, T.G.; Makhrus, M. Synthesis of the 21st Century Skills (4C) Based Physics Education Research In Indonesia. JIPF (J. Ilmu Pendidik. Fis.) 2021, 6, 88–97. [Google Scholar] [CrossRef]
  138. Rahayu, S.M.; Rosidin, U.; Herlina, K. Development of collaboration and communication skills assessment tools based on project based learning in improving high school students the soft skills. In International Conference on Educational Assessment and Policy (ICEAP 2020); Atlantis Press: Dordrecht, The Netherlands, 2021; pp. 163–166. [Google Scholar] [CrossRef]
  139. Siahaan, P.; Dewi, E.; Suhendi, E. Introduction, connection, application, reflection, and extension (ICARE) learning model: The impact on students’ collaboration and communication skills. J. Ilm. Pendidik. Fis. Al-Biruni 2020, 9, 109. [Google Scholar] [CrossRef]
  140. Putri, D.A.; Asrizal, A.; Festiyed, F. The Effects of Science Teaching Materials on Students’ 21st-Century Skills: A Meta-Analysis. J. Penelit. Pembelajaran Fis. 2023, 9, 104. [Google Scholar] [CrossRef]
  141. McKenna, A.; McMartin, F.; Terada, Y.; Sirivedhin, P.; McMartin, F.; Terada, Y.; Sirivedhin, P.; Agogino, A. A Framework For Interpreting Students’ Perceptions of an Integrated Curriculum. In Proceedings of the 2001 Annual Conference, Albuquerque, Mexico, 24–27 June 2001; pp. 6–32. [Google Scholar] [CrossRef]
  142. Zabolotna, K.; Nøhr, L.; Iwata, M.; Spikol, D.; Malmberg, J.; Järvenoja, H. How does collaborative task design shape collaborative knowledge construction and group-level regulation of learning? A study of secondary school students’ interactions in two varied tasks. Int. J. Comput.—Support. Collab. Learn. 2025, 20, 171–199. [Google Scholar] [CrossRef]
  143. Paminto, J.; Yulianto, A.; Linuwih, S. Development of PJBL-Based Physics Edu Media to Improve The 21st Century Learning Skills of High School Students. J. Pendidik. Fis. Indones. 2023, 19, 180–192. [Google Scholar] [CrossRef]
  144. Kharki, K.; Berrada, K.; Burgos, D. Design and Implementation of a Virtual Laboratory for Physics Subjects in Moroccan Universities. Sustainability 2021, 13, 3711. [Google Scholar] [CrossRef]
  145. Radu, I.; Schneider, B. How Augmented Reality (AR) Can Help and Hinder Collaborative Learning: A Study of AR in Electromagnetism Education. IEEE Trans. Vis. Comput. Graph. 2022, 29, 3734–3745. [Google Scholar] [CrossRef]
  146. Spirin, O.; Oleksiuk, V.; Balyk, N.; Lytvynova, S.H.; Sydorenko, S. The Blended Methodology of Learning Computer Networks: Cloud-Based Approach; Digital Library NAES of Ukraine (National Academy of Educational Sciences of Ukraine): Kyiv, Ukraine, 2019; pp. 68–80. [Google Scholar]
  147. Sundstrom, M.; Wu, D.; Walsh, C.; Heim, A.B.; Holmes, N.G. Examining the effects of lab instruction and gender composition on intergroup interaction networks in introductory physics labs. Phys. Rev. Phys. Educ. Res. 2022, 18, 010102. [Google Scholar] [CrossRef]
  148. Scutt, H.; Gilmartin, S.; Sheppard, S.; Brunhaver, S.R. Informed Practices for Inclusive Science, Technology, Engineering, and Math (STEM) Classrooms: Strategies for Educators to Close the Gender Gap. In Proceedings of the 2013 ASEE Annual Conference & Exposition, Atlanta, GA, USA, 23–26 June 2013; pp. 1–17. [Google Scholar] [CrossRef]
  149. Zhan, Z.; Li, T.; Ye, Y. Effect of jigsaw-integrated task-driven learning on students’ motivation, computational thinking, collaborative skills, and programming performance in a high-school programming course. Comput. Appl. Eng. Educ. 2024, 32, 22793. [Google Scholar] [CrossRef]
  150. Brundage, M.; Malespina, A.; Singh, C. Peer interaction facilitates co-construction of knowledge in quantum mechanics. Phys. Rev. Phys. Educ. Res. 2023, 19, 020133. [Google Scholar] [CrossRef]
  151. Jeong, H.; Hmelo-Silver, C.; Jo, K. Ten years of Computer-Supported Collaborative Learning: A meta-analysis of CSCL in STEM education during 2005–2014. Educ. Res. Rev. 2019, 28, 100284. [Google Scholar] [CrossRef]
  152. Lourakis, E.; Petridis, K. Applying Scrum in an Online Physics II Undergraduate Course: Effect on Student Progression and Soft Skills Development. Educ. Sci. 2023, 13, 126. [Google Scholar] [CrossRef]
  153. Malahito, J.; Quimbo, M. Creating G-Class: A gamified learning environment for freshman students. E-Learn. Digit. Media 2020, 17, 94–110. [Google Scholar] [CrossRef]
  154. Budi, G.; Farcis, F. Students’ Critical Thinking Skills in Innovating Problem Solving in the Physics Entrepreneurship Course. Berk. Ilm. Pendidik. Fis. 2021, 9, 39. [Google Scholar] [CrossRef]
  155. Zhai, X.; Krajcik, J. Uses of Artificial Intelligence in STEM Education; Oxford University Press: Oxford, UK, 2024. [Google Scholar] [CrossRef]
  156. Ángeles, D.; Genaro, Z.; Juan, A.A. Integrated Physics and Math course for engineering students: A First Experience. In Proceedings of the 2013 ASEE Annual Conference & Exposition, Atlanta, GA, USA, 23–26 June 2013; pp. 1–9. [Google Scholar] [CrossRef]
  157. Dominguez, A.; De la Garza, J.; Quezada-Espinoza, M.; De Meester, J. Integration of Physics and Mathematics in STEM Education: Use of Modeling. Educ. Sci. 2023, 14, 20. [Google Scholar] [CrossRef]
  158. Spikic, S.; Passel, W.; Deprez, H.; De Meester, J. Measuring and Activating iSTEM Key Principles among Student Teachers in STEM. Educ. Sci. 2022, 13, 12. [Google Scholar] [CrossRef]
  159. Ouyang, F.; Dai, X.; Chen, S. Applying multimodal learning analytics to examine the immediate and delayed effects of instructor scaffoldings on small groups’ collaborative programming. Int. J. STEM Educ. 2022, 9, 45. [Google Scholar] [CrossRef]
  160. Jackson, M.A.; Moon, S.; Doherty, J.H.; Wenderoth, M.P. Which evidence-based teaching practices change over time? Results from a university-wide STEM faculty development program. Int. J. STEM Educ. 2022, 9, 22. [Google Scholar] [CrossRef]
  161. Knowles, J.; Brooks, A.; Clement, E.; Shekhar, P.; Brown, S.A.; Aljabery, M. A Qualitative Exploration of Resource-Related Barriers Associated with EBIP Implementation in STEM Courses. In Proceedings of the 2023 ASEE Annual Conference & Exposition, Baltimore, MD, USA, 25–28 June 2023. [Google Scholar] [CrossRef]
  162. Kennedy, S.A.; Balija, A.M.; Bibeau, C.; Fuhrer, T.J.; Huston, L.A.; Jackson, M.S.; Lane, K.T.; Lau, J.K.; Liss, S.; Monceaux, C.J.; et al. Faculty Professional Development on Inclusive Pedagogy Yields Chemistry Curriculum Transformation, Equity Awareness, and Community. J. Chem. Educ. 2021, 99, 291–300. [Google Scholar] [CrossRef]
  163. Kotsis, K. Bridging Pedagogical Gaps: How Teachers Can Use ChatGPT to Support Physics Experiments. Int. J. Adv. Multidiscip. Res. Stud. 2025, 5, 9–15. [Google Scholar] [CrossRef]
  164. Nawaz, S.; Alghamdi, E.; Srivastava, N.; Lodge, J.; Corrin, L. Understanding the Role of AI and Learning Analytics Techniques in Addressing Task Difficulties in STEM Education. In Artificial Intelligence in STEM Education; CRC Press: Boca Raton, FL, USA, 2022; pp. 241–258. [Google Scholar] [CrossRef]
  165. Zhang, Y.; Wang, P.; Jia, W.; Zhang, A.; Chen, G. Dynamic visualization by GeoGebra for mathematics learning: A meta-analysis of 20 years of research. J. Res. Technol. Educ. 2023, 57, 437–458. [Google Scholar] [CrossRef]
  166. Sommers, A.; White, H.; Dauer, J.; Forbes, C. Impacts of Faculty Development on Interdisciplinary Undergraduate Teaching and Learning in the Food-Energy-Water Nexus. J. Coll. Sci. Teach. 2022, 51, 66–74. [Google Scholar] [CrossRef]
  167. Lin, J. The CLEAR Framework to Implement Active Learning in STEM Education. In Proceedings of the 2021 IEEE International Conference on Engineering, Technology & Education (TALE), Wuhan, China, 5–8 December 2021. [Google Scholar] [CrossRef]
  168. Ayeni, O.; Unachukwu, C.; Osawaru, B.; Chisom, O.N.; Adewusi, O.E. Innovations in STEM education for students with disabilities: A critical examination. Int. J. Sci. Res. Arch. 2024, 11, 1797–1809. [Google Scholar] [CrossRef]
  169. Ankeny, C.; Mayled, L.; Ross, L.; Hjelmstad, K.D.; Krause, S.J.; Middleton, J.A.; Culbertson, R.J. Creating and Scaling an Evidence-Based Faculty Development Program. In Proceedings of the 2018 ASEE Annual Conference & Exposition, Salt Lake City, UT, USA, 23–27 June 2018. [Google Scholar] [CrossRef]
  170. Schnittka, C.; Turner, G.; Colvin, R.; Ewald, M.L. A State-Wide Professional Development Program in Engineering with Science and Math Teachers in Alabama: Fostering Conceptual Understandings of STEM. In Proceedings of the 2014 ASEE Annual Conference & Exposition, Indianapolis, IN, USA, 15–18 June 2014; pp. 1–24. [Google Scholar]
  171. Morales, G.; Noël, R. Work in Progress: Examining the Impact of a Faculty Development Program in Engineering Instructors’ Teaching Practices and Perceptions on Active Learning Methodologies. In Proceedings of the ASEE Annual Conference & Exposition, Baltimore, MD, USA, 25–28 June 2023. [Google Scholar] [CrossRef]
  172. Malavoloneque, G.; Costa, N. Physics Education and Sustainable Development: A Study of Energy in a Glocal Perspective in an Angolan Initial Teacher Education School. Front. Educ. 2022, 6, 639388. [Google Scholar] [CrossRef]
  173. Changkui, C. Applications of Large Multimodal Models (LMMs) in STEM Education: From Visual Explanations to Virtual Experiments. Artif. Intell. Educ. Stud. 2025, 1, 010201. [Google Scholar] [CrossRef]
  174. Dusen, B.; Nissen, J. Associations between learning assistants, passing introductory physics, and equity: A quantitative critical race theory investigation. Phys. Rev. Phys. Educ. Res. 2020, 16, 010117. [Google Scholar] [CrossRef]
  175. Bazelais, P.; Breuleux, A.; Doleck, T. Investigating a blended learning context that incorporates two-stage quizzes and peer formative feedback in STEM education. Knowl. Manag. E-Learn. Int. J. 2022, 14, 395–414. [Google Scholar] [CrossRef]
  176. Park, M. Effects of Simulation-based Formative Assessments on Students’ Conceptions in Physics. Eurasia J. Math. Sci. Technol. Educ. 2019, 15, 103586. [Google Scholar] [CrossRef]
  177. Espinosa, T.; Miller, K.; Araújo, I.; Mazur, E. Reducing the gender gap in students’ physics self-efficacy in a team- and project-based introductory physics class. Phys. Rev. Phys. Educ. Res. 2019, 15, 010132. [Google Scholar] [CrossRef]
  178. Williams, E.; Zwolak, J.; Dou, R.; Brewe, E. Linking engagement and performance: The social network analysis perspective. Phys. Rev. Phys. Educ. Res. 2019, 15, 020150. [Google Scholar] [CrossRef]
  179. Kuromiya, H.; Majumdar, R.; Ogata, H. Fostering Evidence-Based Education with Learning Analytics: Capturing Teaching-Learning Cases from Log Data; Kyoto University Research Information Repository (Kyoto University): Kyoto, Japan, 2020. [Google Scholar]
  180. Bradford, B.; Beier, M.; Oswald, F. A Meta-analysis of University STEM Summer Bridge Program Effectiveness. CBE Life Sci. Educ. 2021, 20, ar21. [Google Scholar] [CrossRef]
  181. Li, Y.; Singh, C. Sense of belonging is an important predictor of introductory physics students’ academic performance. Phys. Rev. Phys. Educ. Res. 2023, 19, 020137. [Google Scholar] [CrossRef]
  182. Dancy, M.; Lau, A.; Rundquist, A.; Henderson, C. Faculty online learning communities: A model for sustained teaching transformation. Phys. Rev. Phys. Educ. Res. 2019, 15, 020147. [Google Scholar] [CrossRef]
  183. Zabriskie, C.; Yang, J.; DeVore, S.; Stewart, J. Using machine learning to predict physics course outcomes. Phys. Rev. Phys. Educ. Res. 2019, 15, 020120. [Google Scholar] [CrossRef]
Figure 2. Research agenda for mechanism-informed, equity-sensitive scaling of active learning in university physics. Legend: The figure summarizes five priority directions: mechanism testing, fidelity coding, equity-focused moderation analyses, modality-sensitive design with intervention-coupled analytics, and competency-oriented long-horizon evaluation. The agenda links each direction to representative measures and reporting standards to support cumulative evidence building.
Figure 2. Research agenda for mechanism-informed, equity-sensitive scaling of active learning in university physics. Legend: The figure summarizes five priority directions: mechanism testing, fidelity coding, equity-focused moderation analyses, modality-sensitive design with intervention-coupled analytics, and competency-oriented long-horizon evaluation. The agenda links each direction to representative measures and reporting standards to support cumulative evidence building.
Sustainability 18 02791 g002
Table 1. Operational definitions and implementation heuristics of major active-learning formats in university physics.
Table 1. Operational definitions and implementation heuristics of major active-learning formats in university physics.
Active-Learning Format (University Physics)Core Functional ComponentsTypical Implementation Features (Heuristics)Scalable Readiness AssuranceIn-Class Dominant Task FamilyFeedback RoutineRepresentative Evidence (Ref#)
Flipped/blended readiness-based instructionReadiness assurance; in-class sensemaking; feedback loopsPre-class bounded preparation aligned to class tasks; class time shifts to application and explanationLow-stakes pre-class check (quiz/prompt) with rapid feedback; use results to target misconceptionsConcept questions; short problem segments; guided inquiryImmediate checks + targeted clarification; revision tasks; selective grading[1,17,55,57]
Peer instruction/audience-response-supported discussionElicitation → peer discussion → revote; explanation normsShort conceptual prompts; structured discussion; focus on reasoning not speedOptional: brief pre-class priming; in-class readiness via warm-up diagnosticConceptual conflict questions; reasoning comparisonImmediate feedback through polling + instructor synthesis; follow-up justification prompts[11,36,49]
Collaborative problem solving (CPS)/cooperative learningStructured group work; shared artifacts; accountability; feedbackSmall groups with roles; multi-step problems; intermediate checkpointsShort pre-task readiness prompt; role assignment and expectationsMulti-principle problem solving; whiteboarding/shared solutionsTA/instructor roaming feedback; peer checking; brief whole-class debrief[19,62,68]
Inquiry-based/simulation-supported learning (e.g., PhET-guided)Prediction–test–explain cycles; scaffolding; feedbackGuided prompts; explicit representation linking; anomaly resolutionPre-class conceptual priming or short simulation previewHypothesis testing; model evaluation; explanation writingPrompted reflection + targeted feedback; revision of models/explanations[4,56,68]
Project-based/scenario-based modules (socio-scientific contexts)Authentic tasks; collaboration; decision rationale; assessment alignmentShort, bounded projects; staged milestones; rubric-based evaluationCheckpoint submissions; readiness guides; exemplarsDecision memos; modeling tasks; argumentationMilestone feedback; peer review with calibration; final rubric scoring[21,45,66]
Note: Implementation features are presented as design heuristics to preserve functional fidelity (readiness–sensemaking–feedback) and should be adapted to local constraints (class size, resources, modality). The evidence supporting each element varies in density and reporting quality; the tables summarize recurring elements, while Section 3, Section 4 and Section 5 indicate where evidence is comparatively denser versus thinner.
Table 2. Component-based taxonomy: operational definitions, minimal design heuristics, mechanisms, and representative evidence.
Table 2. Component-based taxonomy: operational definitions, minimal design heuristics, mechanisms, and representative evidence.
Component (What Is Implemented)Operational Definition (What Students Do)Design Heuristics/Minimal Implementation Features (Non-Prescriptive)Primary Mechanisms Targeted (Why It Works)Evidence Exemplars (Author, Year [Ref#])
Pre-class preparation (flipped readiness building)Structured work completed before class to shift first exposure and low-order processing out of class; used to enable in-class application/explanation.Bound workload (often ~20–30 min); in-class time predominantly active (often ≥60%); tightly couple pre-class tasks to in-class retrieval/application; provide low-bandwidth alternatives where access is constrained.Cognitive: reduce extraneous load, prime prior knowledge. Affective: early mastery/competence signals. Behavioral: readiness for participation.[1,17,54,55,57]
In-class social sensemaking routines (peer interaction + reasoning accountability)Structured peer-mediated reasoning in class (e.g., CPS, whiteboarding, peer assessment) where students explain, justify, critique, and revise ideas/solutions.CPS: groups commonly 3–4; multi-principle tasks; role/norm structures for psychological safety. Peer assessment: explicit rubrics + brief calibration; instructor oversight (platform-supported if needed). Whiteboarding: externalize reasoning; allocate time for presentation/critique cycles.Cognitive: conflict detection + representational integration. Affective: belonging/efficacy via safe participation. Behavioral: participation density + help-seeking.[58,59,60,63,64]
Feedback & formative assessment architectures (immediate, delayed, meta-feedback)Feedback cycles that diagnose misconceptions, trigger revision, and structure subsequent practice; includes immediate (e.g., clickers), delayed (homework/labs with revision), and meta-feedback (self-evaluation supports).Immediate: diagnostic items + explanation prompts; follow with peer discussion where feasible. Delayed: diagnostic comments + required revision (not optional). Meta-feedback: integrate rubrics/logs into graded workflow; avoid overload in large classes; plan access pathways for tool-delivered feedback.Cognitive: error-based revision. Affective: competence signals. Behavioral: sustained engagement + self-regulation.[9,60,68]
Example-based learning (error examples & contrast examples)Worked-example designs that externalize expert reasoning and reduce unproductive search; includes misconception-aligned error examples and feature-contrast examples requiring discrimination of critical cues.Error examples: use common misconceptions; require learners to diagnose/explain. Contrast examples: paired problems differing in critical features; prompt “why solution paths diverge.” Sequence/scaffold by prior knowledge; avoid overload in highly abstract content.Cognitive: conflict monitoring (error) + feature discrimination (contrast) + schema refinement. Behavioral: guided explanation/checking routines.[54,56,70]
Note: Before submission, align each Ref# in ‘Evidence exemplars’ to the final manuscript reference list after any reference reordering. The evidence supporting each element varies in density and reporting quality; the tables summarize recurring elements, while Section 3, Section 4 and Section 5 indicate where evidence is comparatively denser versus thinner.
Table 4. Design parameters and minimum implementation requirements for key components (including common failure modes and moderators).
Table 4. Design parameters and minimum implementation requirements for key components (including common failure modes and moderators).
Component (Section 3)What It Is (Operational Definition)Key Design Parameters (Implementation “knobs”)Minimum Implementation Requirements (to Avoid Pseudo-Active Use)Common Failure Modes (Why Effects Become Null/Negative)Boundary Conditions/Moderators (When It Works Best)Representative Evidence (from Your Reference List)
Pre-class preparation (flipped readiness)Structured work completed before class that shifts first exposure outside class and prepares students for in-class retrieval and applicationTime-on-task bounded; alignment to in-class tasks; format choice (microlectures vs. simulations vs. generic videos)Pre-class tasks must be required for in-class success (retrieval/apply), not optional “preview”; clear accountability (quiz/log)Weak coupling between phases; passive viewing without prompts; access barriers (connectivity/devices)Access/technology readiness; instructor workflow coherence; students’ SRL readiness[1,17,55,57]
Interactive simulations (e.g., PhET) as preparation or activityManipulable representations enabling variable control, prediction/testing, and immediate feedbackPrediction–test–explain prompts; guided exploration tasks; integration with MR translationStudents must generate predictions/explanations, not only “play”; prompts must target misconceptions/model structureNovelty-only use; unguided exploration overload; tool use not linked to assessment or in-class sensemakingTopic fit (abstract/visualizable phenomena); prior knowledge; access/technical readiness[56,83]
Peer instruction/ConcepTests (often with ARS/clickers)Short conceptual questions + vote + peer discussion + revote + explanationQuestion quality (misconception-relevant); discussion time; explanation follow-upMust include peer discussion + explanation, not vote-only; questions must represent conceptual nuanceOversimplified items in highly abstract domains; no discussion; instructor skips explanationTopic abstraction; class size (interaction bandwidth); instructor facilitation skill[11,36,49]
Collaborative problem solving (CPS)Small-group work on multi-step physics tasks requiring derivation/application/justificationGroup size (small); task complexity (multi-principle); role structure; time allocationTasks must require justification and integration (not parallel individual work); roles/norms to ensure psychological safetySocial loafing; anxiety/participation threat; tasks too hard without scaffolds; “answer sharing” without reasoningTopic difficulty; student anxiety/self-efficacy; instructor monitoring; classroom layout[19,58]
Whiteboarding (public reasoning + critique)Groups externalize solutions on shared boards followed by presentation, critique, and feedbackRequirement to show steps + concepts; time for presentation/critique; facilitation movesMust make reasoning publicly inspectable and include critique/revision cycle; adequate time/materialsInsufficient time/materials; low-quality facilitation; large-class logistics prevent feedback cyclesClass size; room/resources; facilitation capacity; norms for respectful critique[60,63,64]
Peer assessment (rubric-guided peer review)Students evaluate peers’ physics work using explicit criteria; benefits via calibration and metacognitionRubric specificity (conceptual criteria); calibration session; number of reviews; instructor oversightMust include criteria + calibration; prompts should require explanation of feedbackInadequate domain expertise in abstract topics; unreliable ratings; low trust in peer feedbackTopic abstraction; student readiness; rubric quality; workload/time[59,60]
Formative feedback architectures (immediate, delayed, meta-feedback)Feedback loops that diagnose misconceptions and trigger revision (immediate via ARS; delayed via homework/labs; meta via self-assessment)Feedback timing; explanation quality; revision requirement; integration into grading/workflowFeedback must be actionable and linked to revision (required, not optional)Feedback given but no revision opportunity; delayed correction for novices; online access inequitiesPrior knowledge; task type (conceptual vs. procedural/lab); access to tools; time constraints[17,68]
Example-based learning (error + contrast examples)Worked examples designed to elicit conflict (error examples) or discrimination of critical features (contrast examples)Misconception-aligned errors; prompts to diagnose/explain; feature-explicit contrasts; sequencingStudents must explain diagnosis/discrimination, not only read; demands calibrated to readinessOverload in highly abstract content; too easy for experts; examples detached from practice tasksPrior knowledge; task complexity; scaffolding; assessment alignment[54,68]
Note: Use this table to support Section 5 and Section 7 (fidelity, failure modes, and sustainable scaling). The evidence supporting each element varies in density and reporting quality; the tables summarize recurring elements, while Section 3, Section 4 and Section 5 indicate where evidence is comparatively denser versus thinner.
Table 5. Moderators of effectiveness and practical design responses for sustainable scaling in higher-education physics.
Table 5. Moderators of effectiveness and practical design responses for sustainable scaling in higher-education physics.
Moderator DomainModerator (What Varies)Operational Indicator/Threshold (Design-Relevant)Typical Risk If Not EngineeredPractical Design Response (What to Do)Representative Evidence (from Your Reference List)
Classroom feasibilityClass size/interaction bandwidthEffects can attenuate as class size increases unless scalable interaction and feedback routines are in place.Reduced feedback frequency; talk-time concentration; lower visibility of student thinkingUse scalable routines (ARS + peer discussion), structured small-group roles, tighter timeboxing; increase diagnostic sampling[11,74,91]
Technology conditionsAccess and reliability (internet/devices)When reliable access to devices/internet is limited, flipped/online elements may underperform unless low-tech alternatives and access supports are provided.Incomplete preparation; inequitable participation; negative effects under access constraintsProvide low-bandwidth alternatives (downloadable materials), keep a single workflow, in-class “catch-up” buffers[15,48,53]
Technology conditionsRedundancy/platform overloadMultiple unintegrated platforms increase navigation burden (extraneous load)Lower engagement and continuity; “tool fatigue”Consolidate tools; align each tool to a mechanism (visualization/feedback/interaction) [2,95]
Technology conditionsDosage of interactive useInteractive tools often require ~2× weekly use for transfer to emerge (heuristic)One-off novelty with minimal learning transferPlan repeated cycles (predict–test–explain; retrieve–apply–revise)[56,96]
Implementation qualityTeacher fidelity (principle fidelity vs. scripting)Fidelity predicts outcomes when defined as preserving explanation–feedback–revision mechanisms, not strict scriptsOver-scripting suppresses autonomy/exploration; under-fidelity leads to pseudo-active tasksObserve enactment; coach facilitation moves; permit adaptive responsiveness while preserving mechanisms [33]
Learner readinessPrior conceptual knowledgeHigher prior conceptual knowledge can be associated with stronger gains on higher-order outcomes; novices often require additional scaffolding.Novices overload; shallow participation; confusion in explanation-heavy tasksPre-teach prerequisite schemas; use worked/error examples; increase scaffolding and guided prompts[33,34,54]
Learner readinessMotivation/self-efficacySelf-efficacy mediates engagement and can mediate gender gaps even with similar prepWithdrawal from participation; reduced persistence; inequitable gainsLow-stakes mastery cycles; psychological safety norms; structured participation (reduce volunteer bias)[9,10,11]
Cognitive capacityWorking memory/cognitive resourcesHigh-load activities (discussion + multi-step solving) disproportionately tax lower-WM studentsOverload; unproductive search; disengagementUse guided active learning; segment tasks; provide partial worked examples and checklists[74,90]
Cognitive capacitySpatial reasoning (visualization-intensive topics)AR/VR benefits can be conditional; without scaffolding may amplify gaps (e.g., ~20% advantage for higher spatial ability)Technology helps some, widens disparitiesAdd scaffolded manipulation prompts; pair MR translation tasks; avoid novelty-only VR[93,101]
Design specificationBlended ratio (F2F vs. online)A balanced mix of face-to-face and online elements is often more robust than extreme modality shifts; online-heavy designs generally increase SRL demands.Reduced social presence; insufficient clarification for complex conceptsPreserve in-person sensemaking/labs; use online for bounded preparation + practice; engineer interaction online[102,104]
Design specificationTask difficulty/integration demandWhen tasks require integration of multiple concepts, scaffolding and intermediate checkpoints become essential.Either overload (too hard) or pseudo-active (too easy)Calibrate difficulty; use error/contrast examples under high difficulty; demand justification under low difficulty[54,90,105]
Cultural/institutional contextParticipation norms & local constraintsPeer critique may be less acceptable in some collectivist contexts; institutional ceilings (labs, bandwidth)Low-quality feedback; reduced participation; smaller effectsUse criterion-referenced rubrics; emphasize respectful critique routines; adapt tool choices to constraints[17,91]
Note: Exact numeric thresholds were rewritten as qualitative heuristics unless source-verifiable. If you prefer to keep any numbers, we must verify them against the cited sources before submission. The evidence supporting each element varies in density and reporting quality; the tables summarize recurring elements, while Section 3, Section 4 and Section 5 indicate where evidence is comparatively denser versus thinner.
Table 6. Competency–Activity–Assessment matrix mapping SDG-aligned competencies to physics active-learning design and evaluation tools.
Table 6. Competency–Activity–Assessment matrix mapping SDG-aligned competencies to physics active-learning design and evaluation tools.
Competency Cluster (SDG-Aligned)Physics-Aligned Active-Learning Activities (Examples)Primary Evidence/Learning ArtifactsAssessment Options (Scalable)Feasibility & Scaling Notes
Critical thinking & evidence-based problem solvingConcept-question cycles with justification; multi-representation tasks; error/contrast example diagnosisJustified solutions; representation translations; corrected error analysesReasoning rubrics; short explanation items; transfer problems; sampled gradingUse structured rubrics + selective sampling to manage workload; align exams to reasoning/transfer
Collaboration & communicationRole-based CPS; shared whiteboards/digital artifacts; structured peer feedbackGroup solution artifacts; peer feedback records; brief reflective memosCollaboration rubric; calibrated peer evaluation; artifact-based scoringDefine roles and accountability to protect equity; audit participation patterns in large classes
Responsible decision making under uncertainty (socio-scientific contexts)Scenario-based problems (energy/climate/technology); short decision memos; debates with evidenceDecision memos; argument maps; presentations/postersMemo/policy-brief rubric (assumptions, evidence, uncertainty, trade-offs); peer review + calibrationKeep tasks bounded (mini-projects); provide exemplars; integrate checkpoints for SRL support
Self-regulated learning (supporting competence across SDG tasks)Readiness assurance routines; metacognitive checklists; revision assignmentsPreparation logs; reflection prompts; revision submissionsSRL prompts scored with light rubric; platform trace checks with targeted interventionsEssential in flipped/hybrid; provide low-tech alternatives and explicit guidance to reduce inequity
Note: This matrix is intended to be used together with the component–mechanism synthesis (Section 3 and Section 4) and moderator guidance (Section 5) to design competency-aligned learning that remains feasible at scale.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xiao, F.; Wang, C.; Jiang, J. Active Learning in University Physics for Sustainable Higher Education: Effective Components, Mechanisms, and SDG-Aligned Competency Pathways—A Multidimensional Review. Sustainability 2026, 18, 2791. https://doi.org/10.3390/su18062791

AMA Style

Xiao F, Wang C, Jiang J. Active Learning in University Physics for Sustainable Higher Education: Effective Components, Mechanisms, and SDG-Aligned Competency Pathways—A Multidimensional Review. Sustainability. 2026; 18(6):2791. https://doi.org/10.3390/su18062791

Chicago/Turabian Style

Xiao, Fan, Chenglong Wang, and Jun Jiang. 2026. "Active Learning in University Physics for Sustainable Higher Education: Effective Components, Mechanisms, and SDG-Aligned Competency Pathways—A Multidimensional Review" Sustainability 18, no. 6: 2791. https://doi.org/10.3390/su18062791

APA Style

Xiao, F., Wang, C., & Jiang, J. (2026). Active Learning in University Physics for Sustainable Higher Education: Effective Components, Mechanisms, and SDG-Aligned Competency Pathways—A Multidimensional Review. Sustainability, 18(6), 2791. https://doi.org/10.3390/su18062791

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop