Review Reports
- Sharon L. Burton
Reviewer 1: Anonymous Reviewer 2: Anonymous Reviewer 3: Hsin-Yuan Chen
Round 1
Reviewer 1 Report
Comments and Suggestions for AuthorsThe manuscript examines designing and implementing cyber security
laboratories, using a real-world pilot case study as the empirical material.
The topic is relevant. But I have numerous concerns too:
* The structure is very weird, making me suspect an undisclosed use of a
LLM. For instance, the literature review and conceptual framework in
Section 7 would typically precede the materials and methods Section 5.
* There is a striking hallucination in Fig. 1. The wrong label and
misspellings for the top rectangle are a telltale sign of hallucinations.
Furthermore, Google does not find anything about [47]; i.e., it does not
seem to exist in the Internet. There is also an empty reference [48],
which is also not referenced in the text.
* The reference [26] is impossible to find.
* Many references are misrepresented. For instance, [15] does not argue
that about a limitation; in fact, it argues the opposite in general.
* The term andragogy should be explained. Upon checking [16], it simply
means adult learning, I think.
* The methodology is a little vaguely described. While I do not question the
adequacy of the methodology, I would suggest sharpening its description
rather than basically just saying that qualitative methods were used. For
instance, based on what I read in Section 5, it seems the methodology was
quite close to what is known as action research. Field methods and their
methodologies could be also mentioned specifically.
* The methodology does not support the results in Section 10. In particular,
it is written that "learners consistently exhibited gains in post-training
evaluations, frequently surpassing their baseline performance by a
significant margin". How were the evaluations done? As I suspect some
quantification was present, a mixed methods setup would be needed.
* The introduction notes that laboratories support cross-disciplinary
collaboration. I don't see any evidence about this claim. It is
also mentioned in the results section; i.e., the "laboratory has catalyzed
interdisciplinary collaboration, promoting shared projects and research
initiatives that span departments and academic units [36]". Now, it is
*the* laboratory, the particular laboratory you are studying, so why on earth
is there a reference here? And upon checking [36], it does not say
anything about laboratories or cyber security. In fact, even the
words do not appear in it.
* It is noted that in-person training was delivered, but then only video
lectures are listed in Subsection 5.1.1. Which was it?
* The virtual practice environment needs to be explained better. It is
unclear to me now whether the laboratory is a physical or virtual space.
If it is a virtual space, I would argue that it should be compared to a
physical laboratory; i.e., this limitation (if so) should be acknowledged.
Author Response
The manuscript examines designing and implementing cyber security
laboratories, using a real-world pilot case study as the empirical material.
The topic is relevant. But I have numerous concerns too:
Comments1 :* The structure is very weird, making me suspect an undisclosed use of a
LLM. For instance, the literature review and conceptual framework in
Section 7 would typically precede the materials and methods Section 5.
Response 1: The template is my template and not one from a LLM. These type of statements should not be placed on a professional review. I made the following change.
Abstract
I removed the following. Using a qualitative, multi-stage case study approach, the research examined institutional practices, instructional methods, and technical considerations impacting lab development.
I removed the following. This information is followed by a review of the research gap, an outline of the qualitative methodology and design, an extensive literature review, development and critique of the conceptual framework, an original synthesis of best practices, and separate sec-tions for the results, discussion, and conclusions.
I added this information: This information is followed by a review of the research gap, an outline of the mixed methods methodology and design, an extensive literature review, development and critique of the conceptual framework, an original synthesis of best practices, and separate sections for the results, discussion, and conclusions.
Materials and Methods Section
I added this information. This study employed a mixed-methods, multi-stage case study approach, incorporating elements of action research to support iterative improvement and learner participation throughout laboratory development. The methodology included (a) semi-structured interviews with facilitators and participants; (b) direct participant observation during lab sessions; (c) structured document analysis of training and session logs; and (d) administration and statistical analysis of standardized pre- and post-training assessments. This combined fieldwork and data collection design offered insight into lived experiences, institutional dynamics, and real-world effectiveness of the lab, supporting both meaning-making (qualitative) and outcome measurement (quantitative).
I removed the following: This research employed a qualitative inquiry to explore how cybersecurity labor-atories can be deliberately designed, deployed, and refined within academic institu-tions. The aim was to interpret complex educational environments, examine institutional decision-making, and illuminate the lived experiences of facilitators and technical practitioners involved in laboratory design. Qualitative methods are particularly suited for studies that emphasize meaning-making, context-sensitivity, and the identification of complex process dynamics rather than quantifiable variables.
I added this information: Combining qualitative methods (i.e., meaning-making and context-sensitive exploration) with quantitative data collection (i.e., pre- and post-training assessments) permitted an inclusive evaluation of educational processes and measurable outcomes.
I removed the following: The multi-stage, qualitative case study approach selected for this research was optimal for capturing real-world conditions under which cybersecurity laboratories evolve. Data collection employed multiple methods, literature analysis, structured document review, and direct observation. This process ensured data credibility and a multidimensional comprehending of pedagogical, andragogical, technological, and organizational conditions.
I added this information: The multi-stage, mixed methods case study incorporated action research cycles encompassing planning, intervention, participant observation, structured reflection, and revision of lab processes. Fieldwork included semi-structured interviews, participant observation in live training sessions, and structured document review of curriculum materials and resource deployments. Quantitative learning outcome data was collected at multiple stages using pre- and post-training assessments, and analyzed via descriptive statistics and comparative methods. Together, these processes ensured research credibility and a multidimensional understanding of educational, technical, and organizational factors.
I removed the following: In cybersecurity education, where curricula, infrastructure, policy, and learner behavior intersect, qualitative research offers the explanatory richness and depth that quantitative metrics may overlook.
I added this information: In cybersecurity education, where curricula, infrastructure, policy, and learner behavior intersect, a combination of field research methods (i.e., participant observation, interviews, and action cycles) offers explanatory richness and depth while quantitative metrics (test scores, completion rates) offer complementary evidence of measurable learning outcomes.
I removed the following: By foregrounding qualitative inquiry and a case-study design, this article ensures its findings are robust, relevant, and transferable to institutions seeking to enhance cybersecurity education.
I added this information: By employing a mixed methods inquiry built on action research principles, and emphasizing qualitative field methods and quantitative assessment within a case study structure, this article safeguards its findings are relevant, robust, and transferable to organizations seeking to amplify cybersecurity education.
I removed the following information: The case study design supported detailed engagement with a specific educational setting [21]. It enabled the exploration of how cybersecurity labs were conceptualized and adapted in response to technical challenges, stakeholder needs, and institutional constraints. By examining design phases, implementation barriers, and user feedback loops, the research surfaced why and how certain practices succeeded or required adjustment, insights crucial for model replication and scaling. In addition, quantitative pre- and post-training assessments provided direct measures of skill and knowledge gains, complementing the qualitative insights with objective evidence of learning impact.
I added this information: The action-oriented case study approach permitted all-encompassing encounter with the educational setting, incorporating real-time field observations, insightful sessions with participants, and regular cycles of evaluation and adjustment (action research). Design, implementation, and feedback phases were documented using observation protocols, session logs, and interview transcripts, with data thematically coded using qualitative data analysis software. Quantitative pre- and post-training assessments were administered and compared statistically, directly supporting evaluation of knowledge and skill acquisition.
I removed the following: The methodology combined qualitative sources to achieve a rich, contextualized perspective. Documents such as training materials, session logs, resource inventories, and curriculum guides were reviewed by the facilitator and researcher alongside observational data collected during pilot implementation cycles. Interviews with facilitators and technical staff offered additional perspectives on instructional design, platform functionality, and adaptation mechanisms. This multi-source triangulation revealed latent variables (i.e., such as communication challenges and adaptability under constraint) that influenced lab effectiveness.
I added this information: Integration of Multiple Data Sources: The methodology combined field methods (i.e., semi-structured interviews, direct participant observation, periodic facilitator focus groups, and document analysis of course materials and session logs) with quantitative assessment of pre- and post-training outcomes. Thematic coding of qualitative data and descriptive statistical analysis of learning assessments were triangulated to reveal both subtle and overt influences on laboratory effectiveness, improving the depth and reliability of findings.
I removed the following: This flexibility ensured responsiveness to unforeseen disruptions (e.g., infrastructure changes), providing relevant data on system resilience, instructional effectiveness, and the impact of real-world constraints.
I added this information: This flexibility ensured responsiveness to unforeseen disruptions (e.g., infrastructure changes), providing relevant qualitative and quantitative data on system resilience, instructional effectiveness, and the impact of real-world restrictions.
Added this section
Detailed Data Collection Procedures And Analysis Protocols:
Detailed Data Collection Procedures And Analysis Protocols: The detailed data collection procedures and analysis protocols were incorporated as a distinct subsection within the Materials and Methods to enhance methodological transparency and rigor. Semi-structured interviews were scheduled with both facilitators and laboratory participants at three strategic points in each pilot cycle: before the start of the laboratory, at the mid-point, and upon completion. Interview guides solicited participant perspectives on the instructional value, the relevance of laboratory experiences, encountered challenges, and suggestions for improvement. In addition to interviews, focus groups were convened at the end of each pilot session to promote collaborative reflection and to gather group-level insights on laboratory processes and learning culture.
Throughout all laboratory sessions, participant observation was conducted using a standardized observation checklist. This instrument was designed to systematically capture participant engagement, collaborative dynamics, problem-solving strategies, and the efficacy of facilitator interventions. Observers recorded both quantifiable ac-tions—such as the frequency of active participation and the resolution of technical tasks—and qualitative impressions of instructional resilience and learner adaptability. Session logs, resource inventories, and training artifacts were also collected for sup-plemental analysis.
Interview and focus group sessions were audio-recorded and transcribed verba-tim. All qualitative data, including observation notes, were uploaded into qualitative analysis software for inductive coding. Thematic codes were iteratively developed to identify patterns relating to instructional effectiveness, laboratory adaptability, and emergent barriers. To ensure reliability, subsets of the qualitative data were dou-ble-coded by two independent researchers, and discrepancies were resolved through consensus meetings. Member checking was employed by sharing preliminary findings with participants to validate interpretations and incorporate their perspectives.
The study followed explicit action research cycles: after each laboratory pilot, data from interviews, focus groups, and observations were rapidly analyzed and shared in debriefing sessions with facilitators and key participants. Feedback and reflective input were then used to refine the laboratory structure, update instructional materials, and adjust both assessment tools and research protocols before the next pilot iteration. This systematic cycle of planning, action, observation, and reflection ensured that both the laboratory environment and the research design remained adaptive, stakeholder-responsive, and continuously improved in alignment with evolving educational and technological demands.
Research Gaps
I removed the following: Scholarly literature offers a patchwork of case studies and isolated technical interventions, but lacks a comprehensive, scalable model adaptable to diverse institutional contexts, and pedagogical objectives [15] and andragogical objectives [16].
I added this information: Scholarly literature offers a patchwork of case studies and isolated technical interventions but lacks comprehensive written analyses and scalable models adaptable to diverse institutional contexts, pedagogical and andragogical objectives [15,16].
Case Study
I removed the following: Pre-training and post-training assessments were used to measure both knowledge gains and broader learning outcomes, while session observations and facilitator reflections provided qualitative process data.
I added this information: Pre-training and post-training assessments were used to quantitatively measure knowledge increases and broader learning outcomes, while session observations and facilitator reflections offered qualitative process data, permitting a comprehensive mixed methods evaluation.
I removed the following: Assessment mechanisms included quizzes, demonstrations, and applied system interaction tasks designed to measure practical capability acquisition.
I added this information: Assessment mechanisms included quizzes, demonstrations, and applied system interaction tasks designed to quantitatively measure real-world ability attainment and offer numerical evidence of learning.
Results
I removed the following: Qualitative assessment findings indicated that learners consistently exhibited gains in post-training evaluations, frequently surpassing their baseline performance by a significant margin.
I added this information: Mixed methods assessment findings indicated that learners steadily showed numerical noteworthy gains in post-training evaluations, recurrently exceeding their baseline performance by a significant margin. Quantitative results confirmed at least a 20% enhancement in test scores for all participants, while qualitative data contextualized these enhancements through observed behaviors and participant feedback.
Limitations
I removed the following: Limitations of this work are primarily rooted in resource constraints and the challenge of keeping laboratory infrastructure current with technological advancements.
I added this information: Limitations of this work comprise resource constraints, challenges in keeping laboratory infrastructure current with technological advancements, and the limited numerical generalizability associated with the sample size and lack of a control group for the quantitative assessments.
The Modular Adaptive Cybersecurity Laboratory Image
Comments 1: * There is a striking hallucination in Fig. 1. The wrong label and misspellings for the top rectangle are a telltale sign of hallucinations. Furthermore, Google does not find anything about [47]; i.e., it does not seem to exist in the Internet. There is also an empty reference [48], which is also not referenced in the text.
Response 1: The image was updated. Number 47. The Modular Adaptive Cybersecurity Laboratory Framework (MACLF) by Dr. S. L. Burton (2025) was created by me, Dr. Sharon L. Burton. It seems as though the paper was not read well. I have been doing this type of work for over 30 years.
Reference
Comments 1: * The reference [26] is impossible to find.
Response 1: Reference #26 is not possible to find. Here is the link. https://centreforfacdev.ca/virtual-simulation-an-educators-toolkit/
Comments 1: * Many references are misrepresented. For instance, [15] does not argue
that about a limitation; in fact, it argues the opposite in general.
Response1: Reference [15] is not about an argument, and not about a limitation. None of these references are misrepresented. I have checked them all.
Introduction
Comments 1: * The term andragogy should be explained. Upon checking [16], it simply
means adult learning, I think.
Response1: I defined and andragogy. Addressing this gap requires not just technological investment, but and andragogical (adult learning) vision, one that recognizes the lab as a mechanism for ongoing organizational learning.
Comments 1* The methodology is a little vaguely described. While I do not question the
adequacy of the methodology, I would suggest sharpening its description
rather than basically just saying that qualitative methods were used. For
instance, based on what I read in Section 5, it seems the methodology was
quite close to what is known as action research. Field methods and their
methodologies could be also mentioned specifically.
Response: Action research can be qualitative. I will add more definition. My focus was not on action research, it was a multi-stage, qualitative case study. Multi-stage qualitative case study is mentioned.
Comments 1* The methodology does not support the results in Section 10. In particular,
it is written that "learners consistently exhibited gains in post-training
evaluations, frequently surpassing their baseline performance by a
significant margin". How were the evaluations done? As I suspect some
quantification was present, a mixed methods setup would be needed.
Response: I updated the research from just qualitative to a mixed methods approach, combining qualitative components (interviews, observations, document analysis) with quantitative components (pre- and post-test score analysis, descriptive statistics).
Comments 1* The introduction notes that laboratories support cross-disciplinary
collaboration. I don't see any evidence about this claim. It is
also mentioned in the results section; i.e., the "laboratory has catalyzed
interdisciplinary collaboration, promoting shared projects and research
initiatives that span departments and academic units [36]". Now, it is
*the* laboratory, the particular laboratory you are studying, so why on earth
is there a reference here? And upon checking [36], it does not say
anything about laboratories or cyber security. In fact, even the
words do not appear in it.
Response: I changed the reference to [45] for the following sentence. According to this research team, virtual laboratories contribute to the improvement and progression of collaborative learning by incorporating mechanisms that facilitate communication and foster group awareness. Such environments permit consequential engagement between learners and instructors across diverse disciplines, thereby en-couraging the co-construction of knowledge and collective problem-solving. Nonethe-less, existing research underscores persistent limitations in the availability of structured tools to guide and regulate collaboration, highlighting critical directions for future investigation and design [45].
The following supporting paragraphs were added.
Institutional stakeholders reported ancillary benefits beyond student learning. The laboratory has catalyzed interdisciplinary collaboration, promoting shared projects and research initiatives that span departments and academic units [45]. According to this research team, virtual laboratories contribute to the improvement and progression of collaborative learning by incorporating mechanisms that facilitate communication and foster group awareness. Such environments permit consequential engagement between learners and instructors across diverse disciplines, thereby encouraging the co-construction of knowledge and collective problem-solving. Nonetheless, existing research underscores persistent limitations in the availability of structured tools to guide and regulate collaboration, highlighting critical directions for future investigation and design [45].
Another example is the work of the team that researched a multi-institutional cybersecurity lab platform designed for hands-on learning [46]. The work gives emphasis to interdisciplinary collaboration across IT, business, and e-commerce domains. It addresses the global cybersecurity workforce gap through outcome-based education and platform-driven training [46].
In 2025, Ohio University’s SecureAcademy Partnership exemplifies the integration of interdisciplinary collaboration and experiential learning within cybersecurity education [47]. Through its B.S. in Cybersecurity Operations program, the initiative merged technical, analytical, and operational competencies by engaging departments such as Communication, Business, and Engineering. The program featured immersive threat simulation laboratories, industry-recognized certifications, and gamified learning environments that mirror real-world scenarios. These elements were strategically designed to align with evolving industry expectations while fostering cross-disciplinary teaching and curricular innovation [47].
The Cybersecurity Field Training Experience (C-FTE), developed by the Commonwealth Cyber Initiative in 2025, addressed the critical shortage of cybersecurity professionals in Virginia, where over 51,000 job openings remain unfilled [48]. Recognizing that the current pipeline of degree-seeking candidates remains insufficient, the program strategically engaged students from nontraditional backgrounds (i.e., majors in IT, mathematics, and business) by equipping them with hands-on cybersecurity skills. The initiative pairs graduate student subject matter experts and project supervisors with undergraduate students, forming interdisciplinary teams that collaborate with industry and government partners. Graduate students continue to serve as technical mentors, guiding undergraduates through real-world project tasks while simultaneously cultivating their own leadership capabilities. This dual-impact model prepares undergraduates for entry-level cybersecurity roles and fosters the development of mid-level managerial competencies among graduate participants, thereby expanding and diversifying the cybersecurity workforce [48].
Comments 2* It is noted that in-person training was delivered, but then only video
lectures are listed in Subsection 5.1.1. Which was it?
Response: I explained the in-person training under 5.1 Case Studies. Directly below, I copied and pasted some of the documented information from the article. Please re-read the information.
A pilot laboratory initiative was conducted across five separate six-week sessions to evaluate design feasibility and instructional effectiveness. Each pilot enrolled 15 to 25 adult learners, ages 23 to 55, all of whom had prior technical or operational work experience (minimum three years) and represented diverse professional and vocational backgrounds. Pre-training and post-training assessments were used to quantitative-ly measure knowledge increases and broader learning outcomes, while session obser-vations and facilitator reflections offered qualitative process data, permitting a com-prehensive mixed methods evaluation. The in-person training delivered eight-hour sessions (with one-hour daily breaks) over multiple weeks, incorporating structured formative and summative assessments. Participants engaged in hands-on scenarios that involved system simulation, vulnerability testing, software use, classroom dia-logue, and procedural walkthroughs [23].
Comments 3* The virtual practice environment needs to be explained better. It is
unclear to me now whether the laboratory is a physical or virtual space.
If it is a virtual space, I would argue that it should be compared to a
physical laboratory; i.e., this limitation (if so) should be acknowledged.
Response: I added this information to explain a sandbox/ virtual practice environment. The sandbox, or virtual practice environment, refers to a computer-based simulation designed to facilitate the development and testing of specific knowledge, skills, and capabilities [51]. While sandbox environments have long been used in technical training and cybersecurity education, recent advancements have expanded their scope to include more complex, interdisciplinary competencies such as threat detection, ethical hacking, and strategic response planning. What distinguishes contemporary sandbox platforms is not their architecture. Still, the evolving nature of the skills they assess reflects the dynamic demands of the cybersecurity workforce and the increasing emphasis on experiential learning [51].
Reviewer 2 Report
Comments and Suggestions for AuthorsThis paper presents a modular framework for the design, implementation, and evaluation of cybersecurity laboratories in higher education. It responds to a well-documented gap between theoretical cybersecurity education and practical workforce readiness by proposing the Modular Adaptive Cybersecurity Laboratory Framework (MACLF). Grounded in experiential learning theory and supported by a multi-stage qualitative case study, the paper synthesizes institutional, pedagogical, and technological best practices to offer a scalable, flexible approach to lab design. It provides empirical findings from five pilot sessions involving adult learners, demonstrating measurable skill improvement, adaptability to infrastructural challenges, and institutional benefits such as interdisciplinary collaboration and curricular alignment.
However, several critical issues must be addressed before the manuscript can be considered for publication:
- Figure 1 (MACLF) is mentioned but not described in detail. Ensure the visual is clearly labeled, well-integrated into the text, and explained so that readers unfamiliar with the model can fully understand its structure and use.
- Phrases such as “marked improvement,” “heightened enthusiasm,” or “institutional synergy” require substantiation. Clarify what indicators were used to assess student engagement and institutional benefit. Were there surveys, interviews, or follow-ups with stakeholders?
- For applications of Cybersecurity Laboratory Design, Some ideas in the articles can be used as reference, including Real-time tracker of chicken for poultry based on attention mechanism-enhanced YOLO-Chicken algorithm, and Individualizing cybersecurity lab exercises with labtainers. Note that if you refer to relevant literature, please add citations appropriately.
- The article contains too many chapters. Please merge them as appropriate. Usually 5-6 chapters are appropriate.
- The last reference is missing.
Author Response
This paper presents a modular framework for the design, implementation, and evaluation of
cybersecurity laboratories in higher education. It responds to a well-documented gap between
theoretical cybersecurity education and practical workforce readiness by proposing the Modular
Adaptive Cybersecurity Laboratory Framework (MACLF). Grounded in experiential learning theory and supported by a multi-stage qualitative case study, the paper synthesizes institutional,
pedagogical, and technological best practices to offer a scalable, flexible approach to lab design.
It provides empirical findings from five pilot sessions involving adult learners, demonstrating
measurable skill improvement, adaptability to infrastructural challenges, and institutional benefits such as interdisciplinary collaboration and curricular alignment.
However, several critical issues must be addressed before the manuscript can be considered for
publication:
Reviewer Comments2: 1. Figure 1 (MACLF) is mentioned but not described in detail. Ensure the visual is clearly labeled, well-integrated into the text, and explained so that readers unfamiliar with the model can fully understand its structure and use.
Author’s Response: The article addresses Reviewer 2's comment by introducing the Modular Adaptive Cybersecurity Laboratory Framework (MACLF) in the conceptual framework section. Figure 1 is mentioned in the section introducing the conceptual framework with language such as, "See Figure 1 for the framework. I added the following information.
Figure 1 visually represents the MACLF, a cyclical and feedback-driven model designed to guide the development and continuous improvement of academic cybersecurity laboratories. At the core of MACLF are four interrelated stages, adapted from Kolb’s experiential learning cycle: concrete experience, reflective observation, abstract conceptualization, and active experimentation [51]. These stages are depicted as a continuous loop, underscoring that effective laboratory practice is inherently iterative; each learning cycle generates insights that inform the next phase of design or instruction.
Each part of the model plays a critical role in the instructional process. Specific experience implies learners’ direct engagement with hands-on lab activities, supported by modular technologies and adaptable environments. Reflective observation inspires participants and facilitators to systematically review these experiences, identifying strengths and areas for growth. Abstract conceptualization enables the synthesis of observations into updated curricular strategies or lab protocols. Finally, active experimentation puts these new concepts into practice through amended activities, scenarios, or assessments, closing the feedback loop and launching a new cycle of enhancement. The framework also visually integrates feedback channels (i.e., assessment results, participant feedback, and technological advancements) that saturate each stage, confirming the laboratory adapts to progressing threats and educational goals in a structured yet responsive manner.
For practitioners and institutions, MACLF’s modular approach allows for incremental implementation and adaptation, whether expanding lab capacity, integrating new technologies, or updating general education objectives. By plainly depicting the cyclical flow and feedback mechanisms, Figure 1 ensures that readers comprehend how all components collaboratively drive sustainable and effective laboratory innovation.
Reviewer Comments2: 2. Phrases such as “marked improvement,” “heightened enthusiasm,” or “institutional synergy” require substantiation. Clarify what indicators were used to assess student engagement and institutional benefit. Were there surveys, interviews, or follow-ups with stakeholders?
Author’s Response: This section has one of the reviewer comments: Implementation of the proposed cybersecurity laboratory framework has yielded a constellation of concrete and meaningful outcomes for academic institutions. Foremost among these was a marked increase in student engagement, observed through robust participation in experiential exercises, heightened enthusiasm for solving complex cybersecurity challenges, and a demonstrable improvement in practical competencies.
The comment marked increased was addressed with the following: Mixed methods assessment findings indicated that learners steadily showed numerically noteworthy gains in post-training evaluations, recurrently exceeding their baseline performance by a significant margin. Quantitative results confirmed at least a 20% enhancement in test scores for all participants, while qualitative data contextualized these enhancements through observed behaviors and participant feedback.
Author’s Response: The comment heightened enthusiasm was addressed with the following: Foremost among these was a noticeable increase in student engagement, evidenced by robust participation in experiential exercises, increased voluntary involvement in advanced problem-solving tasks, and quantifiable improvement in practical competencies as measured by pre- and post-training assessments, facilitator observation logs, and post-session surveys.
Reviewer Comments2:3. For applications of Cybersecurity Laboratory Design, Some ideas in the articles can be used as reference, including Real-time tracker of chicken for poultry based on a?ention mechanismenhanced YOLO-Chicken algorithm, and Individualizing cybersecurity lab exercises with labtainers. Note that if you refer to relevant literature, please add citations appropriately.
Author’s Response: Thank you for your comment. I appreciate your feedback. The request is not clear. I would welcome further clarification so I can address it as fully as possible.
Reviewer Comments2: 4. The article contains too many chapters. Please merge them as appropriate. Usually 5-6 chapters are appropriate.
Author’s Response: I satisfied some of the reviewer’s request by making 8. Conceptual Framework Critiques to be 7.1. I made 10 Discussion to be 9.1. Keep in mind that another reviewer asked me to add the heading: Materials and Methods
Note that I reviewed numerous journals and found these headings to be standard.
Reviewer Comments2: 5. The last reference is missing.
Author’s Response: All references are listed.
Reviewer 3 Report
Comments and Suggestions for AuthorsThe topic is timely and highly relevant to cybersecurity education. The manuscript attempts to offer a modular and scalable framework for lab development, drawing from experiential learning theory and case-based qualitative data.
(1) The repeated pairing of “pedagogical and andragogical” throughout the manuscript borders on excessive, often in places where one term would suffice. Be more selective. Clarify where adult learning principles are truly distinct from general educational strategies. Otherwise, simplify by using “instructional design” or “learning principles” where appropriate.
(2) While Kolb’s Experiential Learning Theory is introduced, the manuscript lacks critical application of this framework beyond summary. Figure 1 is referenced but not described in sufficient detail, and its visual or structural contribution to the framework is vague. Deepen the analysis. Clearly map each component of the lab framework to the Kolb cycle. Improve the figure and its explanatory text to strengthen the conceptual clarity.
(3) The case study is based on a single institution and one facilitator, which weakens the transferability of the conclusions. The manuscript does not sufficiently address this limitation or discuss how contextual factors might influence replicability. Clearly state these limitations and discuss possible differences in implementation across various higher education environments. Suggest directions for broader application or future comparative studies.
(4) The manuscript frequently cites “measurable gains,” “increased engagement,” or “improved competencies” without providing numerical or statistical evidence beyond anecdotal observation. Include clearer data presentation. Even within qualitative design, limited quantification can improve rigor.
(5) Revise for conciseness and clarity. Aim for precision over repetition. Avoid restating the same ideas in multiple sections.
(6) The “Originality of the Text” section makes ambitious claims but doesn’t adequately contrast this model with existing frameworks or tools such as NICE, cyber ranges, or Labtainers. Explicitly compare the MACLF with at least one or two existing models in the literature. Clarify what is actually new: modularity? integration of Kolb? facilitation protocol?
(7) Several citations (e.g., [16], [47]) are referenced repeatedly without adding new substance.
(8) The modular, scalable design approach is relevant and valuable for practical implementation. The connection to experiential learning is appropriate and strengthens educational value.
Comments on the Quality of English LanguageThe modular, scalable design approach is relevant and valuable for practical implementation. The connection to experiential learning is appropriate and strengthens educational value. The English could be improved to more clearly express the research.
Author Response
The topic is timely and highly relevant to cybersecurity education. The manuscript aempts to offer a modular and scalable framework for lab development, drawing from experiential learning theory and case-based qualitative data.
Reviewer Comments3: (1) The repeated pairing of “pedagogical and andragogical” throughout the manuscript borders on excessive, often in places where one term would suffice. Be more selective. Clarify where adult
learning principles are truly distinct from general educational strategies. Otherwise, simplify by
using “instructional design” or “learning principles” where appropriate.
Authors Response: I removed the following: This study aims to fill this gap by proposing a comprehensive model for lab construction and management that emphasizes adaptability, scalability, and integration with pedagogical plus andragogical goals.
I replaced the sentence with this information to add clarity. This study aims to fill the existing gap by proposing a comprehensive model for laboratory construction and management that emphasizes adaptability, scalability, and integration with instructional design strategies tailored to general educational (pedagogical) and adult learning (andragogical) goals. Specifically, pedagogical goals focus on structured, teacher-directed activities suitable for traditional undergraduate learners, while andragogical goals incorporate problem-centered, experience-driven principles essential for self-directed adult learners. By distinguishing between these approaches, the framework ensures that laboratory experiences are relevant and practical for a diverse student population.
Author’s Response Old Sentence: Scholarly literature offers a patchwork of case studies and isolated technical interventions but lacks comprehensive written analyses and scalable models adaptable to diverse institutional contexts, pedagogical and andragogical objectives [15, 16].
New Sentence: Scholarly literature offers a patchwork of case studies and isolated technical interventions but lacks comprehensive written analyses and scalable models adaptable to diverse institutional contexts, general education and adult learning objectives [15, 16].
Author’s Response: Where applicable, I changed pedagogical to general education, and changed androgogical to adult learning.
Reviewer Comments3: (2) While Kolb’s Experiential Learning Theory is introduced, the manuscript lacks critical application
of this framework beyond summary. Figure 1 is referenced but not described in sufficient detail, and its visual or structural contribution to the framework is vague. Deepen the analysis. Clearly map each component of the lab framework to the Kolb cycle. Improve the figure and its explanatory text to strengthen the conceptual clarity.
Authors Response: The following information was added in the Conceptual frameworks section. To deepen the application of Kolb’s Experiential Learning Theory within this cybersecurity laboratory framework, each stage of the Kolb cycle directly informs corresponding elements and processes of lab design and facilitation.
- Concrete Experience: In the MACLF, this stage is embodied by structured, hands-on laboratory activities and simulations that immerse participants in realistic cybersecurity scenarios. Learners engage in live system manipulation, guided attack-and-defense exercises, and immediate practical problem-solving.
- Reflective Observation: Following experiential activities, students and facilitators participate in debrief sessions, reflective journaling, and guided discussions. This stage is captured in the framework as scheduled feedback moments and analysis of individual and group performance, fostering critical examination of decisions, strategies, and outcomes.
- Abstract Conceptualization: Insights gained from reflection are systematically analyzed to generate actionable takeaways. In the MACLF, this means integrating learners’ reflections and assessment data into revised lab protocols, frameworks, and instructional strategies. This is operationalized through collaborative curriculum review, thematic data analysis, and continuous adaptation of learning objectives.
- Active Experimentation: Guided by what was learned and conceptualized, facilitators and students iteratively revise and apply new strategies in subsequent laboratory sessions. The framework channels these cycles into continual prototyping, testing of new tools or scenarios, and iterative improvement loops documented within the lab’s operational processes.
This explicit mapping ensures that the laboratory model functions as an ongoing learning cycle, with each session not only imparting technical skills but also cultivating adaptive thinking, self-assessment, and the capacity to integrate lessons learned into future practice. The cyclical and feedback-driven structure of Figure 1 visually and operationally mirrors Kolb’s theory, highlighting the centrality of iterative improvement and evidence-based evolution in the laboratory environment.
The Figure 1 Caption was changed from: Figure 1: Modular, adaptive framework for academic cybersecurity laboratory de-sign, highlighting core components and dynamic feedback integration.
The Figure 1 Caption was changed to: Modular Adaptive Cybersecurity Laboratory Framework (MACLF) mapped to Kolb’s Experiential Learning Cycle. Each quadrant corresponds to one of Kolb’s four stages, Concrete Experience, Reflective Observation, Abstract Conceptualization, Active Experimentation, systematically guiding lab activities and continuous curricular refinement.
The figure was updated.
Reviewer Comments3: (3) The case study is based on a single institution and one facilitator, which weakens the transferability of the conclusions. The manuscript does not sufficiently address this limitation or discuss how contextual factors might influence replicability. Clearly state these limitations and discuss possible differences in implementation across various higher education environments. Suggest directions for broader application or future comparative studies.
Reviewer Comments3: (4) The manuscript frequently cites “measurable gains,” “increased engagement,” or “improved
competencies” without providing numerical or statistical evidence beyond anecdotal observation.
Include clearer data presentation. Even within qualitative design, limited quantification can improve rigor.
Author’s Response: The phrase measurable gains is supported by a minimum of 20%. This phrase was added to the abstract and is in the article.
Author’s Response: The phrase measurable gains is supported by high engagement levels is supported by a minimum of 85%. This percentage was added to the abstract and is in the article.
Reviewer Comments3: (5) Revise for conciseness and clarity. Aim for precision over repetition. Avoid restating the same ideas in multiple sections.
Author’s Response: Updates were made.
Reviewer Comments3: (6) The “Originality of the Text” section makes ambitious claims but doesn’t adequately contrast this model with existing frameworks or tools such as NICE, cyber ranges, or Labtainers. Explicitlycompare the MACLF with at least one or two existing models in the literature. Clarify what is actually new: modularity? integration of Kolb? facilitation protocol?
This information was added to the originality of the text.
Labtainers deliver containerized cybersecurity labs focused on technical skill acquisition through parameterized scenarios. While effective for hands-on learning, they lack explicit alignment with adult learning theory or built-in curriculum integration. Traditional cyber ranges emphasize simulation fidelity and scalable network environments, often serving as standalone platforms for assessment and training. However, they typically do not embed feedback mechanisms, facilitator protocols, or direct integration with institutional learning objectives. The proposed MACLF is original in several respects:
- Modularity: It enables incremental lab construction and revision, allowing for flexible adaptation to technological advances and evolving educational needs.
- Integration of Kolb’s Experiential Learning Cycle: Each module deliberately maps core lab activities to Kolb’s four stages—concrete experience, reflective observation, abstract conceptualization, and active experimentation—embedding a continuous, evidence-based feedback loop into lab operations and curriculum design.
- Facilitation Protocol: MACLF incorporates a formal facilitation protocol that supports ongoing professional development, collaborative curriculum review, and stakeholder engagement, which are typically absent in NICE, Labtainers, and most cyber range implementations.
By explicitly contrasting these elements and situating MACLF within recent literature, the research demonstrates that the originality of the model lies in bridging pedagogical and andragogical best practices with technical and institutional scalability, thereby advancing practical, sustainable, and curriculum-integrated cybersecurity laboratory design.
Reviewer Comments3: (7) Several citations (e.g., [16], [47]) are referenced repeatedly without adding new substance.
Author’s Response: Any repeated reference was used because it was appropriate for the area.
Reviewer Comments3: (8) The modular, scalable design approach is relevant and valuable for practical implementation. The connection to experiential learning is appropriate and strengthens educational value.
Round 2
Reviewer 1 Report
Comments and Suggestions for AuthorsThe manuscript did improve substantially. Given the points about likely LLM use in the previous review round, I wonder what was the reason for it? Why use hallucinating models when knowing the risks involved? For now, I can only trust that the references and referencing are correct as I do not have time to verify them. Otherwise looks good.
Author Response
Reviewer’s Comments: The manuscript did improve substantially. Given the points about likely LLM use in the previous review round, I wonder what was the reason for it? Why use hallucinating models when knowing the risks involved?
Author’s Response: I appreciate the reviewer’s comments. Allow me to clarify this manuscript does not discuss or utilize hallucinating AI models. This researcher’s article mentions simulation-based learning environments and virtual laboratory platforms. These two are different from AI language models that show hallucination behaviors. The sandbox environments and virtualization technologies I mentioned are controlled, educational simulations devised for cybersecurity training; they are not generative AI models that produce probabilistic outputs, in other words, sampling from a distribution, allowing for variability and creativity). My focus is on structured and controlled environments for learniing. Generative AI models which are unpredictable is not my focus.
Reviewer’s Comments: For now, I can only trust that the references and referencing are correct as I do not have time to verify them. Otherwise looks good.
Author’s Response: The references have been re-checked and are accurate.
Reviewer 2 Report
Comments and Suggestions for AuthorsMy concerns have been addressed, and I appreciate the authors' careful attention to my previous comments and the substantial efforts put forth in revising the manuscript. I am pleased with the updated version, which now provides comprehensive insights and valuable conclusions.
Author Response
My concerns have been addressed, and I appreciate the authors' careful attention to my previous comments and the substantial efforts put forth in revising the manuscript. I am pleased with the updated version, which now provides comprehensive insights and valuable conclusions.
Author’s Response: Thank you for this message regarding the updates for this manuscript.
Reviewer 3 Report
Comments and Suggestions for AuthorsThe manuscript is a timely and relevant contribution to cybersecurity education, proposing a modular and scalable lab framework with good potential impact. Overall, the work is promising, but a few areas would benefit from refinement before publication:
(1) The phrase pedagogical and andragogical is used too often. It would help to simplify the wording and only highlight adult learning principles when they are truly distinct. Streamlining the language for conciseness and clarity will also improve the flow.
(2) Kolb’s Experiential Learning Theory is introduced, but the application could be made more explicit. In particular, Figure 1 would be stronger if each element of the framework was clearly mapped to Kolb’s cycle and described in more detail.
(3) Since the study is based on one institution and one facilitator, it would be good to acknowledge this limitation and briefly discuss how the framework might look in different contexts.
(4) Phrases like measurable gains and improved competencies should be backed up with a bit more data or clearer examples, even if limited. This will add credibility.
(5) The section on originality could be sharpened by contrasting this model with existing ones and pointing out what’s truly new-whether it’s the modularity, the integration of Kolb, or the facilitation process. Also, avoid repeating the same citations without adding fresh points.
Comments on the Quality of English LanguageThe English in the manuscript is generally understandable but could be improved to more clearly and precisely express the research. At times, sentences are overly long or repetitive, which may reduce readability. Refining the grammar, simplifying complex phrasing, and ensuring consistent use of technical terms would help the arguments flow more smoothly. A careful language edit would enhance clarity, making the theoretical framework, methodology, and findings easier for readers to follow.
Author Response
The manuscript is a timely and relevant contribution to cybersecurity education, proposing a modular and scalable lab framework with good potential impact. Overall, the work is promising, but a few areas would benefit from refinement before publication:
(1) The phrase pedagogical and andragogical is used too often. It would help to simplify the wording and only highlight adult learning principles when they are truly distinct. Streamlining the language for conciseness and clarity will also improve the flow.
Author’s Response: This researcher changed most of the phrasing regarding pedagogical and andragogical methodologies. I have further reduced the phrasing to twice from 10 times. Pedagogical and androgogical appears in the section, Abstract. Pedagogy and Andragogy appear in the section, Research Gaps.
Old sentence (Background Section): This study aims to fill this gap by proposing a comprehensive model for lab construction and management that emphasizes adaptability, scalability, and integration with Structured, teacher-guided instructional approaches plus adult learning goals.
Updated sentence (Background Section): This study aims to fill this gap by proposing a comprehensive model for lab construction and management that emphasizes adaptability, scalability, and integration with comprehensive learning objectives.
Old sentence (Research Gaps): Scholarly literature offers a patchwork of case studies and isolated technical interven-tions but lacks comprehensive written analyses and scalable models adaptable to di-verse institutional contexts, pedagogical, and andragogical objectives [15, 16].
Updated sentence (Research Gaps): Scholarly literature offers a patchwork of case studies and isolated technical interventions but lacks comprehensive written analyses and scalable models adaptable to diverse institutional and educational contexts [15, 16].
Literature Review:
Old sentence (Evolution of Cybersecurity Education): As a response to these deficiencies, the introduction of laboratory environments has become an increasingly vital pedagogical and andragogical strategy.
Updated sentence (Evolution of Cybersecurity Education): As a response to these deficiencies, the introduction of laboratory environments has become an increasingly vital learning methodology.
Literature Review: Multiple instances in sections 6.1 and 6.2
Old sentence (Literature Review): 6.1 Virtualization stands out as a cornerstone of contemporary lab design, offering dynamic network simulations without the cost and rigidity of physical infrastructure. Leveraging virtual machines and containerized environments enables the recreation of complex cyber ecosystems using minimal hardware, thereby maximizing resource efficiency plus pedagogical and andragogical relevance [28].
Updated sentence (Literature Review): 6.1 Virtualization stands out as a cornerstone of contemporary lab design, offering dynamic network simulations without the cost and rigidity of physical infrastructure. Leveraging virtual machines and containerized environments enables the recreation of complex cyber ecosystems using minimal hardware, thereby maximizing resource efficiency and educational effectiveness [28].
Old sentence (Literature Review): 6.2 A practical cybersecurity laboratory does not function in isolation; its value emerges through intentional alignment with curricular outcomes plus pedagogical and andragogical design.
Updated sentence (Literature Review): 6.2 A practical cybersecurity laboratory does not function in isolation; its value emerges through intentional alignment with curricular outcomes and comprehensive instructional design.
Old sentence (Discussoin): It could encourage the breakdown of silos, facilitating collaborative research projects, knowledge exchange, and shared pedagogical and andragogical strategies across departments.
Updated sentence (Discussion): It could encourage the breakdown of silos, facilitating collaborative research projects, knowledge exchange, and shared instructional approaches across departments.
Old sentence (Results): These findings contributed to a deeper comprehension of how pedagogical and andragogical intent, technical infrastructure, and institutional dynamics interact to influence the effectiveness and sustainability of cybersecurity laboratory environments.
Updated sentence (Results): These findings contributed to a deeper comprehension of how educational design, technical infrastructure, and institutional dynamics interact to influence the effectiveness and sustainability of cybersecurity laboratory environments
Old sentence (Conceptual Framework): This framework, created by Dr. S. L. Burton [47], incorporates modular design, virtualization, and continuous feedback, ensuring that the lab remains responsive to technological advancements as well as pedagogical and andragogical needs.
Updated sentence (Conceptual Framework): This framework, created by Dr. S. L. Burton [47], incorporates modular design, virtualization, and continuous feedback, ensuring that the lab remains responsive to technological advancements as well as evolving learning requirements.
Old sentence (Originality of the Text): For example, comprehensive surveys and guidance documents highlight the significance of modular, scalable lab environments that support hands-on, experiential learning and align with current industry standards, and pedagogical and andragogical objectives [18, 37].
Updated sentence (Originality of the Text): For example, comprehensive surveys and guidance documents highlight the significance of modular, scalable lab environments that support hands-on, experiential learning and align with current industry standards, and instructional frameworks [18, 37].
Reviewer’s Comments: (2) Kolb’s Experiential Learning Theory is introduced, but the application could be made more explicit. In particular, Figure 1 would be stronger if each element of the framework was clearly mapped to Kolb’s cycle and described in more detail.
Author’s Response (Conceptual Framework): I added the following: Each modular adaptive framework’s element directly corresponds to Kolb's experiential learning cycle. See Table 1 for the comparison.
Reviewer’s Comments: (3) Since the study is based on one institution and one facilitator, it would be good to acknowledge this limitation and briefly discuss how the framework might look in different contexts.
Author’s Response: This information was added to the section, Limitations.
The MACLF framework development and evaluation was conducted within a single institutional context with one primary facilitator, which may limit the generalizability of findings across diverse educational settings. Institutions with varying resource levels, technological infrastructure, or facilitator expertise may experience different implementation outcomes. Additionally, cultural, organizational, and instructional methodology differences across institutions could influence the framework's effectiveness and require contextual adaptations.
Reviewer’s Comments: (4) Phrases like measurable gains and improved competencies should be backed up with a bit more data or clearer examples, even if limited. This will add credibility.
Author’s Response:
Current sentence (Abstract): Case study data revealed measurable gains in participant competency, high engagement levels, and successful adaptation to logistical and technological barriers.
Updated Sentence (Abstract): Case study data revealed measurable gains in participant competency, with all participants achieving at least a 20% improvement in post-training test scores, high engagement levels demonstrated through consistent session attendance and active participation in hands-on exercises, and successful adaptation to logistical and technological barriers, including facility relocations and system downtime incidents.
“At least a 20% improvement, explains the two other areas that mention measurable gains.
Reviewer’s Comments: (5) The section on originality could be sharpened by contrasting this model with existing ones and pointing out what’s truly new-whether it’s the modularity, the integration of Kolb, or the facilitation process. Also, avoid repeating the same citations without adding fresh points.
Author’s Response: Several paragraphs were added to include two new references [51] and [52]. One reference [9] was used in a different manner.
This is the newly added information – Originality of the Text.
For example, a research team [9] offered a critique of cybersecurity labs that emphasize technical depth (i.e., the use of virtual machines and attack-defense scenarios) without sufficient attention to instructional design. They argue that such labs often prioritize high-fidelity simulations at the expense of learning structure, resulting in environments that replicate real-world complexity without effectively supporting learner engagement.
A separate example of an infrastructure-heavy approach can be found in the 2023 research by [52]. This initiative centers on simulating cyber-attacks on smart grids using virtual environments, with a strong emphasis on technical fidelity and system realism. The lab was designed to support various training and experimental scenarios through virtual machines and network configurations. However, the model offered limited integration with pedagogical frameworks or learner-centered design, reinforcing concerns that technically robust environments may lack the instructional scaffolding necessary to support diverse educational outcomes. As such, it exemplifies an isolated technical solution, prioritizing simulation capabilities while underemphasizing educational adaptability and engagement.
In 2025, the research team [51] examined how cybersecurity frameworks have been implemented across educational institutions, supporting a tailored security model. Their analysis reveals that many existing frameworks prioritize technical controls while overlooking educational alignment (the fit between cybersecurity practices and the diverse teaching and learning needs of different learners and contexts) [51]. This information places their work within the grouping of isolated technical solutions; it illustrates how a strong technical emphasis may undermine instructional effectiveness and contextual adaptability.
The MACLF's originality lies in three specific innovations that are lacking in existing models. First, existing models typically fall into two categories, technical approaches that prioritize simulation fidelity or educational frameworks that emphasize learning outcomes. This MACLF framework bridges both domains through explicit alignment with Kolb's experiential learning cycle, creating systematic learning grounding absent in either approach alone. Second, unlike static, institution-specific models, the MACLF introduces true modularity through component-based architecture that enables customization while maintaining educational effectiveness across diverse institutional contexts. Third, this framework pioneers embedded continuous feedback mechanisms that facilitate real-time adaptation, contrasting sharply with conventional models where assessment occurs post-implementation rather than being integrated into the design process.
Author’s Comments: The English in the manuscript is generally understandable but could be improved to more clearly and precisely express the research. At times, sentences are overly long or repetitive, which may reduce readability. Refining the grammar, simplifying complex phrasing, and ensuring consistent use of technical terms would help the arguments flow more smoothly. A careful language edit would enhance clarity, making the theoretical framework, methodology, and findings easier for readers to follow.
Author’s Response: The grammar was updated and the language was edited . Information connects to the context of the manuscript.