Next Article in Journal
Theoretical Vulnerabilities in Quantum Integrity Verification Under Bell-Hidden Variable Convergence
Previous Article in Journal
Evaluation of Anomaly-Based Network Intrusion Detection Systems with Unclean Training Data for Low-Rate Attack Detection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Learning to Hack, Playing to Learn: Gamification in Cybersecurity Courses

by
Pierre-Emmanuel Arduin
1,* and
Benjamin Costé
2
1
Université Paris-Dauphine—PSL, DRM UMR CNRS 7088, 75775 Paris, France
2
Airbus Cybersecurity, 35208 Rennes, France
*
Author to whom correspondence should be addressed.
J. Cybersecur. Priv. 2026, 6(1), 16; https://doi.org/10.3390/jcp6010016
Submission received: 29 September 2025 / Revised: 8 December 2025 / Accepted: 29 December 2025 / Published: 7 January 2026

Abstract

Cybersecurity education requires practical activities such as malware analysis, phishing detection, and Capture the Flag (CTF) challenges. These exercises enable students to actively apply theoretical concepts in realistic scenarios, fostering experiential learning. This article introduces an innovative pedagogical approach relying on gamification in cybersecurity courses, combining technical problem-solving with human factors such as social engineering and risk-taking behavior. By integrating interactive challenges into the courses, engagement and motivation have been enhanced, while addressing both technological and managerial dimensions of cybersecurity. Observations from course implementation indicate that students demonstrate higher involvement when participating in supervised offensive security tasks and social engineering simulations within controlled environments. These findings highlight the potential of gamified strategies to strengthen cybersecurity competencies and promote ethical awareness, paving the way for future research on long-term cybersecurity learning outcomes.

1. Introduction

Information systems security may be regarded through the scope of technical and externally centered solutions allowing the prevention of intrusions [1], the detection of denial of service attacks [2], and the strengthening of firewalls [3]. Industrial professionals as well as the academic literature are observing that a prevailing threat is neither technical nor external but comes from employees, insiders of organizations [4,5,6,7]. This insider threat can be intentional or unintentional, malicious or nonmalicious [8,9,10,11], leading practitioners, researchers, and teachers in higher education to consider cybersecurity as a group and social phenomenon [12].
While societies are still far from seeing all the consequences of the crisis caused by the COVID-19 outbreak, some authors draw up a “balance sheet” concerning the sudden shift on-line of their classes in higher education [13,14]. Authors such as [15] propose to measure user satisfaction and efficiency from a student’s point of view (factors influencing the preference for online classes, for example), while others, such as [16], analyze the results of a study on French students and show that students would like to keep 35% of their classes online after experiencing lockdown. For [17], it cannot be neglected that the COVID-19 outbreak affected the mental health of students. Ref. [18] have gone further by observing the effects of anxiety resulting from such a sudden switch online. If the issue of knowledge sharing is crucial in higher education, remotely in general but online in particular, it has been widely and for a long time discussed in the literature [19,20,21], including for cybersecurity [22,23].
Indeed, teaching cybersecurity cannot be reduced to teaching techniques of encryption/decryption, network analysis, or brute-force attacks. The intention individuals have to perform a behavior is notably determined by whether they think a group or other individuals support or condemn such a behavior [24]. This demonstrates then the “need to understand how not only individuals perceive cybersecurity risks, but also how they think other people perceive those risks” [12] (p. 2). We propose to go deeper in this article by presenting how an innovative group learning pedagogy relying on gamification influences the decision of learners to commit with interest in information system security classes requiring technical as well as social engineering techniques. This article aims to examine gamified approaches, such as Capture the Flag, in cybersecurity education, adopting a pedagogical perspective on managing both external and insider threats. Building on this perspective, the article introduces a pedagogical framework that leverages gamification to strengthen cybersecurity education. By combining technical problem-solving with human-centric dimensions, such as social engineering and risk-oriented decision-making, it aims to foster both technical proficiency and behavioral awareness in realistic learning scenarios.
In the second section of this article, we will present the background theory: First, crucial elements of information system security we wanted to share with our students: external and insider threats; pedagogy through gamification will then be introduced. In the third section of this article, we will present scenarios we designed for information system security classes with MSc students from the Paris-Dauphine—PSL university, as well as a discussion on limits and ethical implications. This article aims to share pedagogical scenarios to teach information systems security, as well as to defend that awareness, training, and higher education can benefit from considering another way to teach by learning to hack and playing to learn: Hack us if you can!

2. Theoretical Background

In this section, we first present crucial elements of information systems security, i.e., cybersecurity, that we wanted to share with our students: external and insider threats. Second, we outline gamification as a pedagogical approach.

2.1. Mastering Technology as a Prerequisite for Cybersecurity

As exposed in [25], an information system (IS) is not solely composed of technology devices but also includes humans as long as they contribute to the overall level of security of the global system. The IS is thus threatened on two sides: on the one hand, external attackers use their own resources to penetrate and then compromise the IS; on the other hand, internal users can break security intentionally or not. Moreover, the emergence of the Internet of Things (IoT) [26], Bring Your Own Device (BYOD) [27] and zero-trust models [28] reinforces the importance of considering security from both points of view.
Going further, the rise in new devices (smartphones, connected devices, home assistants, etc.) brings new habits and an evolving need for digital access from everywhere. Since we are nearly always digitally connected, our data are vulnerable at the same time. Protection of these new devices can be set up in the following three ways:
  • paid products from well-known editors such as anti-viruses, Virtual Private Networks (VPNs), firewalls, web protections, etc., which offer a high security level [29].
  • free products whose counterpart is their low efficiency. These ones can be free versions of well-known products or outsiders seeking a new marketplace. However, they can also be fake products that install adware and/or pop-ups to gather (or even steal) personal data as much as they introduce new vulnerabilities [30].
  • custom device configurations without any additional security software. They, however, require IS security awareness and deep knowledge of their technical aspects. For example, most people leave wireless interfaces open, spreading their data around and encouraging opportunistic attackers. These ones can be from newbies (often called ‘script kiddies’) to experts [31]. Depending on the skills of the attackers and their maliciousness, the device can be enrolled in a Bitcoin-mining (or Monero, or any cryptographic money) farm or will participate in a Distributed Denial-of-Service (DDoS) attack [32].
Each protection option has its own risks, and most users do not need to understand the underlying technical stuff, although they require appropriate security measures. However, computer science students need to understand what the risks of their decisions are, especially when they use their skills for a company, as long as their decisions affect several users. In addition, cybercriminals act differently when they target companies’ IS instead of personal devices. The course we proposed aims to help students to act appropriately to overcome personal or company-wide risks. With that in mind, we propose to focus on external threats through the following three main pedagogical objectives explained below:
  • Think as a cybercriminal—it is necessary to identify vulnerabilities and threats. Indeed, a criminal mind is not really intuitive, as it circumvents tools or habits to obtain what s/he is looking for. Phreakers used whistles to make free phone calls [33]; some hackers used optical LEDs to exfiltrate data from air-gapped computers [34] or even turned power supplies into speakers [34,35]. Devices may allow actions initially strictly unwanted by designers. A first objective of our course was to share such an idea with our students.
  • Compare yourself to successful cybercriminals—students need to analyze successful attacks to understand the techniques and strategies commonly employed by attackers.
  • Reconstruct the sequence of events—to find pieces of information about an attacker through students’ understanding of the criminalistic modus operandi.
A correct appreciation of risks and criminalistic procedures is obtained during this course. As they acquire new knowledge on cybersecurity and ways to hack, we observed an increase in students’ involvement, self-esteem, and enthusiasm. Nevertheless, for [36] notably there are several motivations for hacking that include a social element: hacking for prestige, hacking ideologically—known as hacktivism, and insider threats that may be motivated by revenge. While mastering technology is essential for building a strong foundation in cybersecurity, it is equally important to address the human dimension by managing individuals in the face of emerging threats.

2.2. Managing Individuals in the Face of Emerging Threats

Disturbingly, authors such as [10] seem so topical nowadays, more than thirty years after their publication. Indeed, the consequences of an attack that they highlighted in 1992 remain the same in 2025: (1) disclosure of bankable information; (2) modification or (3) destruction of sensitive information; and (4) denial of use by hindering access to resources. Particularly, these authors highlighted the existence of external as well as insider threats (Figure 1).
Delineating what is “considered right” is a crucial issue when managing IS security. Ref. [37] proposed the concept of “ethical work climates” to define the perception employees have of institutionalized values, policies and reward systems. Such ethical work climates have been shown to influence information security policy noncompliance [38]. For example, some social cues lead to internalized norms and a kind of social censure [39,40], but it is possible that “by a subtle alchemy, the delinquent moves himself into the position of an avenger and the victim is transformed into a wrong-doer” [41] (p. 668) through what [41] call “neutralization techniques”.
Ref. [42] noted that social influence strongly impacts end-user security-related intentions. For [24], the intention to perform a desired behavior depends on the perception you have of whether others will support or condemn such behavior. Ref. [12] (p. 2) insist when they argue that it is important to understand not only how “individuals perceive cybersecurity risks”, but also “how they think other people perceive those risks”. Should I share my doubts about this suspicious email with my colleagues? Should I trust without asking this assistant urging me to send a file? Authors such as [43,44,45] showed that cognitive and cultural biases may be identified and activated maliciously to influence security-related intentions [43]. Insider threats can be categorized through two dimensions: (1) the intentional character of the threat and (2) its maliciousness [7,46]. For employees who are users of an IS and may as such constitute an entry point into the system, insider threats can be (i) unintentional, described as wrong actions from inexperienced, negligent or influenced employees; (ii) intentional and non-malicious, described as deliberate actions from employees winning a benefit but without a desire to cause harm; and (iii) intentional and malicious, described as deliberate actions from employees with a desire to cause harm [8]. These can also be described as follows:
  • Unintentional: wrong actions from inexperienced, negligent or influenced employees; for example, inattentive clicks, input errors, accidental deletions of sensitive data, etc. [47].
  • Intentional and non-malicious: deliberate actions from employees winning a benefit but without a desire to cause harm; for example, deferring backups, choosing a weak password, leaving doors open during sensitive discussions, etc. [48].
  • Intentional and malicious: deliberate actions from employees with a desire to cause harm; for example, divulging sensitive data, introducing malicious software, etc. [49].
The course we proposed integrates technological aspects of cybersecurity, as presented in Section 2.1, but also group and social aspects of cybersecurity and relies on these three categories of insider threats. Indeed, as concluded by [50], there is a need to train people for counteracting human deception in social engineering attacks. A social engineer is an attacker who targets a legitimate user from whom they obtains some direct (rights of access, harmful link visited, etc.) or indirect (vital information, relationship of trust, etc.) means to get into the system [33]. The proposed scenarios and pedagogical content discussed in Section 3.3 show notably how our students learned to think as social engineers.
Building on the technological considerations and the human-centric management strategies discussed earlier (Section 2.1 and Section 2.2, respectively), the next section explores gamification as a pedagogical lever in cybersecurity education. By combining technical skills with behavioral insights, gamified approaches offer an engaging way to reinforce threat awareness in realistic scenarios.

2.3. Gamification as a Pedagogical Lever in Cybersecurity Education

For [51], there are four levels of learning within organizations: (1) individual, intuition and subconscious processes relying on perception [52]; (2) teams, social processes of interaction with others through forums [53]; (3) organizational, changes in the organization’s routines due to new knowledge [52]; and (4) inter-organizational, strategic alliances with other organizations to acquire new knowledge [54].
From traditional instructional teaching methods to Massive Open Online Courses (MOOCs), the spectrum of ways to share knowledge and to teach is rather large. If MOOCs provide pedagogical content online and are presented as empowering individuals to learn independently at their own place [55], it should be reminded that the intention to stay in the course needs to be better monitored and understood, particularly during the COVID-19 outbreak [56].
While learning relies on knowledge sharing, it faces some difficulties within organizations, notably due to, first, a lack of employee willingness to share knowledge; second, a lack of employee awareness of knowledge-sharing procedures; and third, a lack of trust in colleagues [57]. In the context of education in general and higher education in particular, the literature criticizes the disengagement induced by traditional instructional methods [58,59], which are neither sharing practical knowledge nor challenging the students [60,61]. Relying on cognitive load theory, ref. [62] explain that examples of ineffective teaching are found when learners are required to unnecessarily mentally integrate disparate sources of information. Cognitive load theory analyzes the ways in which individuals focus and use their cognitive resources during learning and problem solving [63,64]: Information must be mentally integrated before the learning process begins, and [62] (p. 331) presents experiments they made in the early 1990s. Their conclusions corroborate the aspirations of game-based learning: (1) restructure information sources into integrated formats, (2) rely on working examples, and (3) mix a large number of working examples with conventional problems. “I hear and I forget, I see and I remember, I do and I understand”: This sentence, attributed to Confucius, in the second century BC, illustrates the idea of “learning by doing” formalized by [65] in the 1980s. Such an idea of game-based learning leads not only to improved critical thinking and practical solving skills [66], but also—and maybe most important—game-based learning makes the taught thing more interesting [67]. We are then quite far from the idea of “cognitive frustration and learning” proposed by [20] in the early 1970s.
Indeed, authors such as [68] observed cybersecurity teaching experiences in escape-the-room environments. He humbly concluded that these experiences may work—or not—but have always led the participants to enjoy the moment. When satisfied, individuals are more likely to be committed within organizations [69], and a quite similar phenomenon has been observed here. Ref. [70] identified three important components of game-based learning when designing an escape game for hardware security teaching: (1) the importance of the scenario, (2) the posture of the teacher, and (3) the need for a debriefing. Ref. [71] proposed a systematic review of game-changing frameworks for cybersecurity training and awareness, whereas ref. [72] relied on action design research to propose game-based learning artifacts for cybersecurity processes. Ref. [73] considered both the online and international cases. Security gamification is defined as the application of “game-like design artifacts and system processes to strengthen employees’ motivations to encourage learning, efficacy, and increased employee compliance with organizational security initiatives” [74] (p. 131). Gamification of teaching in the case of cybersecurity appears then to make sense regarding [75]’s request for “new methods facilitating collective appreciation of security objectives” [76]. Even if such a question has already been partially tackled in the late 1990s with the rise in hackathons, i.e., intensive computer-programming events [77]. A study realized by [78] concludes that people participate in hackathons for learning (86%), networking (82%), or for advancing social change (38%). Indeed, hacking code to achieve a kind of “social betterment” is one of the origins of hackathons for these authors. This has also been crucial in the design of our gamified cybersecurity course.

3. Design and Implementation of a Gamified Cybersecurity Course

Instead of training security experts, the course we propose aims to give a basic understanding of the most common security concepts through the security lifecycle: identification, protection, detection, response, and then recovery [79]. For each part, a brief description of problems encountered by security experts is presented, relying on our own experiences or on security incidents reported in newspapers. The basics of cryptography are also taught, as they are an unmissable part of cybersecurity knowledge. Every concept presented is approached from both attacker (also known as “red team”) and defender (also known as “blue team”) sides with respect to pedagogical objectives 1 and 3 (cf. Section 2.1) as well as for ethical reasons (see Section 3.4).
The course is organized as follows:
  • Three theoretic lessons about information systems, social engineering, insider threats, security lifecycle and cryptography.
  • Three practical lessons or tutorials split into two themes: external and insider threats. These will be described in Section 3.1 and Section 3.2, respectively, and rely on Section 2.1 and Section 2.2, respectively. After having chosen a theme, students focus on it.
  • A gamified project detailed in Section 3.3 and relying on Section 2.3.
The course materials are available from the corresponding author upon reasonable request.

3.1. From Technology Mastery to the Management of External Cyber Threats

After theoretic courses, students are divided into groups for tutorial classes. Each practical lesson relies on a scenario where students are given a different security role with respect to pedagogical objectives: tutorial 1—cybercriminal, tutorial 2—cyber threat analyst, and tutorial 3—incident handler. At the end of each lesson, students have to write a report describing the methodology they followed. As long as students are not assumed to be security experts at the beginning, their awareness about security issues is privileged over technical skills. This has an influence on the lessons’ design: the first scenario only needs web investigation, the second needs an understanding of technical words, and the third requires basic technical skills.

3.1.1. Tutorial 1—Let Us Hack It!

In the first scenario, students should imagine a path to compromise a common network, depicted in Figure 2. Their attack must respect the MITRE ATT&CK framework [80] describing an attacker’s methodology. Experts use this framework when they detect an attack [81], and the examples presented on the MITRE’s website (https://attack.mitre.org/matrices/enterprise/, accessed on 1 September 2025) lead students to discover several attackers’ methodologies.

3.1.2. Tutorial 2—Know the Attackers

In the second scenario, students are given the role of experts in Cyber Threat Intelligence (CTI). Each group has to analyze a different threat report, summarize the methodology employed by attackers, and propose countermeasures.

3.1.3. Tutorial 3—Find the Cat

The third scenario is far more technical. Students must retrieve pieces of information about a kidnapper using digital forensics and incident response skills. In this scenario, the cat of their manager has been kidnapped (see Figure 3), and they have to find the location of the kidnapper and the secret data encrypted by him. Students received three files: encrypted secret data and two demands of ransom for the cat and the data.
Each file represents a different challenge: (i) the cat ransom’s file holds the probable location of the kidnapper, hidden in metadata; and (ii) the second ransom file contains the secure password that is needed to decrypt the secret data (also known as the third file) and that cannot be guessed nor brute-forced.

3.2. From Individual Management to Awareness and Mitigation of Insider Threats

Students having chosen to join the tutorial classes on insider threats are also divided into groups. Each practical lesson relies on a specific case of insider threat as mentioned in Section 2.2: Tutorial 1—unintentional, Tutorial 2—intentional and non-malicious, and Tutorial 3—intentional and malicious. Each tutorial is concluded with a report written by the students and assessed by the teacher from one lesson to the following.

3.2.1. Tutorial 1—Unintentional Insider Threats

The first tutorial is composed of four parts: (i) The students are confronted with phishing emails in an experiment room with copy-protection walls (see Figure 4); they have no information about the emails’ truthfulness and have to indicate the areas and elements leading them to trust or distrust the emails. (ii) The results of the experiment are discussed with the teacher in a lecture hall, particularly the most trust-eliciting and distrust-eliciting areas (see Figure 5). (iii) The students have to propose a “trust-optimized” phishing email relying on the results of the experiment and explain their choices in a written report. (iv) The works are finally presented and discussed.

3.2.2. Tutorial 2—Intentional and Non-Malicious Insider Threats

During the second tutorial, the class is divided into two groups: (A) the paranoiac Chief Information Security Officers (CISOs) group and (B) the smart and lazy employees group, who are specialists in workarounds to mitigate their efforts. As the reader may have understood, group (A) represents CISOs deploying inadequate information security policies, whereas group (B) represents employees working around such inadequate information security policies and creating intentional and non-malicious insider threats; e.g., frequent password changes may lead to post-it notes with written passwords, complex backup procedures may lead to deferred backups, etc. This second tutorial is composed of three parts: (i) a preparation phase during which the students in group (A) prepare rough and complex information security policies, whereas students in group (B) anticipate such inadequate policies and how they could work around them; (ii) a battle phase during which students from group (A) advance their policies and students from group (B) counter them with workarounds; (iii) a debriefing phase where the policies and workarounds are discussed.

3.2.3. Tutorial 3—Intentional and Malicious Insider Threats

The third and last tutorial focuses on one of the greatest fears of CISOs: employees with privileged access and intimate knowledge of processes becoming intentional and malicious insider threats. During this tutorial, the groups of students work following a procedure of design thinking theory, the DKCP (Define, Knowledge, Concept, Project) method [82], to imagine and design innovative forthcoming practices where users are or become intentionally and maliciously attackers. After the definition and knowledge collection phase, the students connect their ideas to propose new concepts of attacks coming from intentional and malicious insiders and try then to describe the procedure as a project. At the end of the tutorial, these procedures are presented, as well as the design process leading to them. A debriefing follows, where groups are asked to think on potential countermeasures to the identified threats.

3.3. Capture the Flag as a Gamified Learning Scenario

The final part of this course is designed as a Capture The Flag (CTF) challenge, as notably discussed by [68] (cf. Section 2.3). Students have to find a part of the final exam that has previously been hidden in the university. The challenge was divided into several parts; some were common to all groups, and some depended on the group. Each year, about 50 students follow this course. They were randomly assigned to a group, and several parts of the final exam, i.e., several flags, were hidden in the university. So that not all groups were seeking a unique flag, but two or three groups were seeking the same flag. We observed that some groups captured more than one flag.

3.3.1. Part 1—Hacking a Laptop (Common to All Groups)

In the first part, a laptop was given to each group of students. The laptop was locked, and they had no additional clue nor support from us. Like in a real-life attack, they had to unlock the session in order to find clues about the exam location. As presented in Section 2.3, three components are important for game-based learning: (1) the importance of the scenario, (2) the posture of the teacher, and (3) the need of a debriefing [70]. As the reader can guess, the importance of the scenario was crucial here, as was the posture we adopted when asking students to hack us—if they can. We were very clear from the beginning with them by explaining that the final exam will be very, very difficult and that they could find parts of it we had hidden in the university by discovering clues after unlocking the laptop. Of course they knew that they would be assessed on the method they employed to do so. Finally, the groups had to prepare a debriefing on the threats they exploited to unlock the laptop.
Several hints were left, like, for example, an X (ex-Twitter) account number hidden in a fake barcode we prepared (Figure 6a). Once the session had been unlocked, a steganalysis of the local user’s files was needed from the students to reveal several URLs depending on the group number. Then starts the second phase: The virtual-game-based learning, teaching technological aspects of cybersecurity (cf. Section 2.1), becomes a real-game-based learning, teaching social aspects of cybersecurity (cf. Section 2.2).

3.3.2. Part 2—Deceiving Individuals (Different from One Group to Another)

The URLs discovered in the first part give access to different challenges depending on the group and lead students to search parts of the final exam at the university by giving them room numbers. For some groups, hidden HTML code contained information about an office number. For others, the URL asked for an “administrator login”, which was quite common to guess. Some groups had to open assistant offices locked with codes or keys (Figure 6b). Others had to discover our birth dates as passwords to open lecture rooms. Then a sheet of paper was hidden in the office or the lecture room (behind the door, behind a fire instructions poster, on the desktop of the assistant, etc.). Students had to exploit insider threats (cf. Section 2.2) by leading academic or administrative staff to open them a door. Time was limited since the end of the first part, and they had to prepare a debriefing on the threats they exploited to unlock the door and obtain their flag that was a part of the final exam.

3.4. Discussion, Limits and Ethical Implications

We observed that interaction during and outside of classes, either intra-group in tutorials or inter-group in debates, has been significant. Leading our students to hack us—their teachers—is a real challenge. In this information system security course, we observed the enthusiasm and interest that such an exercise had on students’ appetite for mastering technology and for understanding risky behaviors.
We even observed students sharing their responses or using social engineering techniques on other students or administrative staff (who preliminarily gave us their consent to participate but were not informed about details of the exercise). We can say that they enjoyed the moment, as mentioned by [68] (cf. Section 2.3). Such well-being of students made them more likely to be committed to our course, as has been discussed by [69], who bridged job satisfaction and organizational commitment.
The proposed teaching approach relies on “learning by doing” [65] (cf. Section 2.3). Indeed, authors such as [83] already called for a better defense of Internet users by informing them about attackers’ tactics. For others, such as [84], individuals often suffer from engaging in effortful and careful thinking, making them more vulnerable to threats such as online weapons of influence [12,85].
It cannot be neglected that a global reliance on the Internet as a strategic tool would cause reputational harm, management distraction, and more damage than a simple financial loss [86]. For authors such as [87] (p. 1128): “the academic community must endeavor to work with all parties to offer ongoing best practice”. Raising awareness and providing training on information systems security threats to students—and even to schoolchildren [88,89,90]—has become more crucial than ever before.
Authors such as [21] studied the impact of the jigsaw method or cloud platforms on learning through group collaboration. We argue that such a group collaboration has been observed as a key factor to increase students’ commitment, enthusiasm, and well-being, particularly following times of pandemic that induced so much suffered social isolation.
Actually, we observed with final course feedback questionnaires that the students were significantly satisfied by the course. They appreciated the challenge (“it was particularly interesting to find the clues on our own”) and the technical training (“maybe go a little deeper into what is going on the network, ports, etc.”). The overall group appreciated the continuous assessment (“it is an evaluation method that suits me”). It should be noted that the methodology presented here does not yet include extensive empirical data or detailed evaluation metrics. The questionnaire used in this study was designed to collect qualitative feedback only, which does not allow for statistical analysis within the scope of this research. Future iterations of the course now include questionnaires with quantitative components, and their results will be analyzed in future research.
The observed performance reversal between offline and online evaluation requires further investigation. Future work will incorporate quantitative analysis and address ethical dimensions such as fairness, stress management, and accessibility. Prior research has shown that offline and online evaluations can yield contradictory results, calling for methodological prudence [91], while ref. [92] highlighted the existence of distinct biases in these two evaluation modes.
The risk of dual use of this research and pedagogical approach cannot be neglected [93,94]. Even if there is a risk of attackers using the material presented in this article as a “how to” guide, we consider that benefits in terms of training and future training material outweigh the risks. Indeed, we argue that the training of students, employees, and citizens remains one of the most effective defenses.
Teaching offensive cybersecurity skills, such as system exploitation and vulnerability analysis, definitively raises important ethical questions. The academic literature emphasizes the need for clear boundaries between legitimate educational objectives and potential misuse of acquired skills [95]. Recent studies advocate integrating ethics and legal frameworks into curricula to ensure responsible practice [96]. Furthermore, research highlights that students’ perception of ethics improves when reflective activities are embedded in hands-on exercises [97]. Establishing institutional policies and designing controlled environments are considered essential to mitigate risks while fostering professional competence [98].
In addition, gamification hides consequences of security abuse of real systems (including psychological consequences on employees). We balance such effects by law reminders (students are responsible for their actions), or we encourage students to put studied attacks into perspective by using them as a basis for countermeasures such as detection systems. In this way, we ensure that students focus on defensive rather than offensive aspects.

4. Conclusions

In this article, we propose to share pedagogical scenarios, strengthening the idea that awareness, training, and higher education could benefit from considering another way to teach information systems security by learning to hack, playing to learn: hack us if you can!
In the second section, we have presented external and insider threats to information system security, as well as innovative pedagogy initiatives such as gamification. In the third section, we have presented an innovative pedagogical approach, and a discussion on the limits and ethical implications of such a “hack us if you can” group learning pedagogy.
The initial approach was qualitatively assessed through feedback questionnaires. Further improvements are underway, supported by quantitative evaluations, to keep the course aligned with the ongoing evolution of cybersecurity and pedagogical practices.
As mentioned in this article, the course materials are available from the corresponding author upon reasonable request. While broader dissemination is desirable to foster portability and adaptation in other academic or corporate contexts, unrestricted online access could compromise the integrity of future learning activities by exposing students to solutions in advance. Therefore, controlled sharing also remains essential to preserve the pedagogical value of the exercises.

Author Contributions

Conceptualization, P.-E.A. and B.C.; investigation, P.-E.A. and B.C.; resources, P.-E.A. and B.C.; writing—original draft preparation, P.-E.A. and B.C.; writing—review and editing, P.-E.A. and B.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. The course materials are available from the corresponding author upon reasonable request. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors thank their colleagues for discussions during conferences on earlier versions of this work. They warmly thank the students who participated and kept controversy open. They also thank the anonymous reviewers for their helpful comments and guidance.

Conflicts of Interest

Author Benjamin Costé was employed by the company Airbus Cybersecurity but this work is personal. Views and opinions expressed are solely the ones of the authors and do not reflect the opinions of their employers or their affiliates.

References

  1. Hansen, J.V.; Lowry, P.B.; Meservy, R.D.; McDonald, D.M. Genetic programming for prevention of cyberterrorism through dynamic and evolving intrusion detection. Decis. Support Syst. 2007, 43, 1362–1374. [Google Scholar] [CrossRef]
  2. Zhi-Jun, W.; Hai-Tao, Z.; Ming-Hua, W.; Bao-Song, P. MSABMS-based approach of detecting LDoS attack. Comput. Secur. 2012, 31, 402–417. [Google Scholar] [CrossRef]
  3. Ayuso, P.N.; Gasca, R.M.; Lefevre, L. FT-FW: A cluster-based fault-tolerant architecture for stateful firewalls. Comput. Secur. 2012, 31, 524–539. [Google Scholar] [CrossRef]
  4. Hassandoust, F.; Techatassanasoontorn, A.A.; Singh, H. Information Security Behaviour: A Critical Review and Research Directions. In Proceedings of the European Conference on Information Systems, ECIS 2020, Online, 15–17 June 2020. [Google Scholar]
  5. Sasse, M.A.; Brostoff, S.; Weirich, D. Transforming the ‘weakest link’—A human/computer interaction approach to usable and effective security. BT Technol. J. 2001, 19, 122–131. [Google Scholar] [CrossRef]
  6. Vroom, C.; Von Solms, R. Towards information security behavioural compliance. Comput. Secur. 2004, 23, 191–198. [Google Scholar] [CrossRef]
  7. Willison, R.; Warkentin, M. Beyond Deterrence: An Expanded View of Employee Computer Abuse. MIS Quartely 2013, 37, 1–20. [Google Scholar] [CrossRef]
  8. Arduin, P.E. Insider Threats; John Wiley & Sons: Hoboken, NJ, USA, 2018. [Google Scholar]
  9. Leach, J. Improving user security behaviour. Comput. Secur. 2003, 22, 685–692. [Google Scholar] [CrossRef]
  10. Loch, K.D.; Carr, H.H.; Warkentin, M.E. Threats to information systems: Today’s reality, yesterday’s understanding. MIS Q. 1992, 16, 173–186. [Google Scholar] [CrossRef]
  11. Warkentin, M.; Willison, R. Behavioral and policy issues in information systems security: The insider threat. Eur. J. Inf. Syst. 2009, 18, 101–105. [Google Scholar] [CrossRef]
  12. McAlaney, J.; Benson, V. Cybersecurity as a social phenomenon. In Cyber Influence and Cognitive Threats; Academic Press: Cambridge, MA, USA, 2020; pp. 1–8. [Google Scholar]
  13. Huang, R.; Tlili, A.; Chang, T.W.; Zhang, X.; Nascimbeni, F.; Burgos, D. Disrupted classes, undisrupted learning during COVID-19 outbreak in China: Application of open educational practices and resources. Smart Learn. Environ. 2020, 7, 19. [Google Scholar] [CrossRef]
  14. Mohmmed, A.O.; Khidhir, B.A.; Nazeer, A.; Vijayan, V.J. Emergency remote teaching during Coronavirus pandemic: The current trend and future directive at Middle East College Oman. Innov. Infrastruct. Solut. 2020, 5, 72. [Google Scholar]
  15. Nariman, D. Impact of the Interactive e-Learning Instructions on Effectiveness of a Programming Course. In Proceedings of the Complex, Intelligent and Software Intensive Systems, Proceedings of the 14th International Conference on Complex, Intelligent and Software Intensive Systems (CISIS-2020), Lodz, Poland, 1–3 July 2020; pp. 588–597. [Google Scholar]
  16. Adam-Ledunois, S.; Arduin, P.E.; David, A.; Parguel, B. Basculer soudain aux cours en ligne: Le regard des étudiants. In Le Libellio d’Aegis; AEGIS: Paris, France, 2020; Volume 16—Série Spéciale Coronam, Semaine 5; pp. 55–67. Available online: http://lelibellio.com/ (accessed on 28 December 2025).
  17. Lee, J. Mental health effects of school closures during COVID-19. Lancet Child Adolesc. Health 2020, 4, 421. [Google Scholar] [CrossRef] [PubMed]
  18. Baloran, E.T. Knowledge, Attitudes, Anxiety, and Coping Strategies of Students during COVID-19 Pandemic. In Loss and Trauma in the COVID-19 Era; Routledge: Oxfordshire, UK, 2020; pp. 1–8. [Google Scholar]
  19. Burian, S.; Horsburgh, J.; Rosenberg, D.; Ames, D.; Hunter, L.; Strong, C. Using Interactive Video Conferencing for Multi-Institution, Team-Teaching. Proc. ASEE Annu. Conf. Expo. 2013, 23, 1. [Google Scholar]
  20. Cangelosi, V.E.; Usrey, G.L. Cognitive Frustration and Learning. Decis. Sci. 1970, 1, 275–295. [Google Scholar] [CrossRef]
  21. Chang, W.L.; Benson, V. Jigsaw teaching method for collaboration on cloud platforms. Innov. Educ. Teach. Int. 2022, 59, 24–36. [Google Scholar]
  22. Micco, M.; Rossman, H. Building a cyberwar lab: Lessons learned: Teaching cybersecurity principles to undergraduates. In Proceedings of the 33rd SIGCSE Technical Symposium on Computer Science Education, Cincinnati, KY, USA, 27 February–3 March 2002; pp. 23–27. [Google Scholar]
  23. Serik, M.; Tleumagambetova, D.; Tutkyshbayeva, S.; Zakirova, A. Integration of Cybersecurity into Computer Science Teachers’ Training: A Systematic Review. Int. J. Eng. Pedagog. 2025, 15, 57. [Google Scholar]
  24. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar]
  25. Reix, R. Systemes d’Information et Management des Organisations; Vuibert: Paris, France, 2000. [Google Scholar]
  26. Atzori, L.; Iera, A.; Morabito, G. The Internet of Things: A survey. Comput. Netw. 2010, 54, 2787–2805. [Google Scholar] [CrossRef]
  27. Thomson, G. BYOD: Enabling the chaos. Netw. Secur. 2012, 2012, 5–8. [Google Scholar] [CrossRef]
  28. Ward, R.; Beyer, B. BeyondCorp: A new approach to enterprise security. login Mag. Usenix Sage 2014, 39, 6–11. [Google Scholar]
  29. Ahvanooey, M.T.; Li, Q.; Rabbani, M.; Rajput, A.R. A Survey on Smartphones Security: Software Vulnerabilities, Malware, and Attacks. Int. J. Adv. Comput. Sci. Appl. (IJACSA) 2017, 8, 30–45. [Google Scholar] [CrossRef]
  30. Wu, F.; Narang, H.; Clarke, D. An Overview of Mobile Malware and Solutions. J. Comput. Commun. 2014, 2, 8–17. [Google Scholar] [CrossRef]
  31. Barber, R. Hackers Profiled—Who Are They and What Are Their Motivations? Comput. Fraud Secur. 2001, 2, 14–17. [Google Scholar] [CrossRef]
  32. Antonakakis, M.; April, T.; Bailey, M.; Bernhard, M.; Bursztein, E.; Cochran, J.; Durumeric, Z.; Halderman, J.A.; Invernizzi, L.; Kallitsis, M.; et al. Understanding the Mirai Botnet. In Proceedings of the 26th USENIX Security Symposium (USENIX Security 17), Vancouver, BC, Canada, 16–18 August 2017; pp. 1093–1110. [Google Scholar]
  33. Mitnick, K.; Simon, W. The Art of Deception: Controlling the Human Element of Security; John Wiley and Sons: Hoboken, NJ, USA, 2003. [Google Scholar]
  34. Guri, M.; Hasson, O.; Kedma, G.; Elovici, Y. An optical covert-channel to leak data through an air-gap. In Proceedings of the 2016 14th Annual Conference on Privacy, Security and Trust (PST), Auckland, New Zealand, 12–14 December 2016. [Google Scholar] [CrossRef]
  35. Guri, M. POWER-SUPPLaY: Leaking Data from Air-Gapped Systems by Turning the Power-Supplies Into Speakers. arXiv 2020, arXiv:2005.00395. [Google Scholar]
  36. Seebruck, R. A typology of hackers: Classifying cyber malfeasance using a weighted arc circumplex model. Digit. Investig. 2015, 14, 36–45. [Google Scholar] [CrossRef]
  37. Victor, B.; Cullen, J.B. The organizational bases of ethical work climates. Adm. Sci. Q. 1988, 22, 101–125. [Google Scholar] [CrossRef]
  38. Gwebu, K.L.; Wang, J.; Hu, M.Y. Information security policy noncompliance: An integrative social influence model. Inf. Syst. J. 2020, 30, 220–269. [Google Scholar] [CrossRef]
  39. Bandura, A. Social cognitive theory of self-regulation. Organ. Behav. Hum. Decis. Process. 1991, 50, 248–287. [Google Scholar] [CrossRef]
  40. Herath, T.; Yim, M.S.; D’Arcy, J.; Nam, K.; Rao, H.R. Examining employee security violations: Moral disengagement and its environmental influences. Inf. Technol. People 2018, 31, 1135–1162. [Google Scholar] [CrossRef]
  41. Sykes, G.M.; Matza, D. Techniques of neutralization: A theory of delinquency. Am. Sociol. Rev. 1957, 22, 664–670. [Google Scholar] [CrossRef]
  42. Johnston, A.C.; Warkentin, M. Fear appeals and information security behaviors: An empirical study. MIS Q. 2010, 34, 549–566. [Google Scholar] [CrossRef]
  43. Arduin, P.E. A cognitive approach to the decision to trust or distrust phishing emails. Int. Trans. Oper. Res. 2023, 30, 1263–1298. [Google Scholar] [CrossRef]
  44. Benson, V.; McAlaney, J. Emerging Cyber Threats and Cognitive Vulnerabilities; Elsevier Science: Amsterdam, The Netherlands, 2019. [Google Scholar]
  45. Triplett, W.J. Addressing Human Factors in Cybersecurity Leadership. J. Cybersecur. Priv. 2022, 2, 573–586. [Google Scholar] [CrossRef]
  46. McAlaney, J.; Frumkin, L.; Benson, V. Psychological and Behavioral Examinations in Cyber Security; Advances in Digital Crime, Forensics, and Cyber Terrorism (2327-0381); IGI Global: Hershey, PA, USA, 2018. [Google Scholar]
  47. Stanton, J.; Stam, K.; Mastrangelo, P.; Jolton, J. Analysis of end user security behaviors. Comput. Secur. 2005, 24, 124–133. [Google Scholar] [CrossRef]
  48. Guo, K.; Yuan, Y.; Archer, N.; Connely, C. Understanding Nonmalicious security violations in the workplace: A composite behavior model. J. Manag. Inf. Syst. 2011, 28, 203–236. [Google Scholar] [CrossRef]
  49. Shropshire, J. A canonical analysis of intentional information security breaches by insiders. Inf. Manag. Comput. Secur. 2009, 17, 221–234. [Google Scholar] [CrossRef]
  50. Campbell, C.C. Solutions for counteracting human deception in social engineering attacks. Inf. Technol. People 2019, 32, 1130–1152. [Google Scholar] [CrossRef]
  51. Sun, P.Y.T.; Scott, J.L. An investigation of barriers to knowledge transfer. J. Knowl. Manag. 2005, 9, 75–90. [Google Scholar] [CrossRef]
  52. Crossan, M.M.; Lane, H.W.; White, R.E. An organizational learning framework: From intuition to institution. Acad. Manag. Rev. 1999, 24, 522–537. [Google Scholar] [CrossRef]
  53. Senge, P.M. The Fifth Discipline: The Art and Practice of the Learning Organization; Doubleday/Currency: New York, NY, USA, 2006. [Google Scholar]
  54. Hatten, K.J.; Rosenthal, S.R. Reaching for the Knowledge Edge: How the Knowing Corporation Seeks, Shares and Uses Knowledge for Strategic Advantage; American Management Association, Inc.: New York, NY, USA, 2001. [Google Scholar]
  55. Razmerita, L.; Kirchner, K.; Hockerts, K.; Tan, C.W. Modeling collaborative intentions and behavior in Digital Environments: The case of a Massive Open Online Course (MOOC). Acad. Manag. Learn. Educ. 2020, 19, 469–502. [Google Scholar] [CrossRef]
  56. Prenkaj, B.; Stilo, G.; Madeddu, L. Challenges and Solutions to the Student Dropout Prediction Problem in Online Courses. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, Online, 19–23 October 2020; pp. 3513–3514. [Google Scholar]
  57. Maitlo, A.; Ameen, N.; Peikari, H.R.; Shah, M. Preventing identity theft. Inf. Technol. People 2019, 32, 1184–1214. [Google Scholar] [CrossRef]
  58. Daspit, J.J.; D’Souza, D.E. Using the community of inquiry framework to introduce wiki environments in blended-learning pedagogies: Evidence from a business capstone course. Acad. Manag. Learn. Educ. 2012, 11, 666–683. [Google Scholar] [CrossRef]
  59. Siala, H.; Kutsch, E.; Jagger, S. Cultural influences moderating learners’ adoption of serious 3D games for managerial learning. Inf. Technol. People 2020, 33, 424–455. [Google Scholar] [CrossRef]
  60. Mustar, P. Technology management education: Innovation and entrepreneurship at MINES ParisTech, a leading French engineering school. Acad. Manag. Learn. Educ. 2009, 8, 418–425. [Google Scholar]
  61. Salas, E.; Wildman, J.L.; Piccolo, R.F. Using simulation-based training to enhance management education. Acad. Manag. Learn. Educ. 2009, 8, 559–573. [Google Scholar]
  62. Chandler, P.; Sweller, J. Cognitive load theory and the format of instruction. Cogn. Instr. 1991, 8, 293–332. [Google Scholar] [CrossRef]
  63. Sweller, J. Cognitive load during problem solving: Effects on learning. Cogn. Sci. 1988, 12, 257–285. [Google Scholar] [CrossRef]
  64. Sweller, J. Cognitive technology: Some procedures for facilitating learning and problem solving in mathematics and science. J. Educ. Psychol. 1989, 81, 457. [Google Scholar] [CrossRef]
  65. Kolb, D.A. Experience as the Source of Learning and Development; Prentice Hall: Upper Sadle River, NJ, USA, 1984. [Google Scholar]
  66. Kapp, K.M. The Gamification of Learning and Instruction: Game-Based Methods and Strategies for Training and Education; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
  67. Tobias, S.; Fletcher, J.D.; Wind, A.P. Game-based learning. In Handbook of Research on Educational Communications and Technology; Springer: Cham, Switzerland, 2014; pp. 485–503. [Google Scholar]
  68. Snyder, J. A Framework and Exploration of a Cybersecurity Education Escape Room. Ph.D. Thesis, Brigham Young University, Provo, UT, USA, 2018. [Google Scholar]
  69. Saridakis, G.; Lai, Y.; Muñoz Torres, R.I.; Gourlay, S. Exploring the relationship between job satisfaction and organizational commitment: An instrumental variable approach. Int. J. Hum. Resour. Manag. 2020, 31, 1739–1769. [Google Scholar] [CrossRef]
  70. Bruguier, F.; Lecointre, E.; Pradarelli, B.; Dalmasso, L.; Benoit, P.; Torres, L. Teaching Hardware Security: Earnings of an Introduction proposed as an Escape Game. In Proceedings of the International Conference on Remote Engineering and Virtual Instrumentation, Athens, GA, USA, 26–28 February, 2020; Springer: Cham, Switzerland, 2020; pp. 729–741. [Google Scholar]
  71. Amjad, K.; Ishaq, K.; Nawaz, N.A.; Rosdi, F.; Dogar, A.B.; Khan, F.A. Unlocking cybersecurity: A game-changing framework for training and awareness—A systematic review. Hum. Behav. Emerg. Technol. 2025, 2025, 9982666. [Google Scholar] [CrossRef]
  72. Rajendran, D.P.D.; Sundarraj, R.P. Designing game-based learning artefacts for cybersecurity processes using action design research. Bus. Inf. Syst. Eng. 2025, 67, 409–428. [Google Scholar] [CrossRef]
  73. Lui, A.; Womack, C.; Orton, P. Collaborative online international learning as a third space to improve students’ awareness of cybersecurity. Educ. Inf. Technol. 2025, 30, 13835–13856. [Google Scholar] [CrossRef]
  74. Silic, M.; Lowry, P.B. Using Design-Science Based Gamification to Improve Organizational Security Training and Compliance. J. Manag. Inf. Syst. 2020, 37, 129–161. [Google Scholar] [CrossRef]
  75. Benson, V.; McAlaney, J.; Frumkin, L.A. Emerging threats for the human element and countermeasures in current cyber security landscape. In Cyber Law, Privacy, and Security: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2019; pp. 1264–1269. [Google Scholar]
  76. Taherdoost, H. Towards an Innovative Model for Cybersecurity Awareness Training. Information 2024, 15, 512. [Google Scholar] [CrossRef]
  77. Maaravi, Y. Using hackathons to teach management consulting. Innov. Educ. Teach. Int. 2020, 57, 220–230. [Google Scholar] [CrossRef]
  78. Briscoe, G.; Mulligan, C. Digital Innovation: The Hackathon Phenomenon. Creativeworks London. 2014. Available online: http://qmro.qmul.ac.uk/xmlui/handle/123456789/11418 (accessed on 28 December 2025).
  79. NIST. Framework for Improving Critical Infrastructure Cybersecurity, Version 1.1; Technical Report; National Institute of Standards and Technology: Gaithersburg, MD, USA, 2018. [CrossRef]
  80. Strom, B.E.; Applebaum, A.; Miller, D.P.; Nickels, K.C.; Pennington, A.G.; Thomas, C.B. MITRE ATT&CK: Design and Philosophy; Technical Report; MITRE Corp: Bedford, MA, USA, 2018. [Google Scholar]
  81. Al-Shaer, R.; Spring, J.M.; Christou, E. Learning the Associations of MITRE ATT&CK Adversarial Techniques. arXiv 2020, arXiv:2005.01654. [Google Scholar] [CrossRef]
  82. Damart, S.; David, A.; Klasing Chen, M.; Laousse, D. Turning managers into management designers: An experiment. In Proceedings of the XXVIIème Conférence de l’AIMS, Montpellier, France, 6–8 June 2018. [Google Scholar]
  83. Sagarin, B.J.; Mitnick, K.D. The path of least resistance. In Six Degrees Of Social Influence: Science, Application, and the Psychology of Robert Cialdini; Oxford University Press: New York, NY, USA, 2012; pp. 27–38. [Google Scholar]
  84. Fiske, S.T.; Taylor, S.E. Social Cognition: From Brains to Culture; Sage: Thousand Oaks, CA, USA, 2013. [Google Scholar]
  85. Muscanell, N.L.; Guadagno, R.E.; Murphy, S. Weapons of influence misused: A social influence analysis of why people fall prey to internet scams. Soc. Personal. Psychol. Compass 2014, 8, 388–396. [Google Scholar] [CrossRef]
  86. Mohammed, A.M.; Idris, B.; Saridakis, G.; Benson, V. Information and communication technologies: A curse or blessing for SMEs? In Emerging Cyber Threats and Cognitive Vulnerabilities; Elsevier: Amsterdam, The Netherlands, 2020; pp. 163–174. [Google Scholar]
  87. Shah, M.H.; Jones, P.; Choudrie, J. Cybercrimes prevention: Promising organisational practices. Inf. Technol. People 2019, 32, 1125–1129. [Google Scholar] [CrossRef]
  88. Khoja, Z.; Dintakurthy, Y. Gamification of Cybersecurity Education for K-12 Teachers. J. Comput. Sci. Coll. 2025, 40, 70–79. [Google Scholar]
  89. Videnovik, M.; Filiposka, S.; Trajkovik, V. A novel methodological approach for learning cybersecurity topics in primary schools. Multimed. Tools Appl. 2025, 84, 22949–22969. [Google Scholar] [CrossRef]
  90. Xu, Y.; Li, H. Cybersecurity Matters for Primary School Students: A Scoping Review of the Trends, Challenges, and Opportunities. IEEE Trans. Learn. Technol. 2025, 18, 513–529. [Google Scholar] [CrossRef]
  91. Beel, J.; Langer, S.; Genzmehr, M.; Gipp, B.; Nürnberger, A. A Comparison of Offline Evaluations, Online Evaluations, and User Studies in the Context of Research-Paper Recommender Systems. In Proceedings of the 19th International Conference on Theory and Practice of Digital Libraries (TPDL), Poznań, Poland, 14–18 September 2015. [Google Scholar]
  92. Laupper, E.; Balzer, L.; Berger, J.L. Online vs. Offline Course Evaluation Revisited: Testing the Invariance of a Course Evaluation Questionnaire Using a Multigroup Confirmatory Factor Analysis Framework. Educ. Assess. Eval. Account. 2020, 32, 481–498. [Google Scholar] [CrossRef]
  93. Mirkovic, J.; Peterson, P.A.H. Class Capture-the-Flag Exercises. In Proceedings of the 2014 USENIX Summit on Gaming, Games, and Gamification in Security Education (3GSE 14), San Diego, CA, USA, 8 August 2014. [Google Scholar]
  94. Rath, J.; Ischi, M.; Perkins, D. Evolution of different dual-use concepts in international and national law and its implications on research ethics and governance. Sci. Eng. Ethics 2014, 20, 769–790. [Google Scholar] [CrossRef]
  95. Al-Tawil, T.N. Ethical implications for teaching students to hack to combat cybercrime and money laundering. J. Money Laund. Control 2024, 27, 21–33. [Google Scholar] [CrossRef]
  96. Morgan, S.; Goel, S. Improving Ethics Surrounding Collegiate-Level Hacking Education: Recommended Implementation Plan. MCA J. 2024, 7, 8. [Google Scholar]
  97. Hynninen, T. Student Perceptions of Ethics in Cybersecurity Education. In Proceedings of the Conference on Technology Ethics (Tethics), Turku, Finland, 18–19 October 2023. [Google Scholar]
  98. Chew, J. A Study on Ethical Hacking in Cybersecurity Education Within the United States. Ph.D. Thesis, California Polytechnic State University, San Luis Obispo, CA, USA, 2024. [Google Scholar]
Figure 1. Taxonomy of IS security threats (inspired from Loch et al. [10]).
Figure 1. Taxonomy of IS security threats (inspired from Loch et al. [10]).
Jcp 06 00016 g001
Figure 2. IS targeted by the students during a tutorial on external threats.
Figure 2. IS targeted by the students during a tutorial on external threats.
Jcp 06 00016 g002
Figure 3. Tutorial on external threats—find the cat. Ransom demand with hidden data. “I took your cat 1000 euro in cash not talk to police or I kill him.”.
Figure 3. Tutorial on external threats—find the cat. Ransom demand with hidden data. “I took your cat 1000 euro in cash not talk to police or I kill him.”.
Jcp 06 00016 g003
Figure 4. Tutorial on insider threats—phish me if you can. Students in the experimental laboratory with copy-protection walls.
Figure 4. Tutorial on insider threats—phish me if you can. Students in the experimental laboratory with copy-protection walls.
Jcp 06 00016 g004
Figure 5. Tutorial on insider threats—phish me if you can. Phishing email with identified trust-eliciting areas (inspired by Arduin [43]).
Figure 5. Tutorial on insider threats—phish me if you can. Phishing email with identified trust-eliciting areas (inspired by Arduin [43]).
Jcp 06 00016 g005
Figure 6. Fake barcode (a): students have to hack a laptop. Door lock with programmable code (b): students have to deceive people.
Figure 6. Fake barcode (a): students have to hack a laptop. Door lock with programmable code (b): students have to deceive people.
Jcp 06 00016 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Arduin, P.-E.; Costé, B. Learning to Hack, Playing to Learn: Gamification in Cybersecurity Courses. J. Cybersecur. Priv. 2026, 6, 16. https://doi.org/10.3390/jcp6010016

AMA Style

Arduin P-E, Costé B. Learning to Hack, Playing to Learn: Gamification in Cybersecurity Courses. Journal of Cybersecurity and Privacy. 2026; 6(1):16. https://doi.org/10.3390/jcp6010016

Chicago/Turabian Style

Arduin, Pierre-Emmanuel, and Benjamin Costé. 2026. "Learning to Hack, Playing to Learn: Gamification in Cybersecurity Courses" Journal of Cybersecurity and Privacy 6, no. 1: 16. https://doi.org/10.3390/jcp6010016

APA Style

Arduin, P.-E., & Costé, B. (2026). Learning to Hack, Playing to Learn: Gamification in Cybersecurity Courses. Journal of Cybersecurity and Privacy, 6(1), 16. https://doi.org/10.3390/jcp6010016

Article Metrics

Back to TopTop