Next Article in Journal
Trading Cloud Computing Stocks Using SMA
Previous Article in Journal
The Optimal Choice of the Encoder–Decoder Model Components for Image Captioning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Information Security Awareness in the Insurance Sector: Cognitive and Internal Factors and Combined Recommendations

1
Consultant Process & Information Management, Dux Group, 3011 TA Rotterdam, The Netherlands
2
Management Sciences and Marketing, The University of Manchester, Manchester M15 6PB, UK
*
Author to whom correspondence should be addressed.
Information 2024, 15(8), 505; https://doi.org/10.3390/info15080505
Submission received: 18 July 2024 / Revised: 3 August 2024 / Accepted: 15 August 2024 / Published: 21 August 2024

Abstract

:
Cybercrime is currently rapidly developing, requiring an increased demand for information security knowledge. Attackers are becoming more sophisticated and complex in their assault tactics. Employees are a focal point since humans remain the ‘weakest link’ and are vital to prevention. This research investigates what cognitive and internal factors influence information security awareness (ISA) among employees, through quantitative empirical research using a survey conducted at a Dutch financial insurance firm. The research question of “How and to what extent do cognitive and internal factors contribute to information security awareness (ISA)?” has been answered, using the theory of situation awareness as the theoretical lens. The constructs of Security Complexity, Information Security Goals (InfoSec Goals), and SETA Programs (security education, training, and awareness) significantly contribute to ISA. The most important research recommendations are to seek novel explaining variables for ISA, further investigate the roots of Security Complexity and what influences InfoSec Goals, and venture into qualitative and experimental research methodologies to seek more depth. The practical recommendations are to minimize the complexity of (1) information security topics (e.g., by contextualizing it more for specific employee groups) and (2) integrate these simplifications in various SETA methods (e.g., gamification and online training).

1. Introduction

Cyber-attacks impose more threats to information security; data usage, digital footprints, and (internet) consumption rates rise continuously [1,2,3]. Since COVID, cyberthreat volumes have increased by 25% [4,5,6]. It is predicted that by 2025, 40% of cybersecurity programs will utilize socio-behavioral principles influencing security culture across organizations due to the ineffectiveness of traditional security awareness programs [7]. Gartner expects that from less than 5% in 2021 to 40% in 2025 of cybersecurity programs will have deployed socio-behavioral principles (such as nudge techniques) to influence security culture across the organization [3]. This shift is supported by recent academic research, which highlights the significant role of incorporating behavioral and social science principles to enhance cybersecurity outcomes, underscoring that traditional technical defences alone are insufficient [8]. Human aspects stay present in most data breaches and phishing; security awareness efforts are too static to prepare technologists and others for effective security decision-making. Many breaches result from preventable human behaviors, emphasising the need for security behavior and awareness programs [9,10,11,12]. Thus, attention should be focused on holistic behavior and change programs rather than just compliance-based awareness efforts [7,10,13,14].
Organizations are encouraged to adopt new secure behaviors for cyber resilience due to the increasing sophistication of cyberattacks. Information is one of the most valuable assets of an organization, hence why employees are obligated to undergo various forms of education and other training efforts. As a result, organizations often use standardized security education programs (SETA) to develop ISA, defined as user awareness and commitment to information security guidelines [15,16,17]. SETA efforts, being top-down and supply-driven, fail to account for internal characteristics and personal behavior in ISA development. Information Security Policies (ISP) serve as controls, but their effectiveness is compromised when employees lack sufficient awareness, potentially leading to violations of the policies. Raising ISA is thus seen as a crucial first step [18,19,20]. Employees must also be aware of security risks to have an effective first line of defense. A narrow focus on technical aspects is insufficient given the multidisciplinary nature of information security, where human aspects are crucial. Hence, ISA is diverse, requiring various awareness capabilities to address different threat categories, underscoring its ongoing importance. Effective education approaches for continuous ISA serve as crucial complements to regular monitoring efforts. The financial and insurance industry, the second most attacked sector, faces a need for robust security awareness efforts despite its professionalism and compliance obligations [21,22].
Recent Information System (IS) literature predominantly focuses on security behavior, policies, violation and compliance, and education tools for security awareness, with limited attention given to ISA [23]. Current research lacks an emphasis on ISA; prioritization factors such as individual awareness, cognition, and organizational culture are underexplored. Notably, there is a scarcity of IS security literature relating to cognitive and learning mechanisms, such as behaviorism, cognitivism, or constructivism. Opportunities for future research include exploring learning and awareness programs’ promotion and expanding the understanding of ISA capabilities [24]. Hence, a more concentrated focus on these factors is needed in IS research. There is a recognized need for more research on ISA, considering factors like industry, organizational types, employee roles [25,26,27,28,29], personality traits [14,30], and cognitive and behavioral mechanisms [25,31,32,33]. Additionally, there is no common understanding of decisive factors of ISA [32,33]. Hence, this research distinguishes itself by exploring cognitive and internal organizational mechanisms, departing from traditional behavioral theories, and focusing on ISA as a dependent variable to understand its formulation for improving the effectiveness of the ‘human firewall.’ Overall, the broader literature reflects the need for additional research, particularly focused on understanding how ISA shapes itself and its role in establishing the ‘human firewall,’ making it academically and practically relevant.
This research seeks to develop a validated theory to enhance ISA in the financial and insurance sectors, with broader applicability. The goal is to understand employees’ mental models regarding security threats, enabling more effective ISA education. The insights obtained can strengthen the ‘human firewall’ across different industries. The research emphasizes addressing ‘how’ to engage employees in ISA education. The findings can be used to tailor security awareness programs, modify processes like onboarding, and target ISA education to meet organizational needs [10,34].
Academic contributions manifest, particularly in the insurance and financial sectors, by examining internal factors influencing ISA. A smaller group of research papers investigate antecedents of ISA [19,35,36,37]; most existing research focuses on ISP compliance or violations, neglecting the understanding of ISA among employees [25,38,39,40,41,42,43,44,45]. This study addresses the gap by delving into several employee factors, providing a unique approach to building a research and conceptual model. With this in mind, Haeussinger and Kranz [36] argue that further research is needed to seek antecedents of ISA; they emphasized institutional, individual, and socio-environmental antecedents as well. Furthermore, Hwang et al. [19] uphold that wish by stressing the need for researching cognitive and perceptual factors on how ISA is accomplished. The findings can be valuable for researchers in IS security and other social science fields, offering insights applicable in various contexts and types of awareness beyond information security.

2. Literature Review

A deep understanding of the antecedents of ISA is essential for ensuring the security of organizations and employees. In the 1990s, scholars began recognizing the significance of human factors in information security, moving beyond a purely technological focus [46]. Since then, behavioral information security has emerged as a field with distinct research streams in need of further exploration, including researching ISA [47]. ISA is defined as “a state of having knowledge allowing a person to: (1) recognize a threat when one is encountered, (2) assess the type and the magnitude of damage it can cause, (3) identify what vulnerabilities a threat can exploit, (4) identify what countermeasures can be employed to avoid or mitigate the threat, (5) recognize one’s responsibilities for threat reporting and avoidance, and (6) implement recommended protective behaviours” ([48], p. 109). This aligns with the holistic frameworks of Maalem Lahcen et al. [49] and Stanton et al. [50], which also encompass human factors and behavioral cybersecurity.

2.1. Information Security Behavior

Proper and compliant information security behavior is reached by transforming awareness to correct behavioral intentions, as evidenced by several studies [25,38,39,40,41,42,43,44,51,52,53]. The investigation of employee behavior in the context of IS has advanced over the years this century. A prevalent theme within the domain of information security behavior is the concept of ISPs, which employees either adhere to or violate. The above-referenced extensive research has aimed to understand the predictors and motivations behind employee behavior. Internal factors such as self-efficacy and attitude have been found to play significant roles in shaping compliance behavior. Table 1 provides an overview of (1) studies’ research objectives, (2) the role of self-efficacy and/or attitude, and (3) the role of ISA.
In analyzing ten empirical studies, it is apparent that while only two did not include awareness as a construct [51,54], all ten demonstrated the direct (included in the research question) or indirect (used as a core element but not in the research question) impact of ISA on compliant behavior. ISA positively influences individuals’ attitudes towards secure behavior and their motivation for security goals, crucial for predicting information security behavior or compliance. Employees’ perceptions of misbehavior consequences also clearly shape compliance intentions [25,40]. ISA emerges as a fundamental determinant of employees’ adherence to information security practices, underscoring the importance of maintaining sufficient awareness levels [38,53,55]. As per the references in Table 1, ISA consistently shows a positive impact on secure behavior. These findings align with our research model, which focuses on awareness. Fear has also been studied in the context of explaining information security behavior. It is often indirectly addressed through the concept of deterrence, which instils doubt or fear of potential consequences for non-compliance. Research has shown that deterrence affects malicious computer abuse and can heighten perceptions of threat severity and vulnerability [28,56]. Furthermore, threats that embody fear that employees have no awareness of yet should be covered more in depth to acknowledge the perceived severity of this [26]. Fear can also lead to attitudinal ambivalence towards ISPs, influencing protection-motivated behavior. The lack of security awareness contributes to this ambivalence and subsequent misbehavior [29]. Similarly, building employee awareness of information security threats is essential for promoting secure behavior [57,58]. Moreover, a greater presence of deterrence increases awareness of ISP and security measures [56].
Overall, while fear plays a role in the broader context of information security, our research model specifically highlights the critical importance of ISA. The insights from both Table 1 and the additional studies relating to deterrence and fear appeals underscore the necessity of maintaining high levels of awareness to shape employee behavior and attitudes towards security measures. Hence, there is a critical need to improve awareness programs to shape the attitudes and behavior of employees in any organization regarding information security.

2.2. From Information Security Behavior to Awareness

Theoretical perspectives have been utilized to explain security behavior in the existing literature. However, these theories may not align well with the context of this research, which focuses on awareness rather than behavior. The most commonly used theories to explain behavior include the (a) Theory of Reasoned Action/Planned Behavior [35,36,37,59,60,61,62,63], (b) Technology Acceptance Model [59], (c) General Deterrence Theory [61,62,63], and (d) Protection Motivation Theory [54,61,62,6465,66,67,68]. These studies were contextual for the selected situation awareness framework used in this study. Additionally, the social cognitive theory and social learning theory, particularly focusing on self-efficacy, have been used to explain behavior and ISA in studies [19,35,38,41,52,68,69]. Furthermore, theories such as relational awareness [35], affective absorption and affective flow [70], routine activity theory [66], social judgment theory [71], collectivism [72], and situation awareness [42,73] have been explored.
However, this research argues, implementing the suggestions from Lebek et al. (2014) [74], that while most of these theories may explain behavior well, they may not adequately explain awareness. Research for developing measures and process models that influence ISA, rather than solely relying on established relationships from commonly used theories, is needed. Notably, situation awareness is one of the few exceptions in explaining ISA, highlighting a shift towards dissecting awareness rather than behavior.
Situation awareness (SA), as defined by Endsley [75], involves perceiving, comprehending, and projecting elements in the environment to make informed decisions. Initially developed for aviation, SA applies to various dynamic situations, including everyday activities like walking on the street. In environments with constantly changing security risks, such as those in IS, SA becomes crucial [75]. The individual must recognize relevant knowledge and important environmental facts to make safe judgments [76]. SA comprises three levels: (1) perception, (2) comprehension, and (3) projection. Perception involves sensing environmental attributes, while comprehension involves understanding them, and projection involves foreseeing future developments [73,75].
Factors like information-processing mechanisms and memory influence SA [73]. Experimental studies explore the impact of information richness in SETA programs and phishing experienced with situation ISA, respectively [42,73]. This research differs from these studies in terms of methodology and included constructs. SA has been utilized in the cybersecurity literature for proposed frameworks but is less explored at the individual level [77,78,79]. Furthermore, Kannelønning and Katsikas [20] suggest that there is room for the further exploration of situation awareness at the level of individual employees within (cyber)security environments.
Endsley’s [75] definition of ISA aligns with the essence of SA’s three levels. While SA’s application in the IS literature is limited, this study explores new perspectives on awareness influences, motivating its inclusion in the IS literature.

2.3. Information Security Awareness

Bulgurcu et al. [25] (p. 532) combined general ISA and ISP awareness, defining it as “an employee’s knowledge of information security and their organization’s ISP requirements”. This highlights the two key dimensions. Hanus et al.’s [17] (p. 109) research adopts a broader definition based on security awareness: “A state of having knowledge allowing a person to: (1) recognize threats, (2) assess their impact, (3) identify exploitable vulnerabilities, (4) employ countermeasures, (5) recognize reporting responsibilities, and (6) implement protective behaviours.” This definition emphasizes that ISA goes beyond knowledge, requiring its application of the knowledge. Furthermore, it aligns with Siponen’s [15] view. ISA is described as a cognitive state of mind [62]. Additionally, ISA involves knowledge about information security, influenced by experiences or external sources [25]. ISA is positioned as the foundation of SETA [67] and is crucial for information security compliance [19].
Research into ISA and its antecedents has been conducted, with Haeussinger and Kranz [38,39] categorizing them into (1) individual, (2) institutional, and (3) socio-environmental predictors. Subsequent studies [35,42,80] built on these dimensions, exploring awareness antecedents further. Jaeger [41] added a fourth dimension: technological factors. The majority of studies contain institutional (management and firm) antecedents [35,40,42,45,72,81,82], such as leadership and information and education channels.
The individual dimension is equally represented, although its antecedents are more varied, covering topics such as personality traits and demographic factors, including age and personality traits predicting ISA [45,55,83,8485]. Additional individual antecedents include self-efficacy cues and behavioral traits like the tendency for risky behavior [38,41,52,64].
In Appendix A, an extensive table details all the studies and the dimensions utilized for their research constructs. Other studies examine awareness in the context of severity and susceptibility awareness [37] and countermeasure and threat awareness [66].
This research stresses the need to address human information processing, decision-making, the needs of employees, security culture, and the cognitive mechanisms that generate awareness, as these aspects are currently underexplored in the existing studies according to researchers [21,45,86]. This research aims to contribute to filling that gap to provide a better understanding of how to enhance ISA.

2.4. Proposition Development & Research Model

In this research, there are five propositions developed; this includes three individual propositions (Negative Experience, SETA Program, and InfoSec Goals) and two task and environmental propositions (Security Complexity and SETA Design). The following subchapters will detail the five propositions based on the literature.

2.4.1. Negative Experience

Endsley [75,87] suggests that experiencing an environment contributes to the development of expectations regarding novel events. Experience is a key factor influencing the aforementioned information-processing mechanisms. While work experience may not directly impact eventual behavior [44], individuals with greater security experience utilize their ISA by recognizing specific familiar cues. This indicates that experience fosters the development of schemata and mental models, enabling individuals to identify patterns that contribute to awareness [42]. Additionally, negative experiences with incidents affect one’s awareness, and experience also eventually affects one’s behavior [36]. In this research, negative experience is defined as past encounters with any kind of information security incidents such as worms, viruses, and phishing attacks in both private and work contexts [36,42]. This assertion is echoed in another study, which demonstrates that experiences with breaches positively influence security-related behavior, with behavior being determined both directly and indirectly by ISA [88]. Frank and Kohn [89] posited that when co-workers share information security experiences, it could raise awareness.
Proposition 1.
Negative experiences positively influence information security awareness.

2.4.2. SETA Program

SETA is an overarching term encompassing programs or activities that involve awareness, training, and education. Awareness should be created to stimulate and motivate individuals to understand what is expected of them regarding information security and organizational measures/policies, providing them with knowledge. Training is a process aimed at teaching individuals a skill or demonstrating how to use specific tools. Education entails more in-depth schooling as a career development initiative or to support the aforementioned tools [90]. In this research, SETA (programs) is defined as “any endeavour that is undertaken to ensure that every employee is equipped with the information security skills and information security knowledge specific to their roles and responsibilities by using practical instructional methods” [91] (p. 250). In addition to internal organizational objectives, SETA initiatives are influenced by external authorities such as regulatory and compliance drivers [92]. General observations regarding the positive effects of SETA are apparent, such as a reduction in weak passwords and an increase in ISP compliance [93]. There is a clear negative relationship between SETA and cybersecurity-related incidents [94], such as a reduction in phishing susceptibilities [95]. Other studies also found that SETA efforts raise awareness and discourage IS misuse and risks among employees [40,93]. Again, in general, SETA programs raise the knowledge of employees [96].
A variety of educational tools have been utilized, including online games and short animation films [97] and the gamification effectiveness [98,99,100]. Furthermore, the media and message types are important and have different levels of effectiveness [101], which are linked to internal personality traits [102]. Furthermore, it is important to integrate cognitive and psychological factors, such as cultural and cognitive bias [86], psychological ownership [103], and motivation to process information systematically and cognitively [104]. However, not all SETA programs are currently adequate enough yet and require more socio-behavioral principles.
Training efforts are one of the individual factors impacting SA [75,87]. Within these efforts, the essence is beyond training. So, receiving information for better and conscious decision-making [19] is vital for comprehension and having the right amount of knowledge for handling future situations (Projection). Effective SETA programs instil confidence in employees to address security threats [103]. Consequently, the impact of previous general security training undertaken by an employee becomes apparent. Therefore, this research posits that security education and training initiatives, collectively referred to as the SETA program, exert a positive influence.
Proposition 2.
A SETA program positively affects information security awareness.

2.4.3. InfoSec Goals

Goals largely affect SA by directing attention and shaping the perception and interpretation of information, leading to comprehensive understanding. According to Endsley, a pilot’s goals, such as defeating enemies or ensuring survival, impact SA through either a top-down or bottom-up process [75]. In a top-down process, a person’s goals and objectives give direction to what environmental elements a person will be aware of and notices, shaping SA and informing comprehension. Conversely, in a bottom-up process, environmental cues may alter goals for safety reasons. Meaning is given when that information is put in a context related to one’s goals, resulting in the individual acting upon activities aligned with that goal. While both processes are effective in dynamic environments, the top-down approach is often more applicable in information security contexts. This research posits that goals are an extension of organizational commitment, with employees demonstrating greater attentiveness to information security when their goals align with those of their organization [105]. Goo et al. [106] and Liu et al. [43] support the positive impact of affective and organiza-tional commitment on compliance intention and ISP compliance.Hence, this research defines InfoSec Goals as feelings of identification with, attachment to, and involvement in organizational InfoSec performance. InfoSec Commitment manifests as valuing one’s role in organizational information security, taking personal responsibility for organizational InfoSec performance, and dedication to remaining competent in it [107]. In the study of Davis et al. [107], they use the explanation to define InfoSec Commitment. However, as this research uses the SA theory, it is aligned with this theory by using the term goals rather than commitment. InfoSec Commitment serves as a foundation for InfoSec Goals; in this research, the transformation from InfoSec Commitment to InfoSec Goals will involve careful consideration of the underlying principles.
Specifically within the information security context, InfoSec Commitment reflects involvement, attachment, and identification with the information security performance of a firm [107]. Furthermore, Cavallari [108] posited that motivation and intent have a positive effect on compliant behavior to shape the information security plan of the organization. Motivation, via managers to employees, shape the overarching goals of information security [108]. Similarly, in later studies, it was found that commitment and support for the organization have a positive effect on compliant behavior [109].
To conclude, these insights align with Endsley’s framework [75], indicating that commitment to information security involves recognizing opportunities to act safely and acting upon these values, primarily through top-down processing.
Proposition 3.
InfoSec goals positively affect information security awareness.

2.4.4. Security Complexity

Stress factors, encompassing both physical and social-psychological aspects, can significantly influence SA [75,110]. Physical stressors, such as noise, fatigue, and lighting, as well as social psychological stressors like fear, anxiety, and time pressure, can narrow an individual’s field of attention, potentially causing them to overlook critical elements. Endsley highlights that stress can disrupt Perception, the first level of SA, and early decision-making stages [110]. It is found that lower job stress correlates with ISA, leading to higher employee productivity [111]. Conversely, information security stress, particularly work overload, can decrease ISA and lead to non-compliance with information security policies (ISPs) [112,113]. This finding is shared by another study; security fatigue symptoms make employees prone to ignoring ISPs and minimize their security efforts [114].
Complexity also plays a crucial role in SA, representing the number of system components, their interaction degree, and the change rate. Security warnings, which call employees’ attention to potential threats, add a layer of complexity to SA by requiring more mental effort to process [42]. This complexity can increase the mental workload and diminish SA, especially in an information security context, where employees must navigate complex security issues and ISP requirements. Complexity is closely linked with security-related stress, requiring employees to invest additional time and effort to understand and address security challenges, such as technical jargon, security issues, and ISPs. Complexity is one of the stress-related factors contributing to ISP violations [115]. Hence, complexity in this research is defined as situations where information security requirements and threats are perceived as complex and force employees to invest time and effort in learning and understanding information security, which can lead to stress (result of an interaction between external environmental stimuli and an individual’s reactions to them). This definition builds on other studies [112]. We posit that complex security requirements are complex to follow. This could result in work overload, which aligns with other findings [112,113]. This in turn can lead to emotion-focused coping strategies such as moral disengagement, which may increase the likelihood of non-compliant behavior [115]. Although awareness was not explicitly addressed in this study, as discussed earlier, this research has established that awareness is a driver of behavior. Therefore, the research posits that complexity also influences awareness. In summary, stress and complexity are intertwined factors that influence ISA. This research argues that minimized security efforts result from inadequate levels of ISA, affected by complexity including stress-related factors, as noted by Endsley [75].
Proposition 4.
Security complexity has a negative impact on information security awareness.

2.4.5. SETA Design

In the context of SA, the effectiveness of an individual’s perception and comprehension is influenced by two key design elements: system design and interface design [75,87]. System design refers to how information is gathered and presented to the individual, such as the display of data by aircraft systems to pilots. This design aspect plays a crucial role in shaping SA by determining the amount and manner of information available to the individual [87,116]. On the other hand, interface design focuses on how information is presented to the user, including factors like quantity, accuracy, and compatibility with the user’s needs [75]. A study discusses how effective interface design reduces the cognitive load by tailoring information to specific situations and adapting to current conditions, thereby enhancing situational awareness [111]. To optimize SA, design guidelines should prioritize aligning with SA goals rather than solely focus on technological aspects. Critical cues should be emphasized in interface designs to facilitate effective SA mechanisms [75].
In the realm of information security, system and interface design act as intermediaries through which employees receive security-related information via SETA methods. Various SETA methods are employed, such as text-based, video-based, or game-based delivery, but their effectiveness can vary significantly [73,99,101,102]. Some studies suggest that SETA initiatives can be perceived as burdensome by employees, leading to careless completion [100,103]. To link this to the research gap, since each SETA method can have a different impact on employees, it is possible that one method may be more effective than others in raising awareness. SETA is a tool used to raise awareness among employees. The importance of SETA has only increased over time, with an emphasis on its effectiveness rather than solely just its use of it. Another observation is that there is no universal perfect SETA; it has versatile delivery methods, and not all methods work the same in all contexts and demographics. It was also posited that it is target audience-dependable [113]. They argue that certain methods are most popular, and a combination of methods could be effective; however, passive methods remain ineffective [113]. Furthermore, there are still challenges in differentiating between the various types of methods and programs and assessing their effectiveness regarding knowledge, attitude, intention, and behavioral outcomes at both individual and organizational levels [117]. The lack of differentiation is attributed to an approach to ISA measurement that is broad and heterogeneous, making it difficult to determine specific effects of different SETA programs. This results in the definition of SETA Design as the various delivery methods and techniques, with their own attributes and characteristics used to transmit information security knowledge to employees for educational purposes, which builds on the work of the previously mentioned studies [75,113].
Thus, the effectiveness of different SETA methods varies, impacting ISA [102]. This was also underscored by Nwachukwu et al. [118], although more research is required to understand the effects of different methods, considering how information is presented to employees (system design) and the quality, quantity, and relevance of the information provided (interface design) and how it shapes ISA.
Proposition 5.
Conforming to the SETA design interventions positively affects information security awareness.
Based on the constructed research variables, the following model in Figure 1 represents the research in a visual manner.

3. Materials and Methods

This research applies a critical literature review [119,120]. The literature review is descriptive in nature; the focus is on the identification of the relevant perspectives of ISA [121]. The search focused on papers from 1995 onwards, using the keywords “security awareness” and “information security awareness”. The literature review facilitated relevant perspectives on ISA and formed the foundation for the research model.
This research is an empirical study. The main research methodology is quantitative—cross-sectional survey research (N = 156) at one of the largest Dutch insurance firms. The pilot survey, shared with a diverse group of employees, resulted in an improved survey, especially in terms of brevity and accessibility, with a reduced number of questions. The survey questions are based on the literature, supplemented with survey-specific questions. Perceptual questions were measured using a seven-point Likert scale, while other questions had dichotomous and categorical answers. Table 2 provides an overview of all items.
The survey was conducted anonymously and in Dutch, reflecting the main language of the organization. The survey is distributed via email to various groups and through a newsletter to 317 employees; thus, it is convenience sampling. Following Roscoe’s [122] guidelines, the sample includes two subgroups (non-IT employees and IT employees and managers), which include over 30 subjects, except for managers (N-managers = 26); this influenced the external validity of the research. The average time needed for the respondents to complete the survey was 12 min, and the collection of the data took place in the first two weeks of December 2022.
The literature suggests targeting specific groups for SETA initiatives due to the inadequacy of generic activities [123,124]. Furthermore, an existing research gap indicates whether managers and employees from the same firm differ in their security behavior [20]. Hence two perspectives will be considered: management/non-management and non-IT/IT roles. Research indicates that IT-related personnel generally have higher awareness levels, regardless of industry specifics [8,12]. Therefore, the study will examine these perspectives, each comprising two groups, to discover differences for targeted and effective SETA initiatives, subsequently enhancing the overall ISA.

3.1. Data Analysis

The collected data are analyzed in IBM SPSS (version 29.0.2.0) by means of descriptive and inferential statistics. For the inferential part, different techniques have been used. The correlation between construct items has been demonstrated by means of a Pearson’s r correlation matrix. Correlations also assisted in interpreting possible multicollinearity [125]—this has been controlled with a correlation matrix, as with the variance inflation factor diagnostic [126]. To test the effect of the propositions and significance, a multiple linear regression has been used, with a significance level of 0.05. Before the regressions were conducted, the assumptions associated with multiple linear regression were checked and corrective measures were taken. Subsequently, separated regression for the subgroups of the two group perspectives has been conducted as well to compare groups with each other, with an adjusted significance level to prevent family-wise errors. Additionally, independent sample t-tests and non-parametric Mann–Whitney U tests have been utilized to compare the means of the different groups [127].

3.2. Reliability & Validity

Cronbach’s Alpha is used for the internal reliability of measurement items. Techniques contributing to construct validity are the compound use of factor loading and average variance extracted. A confirmatory factor analysis (CFA) has been conducted in SmartPLS, rather than an exploratory factor analysis (EFA), as the propositions are supported by theory, and the focus of this research is not on discovering factors [128,129].
The factor analysis is used for convergent and discriminant validity, subsets of construct validity. The construct validity indicates when measurement items are strongly correlated with the research construct. To indicate discriminant validity for items, items that correlate weakly with items of other constructs have been examined, except for the construct where the items are associated (critical value: equal to or higher than 0.5). The square root of the average variance extracted, following the Fornell and Larcker criterion [130], has been applied.
To address potential Common Method Variance (CMV) bias, the Harman’s single factor test is conducted. This step ensures that the results are not influenced by methodological biases in the data collection process.

4. Results

4.1. Descriptive Statistics

The survey has reached an n of 156. All sample size indications mentioned in the previous chapter are succeeded, except for one. The groups were IT, non-IT, managers, and non-managers. The subset of management has not reached the preferred minimum of participants—26 instead of 30. Thus, all calculations for this group will be based on 26 cases and have influence on the external validity of the research. The sample’s largest groups are the age brackets of 25–34 and 45–54. Employees under 25 and employees of 65 and older are a vast minority. Regarding the subgroups, the following frequencies are present; IT employees (54.5%) exceed the number of non-IT employees (45.5%). For the level of participants, 130 respondents are non-management (83.3%), and 26 are managers (16.7%).
In terms of the means of constructs, Table 3 demonstrates the mean of combined items belonging to one construct and standard deviations. Table 3 provides all constructs.
ISA and InfoSec Goals have high means, M = 6.35 and M = 6.53, respectively. SETA Design carries the highest variability (SD = 1.14).
Regarding negative experiences, 10.3% of respondents reported having experienced malware, while the remaining 89.7% did not. The incidence of phishing is higher, with 19.2% of respondents having encountered phishing, compared to 80.8% who have not.

4.1.1. Malware Experience

The frequency of malware experience is similar between IT and non-IT respondents (seven and nine, respectively). Among management, two respondents reported malware experience, whereas fourteen non-managers did. This notable difference is expected given the smaller number of managers. Approximately 11% of non-managers have experienced malware, compared to approximately 8% of managers. Despite these observations, the differences for these two perspectives are minimal in terms of percentage points.

4.1.2. Phishing Experience

Phishing is a more prevalent issue compared to malware. Approximately 23% of non-IT employees have experienced phishing, in contrast to 16% of IT employees. Among management, nearly 27% have been targeted by phishing, compared to almost 18% of non-managers. These findings indicate that the differences in phishing experience percentages are more pronounced than those for malware. Non-IT employees and managers are more frequently affected by phishing than their counterparts.

4.1.3. Measurement Item Statistics

The statistical analysis of the survey provides insight into the central tendencies and variability of the items. Table 4 demonstrates the means and standard deviation for each measurement item.

4.2. Reliability & Validity of Results

This section details the reliability and validity of the survey instrument. It covers correlation, internal consistency by means of Cronbach’s Alpha, and, subsequently, the convergent and discriminant validity.
Cronbach’s Alpha was calculated to assess reliability for each construct. Values of 0.6 and above are acceptable [131]. Although values below 0.7 are generally questioned, all item-total correlations exceeded 0.3, meeting the criteria [132]. All constructs are reliable, with the following values: SETA Program α = 0.64, Security Complexity α = 0.63, InfoSec Goals α = 0.90, SETA Design α = 0.83, and ISA α = 0.89.
A correlation matrix (Appendix B) revealed significant associations between ISA and all dependent variables except Negative Experience (r = 0.045). Potential multicollinearity issues were assessed with a Variance Inflation Factor (VIF) analysis, confirming no high multicollinearity (Appendix C) [133,134], meeting one of the assumptions for linear regression.
For the convergent and discriminant validity, the Kaiser–Meyer–Olkin (KMO) test indicated sampling adequacy for factor analysis (KMO = 0.875, p < 0.001), meeting Kaiser’s (1974) criterion of >0.80. Confirmatory factor analysis in SmartPLS confirmed convergent validity, with AVE values exceeding 0.5, per Fornell and Larcker [130]. See Appendix D for detailed results.
The Harman’s single factor test for potential Common Method Variance bias showed that a single factor explained 32.738% of the total variance. Since this is well below the 50% threshold, it suggests that CMV is unlikely to significantly affect the results. The full results are detailed in Appendix E.

4.3. Inferential Statistics

In this section, the tested propositions will be demonstrated with an additional group comparison. Before the propositions were tested, the assumptions of (multiple) linear regression were checked (with the exception of multicollinearity, since that has already been checked for in the previous section). Furthermore, additional comparisons were executed between the four different demographic groups (IT employees, non-IT employees, managers, and non-managers).
The assumptions of linearity, the independence of errors, and the normal distribution of errors have been satisfactorily met. Linearity was confirmed through scatter plots, and the independence of errors was validated with a Durban Watson statistic of 2.24. Despite minor deviations due to some noticeable deviations at the lower and upper ends of the distribution, the P-P plot indicated a generally normal distribution of residuals. However, the assumption of homoscedasticity was violated due to a cone-shaped pattern in the residuals plot. To address this, logarithmic transformations were applied by a weighted regression to correct heteroscedasticity, using weights based on transformed unstandardized residuals.

4.3.1. Proposition Results

To test the propositions and regress the predictors on ISA, the following model has been used for the weighted regression:
Information Security Awareness = β0 + β1 × SETA Program + β2 × Security Complexity + β3 × Negative Experience + β4 × InfoSec Goals + β5 × SETA Design + ε
The regression model examines the factors that influence ISA. The coefficients (β1 to β5) represent the independent variables, which indicate the influence of each variable on ISA. The error term (ε) captures the variation in ISA that is not explained by the independent variables. The regression analysis shows how these factors collectively determine employees’ ISA.
The regression model predicted approximately 42% of the variance in ISA based on an R2 of 0.413; the R2adj value of 0.394 indicates that 39% of the variation in ISA is jointly explained by the number of predictors. Thus, 61% of the variation in ISA can be explained by other explanatory variables. The model summary indicates that the overall model is significant, F(5, 150) = 21.15, p < 0.001.
Proposition 1.
Negative experience positively affects information security awareness.
The regression indicated that negative experience has no significant negative effect on ISA (β = 0.058, t = 0.477, p = 0.634). Again, there is no significant effect due to the p-value exceeding the mentioned cut-off of 5% significance. This proposition is thus not supported.
Proposition 2.
An SETA program positively affects information security awareness.
It was found that SETA Programs in general significantly predict ISA (β = 0.136, t = 2.402, p = 0.018). Since the associated p-value is less than 0.05 at 5%, it is concluded that SETA Programs have a significant positive impact on ISA. Hence, if the SETA Program increases by one unit, the ISA increases by 0.136 units on average.
Proposition 3.
Information security goals positively affect information security awareness.
InfoSec Goals positively contribute to ISA (β = 0.222, t = 3.089, p = 0.002). Since 0.002 < 0.05, it means that this proposition is significant and supported. The coefficient of InfoSec Goals is positive—that is to say, with a one-unit increase, the ISA will increase by 0.222 on average.
Proposition 4.
Security complexity has a negative impact on information security awareness.
It was found that complexity has a negative contribution to ISA (β = −0.249, t = −7.807, p < 0.001). This contribution was found to be significant, and this proposition is supported. It implies that when complexity increases with one unit, ISA decreases on average with 0.249 units.
Proposition 5.
Conforming to the SETA design interventions positively affects information security awareness.
SETA Design as a construct was not found to be significant (β = −0.038, t = −1.431, p = 0.155), indicating that SETA interventions that coincide with employees’ information needs and using various techniques do not contribute to ISA since the associated p-value exceeds 0.05.
Table 5 presents the regression analysis results for all propositions, with significance levels indicating whether each proposition is supported.
Based on the following research model, the sequence of impact on ISA is (from strongest to least strong): Security Complexity, InfoSec Goals, and SETA (ceteris paribus). Figure 2 demonstrates the path coefficients where each variable indicates the strength. From this, theoretical and practical implications will follow in the next chapter.

4.3.2. Additional Separate Regressions per Group

Separate regressions were conducted for each group (IT, non-IT, managers, and non-managers). This was executed to identify potential group differences in the relationship between each independent variable and ISA. However, a series of comparisons can result in a Type I error possibility. An alpha level of 0.05 was used in the prior section, but it cannot be used for this section due to the increased risk of errors [135]. Therefore, the Bonferroni Correction will be applied to prevent family-wise error, which has a 26% chance of occurring. For the following comparisons, an alpha level below 0.008 will be used. Although this value is close to 0.01, using a less conservative value also helps prevent Type II errors. Thus, this value will be used.
Each of the three different perspectives includes two groups. All models were significant. The adjusted values of the separate models range between 0.337 and 0.452. For a concise overview, Table 6 presents the unstandardized beta coefficients and p-value indicators.

4.3.3. Additional Mean Comparisons of ISA per Group

Last, the means of ISA were compared for each group. The assumptions for the t-tests were checked first. The cases in the sample are independent of each other and all represent individual employees, thus meeting the first assumption of independent observations. The groups of IT and non-IT have a sample size of ≥30, satisfying the Central Limit Theorem and ensuring the sample data are normally distributed. Therefore, this assumption is also met. The group of managers does not meet this assumption (n = 26); hence, the non-parametric Mann–Whitney U test was conducted for this group since the sample is not normally distributed.
The third assumption of homogeneity of variances was checked using Levene’s test. The IT and non-IT groups meet this assumption (F = 0.126, p = 0.723).
IT vs. non-IT employees: The t-test revealed no significant differences in ISA between IT (M = 6.36, SD = 0.70) and non-IT employees (M = 6.33, SD = 0.57). The associated two-sided p-value was 0.802.
Managers vs. non-Managers: The differences between managers (Mean Rank = 80.21) and non-managers (Mean Rank = 78.16) were also not significant (U = 1645.5, z = −0.213, p = 0.831).
Based on the conducted t-tests and Mann–Whitney U tests, no significant differences were found in ISA between IT and non-IT employees and managers and non-managers. This implies that, within the sample and variables studied, these demographic factors did not influence the awareness of information security.

5. Discussion

This section presents the key findings of this research by interpreting them and their relationship with the literature. Afterward, the limitations this research has, implications for theory and practice, and future research directions will be discussed.

5.1. Key Findings

This research investigated the factors contributing to ISA based on an SA perspective. Literature review identified several antecedents forming a unique research model: SETA Program, Negative Experience, Security Complexity, SETA Design, and InfoSec Goals. Significant contributions to ISA were found from Security Complexity, SETA Program, and InfoSec Goals [36,45,56,67]. The study also used two group perspectives, as recommended by the literature, to tailor SETA efforts in order to improve ISA [20]. Additionally, it explored ISA differences across groups and their significance. The findings for each antecedent and their impacts are discussed.
Contrary to expectations, prior negative experiences did not significantly affect employees’ ISA. This finding deviates from two studies referred to in this research, which found that negative experiences enhanced ISA [36,88]. A possible explanation is that cyber-attacks have become more sophisticated over time [136], making past experiences less relevant against current threats. The descriptive statistics indicated that managers and non-IT employees encountered more negative experiences compared to their counterparts. This finding suggests that past experiences in shaping ISA may diminish over time, since threats become more sophisticated. Hence, this emphasizes the need for continuous updates in SETA efforts.
As for InfoSec Goals, they positively contributed to ISA. Employees who care about and engage in their firm’s information security tend to have higher ISA. This aligns with the SA theory, where comprehension is based on environmental perceptions [75]. Employees with strong InfoSec Goals develop a comprehensive understanding of information security and consider it a priority, leading to greater attentiveness and caution regarding potential risks [107]. This finding is consistent with the literature emphasizing the importance of employee commitment and involvement in information security [43,105,106]. This research adds to the literature by linking InfoSec Goals to ISA.
A remarkable finding was Security Complexity having a significant negative impact on ISA, with a strong adverse effect and a moderate negative correlation (r(154) = −0.49). Security Complexity was the only factor significant across all group perspectives and individual groups, with the highest negative impact observed in employees with lower salary grades (β = −0.290, p < 0.001). An increased workload due to higher complexity not only negatively affects SA by jeopardizing perception but also negatively affects IS compliance intentions [137], which is similar to our findings. The finding underscores the importance of minimizing Security Complexity for an improved ISA. This research also aligns with the findings in other research [42] on phishing and on job stress [111], showing that complexity generally reduces ISA and can lead to lower productivity and security fatigue [40,112,114].
SETA programs were found to have a significant contribution to ISA, although smaller than that of Security Complexity and InfoSec Goals. This aligns with previous studies highlighting the positive impact of SETA on ISA [19,40,93,94,95,103]. According to the SA theory, training primarily enhances comprehension, providing employees with the necessary knowledge to handle potential risks. Thus, SETA remains a critical and standardized component of information security for employees.
On the other hand, SETA Design, encompassing system and interface design, is complex to interpret. The construct measures employees’ experiences with methods (system design) and the quality and amount of information provided (interface design) to form their perception. While it was expected that better methods and information would increase ISA, the results were insignificant, likely due to varied employee opinions, reflected in the wide data variance (SD = 1.14). Effective SETA Design should match employees’ needs and foster long-term awareness, which may vary across different employee groups. Thus, while SETA is generally effective, its methods and information transfer require further investigation to foster long-term awareness. This finding is in line with other research [23].
In terms of mean comparisons, there was no significant difference in ISA for the group perspectives, as indicated. In case the opposite was to be present, the separate regression models could be further analysed to find factors explaining why the levels of ISA differ based on the antecedents, their significance level, and the strength of the coefficients in the regression.

5.2. ISA and Findings in Relation to SA

In Figure 3, ISA is visually demonstrated from an SA perspective. Under the taxonomy, the three significant variables are placed, showing their relationship with the three levels of SA.
First, it demonstrates that Security Complexity is positioned under level 1, as Security Complexity, according to SA theory, poses a threat to the first level. However, it can also extend to levels 2 and 3. This positioning highlights the importance of minimizing complexity as a primary focus for ISA, which aligns with the goal of this research.
Second, InfoSec Goals are placed under comprehension and projection. Based on the literature, goals impact how attention is focused and how information is processed and interpreted to form comprehension. InfoSec Goals enable the employee to use the perceived elements from level 1 to assess the threat in the context of the organization/firm and understand the potential damage (level 3, projection). Thus, InfoSec Goals have a vital role in shaping ISA.
Lastly, SETA Program spans the entire taxonomy horizontally, indicating its contribution to all levels and forming a foundational basis. This implies that SETA Programs are crucial for level 1, which fuels the remaining two levels. At the same time, SETA Program for level 1 can be contrasted with Security Complexity and its potential to jeopardize level 1. This practical implication will be further explored in the next sections.

5.3. Research Limitations

This research is subject to several limitations, both methodological and theoretical. Methodologically, the reliability of measurement items posed a challenge. Although the Cronbach’s Alpha value of all items together was 0.828 (20 measurement items), the items of Security Complexity and SETA Program were 0.629 and 0.641, respectively. This indicates potential room for improvement. Additionally, the use of convenience sampling, while chosen for practical reasons, limits the external validity and generalizability of the findings. Notably, the underrepresentation of managers in the sample highlights a potential bias that could have been mitigated with a more robust sampling technique.
Additionally, the measurement of ISA and the other constructs was based on perceptions. The data obtained from the survey responses were self-reported, which can lead to an overestimation of capabilities. Research indicates that employees often overestimate their ISA, with a greater likelihood of the Dunning–Kruger effect as threats become more complex [138]. This can be a limitation and will be addressed in the research recommendation in the next section. The last theoretical development limitations concern the SA theory content. Endsley details information-processing mechanism limitations such as attention and short-, working-, and long-term memory [75]. These structures were not explicitly integrated in the development and execution of this research.
Having established the significant factors influencing ISA and the limitations of this research, the next section details the theoretical implications of these findings.

5.4. Implications for Theory

The previous section has already discussed the extent to which the findings align with prior research. This study contributes to theory by applying the SA theory at an employee level within the context of information security. While existing research in the realm of SA and security has focused on cyber SA and various organizational levels such as management, networks, and firms, this research introduces novel constructs (SETA Design, InfoSec Goals, and Security Complexity) and propositions to explain ISA, which can serve as a foundation for future research. By shedding light on the cognitive factors influencing awareness in information security, this study provides new insights into the field of behavioral information security. Moreover, it opens avenues for criticism and research enhancements, fostering new perspectives. As ISA forms a crucial inception for IS behavior, this research enriches the underexplored investigations into ISA from a theoretical standpoint.

5.5. Implications for Practice

This research has also led to several practical implications. In this section, four main practical implications will be covered and positioned in the context of other relevant studies.
Practical Implication I—Simplify Complexity: Minimizing complexity is crucial, as indicated by the research’s strong negative impact on ISA. Beyond content and materials, complexity encompasses factors like workload, pressure, and knowledge gaps. Organizations can improve Level 1 awareness by simplifying communication, avoiding technical jargon, and ensuring that security education is accessible and relevant to specific employee groups. Tailoring security initiatives to different real-life contexts and employee roles enhances relevance and understanding. For instance, employees with specialized business knowledge can better grasp security concepts within their own work domains. This aligns with other findings [139]; their research findings suggest that individuals may continue engaging in risky behavior if they do not perceive the threat as undesirable and the solution as practical, rather than responding to a rational fear-based appeal [139]. Hence, positioning SETA content as relevant to achieve higher undesirability could help. Furthermore, from an SA-theory perspective, this makes information more comprehensible. Customized complexity reduction strategies for various employee groups are essential, since employees could get overwhelmed by all the possible consequences of threats [58]. Additionally, making security training interactive and enjoyable can enhance engagement. Taking employee feedback into account is vital for improving SETA programs further.
Practical Implication II—Enhance InfoSec Goals: Increasing employee commitment to the IS of their organization is vital for strengthening ISA and establishing a robust human firewall, enabling the democratization of ISA. Organizations should prioritize fostering a culture of commitment to IS, aligning it with broader organizational objectives. This can be achieved through strategies such as emphasizing rewards over punishment for compliant behavior and IS culture within the organizational culture. In terms of using fear appeals to increase ISA, employees may also become less sensitive to fear appeals. In contrast, commitment and organizational citizenship create a positive reinforcement culture, where employees are rewarded for their compliant behavior [109]. In general, organizational culture and organizational commitment have a positive relationship with each other [140]. By integrating InfoSec Goals into the organizational culture, we believe employees are more likely to be intrinsically motivated to prioritize information security.
Practical Implication III—Set priorities for SETA Programs: Organizations should assess the effectiveness of their SETA programs and prioritize areas of improvement based on employee feedback and performance metrics. Organizations should also allocate resources to address other security concerns identified by employees. This includes diversifying SETA methods to accommodate employee preferences. This implication aligns well with other research [141], which has postulated the same.
Practical Implication IV—Using target groups for SETA practices: If organizations would like to nuance their SETA program practice (such as phishing simulation practice) results more by grouping up employees, they could consider the differences. Based on the Negative Experience results from the data, managers have fallen more for phishing efforts than non-managers (27% vs. 18%). It is challenging to come up with clarifications as to why this is. However, it can imply that differences can be further investigated per group as to why this is and practices can be modified for each target group based on their statistics. This recommendation is in line with the recommendations postulated in another study, where the differences between employees and management from the same organisation were investigated [20].

5.6. Future Research Directions

The research offers several recommendations for future studies. Based on the R2adj, which indicated that all variables collectively explain 39% of ISA’s variation, it implies that researchers can seek other constructs to explain ISA. Some of the following recommendations will follow from this.
Further exploration of InfoSec Goals is recommended, considering its significance in this study. Research could delve into how these goals are formed and influenced, examining their relationship with information security climate and culture. Building on other findings, investigating organizational, cognitive, personal, and cultural factors can provide valuable insights into improving information security behavior [106,142]. Furthermore, it was found that top management can influence the behavior of employees by forming organizational cultures that are goal-oriented [142]. This matches neatly with the findings of this research and simultaneously calls for novel findings yielded by new research. These findings [142] are also stressed as a research recommendation in a more recent study [108]. This study constructed several propositions based on pre-existing work, concerning managers’ behavior and organizational culture affecting compliant behavior [108].
For Security Complexity, there is the potential for the deeper exploration of complexity through the lens of psychological distance. Research in this area, inspired by studies [143], could illuminate how the psychological distance of employees influences perceptions of complexity and information security incidents. Incorporating psychological perspectives can enhance understanding and practical implications for minimizing obstacles to ISA.
In terms of methodology, qualitative approaches, particularly for complex constructs like SETA Design, are recommended to gain deeper insights beyond quantitative surveys. Experimental research could also offer direct measurements of awareness, especially for abstract constructs like ISA, where subjects cannot manipulate their behavior. Alternatively, ISA could also be measured by means of the Human Aspects of Information Security Questionnaire (HAIS-Q) [144]. This is recommended due to the earlier mention of ISA measures based on one’s own perception, which can lead to overestimated capabilities.
Conducting research in diverse firms across various sectors and sizes is advised to enrich the field of ISA. This allows for the broader application of theoretical frameworks across different domains.
Scholars are also encouraged to explore alternative research models that account for high correlations between the explanatory constructs since the correlations between SETA Design and SETA, r(154) = 0.66, p = < 0.01, and between SETA and InfoSec Goals, r(154) = 0.57, p = < 0.01, were alarming for potential multicollinearity. Different models with varied relationships between constructs can offer new insights into ISA.

6. Conclusions

In this research, it was investigated which individual factors have a direct influence on the ISA of an employee. To seek and form the antecedents, the theory of SA was utilized as the corresponding theoretical lens. Subsequently, a survey has been generated to quantitatively test the following main research question: “How and to what extent do cognitive and internal factors contribute to information security awareness (ISA)?”.
  • Research Constructs & Method: Five explaining constructs have been incorporated in the research to answer the research question, and their direct relationship with ISA has been tested by means of weighted regression analyses. The constructs are Negative Experience, SETA Program, InfoSec Goals, Security Complexity, and SETA Design.
  • Impact of constructs on ISA:
    InfoSec Goals: Have a positive impact on ISA, demonstrating that employees with a strong commitment to information security are more aware.
    SETA Program: Has a positive impact on ISA by providing necessary education and training.
    Security Complexity: Has a negative impact on ISA, indicating that higher complexity in security measures can decrease awareness and increase cognitive overload.
  • Significant Insights: Security Complexity emerged as the most significant factor, followed by InfoSec Goals and SETA Program, aligning with the three levels of ISA within the SA framework.
  • Group Analyses:
    Additional group analyses and separate regressions have been conducted with the help of two group perspectives, (1) IT and non-IT employees and (2) managers and non-managers.
    The separate regressions held a stricter significance level, demonstrating that Security Complexity has a significant contribution for each group.
    Mean comparisons did not yield notable findings in ISA per group.
This research has established novel constructs and propositions to explain how cognitive and internal factors influence employees’ ISA. These findings can inform future studies and help organizations, ultimately enhancing their overall security posture.

Author Contributions

Conceptualization, M.D. and E.B.; methodology, M.D. and E.B.; software, M.D.; validation, M.D.; formal analysis, M.D. and E.B.; investigation, M.D.; data curation, M.D.; writing—original draft preparation, M.D.; writing—review and editing, E.B.; visualization, M.D.; supervision, E.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in this article.

Acknowledgments

We highly appreciate the opportunity to conduct this research in a large Dutch insurance firm that focuses a lot on security awareness.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Summary of ISA-focused research—Antecedent Dimensions: INST = institutional, IND = individual, ENV = (socio-)environmental, TECH = technological. Unit of Analysis: employee/end user. Dimensions derived from studies [36,81].
Table A1. Summary of ISA-focused research—Antecedent Dimensions: INST = institutional, IND = individual, ENV = (socio-)environmental, TECH = technological. Unit of Analysis: employee/end user. Dimensions derived from studies [36,81].
StudyINSTINDENVTECHTheoretical Lens/Angle to Explain AwarenessObservation
[52] XX Social Learning Theory-
[36]XXX Combines elements of general deterrence theory and social psychology-
[35]X X Relational Awareness-
[80]XX Innovation Diffusion Theory-
[42] X XSituation AwarenessExperimental, phishing context. Personality trait integration.
[37]X Leadership StylesConsistent results: leadership positively influences ISA
[63]XX Theory of Planned Behavior
[19]X Social Learning TheoryEducational information and channels positively influence ISA
[82]X Theory of Reasoned Action
[45]X Organizational and security culture-
[84] X Big Five Personality Traits ModelConsistent and similar results in demographic factors
[83] X
[55] X Demographic differences
[72] X Collectivism-
[85] XX Demographic attributes and socioeconomic resources-

Appendix B

Table A2. Pearson’s Correlation Matrix.
Table A2. Pearson’s Correlation Matrix.
SETACOMPNEGGOALSETADISA
SETA Program--
Security Complexity (COMP)−0.102--
Negative Experience (NEG)0.1520.152--
InfoSec Goals (GOAL)0.568 **−0.1120.160 *--
SETA Design (SETAD)0.661 **0.0400.1000.453 **--
Information Security Awareness (ISA)0.402 **−0.493 **0.0450.550 **0.184 *--
**. Correlation is significant at the 0.01 level (two-tailed). *. Correlation is significant at the 0.05 level (two-tailed).

Appendix C

Table A3. Collinearity Diagnostics.
Table A3. Collinearity Diagnostics.
Coefficients
ModelCollinearity Statistics
ToleranceVIF
1SETA0.4602.172
Complexity0.9301.075
Negative Experience0.9371.068
InfoSec Goals0.6541.529
SETA Design0.5401.852
Dependent Variable: ISA.

Appendix D. Convergent & Discriminant Validity

First, the Kaiser–Meyer–Olkin test for sampling adequacy was used to determine if a factor analysis was suitable. A KMO value larger than 0.80 is beyond fit for a factor analysis [145]. The KMO of all items together was 0.875, with a Bartlett’s Test of Sphericity p < 0.001. Subsequently, the confirmatory factor analysis was executed in SmartPLS. Again, the AVE should be larger than 0.5 according to the criterion of Fornell and Larcker [130] to establish convergent validity. The following table demonstrates the square root AVE values in bold.
Initially, the Kaiser–Meyer–Olkin (KMO) test was utilized to assess the adequacy of sampling for factor analysis. A KMO value exceeding 0.80 is indicative of excellent suitability for factor analysis [145]. In this study, the KMO value for all items collectively yielded 0.875, and Bartlett’s Test of Sphericity returned a significant result (p < 0.001).
Subsequently, confirmatory factor analysis was conducted using SmartPLS. Convergent validity was evaluated based on the Average Variance Extracted (AVE), with a criterion set by Fornell and Larcker [130] suggesting that AVE values should surpass 0.5. The table below presents the square root of the AVE values, highlighted in bold.
Table A4. Square Root of AVE Values and Inter-Construct Correlations.
Table A4. Square Root of AVE Values and Inter-Construct Correlations.
AVECOMPGOALISASETASETAD
COMP0.5680.753
GOAL0.858−0.1450.926
ISA0.658−0.4950.6070.811
SETA0.579−0.2860.5820.530.761
SETAD0.601−0.0120.4680.2590.5570.775
The table demonstrates that all AVE values exceed 0.5 and the bold values exceed the values below and horizontally on the corresponding columns and rows. Hence, discriminant validity is indicated.

Appendix E. Harman’s Single Factor Test Results

The results of the Harman’s single factor test indicated that a single factor accounted for 32.738% of the variance. CMV values below 50% are considered acceptable, indicating that our CMV value falls within the acceptable range [146].
Table A5. Principal Component Analysis.
Table A5. Principal Component Analysis.
Total Variance Explained
ComponentInitial Eigenvalues Extraction Sums of Squared Loadings
Total% of VarianceCumulative %Total% of VarianceCumulative %
17.20232.73832.7387.20232.73832.738
23.64616.57149.309
31.6777.62556.933
41.1345.15562.088
51.0444.74566.833
69124.14470.977
77663.48374.460
86863.11977.579
96472.94180.520
105842.65583.175
115552.52185.696
124652.11587.811
134321.96289.773
143871.75891.532
153581.62993.161
163071.39694.557
173061.39295.949
182391.08497.033
1921597898.012
2019689098.902
2116072899.629
2282371100.000
Extraction Method: Principal Component Analysis.

Appendix F

Table A6. Basic Concepts of INTRODUCTION.
Table A6. Basic Concepts of INTRODUCTION.
TermDefinitionReference
Cybersecurity“Prevention of damage to, protection of, and restoration of computers, electronic communications systems, electronic communications
services, wire communication, and electronic communication, including information contained therein, to ensure its availability, integrity, authentication, confidentiality, and nonrepudiation”.
[147]
Cyber resilience“Cyber resilience refers to the ability to continuously deliver the
intended outcome despite adverse cyber events.”
[148] (p. 2)
Information security“The protection of information, which is an asset, from possible harm resulting from various threats and vulnerabilities”.[149] (p. 4)
Information security policies (ISPs)Information security policies are formalized documents that outline the rules and guidelines for protecting an organization’s information and technology resources. These policies aim to ensure that employees understand their roles and responsibilities in maintaining the security of the organization’s information systems.[25]
Information systems“Information systems are interrelated components working together to collect, process, store, and disseminate information to support decision making, coordination, control, analysis, and visualization in an organization.”[150] (p. 44)

References

  1. Admass, W.S.; Munaye, Y.Y.; Diro, A. Cyber security: State of the art, challenges and future directions. Cyber Secur. Appl. 2023, 2, 100031. [Google Scholar] [CrossRef]
  2. Thakur, M. Cyber Security Threats and Countermeasures in Digital Age. J. Appl. Sci. Educ. (JASE) 2024, 4, 1–20. [Google Scholar]
  3. Gartner. Top Trends in Cybersecurity for 2024; Gartner: Stamford, CT, USA, 2024; Available online: https://www.gartner.com/en/cybersecurity/trends/cybersecurity-trends (accessed on 23 June 2024).
  4. Borkovich, D.; Skovira, R. Working from Home: Cybersecurity in the Age of Covid-19. Issues Inf. Syst. 2020, 21, 234–246. [Google Scholar] [CrossRef]
  5. Weil, T.; Murugesan, S. IT risk and resilience—Cybersecurity response to COVID-19. IT Prof. 2020, 22, 4–10. [Google Scholar] [CrossRef]
  6. Saleous, H.; Ismail, M.; AlDaajeh, S.H.; Madathil, N.; Alrabaee, S.; Choo, K.K.R.; Al-Qirim, N. COVID-19 pandemic and the cyberthreat landscape: Research challenges and opportunities. Digit. Commun. Netw. 2023, 9, 211–222. [Google Scholar]
  7. Gartner. Top Trends in Cybersecurity 2022; Gartner: Stamford, CT, USA, 2022. [Google Scholar]
  8. Almansoori, A.; Al-Emran, M.; Shaalan, K. Exploring the Frontiers of Cybersecurity Behaviour: A Systematic Review of Studies and Theories. Appl. Sci. 2023, 13, 5700. [Google Scholar] [CrossRef]
  9. Bowen, B.M.; Devarajan, R.; Stolfo, S. Measuring the human factor of cyber security. In Proceedings of the 2011 IEEE International Conference on Technologies for Homeland Security (HST), Waltham, MA, USA, 15–17 November 2011; pp. 230–235. [Google Scholar]
  10. Onumo, A.; Ullah-Awan, I.; Cullen, A. Assessing the moderating effect of security technologies on employees compliance with cybersecurity control procedures. ACM Trans. Manag. Inf. Syst. 2021, 12, 11. [Google Scholar] [CrossRef]
  11. Jeong, C.Y.; Lee, S.Y.T.; Lim, J.H. Information security breaches and IT security investments: Impacts on competitors. Inf. Manag. 2019, 56, 681–695. [Google Scholar]
  12. Alsharida, R.A.; Al-rimy, B.A.S.; Al-Emran, M.; Zainal, A. A systematic review of multi perspectives on human cybersecurity behaviour. Technol. Soc. 2023, 73, 102258. [Google Scholar]
  13. Cram, W.A.; D’Arcy, J. ‘What a waste of time’: An examination of cybersecurity legitimacy. Inf. Syst. J. 2023, 33, 1396–1422. [Google Scholar]
  14. Baltuttis, D.; Teubner, T.; Adam, M.T. A typology of cybersecurity behaviour among knowledge workers. Comput. Secur. 2024, 140, 103741. [Google Scholar]
  15. Siponen, M.T. A conceptual foundation for organizational information security awareness. Inf. Manag. Comput. Secur. 2000, 8, 31–41. [Google Scholar] [CrossRef]
  16. Wang, W.; Harrou, F.; Bouyeddou, B.; Senouci, S.M.; Sun, Y. Cyber-attacks detection in industrial systems using artificial intelligence-driven methods. Int. J. Crit. Infrastruct. Prot. 2022, 38, 100542. [Google Scholar]
  17. Alyami, A.; Sammon, D.; Neville, K.; Mahony, C. Critical success factors for Security Education, Training and Awareness (SETA) programme effectiveness: An empirical comparison of practitioner perspectives. Inf. Comput. Secur. 2024, 32, 53–73. [Google Scholar]
  18. Aldawood, S.; Skinner, G. Reviewing Cyber Security Social Engineering Training and Awareness Programs—Pitfalls and Ongoing Issues. Future Internet 2019, 11, 73. [Google Scholar] [CrossRef]
  19. Hwang, I.; Wakefield, R.; Kim, S.; Kim, T. Security Awareness: The First Step in Information Security Compliance Behaviour. J. Comput. Inf. Syst. 2021, 61, 345–356. [Google Scholar] [CrossRef]
  20. Kannelønning, I.H.; Katsikas, S.K. A systematic literature review of how cybersecurity-related behaviour has been assessed. Inf. Comput. Secur. 2023, 31, 463–477. [Google Scholar]
  21. Gulyás, A.; Kiss, A. Impact of Cyber-Attacks on the Financial Institutions. Procedia Comput. Sci. 2023, 219, 84–90. [Google Scholar] [CrossRef]
  22. Kuraku, D.S.; Kalla, D.; Smith, N.; Samaah, F. Safeguarding FinTech: Elevating Employee Cybersecurity Awareness in Financial Sector. Int. J. Appl. Inf. Syst. (IJAIS) 2023, 12, 43–47. [Google Scholar]
  23. Rohan, R.; Pal, D.; Hautamäki, J.; Funilkul, S.; Chutimaskul, W.; Thapliyal, H. A systematic literature review of cybersecurity scales assessing information security awareness. Heliyon 2023, 9, e08671. [Google Scholar] [CrossRef]
  24. Donalds, B.; Barclay, S. Beyond Technical Measures: A Value-Focused Thinking Appraisal of Strategic Drivers in Improving Information Security Policy Compliance. Eur. J. Inf. Syst. 2021, 31, 58–73. [Google Scholar] [CrossRef]
  25. Bulgurcu, B.; Cavusoglu, H.; Benbasat, I. Information security policy compliance: An empirical study of rationality-based beliefs and information security awareness. MIS Q. 2010, 34, 523–548. [Google Scholar]
  26. Chen, Y.; Galletta, D.F.; Lowry, P.B.; Luo, X.R.; Moody, G.D.; Willison, R. Understanding Inconsistent Employee Compliance with Information Security Policies through the Lens of the Extended Parallel Process Model. Inf. Syst. Res. 2021, 32, 1043–1065. [Google Scholar] [CrossRef]
  27. Fertig, T.; Schütz, A.E.; Weber, K. Current Issues of Metrics for Information Security Awareness. In Proceedings of the 28th European Conference on Information Systems (ECIS), An AIS Conference, Online, 15–17 June 2020. [Google Scholar]
  28. Schuetz, S.W.; Lowry, P.B.; Pienta, D.A. The effectiveness of abstract versus concrete fear appeals in information security. J. Manag. Inf. Syst. 2020, 37, 723–757. [Google Scholar] [CrossRef]
  29. Ng, K.C.; Zhang, X.; Thong, J.Y.L.; Tam, K.Y. Protecting against threats to information security: An attitudinal ambivalence perspective. J. Manag. Inf. Syst. 2021, 38, 732–764. [Google Scholar] [CrossRef]
  30. Cram, W.A.; D’Arcy, J.; Proudfoot, J.G. Seeing the Forest and the Trees: A Meta-Analysis of the Antecedents to Information Security Policy Compliance. MIS Q. 2019, 43, 525–554. [Google Scholar] [CrossRef]
  31. Dhillon, G.; Smith, K.; Dissanayaka, I. Information systems security research agenda: Exploring the gap between research and practice. J. Strateg. Inf. Syst. 2021, 30, 101693. [Google Scholar] [CrossRef]
  32. Ko, A.; Tarján, G.; Mitev, A. Information security awareness maturity: Conceptual and practical aspects in Hungarian organizations. Inf. Technol. People 2023, 36, 174–195. [Google Scholar]
  33. Li, W.; Leung, A.; Yue, W. Where is IT in Information Security? The Interrelationship among IT Investment, Security Awareness, and Data Breaches. MIS Q. 2023, 47, 317–342. [Google Scholar] [CrossRef]
  34. Alahmari, A.; Renaud, K.; Omoronyia, I. Moving Beyond Cyber Security Awareness and Training to Engendering Security Knowledge Sharing. Inf. Syst. E-Bus. Manag. 2023, 21, 123–158. [Google Scholar] [CrossRef]
  35. Ahlan, A.R.; Lubis, M.; Lubis, A.R. Information Security Awareness at the Knowledge-Based Institution: Its Antecedents and Measures. Procedia Comput. Sci. 2015, 72, 361–373. [Google Scholar] [CrossRef]
  36. Haeussinger, F.; Kranz, J. Antecedents of employees’ information security awareness: Review, synthesis, and directions for future research. In Proceedings of the 25th European Conference on Information Systems (ECIS), Guimarães, Portugal, 5–10 June 2017; pp. 1–20. [Google Scholar]
  37. Humaidi, N.; Balakrishnan, V. Leadership styles and information security compliance behaviour: The mediator effect of information security awareness. Int. J. Inf. Educ. Technol. 2015, 5, 311. [Google Scholar]
  38. Al-Omari, A.; El-Gayar, O.; Deokar, A. Information security policy compliance: The role of information security awareness. In Proceedings of the Eighteenth Americas Conference on Information Systems, Seattle, WA, USA, 9–12 August 2012; pp. 1–10. [Google Scholar]
  39. Al-Omari, A.; El-Gayar, O.; Deokar, A. Security policy compliance: User acceptance perspective. In Proceedings of the 2012 45th Hawaii International Conference on System Sciences, Maui, HI, USA, 4–7 January 2012; pp. 1–10. [Google Scholar]
  40. D’Arcy, J.; Hovav, A.; Galletta, D. User awareness of security countermeasures and its impact on information systems misuse: A deterrence approach. Inf. Syst. Res. 2009, 20, 79–98. [Google Scholar]
  41. Guzman, I.R.; Galvez, S.M.; Stanton, J.M.; Stam, K.R. Information Security Awareness and Information Security Practices of Internet Users in Bolivia: A Socio-Cognitive View. RELCASI 2014, 6, 2. [Google Scholar]
  42. Jaeger, L.; Eckhardt, A. Eyes wide open: The role of situational information security awareness for security-related behaviour. Inf. Syst. J. 2021, 31, 429–472. [Google Scholar]
  43. Liu, C.; Wang, N.; Liang, H. Motivating information security policy compliance: The critical role of supervisor-subordinate guanxi and organizational commitment. Int. J. Inf. Manag. 2020, 54, 102152. [Google Scholar]
  44. Alanazi, M.; Freeman, M.; Tootell, H. Exploring the factors that influence the cybersecurity behaviors of young adults. J. Comput. Hum. Behav. 2022, 136, 107376. [Google Scholar] [CrossRef]
  45. Wiley, J.; McCormac, A.; Calic, D. More Than the Individual: Examining the Relationship Between Culture and Information Security Awareness. Comput. Secur. 2020, 88, 101640. [Google Scholar] [CrossRef]
  46. Hitchings, J. Deficiencies of the traditional approach to information security and the requirements for a new methodology. Comput. Secur. 1995, 14, 377–383. [Google Scholar] [CrossRef]
  47. Crossler, R.E.; Johnston, A.C.; Lowry, P.B.; Hu, Q.; Warkentin, M.; Baskerville, R. Future directions for behavioural information security research. Comput. Secur. 2013, 32, 90–101. [Google Scholar] [CrossRef]
  48. Hanus, B.; Windsor, J.C.; Wu, Y. Definition and multidimensionality of security awareness: Close encounters of the second order. ACM SIGMIS Database DATABASE Adv. Inf. Syst. 2018, 49, 103–133. [Google Scholar] [CrossRef]
  49. Maalem Lahcen, R.A.; Caulkins, B.; Mohapatra, R.; Kumar, M. Review and insight on the behavioural aspects of cybersecurity. Cybersecurity 2020, 3, 10. [Google Scholar] [CrossRef]
  50. Stanton, J.M.; Stam, K.R.; Mastrangelo, P.M.; Jolton, J.A. Behavioral information security: An overview, results, and research agenda. Hum. Comput. Interact. Manag. Inf. Syst. 2015, 12, 276–294. [Google Scholar]
  51. Chan, R.Y. K Woon, I.; Kankanhalli, A. Perceptions of Information Security in the Workplace: Linking Information Security Climate to Compliant Behavior. J. Inf. Priv. Secur. 2005, 1, 18–41. [Google Scholar] [CrossRef]
  52. Johnston, A.C.; Wech, B.; Jack, E.; Beavers, M. Reigning in the Remote Employee: Applying Social Learning Theory to Explain Information Security Policy Compliance Attitudes. In Proceedings of the AMCIS 2010, 15–18 August; p. 493.
  53. Duzenci, D.; Kitapci, H.; Gok, M.S. The Role of Decision-Making Styles in Shaping Cybersecurity Compliance Behavior. Appl. Sci. 2023, 13, 8731. [Google Scholar] [CrossRef]
  54. Warkentin, M.; Johnston, A.C.; Shropshire, J. The influence of the informal social learning environment in information security awareness programs. Eur. J. Inf. Syst. 2011, 20, 259–272. [Google Scholar] [CrossRef]
  55. Chua, H.N.; Chua, S.F.; Low,, Y.C.; Chang, Y. Impact of Employees’ Demographic Characteristics on the Awareness and Compliance of Information Security Policy in Organizations. Telematics Inf. 2018, 35, 1770–1780. [Google Scholar] [CrossRef]
  56. Luo, X.R.; Li, H.; Hu, Q.; Xu, H. Why Individual Employees Commit Malicious Computer Abuse: A Routine Activity Theory Perspective. J. Assoc. Inf. Syst. 2020, 21. [Google Scholar] [CrossRef]
  57. Shah, P.; Agarwal, A. Cyber Suraksha: A Card Game for Smartphone Security Awareness. Inf. Comput. Secur. 2023, 31, 576–600. [Google Scholar] [CrossRef]
  58. Choi, H.; Park, S.; Kang, J. Enhancing Participatory Security Culture in Public Institutions: An Analysis of Organizational Employees’ Security Threat Recognition Processes. IEEE Access 2024, 12, 47543–47558. [Google Scholar] [CrossRef]
  59. Lebek, B.; Uffen, J.; Breitner, M.H.; Neumann, M.; Hohler, B. Employees’ Information Security Awareness and Behavior: A Literature Review. In Proceedings of the 2013 46th Hawaii International Conference on System Sciences, Wailea, HI, USA, 7-10 January 2013; pp. 2978–2987. [Google Scholar] [CrossRef]
  60. Rocha Flores, W.; Ekstedt, M. Shaping intention to resist social engineering through transformational leadership, information security culture and awareness. Comput. Secur. 2016, 59, 26–44. [Google Scholar] [CrossRef]
  61. Moody, G.D.; Siponen, M.; Pahnila, S. Toward a Unified Model of Information Security Policy Compliance. MIS Q. 2018, 42, 285–312. [Google Scholar] [CrossRef]
  62. Hutchinson, G.; Ophoff, J. A descriptive review and classification of organizational information security awareness research. In Proceedings of the 18th International Information Security Conference 2019, Johannesburg, South Africa, 15 August 2019. [Google Scholar]
  63. Grassegger, T.; Nedbal, D. The role of employees’ information security awareness on the intention to resist social engineering. Procedia Comput. Sci. 2021, 181, 59–66. [Google Scholar]
  64. Jaeger, L.; Eckhardt, A. Making cues salient: The Role of Security Awareness in shaping Threat and Coping Appraisals. In Proceedings of the 25th European Conference on Information Systems (ECIS) 2017, Guimarães, Portugal, 5–10 June 2017; pp. 2525–2535, ISBN 978-0-9915567-0-0. Available online: https://aisel.aisnet.org/ecis2017_rip/5 (accessed on 14 August 2024).
  65. Torten, R.; Reaiche, C.; Boyle, S. The Impact of Security Awareness on Information Technology Professionals’ Behavior. Comput. Secur. 2018, 79, 68–79. [Google Scholar] [CrossRef]
  66. Li, L.; He, W.; Xu, L.; Ash, I.; Anwar, M.; Yuan, X. Investigating the impact of cybersecurity policy awareness on employees’ cybersecurity behaviour. Int. J. Inf. Manag. 2019, 45, 13–24. [Google Scholar]
  67. Hu, S.; Hsu, C.; Zhou, Z. Security Education, Training, and Awareness Programs: Literature Review. J. Comput. Inf. Syst. 2021, 62, 752–764. [Google Scholar] [CrossRef]
  68. Bandura, A.; Walters, R.H. Social Learning Theory; Englewood Cliffs: Prentice Hall, NJ, USA, 1977; Volume 1. [Google Scholar]
  69. Zainal, N.C.; Puad, M.; Sani, N. Moderating Effect of Self-Efficacy in the Relationship Between Knowledge, Attitude and Environment Behavior of Cybersecurity Awareness. Asian Social Science. 2022, 18, 1–55. [Google Scholar]
  70. Ormond, D.; Warkentin, M.; Crossler, R.E. Integrating Cognition with an Affective Lens to Better Understand Information Security Policy Compliance. J. Assoc. Inf. Syst. 2019, 20. [Google Scholar] [CrossRef]
  71. Jensen, M.L.; Durcikova, A.; Wright, R.T. Using susceptibility claims to motivate behaviour change in IT security. Eur. J. Inf. Syst. 2021, 30, 27–45. [Google Scholar] [CrossRef]
  72. Park, E.H.; Kim, J.; Wiles, L. The Role of Collectivism and Moderating Effect of IT Proficiency on Intention to Disclose Protected Health Information. Inf. Technol. Manag. 2023, 24, 177–193. [Google Scholar] [CrossRef]
  73. Shaw, R.S.; Chen, C.C.; Harris, A.L.; Huang, H. The Impact of Information Richness on Information Security Awareness Training Effectiveness. Comput. Educ. 2009, 52, 92–100. [Google Scholar] [CrossRef]
  74. Lebek, B.; Uffen, J.; Neumann, M.; Hohler, B.; Breitner, M.H. Information security awareness and behaviour: A theory-based literature review. Manag. Res. Rev. 2014, 37, 256–276. [Google Scholar] [CrossRef]
  75. Endsley, M.R. Toward a theory of situation awareness in dynamic systems. J. Hum. Factors Ergon. Soc. 1995, 37, 32–64. [Google Scholar] [CrossRef]
  76. Stubbings, L.; Chaboyer, W.; McMurray, A. Nurses’ use of situation awareness in decision-making: An integrative review. J. Adv. Nurs. 2012, 68, 1443–1453. [Google Scholar]
  77. Franke, U.; Brynielsson, J. Cyber situational awareness–a systematic review of the literature. Comput. Secur. 2014, 46, 18–31. [Google Scholar] [CrossRef]
  78. Renaud, J.; Ophoff, J. A cyber situational awareness model to predict the implementation of cyber security controls and precautions by SMEs. Organizational Cybersecurity Journal: Practice, Process and People. Organ. Cybersecur. J. Pract. Process People 2021. [Google Scholar] [CrossRef]
  79. Tianfield, H. Towards integrating a task allocation mechanism into a cyber security situation awareness system. In Proceedings of the Cyber and Information Security Research Conference (CISRC) 2016, Oak Ridge, TN, USA, 5–7 April 2016; pp. 60–66. [Google Scholar]
  80. Alshboul, Y.; Streff, K. Beyond cybersecurity awareness: Antecedents and satisfaction. In Proceedings of the 2017 International Conference on Software and e-Business, Hong Kong, China, 28–30 December 2017. [Google Scholar]
  81. Jaeger, L. Information security awareness: Literature review and integrative framework. In Proceedings of the 51st Hawaii International Conference on System Sciences, Hilton Waikoloa Village, HI, USA, 3–6 January 2018; pp. 4703–4712. [Google Scholar]
  82. Bauer, S.; Bernroider, E.W. From information security awareness to reasoned compliant action: Analyzing information security policy compliance in a large banking organization. ACM SIGMIS Database DATABASE Adv. Inf. Syst. 2017, 48, 44–68. [Google Scholar]
  83. McCormac, A.; Zwaans, T.; Parsons, K.; Calic, D.; Butavicius, M.; Pattinson, M. Individual differences and Information Security Awareness. Comput. Hum. Behav. 2017, 69, 151–156. [Google Scholar] [CrossRef]
  84. van der Schyff, S.; Flowerday, S.V. Proposing a user-centric and context-aware conceptual model for enhancing cybersecurity behaviour. Behav. Inf. Technol. 2021, 40, 354–369. [Google Scholar]
  85. Lyon, G. Informational inequality: The role of resources and attributes in information security awareness. Inf. Comput. Secur. 2024, 32, 197–217. [Google Scholar]
  86. Tsohou, A.; Karyda, M.; Kokolakis, S. Analyzing the role of cognitive and cultural biases in the internalization of information security policies: Recommendations for information security awareness programs. Comput. Secur. 2015, 52, 128–141. [Google Scholar] [CrossRef]
  87. Endsley, M.R. Design and evaluation for situation awareness enhancement. In Proceedings of the Human Factors Society Annual Meeting; Sage Publications: Los Angeles, CA, USA, 1988. [Google Scholar]
  88. Kovačević, A.; Putnik, N.; Tošković, O. Factors related to cyber security behaviour. IEEE Access 2020, 8, 125140–125148. [Google Scholar] [CrossRef]
  89. Frank, M.; Kohn, M. Understanding Extra-Role Security Behaviors: An Integration of the Self-Determination Theory and Construal Level Theory. Computers & Security 2023, 132, 103386. [Google Scholar] [CrossRef]
  90. Peltier, T.R. Information Security Policies, Procedures, and Standards: Guidelines for Effective Information Security Management; CRC Press: Boca Raton, FL, USA, 2005. [Google Scholar]
  91. Amankwa, E.; Loock, M.; Kritzinger, E. A conceptual analysis of information security education, information security training and information security awareness definitions. In Proceedings of the 9th International Conference for Internet Technology and Secured Transactions (ICITST-2014), London, UK, 8–10 December 2014. [Google Scholar]
  92. Tsohou, A.; Kokolakis, S.; Karyda, M. Understanding information security awareness: A systematic literature review. Comput. Secur. 2015, 49, 8–27. [Google Scholar] [CrossRef]
  93. Eminağaoğlu, M.; Uçar, E.; Eren, Ş. The positive outcomes of information security awareness training in companies—A case study. Inf. Secur. Tech. Rep. 2009, 14, 223–229. [Google Scholar] [CrossRef]
  94. Kweon, E.; Lee, H.; Chai, S.; Yoo, K. The Utility of Information Security Training and Education on Cybersecurity Incidents: An empirical evidence. Inf. Syst. Front. 2021, 23, 361–373. [Google Scholar] [CrossRef]
  95. Sikolia, D.; Biros, D.; Zhang, T. How Effective Are SETA Programs Anyway: Learning and Forgetting in Security Awareness Training. J. Cybersecurity Educ. Res. Pract. 2023, 2023. [Google Scholar] [CrossRef]
  96. Alkhazi, B.; Alshaikh, M.; Alkhezi, S.; Labbaci, H. Assessment of the impact of information security awareness training methods on knowledge, attitude, and behaviour. IEEE Access 2022, 10, 132132–132143. [Google Scholar] [CrossRef]
  97. Zhang-Kennedy, L.; Chiasson, S. A Systematic Review of Multimedia Tools for Cybersecurity Awareness and Education. ACM Comput. Surv. 2021, 54, 12. [Google Scholar] [CrossRef]
  98. Silic, M.; Lowry, P.B. Using design-science based gamification to improve organizational security training and compliance. J. Manag. Inf. Syst. 2020, 37, 129–161. [Google Scholar]
  99. Dincelli, E.; Chengalur-Smith, I. Choose your own training adventure: Designing a gamified SETA artefact for improving information security and privacy through interactive storytelling. Eur. J. Inf. Syst. 2020, 29, 669–687. [Google Scholar] [CrossRef]
  100. Emm, D. Gamification—Can it be applied to security awareness training? Netw. Secur. 2021, 4, 16–18. [Google Scholar] [CrossRef]
  101. Abawajy, J. User preference of cyber security awareness delivery methods. Behav. Inf. Technol. 2014, 33, 237–248. [Google Scholar]
  102. Kajzer, M.; D’Arcy, J.; Crowell, C.R.; Striegel, A.; Van Bruggen, D. An exploratory investigation of message-person congruence in information security awareness campaigns. Comput. Secur. 2014, 43, 64–76. [Google Scholar]
  103. Yoo, C.W.; Sanders, G.L.; Cerveny, R.P. Exploring the Influence of Flow and Psychological Ownership on Security Education, Training and Awareness Effectiveness and Security Compliance. Decis. Support Syst. 2018, 108, 107–118. [Google Scholar] [CrossRef]
  104. Puhakainen, P.; Siponen, M. Improving employees’ compliance through information systems security training: An action research study. MIS Q. 2010, 34, 757–778. [Google Scholar]
  105. Chu, A.M.; So, M.K. Organizational information security management for sustainable information systems: An unethical employee information security behaviour perspective. Sustainability 2020, 12, 3163. [Google Scholar]
  106. Goo, J.; Yim, M.-S.; Kim, D.J. A path to successful management of employee security compliance: An empirical study of information security climate. IEEE Trans. Prof. Commun. 2014, 57, 286–308. [Google Scholar]
  107. Davis, J.; Agrawal, D.; Guo, X. Enhancing users’ security engagement through cultivating commitment: The role of psychological needs fulfilment. Eur. J. Inf. Syst. 2023, 32, 195–206. [Google Scholar] [CrossRef]
  108. Cavallari, M. Organizational Determinants and Compliance Behaviour to Shape Information Security Plan. Acad. J. Interdiscip. Stud. 2023, 12, 1. [Google Scholar] [CrossRef]
  109. Vedadi, A.; Warkentin, M.; Straub, D.W.; Shropshire, J. Fostering Information Security Compliance as Organizational Citizenship Behavior. Inf. Manage. 2024, 61, 103968. [Google Scholar] [CrossRef]
  110. Price, W.; Price, T.; Tenan, M.; Head, J.; Maslin, W.; LaFiandra, M. Acute Stress Causes Overconfidence in Situation Awareness. In Proceedings of the 2016 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), San Diego, CA, USA, 15–18 March 2016; IEEE: New York, NY, USA, 2016; pp. 1–6. [Google Scholar] [CrossRef]
  111. McCormac, A.; Calic, D.; Parsons, K.; Butavicius, M.; Pattinson, M.; Lillie, M. The effect of resilience and job stress on information security awareness. Inf. Comput. Secur. 2018, 26, 463–483. [Google Scholar] [CrossRef]
  112. Lee, C.; Lee, C.C.; Kim, S. Understanding information security stress: Focusing on the type of information security compliance activity. Comput. Secur. 2016, 59, 60–70. [Google Scholar] [CrossRef]
  113. D’Arcy, J.; Teh, P.-L. Predicting employee information security policy compliance on a daily basis: The interplay of security-related stress, emotions, and neutralization. Inf. Manag. 2019, 56, 103151. [Google Scholar] [CrossRef]
  114. Cram, W.A.; D’Arcy, J.; Proudfoot, J.G. When enough is enough: Investigating the antecedents and consequences of information security fatigue. Inf. Syst. J. 2021, 31, 521–549. [Google Scholar] [CrossRef]
  115. D’Arcy, J.; Herath, T.; Shoss, M.K. Understanding employee responses to stressful information security requirements: A coping perspective. J. Manag. Inf. Syst. 2014, 31, 285–318. [Google Scholar]
  116. Harper, A.; Mustafee, N.; Pitt, M. Increasing situation awareness in healthcare through real-time simulation. J. Oper. Res. Society 2023, 74, 2339–2349. [Google Scholar]
  117. Bolger, C.; Brummel, B.; Aurigemma, S.; Moore, T.; Baskin, M. Information security awareness: Identifying gaps in current measurement tools. In Proceedings of the 22nd Annual Security Conference (ASC), Las Vegas, NV, USA, 29–30 April 2023. [Google Scholar]
  118. Nwachukwu, U.; Vidgren, J.; Niemimaa, M.; Järveläinen, J. Do SETA Interventions Change Security Behavior? A Literature Review. In Proceedings of the 56th Annual Hawaii International Conference on System Sciences (HICSS 2023); Bui, T.X., Ed.; University of Hawaii, Mānoa: Honolulu, HI, USA, 2023; pp. 6300–6309. [Google Scholar]
  119. Hart, C. Doing a Literature Review: Releasing the Social Science Research Imagination; Sage: London, UK, 1998. [Google Scholar]
  120. Kraus, S.; Breier, M.; Lim, W.M.; Dabić, M.; Kumar, S.; Kanbach, D.; Mukherjee, D.; Corvello, V.; Piñeiro-Chousa, J.; Liguori, E. Literature reviews as independent studies: Guidelines for academic practice. Rev. Manag. Sci. 2022, 16, 2577–2595. [Google Scholar]
  121. Letts, L.; Wilkins, S.; Law, M.C.; Stewart, D.A.; Bosch, J.; Westmorland, M.G. Guidelines for Critical Review Form—Qualitative Studies (Version 2.0); McMaster University Occupational Therapy Evidence-Based Practice Research Group: Hamilton, ON, Canada, 2007; pp. 1–12. [Google Scholar]
  122. Roscoe, J.T. Fundamental Research Statistics for the Behavioural Sciences, 2nd ed.; Holt, Rinehart & Winston: New York, NY, USA, 1975. [Google Scholar]
  123. Chandarman, R.; Van Niekerk, B. Students’ cybersecurity awareness at a private tertiary educational institution. Afr. J. Inf. Commun. 2017, 20, 133–155. [Google Scholar]
  124. Sarkar, S.; Vance, A.; Ramesh, B.; Demestihas, M.; Wu, D.T. The Influence of Professional Subculture on Information Security Policy Violations: A Field Study in a Healthcare Context. Inf. Syst. Res. 2020, 31, 1240–1259. [Google Scholar] [CrossRef]
  125. Forthofer, R.N.; Lee, E.S.; Hernandez, M. Biostatistics: A Guide to Design, Analysis and Discovery; Elsevier: Amsterdam, The Netherlands, 2006. [Google Scholar]
  126. Salmerón, R.; García, C.; García, J. Overcoming the inconsistences of the variance inflation factor: A redefined VIF and a test to detect statistical troubling multicollinearity. arXiv 2020, arXiv:2005.02245. [Google Scholar]
  127. Sijtsma, K.; Emons, W. Nonparametric Statistical Methods. Int. Encycl. Educ. 2010, 3, 347–353. [Google Scholar]
  128. Hoyle, R. Confirmatory Factor Analysis. In Handbook of Applied Multivariate Statistics and Mathematical Modeling; Tinsley, H.E.A., Brown, S.D., Eds.; Academic Press: San Diego, CA, USA, 2000; pp. 465–497. [Google Scholar]
  129. Suhr, D. Exploratory or confirmatory factor analysis? In The Reviewer’s Guide to Quantitative Methods in the Social Sciences; Hancock, G.R., Mueller, R.O., Eds.; Routledge: London, UK, 2006; pp. 111–142. [Google Scholar]
  130. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  131. Hair, J.F.; Anderson, R.E.; Tatham, R.L.; Black, W.C. Multivariate Data Analysis, 5th ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2003. [Google Scholar]
  132. Field, A. Discovering Statistics Using SPSS; Sage Publications: Thousand Oaks, CA, USA, 2005. [Google Scholar]
  133. Kutner, M.H.; Nachtsheim, C.J.; Neter, J.; Li, W. Applied Linear Regression Models, 4th ed.; McGraw-Hill Irwin: Boston, MA, USA, 2004. [Google Scholar]
  134. O’Brien, R.M. A caution regarding rules of thumb for variance inflation factors. Qual. Quant. 2007, 41, 673–690. [Google Scholar] [CrossRef]
  135. Borenstein, M.; Hedges, L.V.; Higgins, J.P.; Rothstein, H.R. Introduction to Meta-Analysis; John Wiley & Sons: Hoboken, NJ, USA, 2021. [Google Scholar] [CrossRef]
  136. Cisco. The Top Cybersecurity Threats in 2022. Available online: https://umbrella.cisco.com/blog/top-cybersecurity-threats-2022 (accessed on 14 April 2022).
  137. Chen, H.; Hai, Y.; Tu, L.; Fan, J. Not All Information Security-Related Stresses Are Equal: The Effects of Challenge and Hindrance Stresses on Employees’ Compliance with Information Security Policies. Behav. Inf. Technol. 2023, 1–16. [Google Scholar] [CrossRef]
  138. Ament, C.; Jaeger, L. Unconscious on their own ignorance: Overconfidence in information security. J. Inf. Sci. 2017, 50, 254–272. [Google Scholar]
  139. Mady, A.; Gupta, S.; Warkentin, M. The effects of knowledge mechanisms on employees’ information security threat construal. Inf. Syst. J. 2023, 33, 790–841. [Google Scholar] [CrossRef]
  140. Azizollah, A.; Abolghasem, F.; Amin, D.M. The relationship between organizational culture and organizational commitment in Zahedan University of Medical Sciences. Glob. J. Health Sci. 2016, 8, 195. [Google Scholar]
  141. Kävrestad, J.; Nohlberg, M.; Furnell, S. A taxonomy of SETA methods and linkage to delivery preferences. ACM SIGMIS Database DATABASE Adv. Inf. Syst. 2023, 54, 107–133. [Google Scholar]
  142. Hu, Q.; Dinev, T.; Hart, P.; Cooke, D. Managing employee compliance with information security policies: The critical role of top management and organizational culture. Decis. Sci. 2012, 43, 615–660. [Google Scholar]
  143. Jaeger, L.; Ament, C.; Eckhardt, A. The closer you get the more aware you become–a case study about psychological distance to information security incidents. In Proceedings of the ICIS 2017: Transforming Society with Digital Innovation, Seoul, Republic of Korea, 10-13 December 2017; Association for Information Systems: Atlanta, GA, USA, 2018; pp. 1–18. [Google Scholar]
  144. Kritzinger, E.; Da Veiga, A.; van Staden, W. Measuring organizational information security awareness in South Africa. Inf. Secur. J. A Glob. Perspect. 2023, 32, 120–133. [Google Scholar]
  145. Kaiser, H.F. An index of factorial simplicity. Psychometrika 1974, 39, 31–36. [Google Scholar]
  146. Fuller, C.M.; Simmering, M.J.; Atinc, G.; Atinc, Y.; Babin, B.J. Common methods variance detection in business research. J. Bus. Res. 2016, 69, 3192–3198. [Google Scholar]
  147. NIST. Computer Security Resource Center. Available online: https://csrc.nist.gov/glossary/term/cybersecurity (accessed on 2 August 2024).
  148. Björck, F.; Henkel, M.; Stirna, J.; Zdravkovic, J. Cyber resilience—Fundamentals for a definition. Adv. Intell. Syst. Comput. 2015, 353, 311–316. [Google Scholar] [CrossRef]
  149. Von Solms, R.; Van Niekerk, J. From information security to cyber security. Comput. Secur. 2013, 38, 97–102. [Google Scholar]
  150. Laudon, K.C.; Laudon, J.P. Management Information Systems, 12th ed.; Prentice-Hall: Upper Saddle River, NJ, USA, 2012; p. 44. [Google Scholar]
Figure 1. Research Model.
Figure 1. Research Model.
Information 15 00505 g001
Figure 2. Regression Path Analysis—* p < 0.05, ** p < 0.01, *** p < 0.001.
Figure 2. Regression Path Analysis—* p < 0.05, ** p < 0.01, *** p < 0.001.
Information 15 00505 g002
Figure 3. Information Security Awareness from a Situation Awareness Perspective.
Figure 3. Information Security Awareness from a Situation Awareness Perspective.
Information 15 00505 g003
Table 1. Ten studies on information security behavior compliance and the role of Information Security Awareness (unit of analysis: employee/user).
Table 1. Ten studies on information security behavior compliance and the role of Information Security Awareness (unit of analysis: employee/user).
ReferenceResearch ObjectiveSelf-Efficacy and/or AttitudeInformation Security Awareness
[51]Examining the social contextual effects on ISP compliance. Utilizing safety climate literature.Self-efficacy positively affects compliant behavior.Not a research construct, but it is concluded that policy guidelines and awareness program lessons should be applied when employees carry out work.
[25]Investigating rational factors that drive ISP compliance. Utilizing the theory of planned behavior.Self-efficacy and attitude positively affect intention to comply with ISP.Predecessor, positive effect on attitude and outcome beliefs.
[52]Explaining compliance intention, utilizing social learning theory.Self-efficacy positively affects compliance intention.Mediating role and directly positively affects compliance intention.
[54]Investigating the antecedents of privacy policy compliance, utilizing social learning theory.Self-efficacy positively affects behavioral intent.Not a research construct
[38]Significance of self-learning and awareness on attitudes toward ISP compliance. Utilizing theory of planned behavior.Self-efficacy and attitude positively affect intention to comply.General ISA and technology awareness: predecessors. Positively affects attitude and self-efficacy.
[39]Constructing a measurement tool for the prediction and explanation of ISP compliance. Based on the security acceptance model.Self-efficacy posited to affect perceived usefulness of protection and perceived ease of use (mediating) towards compliance intention.ISA distributed in awareness of information security, ISP, and SETA. Posited to influence mediating constructs.
[41]Factors influencing internet information security practices. Social cognitive theory utilization.Self-efficacy explains a small amount in information security practices variance.Predecessor, higher ISA in users report higher means in safe internet practices.
[43]Influence of subordinate guanxi and organizational commitment on information security behavior.Self-efficacy positively affects compliant behavior.Control variable on compliance behavior.
[44]Investigating various factors influencing information security compliance behavior.Self-efficacy being one of the most significant factors on compliance behavior.Technology awareness: mediator. Positively influences compliant behavior.
[53]Investigating how individual decision-making styles impact cybersecurity compliance behavior to enhance security measures.Self-efficacy positively affects compliant behavior.Security awareness has a direct positive effect on compliant behavior.
Table 2. Table of Measurement Items—(A): Interval 1–7 Likert Scale, (B): nominal dichotomy (Yes/No, 9 or below/above 9), and (C): nominal age categories (younger than 25, 25–34, 35–44, 45–54, 55–64, older than 65).
Table 2. Table of Measurement Items—(A): Interval 1–7 Likert Scale, (B): nominal dichotomy (Yes/No, 9 or below/above 9), and (C): nominal age categories (younger than 25, 25–34, 35–44, 45–54, 55–64, older than 65).
ConstructCodeItemsScaleReference
ISAISA1I understand the importance of information security. AAdapted from [25]
ISA2I am aware of the negative consequences of a threat.
ISA3I am able to recognize a threat when I encounter one.
ISA4I know what measures I can take to avoid negative consequences.
ISA5I exhibit safe behavior during my daily routine.
ISA6I exhibit safe behavior when faced with a threat.
Negative Experience (NEG)NEG1Have you had any issues with malware at any point in the last two years? (e.g., viruses, spyware, ransomware)BAdapted from [36]
NEG2Have you been phished at any point in the last two years (in every possible form)?
SETASETA1Security awareness activities increase my knowledge about information security.A[103]
SETA2I understand the security awareness activities.
SETA3I try to apply the knowledge of security awareness activities.
InfoSec Goals (GOAL)GOAL1I want to contribute to information security. AAdapted from [107]
GOAL2I would like to handle information securely for information security.
GOAL3The information security of the firm means a lot to me.
Complexity (COMP)COMP1I experience pressure in my work because I find security awareness topics complex.AAdapted from [115]
COMP2I find it difficult to understand security awareness topics.
COMP3I know too little about information security to keep the firm safe.
SETA Design
(SETAD)
SETAD1Communication tools help me to handle information securely.ASurvey-specific
SETAD2Gamification helps me to handle information securely.
SETAD3Phishing simulations help me to handle information securely.
SETAD4The amount of information that has been offered helps me to handle information securely.
SETAD5The information that is offered is of good quality which helps me to deal with information in a secure way.
Items for demography (AGE) and data segmentation (IT/MAN)
AGEWhat is your age group?C
ITAre you IT staff?BSurvey specific
MANAre you management or non-management?
Table 3. Construct means and standard deviation.
Table 3. Construct means and standard deviation.
ConstructMeanStandard Deviation
SETA Program6.180.85
Security Complexity1.990.96
Negative Experience0.150.27
InfoSec Goals6.530.82
SETA Design5.461.14
Information Security Awareness6.350.64
Table 4. Measurement Item Statistics (N = 156).
Table 4. Measurement Item Statistics (N = 156).
Item MeanSD
SETA1Security awareness activities increase my knowledge about information security.5.811.362
SETA2I understand the security awareness activities.6.380.960
SETA3I try to apply the knowledge of security awareness activities.6.350.975
COMP1I experience pressure in my work because I find security awareness topics complex.2.121.411
COMP2I find it difficult to understand security awareness topics.1.851.148
COMP3I know too little about information security to keep the firm safe.2.011.231
NEG1Have you had any issues with malware at any point in the last two years? (e.g., viruses, spyware, ransomware)0.100.304
NEG2Have you been phished at any point in the last two years (in every possible form)?0.190.395
GOAL1I want to contribute to the information security. 6.650.816
GOAL2I would like to handle information securely for information security.6.660.775
GOAL3The information security of the firm means a lot to me. 6.281.082
SETAD1Communication tools help me to handle information securely.5.641.334
SETAD2Gamification helps me to handle information securely.4.561.891
SETAD3Phishing simulations help me to handle information securely.6.041.456
SETAD4The amount of information that has been offered helps me to handle information securely.5.501.346
SETAD5The information that is offered is of good quality, which helps me deal with information in a secure way.5.571.296
ISA1I understand the importance of information security. 6.830.599
ISA2I am aware of the negative consequences of a threat.6.740.710
ISA3I am able to recognize a threat when I encounter one.5.890.862
ISA4I know what measures I can take to avoid negative consequences. 6.080.865
ISA5I exhibit safe behavior during my daily routine.6.280.816
ISA6I exhibit safe behavior when faced with a threat.6.250.840
Table 5. Proposition Results.
Table 5. Proposition Results.
PropositionΒSEt-Valuep-ValueResult
NEG → ISA0.0580.1210.477 Not supported
SETA → ISA0.1360.0572.402*Supported
InfoSec Goals → ISA0.2220.0723.089**Supported
Complexity → ISA−0.2490.032−7.807***Supported
SETA Design → ISA−0.0380.026−1.431 Not supported
* p < 0.05, ** p < 0.01, *** p < 0.001.
Table 6. Regression Coefficients for Group Comparison.
Table 6. Regression Coefficients for Group Comparison.
Perspective 1: IT & Non-ITITNon-IT
βp-Valueβp-Value
NEG → ISA0.057 0.119
SETA → ISA0.147 0.104
InfoSec Goals → ISA0.251 0.297
Complexity → ISA−0.252**−0.245**
SETA Design → ISA0.006 −0.097
Perspective 2: Management & Non-ManagementManagementNon-Management
βp-Valueβp-Value
NEG → ISA0.353 −0.40
SETA → ISA0.099 0.128
InfoSec Goals → ISA0.497 0.195
Complexity → ISA−0.203*−0.263**
SETA Design → ISA−0.47 −0.032
* p < 0.05, ** p < 0.01.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Djotaroeno, M.; Beulen, E. Information Security Awareness in the Insurance Sector: Cognitive and Internal Factors and Combined Recommendations. Information 2024, 15, 505. https://doi.org/10.3390/info15080505

AMA Style

Djotaroeno M, Beulen E. Information Security Awareness in the Insurance Sector: Cognitive and Internal Factors and Combined Recommendations. Information. 2024; 15(8):505. https://doi.org/10.3390/info15080505

Chicago/Turabian Style

Djotaroeno, Morgan, and Erik Beulen. 2024. "Information Security Awareness in the Insurance Sector: Cognitive and Internal Factors and Combined Recommendations" Information 15, no. 8: 505. https://doi.org/10.3390/info15080505

APA Style

Djotaroeno, M., & Beulen, E. (2024). Information Security Awareness in the Insurance Sector: Cognitive and Internal Factors and Combined Recommendations. Information, 15(8), 505. https://doi.org/10.3390/info15080505

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop