Next Article in Journal
A New Strategy-Based PID Controller Optimized by Genetic Algorithm for DTC of the Doubly Fed Induction Motor
Previous Article in Journal
The Rise of Emergent Corporate Sustainability: A Self-Organised View
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Introducing the Privacy Aspect to Systems Thinking Assessment Method

Industrial Engineering and Management Department, Faculty of Engineering, Ariel University, Ariel 4077625, Israel
*
Author to whom correspondence should be addressed.
Systems 2021, 9(2), 36; https://doi.org/10.3390/systems9020036
Submission received: 9 March 2021 / Revised: 13 May 2021 / Accepted: 17 May 2021 / Published: 20 May 2021
(This article belongs to the Section Systems Engineering)

Abstract

:
Systems thinking is a valuable skill that may be required for an individual to be promoted in the business arena to managerial or leading positions. Thus, assessing systems thinking skills is an essential transaction for decision makers in the organization as a preceding step to the promotion decision. One of the well-known and validated tools for this task is a questionnaire. However, because some of the questions invade the employee or candidate’s privacy, the answer may be biased. In this paper, we consider this potential bias, a phenomenon that is becoming more and more significant as privacy concerns and awareness continuously increase in the modern digital world. We propose a formal methodology to optimize the questionnaire based on the privacy sensitivity of each question, thereby providing a more reliable assessment. We conducted an empirical study ( n = 142 ) and showed that a systems skills questionnaire can be enhanced. This research makes a significant contribution to improving the systems skills assessment process in particular, and lays the foundations for improving the evaluation of other skills or traits.

1. Introduction

As the pace of technological and global change continues to speed up, likewise the landscape of the business environment is changing. Information and knowledge are becoming more and more available to everyone. Due to automation and innovation, systems are becoming more efficient and more adaptive, yet more complex. This complexity of modern systems requires employees to deal with issues well beyond their discipline of expertise. To achieve that, we need to implement the principles of systems thinking, for example, seeing the whole system beyond its parts and thinking about each component as part of the whole system [1,2,3,4].
Systems thinking is a skill, and, more broadly, a concept that reflects a comprehensive perception, focusing more on the “big picture” than on the details. It emphasizes the understanding of the effect of changing one part of the system on the other parts, as well as on the whole system [5,6]. Systems thinking is closely related to system architecting [7], a term that defines different viewpoints of a system, and in particular “reflects the outside or gray-box view of a system” [8]. For example, consider a project of implementing an information system. One of the project DBA’s (Database Administrator) main concerns is how to configure the database in a way that will provide satisfying performances—a purely technical task. However, the project manager has to consider budgets, timeline, manpower, etc., as well as many technical aspects like the one mentioned above, but not with the detailed resolution as required from the DBA. The project manager’s approach must be based on systems thinking. Thus, systems thinking is an effective approach to dealing with complex challenges in organizations [9].
Richmond [10,11] described systems thinking as seeing “both the forest and the trees (one eye on each)”, and defined systems thinking as “the art and science of making reliable inferences about behavior by developing an increasingly deep understanding of underlying structure”. According to his claim, formal education encourages us to focus on details and analysis, does not develop the ability for inclusive perception, and thus suppresses systems thinking [12]. Senge [13] applied systems thinking to organization management. According to Senge [14], there are five disciplines of a learning organization: personal mastery, mental models, building shared vision, team learning, and the most important discipline is the fifth one which integrates the other four—systems thinking. Balle [15] pointed out the importance of systems thinking abilities for maximizing an organization’s performance. Moreover, systems thinking is relevant to a wide range of disciplines such as healthcare [16,17], education [18,19], and systems engineering [20,21,22]. Checkland [23] suggested Soft Systems Methodology to deal with Soft Systems Thinking problems such as managerial dilemmas. In order to deal with such problems, Checkland emphasized the importance of integrating divergent stakeholder perspectives. Applying systems thinking approaches for identifying problems and designing solutions enables organizations to confront the complexity of systems in the current era. Jackson [24] presented systems approaches for technical, process, structural, organizational, and people complexity.
While systems thinking is a personal trait [25] rather than a skill that may be evaluated and improved, some tools are available to enhance the systems thinking process. These tools are applicable to various systems, for example, systems archetypes, causal loop diagrams, stock and flow diagrams, and root cause analysis. Banson et al. [26] used causal loop diagrams to develop new structural systems models in the agricultural sector in Ghana that enabled people to determine the components and interactions between the following components: structure, conduct, and performance of systems. The DSRP theory and method which was offered by Cabrera et al. is a form of metacognition in which the foundations of all human thought are described by four patterns: Distinctions, Systems, Relationships, and Perspectives [27]. Cabrera et al. claimed that we can improve our systems thinking skills by learning to explicitly recognize and explicate the distinctions, systems, relationships, and perspectives underlying anything we wish to understand more deeply and clearly.
Systems thinking is a valuable skill that may be required for an individual to be promoted in the business arena. In the abovementioned information systems project example, systems thinking is a prerequisite for the DBA to be promoted to a project manager position, and even to a more senior role, e.g., team manager. This implies that the systems thinking trait is not dichotomic and may be graded with a continuous, or at least a multilevel, scale. In order to make a decision regarding promoting/recruiting an employee to a position that requires systems thinking (or better systems thinking), a validated and reliable systems thinking estimation tool is required. Moreover, every organization is interested in filling positions that fulfill requirements and in matching the right employee to the right job. The systems thinking competency of an employee can serve many goals, for example, sustainability assessment [28]. Naturally, when reviewing candidates for systems engineering positions, the evaluation of systems thinking skills is a central parameter [29]. Assessing systems thinking skills is highly important in the education field, and considered a necessary tool for implementing a productive teaching program for this subject [30]. An effective system thinking assessing tool is important not only regarding recruitment/promotion decisions, but also to evaluate and enhance training programs [31].
In order to evaluate the individual’s capabilities in general and to assess their preference for dealing with positions that require systems thinking skills in particular, we need a validated and reliable tool. The literature describes several tools aimed at evaluating systems thinking skills, mainly in the domain of education. Assaraf and Orion [32] applied customized quandaries to test the ability of school students studying an earth systems-based curriculum dealing with complex systems. Stave and Hooper [33] suggested an assessment of systems thinking skills based on different levels, for example, the ability to list the system parts as a predictor for recognizing interconnection levels, or the ability to justify why a given action is expected to solve a problem as a predictor for using conceptual model levels. Another approach seeks to define the complete set of skills required for systems thinking, e.g., the use of mental modeling and abstraction, and by quantitative assessment of these skills the systems thinking trait may be evaluated [34]. Lavi et al. [35] classified systems thinking attributes into system function, structure, and behavior, and by scoring proposed system models based on object-process methodology they evaluated systems thinking skills. Buckle [36] examined the utilization of a maturity model to assess the competence of a person to handle complex systems, i.e., systems thinking skills (coined as MMSTC—Maturity Model of Systems Thinking Competence).
In the framework of this study, we choose a tool for assessing systems thinking skills that was developed by Frank [29]. Frank presented four different aspects of systems thinking: knowledge, individual traits, cognitive characteristics and capabilities. A model for describing the systems thinking aspects according to Frank’s suggestion was presented by Koral-Kordova and Frank [37]. According to this model, systems thinking is a theoretical latent value that cannot be predicted directly. Thus, in order to measure systems thinking, predictive indicators that match the relevant latent variables of systems thinking are applied. The model illustrates the systems thinking skills regarding each aspect, how these skills interrelate with each other, and how they relate to systems thinking. Examples of predictive indicators for each aspect are: (a) Knowledge—interdisciplinary and multidisciplinary knowledge; (b) Individual traits—managerial skills, group leadership, good interpersonal skills; (c) Cognitive characteristics—understanding the overall system, getting the big picture, understanding the synergy between different systems, considering non-engineering factors; (d) Capabilities—abstract thinking, seeing the future, vision of the future. This framework relies on the concept of high-order thinking skills, and may line up with Bloom’s taxonomy of educational objectives [38]. According to this taxonomy, the simplest thinking skills are learning facts and recall. Higher-order skills include critical thinking, creative thinking, analysis, and problem solving. Systems thinking may be one of the high-order thinking skills.
Frank’s tool is a questionnaire aimed at assessing the individual’s interest for positions requiring systems thinking. This questionnaire reflects the four aspects of this model and presents the conceptual framework for systems thinking skills, respectively. Such a tool is essential in the screening and decision-making processes regarding the placing of employees. The main reason for choosing Frank’s tool was the fact that it had been tested and implemented in previous studies to examine its reliability and validity. These tests included two types of reliability (inter-judge reliability and inter-item consistency reliability) and three types of validity (content validity, contrasted group validity, and construct validity) [39].
A questionnaire is a useful estimation tool when no other means are available, e.g., for a new candidate or an employee who lacks history in the evaluated field. However, because it is answered by the employee/candidate who is not objective, and may also be in a conflict of interest, it may be biased [40]. A bias may be an outcome of many factors, one of them being the privacy aspect. Privacy and security are major increasing concerns in the digital era [41,42,43]. Privacy is considered as an essential value to liberal society and the individual’s autonomy [44,45], and is extensively regulated by governments and intragovernmental organizations, e.g., the EU’s General Data Protection Regulation (GDPR) [46,47,48]. A privacy violation may be raised even from an unexpected source, e.g., a music selection [49], thus, privacy awareness is continuously increasing. This phenomenon may result in biasing answers with high privacy sensitivity [50], as evasive questions have been known for quite a while to have potential to do so [51]. Notably, a bias is a speculation that should be tested when a suspicious factor is included in the questionnaire. Privacy is definitely a significant factor, as the questionnaire may invade privacy [52]. In this research, we tested this suspicion by conducting an independent survey which eliminates an existing bias (as described in Section 2.1), thus indicating objectively the level of privacy invasion.
One way to address this problem is by anonymization, a method that enables people to answer questions honestly, as demonstrated for example with the sensitive information on sexual orientation [53]. Inherently, anonymization is not applicable when assessing employees or candidates, because by the nature of this process we would like to attach the assessment to an identified individual in a listed level, rather than a statistical level, as with surveys and research.
In this paper, we consider the privacy aspect when assessing the systems thinking of an individual. Being aware of the fact that answers to privacy-sensitive questions may be biased, we propose a methodology to optimize the questionnaire, providing a more accurate estimation. We formalize the methodology and demonstrate its usefulness in an empirical study. The discussion on privacy is almost always conducted from the point of view of preserving and protecting. As bad as it sounds, privacy preserving (from the evaluator’s point of view) in an evaluation process is not a good idea. We introduce here a novel approach in which we do not seek to protect the individual’s privacy, but to increase the reliability of the assessment by suppressing the effects of the privacy concerns. It is the first time that the privacy factor has been introduced to this type of evaluation model, accommodated in the methodology (described in Section 2), and empirically evaluated (described in Section 3). We believe that beyond raising the accuracy of systems thinking evaluation, this methodology may also provide a sense of comfort to the candidate or employee, and may thus contribute to improving working relationships.

2. Optimizing Systems Thinking Assessment Based on the Privacy Aspect

This section describes the methodology of optimizing a systems thinking questionnaire by minimizing biases caused by privacy concerns. We first introduce the concepts of the model, and then the formal mathematical model.

2.1. Optimization Concept

The assessment of systems thinking skills, as mentioned previously, is based on a questionnaire. Each question consists of two propositions which are actually statements—one of them indicates a systems thinking approach, while the other does not. The participant (the assessed individual) has to select the statement according to their preferences, i.e., the one that better describes them. For example [54]:
A.
When I take care of a product, it is important for me to concentrate on this product, assuming that other engineers will take care of the other parts of the system.
B.
When I take care of a product, it is important for me to see how it functions as a part of the system.
In this example, as answer B shows a more holistic view (rather than a reductionistic view), answer B indicates a systems thinking approach. Naturally, there are no “correct” or “right” statements (answers), each one is just an indication of a preference. The number of questions answered with a “systems thinking answer” divided by the total number of the questions yields the systems thinking grade on a scale of 0 to 1, indicating minimal and maximal systems thinking skills respectively (for conveniency, in this paper we linearly normalized the grade to a scale of 0 to 100).
The above grade is a measurement that reflects the individual’s systems thinking skills. However, the reliability of a measurement is defined by the extent to which it is accurate [55]. Each item (question) in the questionnaire has a degree of privacy sensitivity that may bias the answer, thus the reliability of the questionnaire. The level of privacy sensitivity can be measured by conducting an isolated survey, this time by asking the participant how sensitive the question is, rather than asking them to answer it. In this case, there is no concern of bias because of the following reasons: (a) The survey is anonymous—note that we seek here to classify the questions and not the individual; (b) The participant is not affected by the answer, i.e., there is no privacy violation; and (c) This type of survey informs more about the question itself, rather than about the respondent. Isolating the sensitivity (privacy) level from the actual availability, along with the use of a multi-item diverse questionnaire, decreases the probability of having a significant amount of confounding factors in the model. However, this can be tested in further research that includes other biasing factors, as described at the end of Section 3.
Based on the answers to the systems thinking questionnaire and the privacy sensitivity answers, the accuracy of the systems thinking questionnaire can be increased as follows: First, we define an indicator to measure the reliability of the questionnaire based on testing the inter-item consistency among questions dealing with similar issues (expected to be correlated with one another). We use the most common test score reliability index, Cronbach’s alpha (noted as ρ Τ ), also known as Tau-equivalent reliability [56]. Then, the questions are classified into subgroups, where each subgroup deals with a similar issue. For example, a question concerning the preference of seeing the “whole picture of the project”, and a question concerning the preference of being aware of tasks that are not under the employee’s responsibility, may be classified in the same sub-group. Then, all answers in each subgroup are reordered along the same scale so that answer A indicates systems thinking, thereby enabling data processing such as ρ Τ . For each question, ρ Τ is calculated within the subgroup. A question is considered to be dropped if (a) its privacy sensitivity exceeds a predetermined threshold; and (b) by dropping the question, ρ Τ is increased by another threshold. The rationale behind this methodology is that the questions that meet these criteria, and are decided to be dropped, cause more “damage” to the accuracy than benefit. The entire process is depicted in Figure 1.

2.2. Formal Model

To formalize this methodology, let Q be a set of n questions ( q 1 , q 2 , .. , q n ). Let S be a collection of m subsets of questions of the same sub-subject, so that they include all answers in Q (i.e., q i Q : ( j , k : j m , k s j ,   q i = s j , k ) and each question has a unique appearance over all subsets of S , i.e., i , j m , i j : t s i : u s j : t = u .
Now let A i j be the answers to question i in s j (j m ,   i s j ), converted by the function f c to the same scale, such that a i j = f c o r i g i n a l   a i j .
For each subset s j , the Tau-equivalent reliability ρ Τ (Cronbach’s alpha) is calculated by Equation (1):
ρ Τ s j = s j s j 1 · 1 i = 1 s j σ a i j 2 σ a i j 2
where σ a i j 2 is the variance of a i j , and σ a i j 2 is the variance of i = 1 s j a i j .
We note ρ Τ s j \ i as the Tau-equivalent reliability of subset s j when question i is dropped. Notice that ρ Τ s j \ i is a feature of question i and ρ Τ s j is a feature of the whole set s j . Let p i j be the privacy concern level of question i in set j . We define a question to be sensitive with high concern if     p i   j p j ¯ > T h , when T h is a preset threshold. Each question s j should be decided to be kept or dropped (noted as i n c l u d e s j ) according to Equation (2):
i n c l u d e s j = d r o p p i j p j ¯ > T h ρ Τ s j \ i ρ Τ s j k e e p o t h e r w i s e
Equation (2) describes the dropping criteria which are based on two cumulative conditions: (a) that the privacy concern level of this question ( p i j p j ¯ ) relative to the average privacy concern level of the whole set ( p j ¯ ) is exceeding the threshold T h ; and (b) that the Tau-equivalent reliability accepted by removing this question ( ρ Τ s j \ i ) is significantly higher than if the question is kept ( ρ Τ s j ).
The optimized questionnaire is now defined by Equation (3), which is in fact the union of all subsets after dropping the questions:
j m s j , i : i s j   , s j , i = k e e p

3. Empirical Study and Results

This section describes how we tested the methodology empirically, and provides some estimates on its effectiveness.

3.1. Empirical Study

In order to test the methodology, we used a questionnaire for assessing systems thinking skills. The questionnaire was developed and validated by Frank [29], and was also validated by Koral-Kordova and Frank [37]. As described in Section 2.1, each question consists of two propositions which are actually statements; one of them indicates a systems thinking approach, while the other does not. The questionnaire was originally designed for engineers. With the aim of using the questionnaire in the current study, some propositions were modified in a way that does not change the concept, but is clearer to the general population. The revised questionnaire was then validated by two experts in systems thinking. The experts were chosen based on their well-known professional and proven academic expertise in systems thinking. One of them is a researcher in this field, while the other is a senior in the industry who deals with this subject in the course of his work. The experts were notified on the subject and goals of this research, and were asked to rank each question of the revised questionnaire according to two criteria: (a) clarity of the question, i.e., the level at which the question will be understood by the subject; and (b) relevancy of the question, i.e., how well the question indicates systems thinking skills. We used a Likert scale of 1 (very low) to 5 (very high), and eliminated questions that were ranked lower than 4 in one of the criteria. The final version of the questionnaire is comprised of 21 sets of two propositions. In order to avoid arbitrary answering (e.g., without first reading the propositions carefully), they were displayed in random order in the questionnaire, i.e., sometimes sentence A reflected the systems thinking tendency and at other times sentence B, without any obvious pattern.
The questionnaire was used in two different ways which we note as actions: action A—To estimate the privacy concern level of each set in the questionnaire. In the context of this action, participants were asked, rather than answering the questionnaire, to indicate for each set the privacy sensitivity level on a Likert scale of 1 to 5; and action B—To estimate the systems thinking skills. In the context of this action, participants were asked to answer the questionnaire, i.e., to select one statement in each question that describes them well or represents their preference. The study was authorized by the institutional ethics committee.

3.2. Participants

We conducted an experiment with two different, independent populations. The first population, noted as P A R a , included n a = 72 participants, and they were purposed to collect the privacy concern level ( p i j ), i.e., to perform action A. The second population noted as P A R b , included n b = 70 participants, and they were purposed to collect the preferences, i.e., to perform action B. As mentioned above, P A R a and P A R b are disjoint sets ( P A R a P A R b = ), a necessity required to avoid dependencies that might create internal noise. The participants of P A R a were asked to perform action A only, while the participants of P A R b were asked to perform action B only. When a research subject chose a sentence that gave evidence of systems thinking (i.e., action B), one point was awarded; otherwise, none. The score of the questionnaire for an individual is the sum of points gained, divided by the number of questions. Therefore, the maximum score is 1 (or 100, if normalized to a 0-to-100 scale), which reflects maximal systems thinking skills.
The participants of both groups were recruited among workers in selected industries. Of all the participants in P A R a , 54% were male and 46% were female; 19% were 18–25 years old, 39% were 26–30 years old, 31% were 31–40 years old, and the remaining 11% were over 40 years old. With regards to education, 89% had gained a bachelor’s degree or higher, 9% had a high school degree, and 2% had no diploma at all; 37% were engineers. All of the participants were employees. Of all the participants in P A R b , 57% were male and 43% were female; 14% were 18–25 years old, 36% were 26–30 years old, 34% were 31–40 years old, and the remaining 16% were over 40 years old. In terms of education, 86% had gained a bachelor’s degree or higher, 13% had a high school degree, and 1% had no diploma at all; 38% were engineers. All of the participants were employees. The occupation distributions of P A R a and P A R b are depicted in Figure 2a,b respectively. Please note that in this section of the demography, the actual occupation is reported. For example, if a person is an engineer by education and works as a manager, they will be reported here as “Management position” rather than “Engineering”. This clarification addresses an alleged inconsistency with the above-reported proportion of engineers among the populations.
The demography of the empirical study population is characterized by a diversity, e.g., the age distribution covers the vast majority of this range among employees, both genders are well represented, and the occupations are varied.

3.3. Results and Analysis

The average privacy concern level (based on P A R a ) for all questions, measured with a scale of one (lowest concern) to five (highest concern) was 2.47 ( σ = 0.26 ). The distribution of the privacy concern is depicted in Figure 3.
The average systems thinking skills (based on P A R b ) was normalized linearly to a scale of 0 (no skills) to 100 (maximal skills), and was 60.6 ( σ = 15.23 ). The distribution of the systems thinking skills across all participants is depicted in Figure 4. It can be seen that the score of systems thinking skills among all the participants has left-skewed distribution (long left tail). The average score of the systems thinking skills is also located left of the peak. This indicates that the distribution of the score of systems thinking skills of P A R b is not a normal one, and most of the participants are probably systems thinkers in their preferences.
The analysis of the questionnaire of P A R b included an exploratory factor analysis confirmed by experts according to which two groups of items (subsets of the questionnaire) were identified:
s 1 : included items number 3, 5, 6, 7, 11, 12, 14, 16, 17, 18, 20.
s 2 : included items number 1, 2, 4, 8, 9, 10, 13, 15, 19, 21.
The first group ( s 1 ), deals with the preferences of the individual on issues related to interaction of the individual with himself. The second group ( s 2 ), deals with the preferences of the individual on issues related to the individual as part of a group/team/project and also relevant to the concept of leadership and management. Questions in the first group ( s 1 ) included the following: Gaining interdisciplinary and multidisciplinary knowledge, personal preference to focus also on topics that are not core topics, awareness of non-profession-related considerations such as business and financial areas. Questions in the second group ( s 2 ) included the following: Familiarity with the responsibilities of colleagues on the project, being part of a team which is involved in large projects, involvement in all stages of the project.
In order to optimize the questionnaire, we implemented the process as described in Figure 1 and presented in Section 2.1. We used ρ Τ as an indicator to measure the reliability of the questionnaire. The value of ρ Τ for all the items of the questionnaire or a subgroup (noted as pre- ρ Τ ) for P A R b was 0.595. Following stage P4 (in Figure 1), we defined a threshold for the privacy sensitivity of the questionnaire, T h = 2.4 . Several items were dropped according to two conditions:
a.
The average privacy sensitivity of the item exceeded the threshold T h .
b.
The value of ρ Τ increased by dropping the item.
For example, the following questions (for realization, one of the two answers is shown) indicated high sensitivity and reduced significantly the ρ Τ : (a) question number 15: “To resolve a problem, I prefer to use innovative practices”; (b) question number 18: “It is my nature to present many questions to my colleagues and/or subordinates.”; and (c) question number 19: “It is important for me to continuously think what else can be improved”.
We repeated this process for each of the two groups found in the exploratory factor analysis. The new ρ Τ for the partial questionnaire or a subgroup was calculated (noted as post- ρ Τ ). The results of this process are presented in Table 1.
The selection of the threshold is subjective, but it has a limited range since an extreme value will drop almost none or almost all of the questions. Further research on a larger sample may reveal a sensitivity study. The average grade (the assessment of system thinking skills on a scale of 0 to 100) of the whole questionnaire ( Q ), considering all items ( μ = 60.6 ,   σ = 15.3 ) was significantly different from the one when questions were dropped ( μ = 63.7 ,   σ = 19.1 ); t 69 = 2.73   ,   p < 0.01 . The average grade of Group 1 questions ( s 1 ), considering all items in the group ( μ = 52.5 ,   σ = 18.7 ), was significantly different from the one when questions were dropped ( μ = 48.3 ,   σ = 26.6 ); t 69 = 2.18   ,   p < 0.05 . The average grade of the Group 2 questions ( s 2 ), considering all items in the group ( μ = 69.6 ,   σ = 8.3 ), was significantly different from the one when questions were dropped ( μ = 76.9 ,   σ = 8.8 ); t 69 = 6.24   ,   p < 0.01 . In all of the groups, the results indicate a significant change in the assessment of system thinking skills before and after questions were dropped due to privacy sensitivity. The absolute differences between the grades range from 3.1 to 7.3 points. We know from other studies (and it is also reflected in this study) that the vast majority of the population roughly distributes between grades of 30 and 90. Thus, the differences reflect a deviation of 5% to 12% within the relevant grade range, a significant difference not only by statistical means, but also semantic ones.

4. Discussion

Systems thinking skills have become a necessity when navigating the modern multidiscipline business environment, mainly when considering managerial roles. However, assessing systems thinking skills is not a trivial task, especially when dealing with a new candidate or when promoting an employee from a position that does not require such skills (thus, no relevant prior knowledge is available) to a new position that requires holistic thinking. A common way to carry out the assessment is by asking the person to be assessed to fill in a questionnaire—a method that is subject to a few obstacles. Since some of the questions may be sensitive from the point of view of privacy concerns, the answers may be biased. Thus, one of these obstacles is the privacy aspect. The awareness of privacy is constantly increasing in the digital era, and privacy protection is supported by research [57,58], as well as regulators [43]. Therefore, the impact of privacy concern on the accuracy of system thinking skills is due to become more and more significant. In this paper, we address this issue and propose a methodology to optimize a questionnaire by accommodating the privacy concern parameter in the model. We conducted an empirical experiment ( n = 142 ) and showed that the questionnaire can be optimized.
This study encapsulates two disciplines: privacy and systems thinking evaluation. Privacy is usually discussed from the preserving and protecting point of view. Many studies offer methods to protect privacy, e.g., in the data mining and machine learning processes [59], when applying new technologies like IoT [60], and in the medical field [61]. Privacy protecting is also a significant issue for legislators and regulators, with the most prominent example being the GDPR, the European Global Data Protection [48]. However, while privacy protecting holds many benefits, it may also encapsulate some costs. These costs are not considered in current methods for evaluating systems thinking skills, which are focused more on identifying the cornerstones of these traits, for example, the four factors of the DSRP theory [27], or the classification of systems thinking attributes into system function [35]. This study, however, addresses the cost of biasing an assessment process, and seeks to minimize it. The purpose of the proposed methodology is to enhance the accuracy of the evaluation process of existing methods, such as the customized quandaries to test the ability of school students studying an earth-systems-based curriculum dealing with complex systems [32], or Frank’s questionnaire [29] which was the subject of the empirical test. While we act here in contrast to the popular direction of privacy protection, this is a legitimate strategy, which acts to the benefit of all sides because of the following: (a) It is done with the full consent of the person assessed and participation is not forced; (b) The consequences are clear to the participant; (c) Dropping highly sensitive privacy questions may avoid inconvenience among the participants, and may even increase their privacy.
While this study discusses the optimization of systems thinking skills assessment, the concept that we present here may be adopted, or sometimes even used directly, in other fields. One example is in the education field, as a standardized creativity measurement procedure and for developing a tool for creativity assessment [62]. Another example can be found in the human resource management field, for evaluating the relationship between the organizational, culture and leadership behavior and the job satisfaction of employees [63]. The model is general enough to address other domains, when a few modifications may be required: (a) When the participant population is diverse, some refinement might be required to evaluate the privacy sensitivity; (b) The metrics vary from one questionnaire to another, and in this case some mathematical adjustments are required; (c) Subgrouping is domain-dependent, and specific methodologies can be devolved; and (d) When the questionnaire is not balanced, i.e., not all questions have the same weight, this parameter must be accommodated in the optimization model. These modifications can be investigated with further research, thereby extending the methodology to other fields.
It is noteworthy that the segmentation into subgroups was made first by exploratory factor analysis and was then confirmed by experts. While this approach is legitimate and proven empirically to yield positive results, other approaches can be proposed in further research. Furthermore, an overlap between subgroups may be considered, as some questions are expected to indicate high correlations with more than one group.
Another aspect that should be considered is the statistical significance of the assessment tool. Our method is based on omitting sensitive questions, i.e., questions with a negative contribution to the calculated index. However, the more questions are included in the questionnaire, the more accurate are the end results. In this view, the questionnaire may be related as a survey, and the number of questions as the sample size, a variable well known to be positively correlated with the evaluation accuracy. This issue can be handled in the early stages of the questionnaire design, when a redundant bulk of questions is formulated, and then some are dropped, yet a sufficient sample size remains. This approach is the equivalent of the Privacy by Design (PbD) approach [64] but, rather than from the privacy protection point of view as described above, this can be from the point of view of minimizing the privacy protection costs. Another extension of this research can be made by adopting a more specific level of privacy concerns to an individual or to subgroups of individuals. Privacy concerns vary from one individual to another and can be measured [65]. Thus, instead of applying an average level to the whole population, specific levels can be attached.
As shown, both in the Introduction section as well as derived from the empirical study (the independent privacy sensitivity questionnaire), assessing systems skills may be biased due to privacy concerns. Thus, this research makes a significant contribution to improving the systems skills assessment process in particular, and also lays the foundations for improving the evaluation of other skills/traits. With the growing awareness of privacy, the importance of this method is also due to increase. Furthermore, this research does not indicate correlations between demographic factors and the sensitivity of certain questions. This relationship may be investigated in further research, motivated by the goal of also optimizing the questionnaire according to the subject’s profile.
Finally, this research examines the privacy aspect in systems thinking assessment methods. However, there could be many other aspects that might bias the responses (not necessarily to systems thinking evaluation), such as cultural sensitivities, religious sensitivities, race sensitivities, etc. Future studies might examine the implementation of the suggested methodology in order to optimize the questionnaire in a more holistic frame.

5. Conclusions

Systems thinking skills are an important personal trait in many domains of the modern working environment. Pursuant to this significance, assessing systems thinking skills is also a process of high importance. However, current evaluation methods, which usually rely on questioning the tested subject, are exposed to bias as a result of the subject’s privacy concerns. This research accommodates these concerns in the model, and by doing so enhances the estimation accuracy. The empirical study, which proves the feasibility of the proposed methodology, opens a window on a much wider world of personal trait evaluation where the privacy aspect is in force.

Author Contributions

Conceptualization, methodology, validation, formal analysis, investigation, resources, data curation, writing—original draft preparation, writing—review and editing, are all handled by R.S.H. and S.K. Both authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was authorized by the institutional (Ariel university) ethics committee, authorization number: AU-ENG-SK-20200810.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Checkland, P. Systems thinking. In Rethinking Management Information Systems; Currie, W., Galliers, B., Eds.; Oxford University Press: Oxford, UK, 1999; pp. 45–56. [Google Scholar]
  2. Midgley, G. Systems Thinking; SAGE Publishing: Thousand Oaks, CA, USA, 2003. [Google Scholar] [CrossRef]
  3. Cabrera, D.; Colosi, L.; Lobdell, C. Systems thinking. Eval. Program Plan. 2008, 31, 299–310. [Google Scholar] [CrossRef] [PubMed]
  4. Monat, J.P.; Gannon, T.F. Applying systems thinking to engineering and design. Systems 2018, 6, 34. [Google Scholar] [CrossRef] [Green Version]
  5. Richmond, B.; Peterson, S. An Introduction to Systems Thinking; High Performance Systems Inc.: Lebanon, VA, USA, 2001. [Google Scholar]
  6. Arnold, R.D.; Wade, J.P. A definition of systems thinking: A systems approach. Procedia Comput. Sci. 2015, 44, 669–678. [Google Scholar] [CrossRef] [Green Version]
  7. Gharajedaghi, J. Systems architecture. In Systems Thinking: Managing Chaos and Complexity: A Platform for Designing Business Architecture; Elsevier: Amsterdam, The Netherlands, 2011; pp. 213–214. [Google Scholar]
  8. Jaakkola, H.; Thalheim, B. Architecture-driven modelling methodologies. In Information Modelling and Knowledge Bases; IOS Press: Amsterdam, The Netherlands, 2010; pp. 97–116. [Google Scholar]
  9. Wilson, B.; van Haperen, K. Soft Systems Thinking, Methodology and the Mangement of Change; Palgrave: London, UK, 2015. [Google Scholar]
  10. Richmond, B. System dynamics/systems thinking: Let’s just get on with it. Syst. Dyn. Rev. 1994, 10, 135–157. [Google Scholar] [CrossRef]
  11. Richmond, B. The “Thinking” in Systems Thinking: Seven Essential Skills; Pegasus Communications: Bala Cynwyd, PA, USA, 2000. [Google Scholar]
  12. Richmond, B. Systems Thinking: Four Key Questions; High Performance Systems (HPS): Watkinsville, GA, USA, 1991. [Google Scholar]
  13. Senge, P.M. The Art and Practice of the Learning Organization; Doubleday: New York, NY, USA, 1990. [Google Scholar] [CrossRef]
  14. Senge, P. The Fifth Discipline; Currency: New York, NY, USA, 1990. [Google Scholar]
  15. Balle, M. Managing with systems thinking: Making dynamics work for you in business decision-making. J. Mark. Res. Soc. 1997, 39, 628–630. [Google Scholar]
  16. Kenett, R.S.; Lavi, Y. Integrated models in healthcare systems. In Systems Thinking: Foundation, Uses and Challenges; Moti, F., Haim, S., Sigal, K.-K., Eds.; Nova Science Publisher: Hauppauge, NY, USA, 2016. [Google Scholar]
  17. Leshno, M.; Menachem, Y. A system thinking in medicine and health care: Multidisciplinary Team (MDT) approach for Hepatocellular carcinoma (HCC). In Systems Thinking: Foundation, Uses and Challenges; Moti, F., Shaked, H., Sigal, K.-K., Eds.; Nova Science Publishers: Hauppauge, NY, USA, 2016. [Google Scholar]
  18. Goldstein, E. Identity Economics, System Thinking, and Education. In Systems Thinking: Foundation, Uses and Challenges; Moti, F., Haim, S., Sigal, K.-K., Eds.; Nova Science Publishers: Hauppauge, NY, USA, 2016. [Google Scholar]
  19. Koral-Kordova, S.; Frank, M.; Miller, A.N. Systems thinking education—Seeing the forest through the trees. Systems 2018, 6, 29. [Google Scholar] [CrossRef] [Green Version]
  20. Davidz, H.L.; Nightingale, D.J. Enabling systems thinking to accelerate the development of senior systems engineers. Syst. Eng. 2008, 11, 1–14. [Google Scholar] [CrossRef] [Green Version]
  21. McDermott, T.; Freeman, D. Systems thinking in the systems engineering process: New methods and tools. In Systems Thinking: Foundation, Uses and Challenges; Moti, F., Shaked, H., Sigal, K.-K., Eds.; Nova Science Publishers: Hauppauge, NY, USA, 2016. [Google Scholar]
  22. Zhang, Y. Dynamic Systems with Multiple Elements. In Systems Thinking: Foundation, Uses and Challenges; Moti, F., Shaked, H., Sigal, K.-K., Eds.; Nova Science Publishers: Hauppauge, NY, USA, 2016. [Google Scholar]
  23. Checkland, P. Soft systems methodology. Hum. Syst. Manag. 1989, 8, 273–289. [Google Scholar] [CrossRef]
  24. Jackson, M. Critical Systems Thinking and the Management of Complexity; John Wiley & Sons: Oxford, UK, 2019. [Google Scholar]
  25. Nagahi, M.; Jaradat, R.; Goerger, S.R.; Hamilton, M.; Buchanan, R.K.; Abutabenjeh, S.; Ma, J. The impact of practitioners’ personality traits on their level of systems-thinking skills preferences. Eng. Manag. J. 2020, 1–18. [Google Scholar] [CrossRef]
  26. Banson, K.; Nguyen, N.; Bosch, O. A systems thinking approach to the structure, conduct and performance of the agricultural sector in Ghana. Syst. Res. Behav. Sci. 2018, 35, 39–57. [Google Scholar] [CrossRef]
  27. Cabrera, D.; Cabrera, L.; Powers, E. A unifying theory of systems thinking with psychsocial applications. Syst. Res. Behav. Sci. 2015, 32, 534–545. [Google Scholar] [CrossRef]
  28. Moldavska, A.; Welo, T. Development of manufacturing sustainability assessment using systems thinking. Sustainability 2016, 8, 5. [Google Scholar] [CrossRef] [Green Version]
  29. Frank, M. Assessing the interest for systems engineering positions and other engineering positions’ required capacity for engineering systems thinking (CEST). Syst. Eng. 2010, 13, 161–174. [Google Scholar] [CrossRef]
  30. Plate, R.; Monroe, M. A structure for assessing systems thinking. Creat. Learn. Exch. 2014, 23, 1–3. [Google Scholar]
  31. Grohs, J.R.; Kirk, G.R.; Soledad, M.M.; Knight, D.B. Assessing systems thinking: A tool to measure complex reasoning through ill-structured problems. Think. Skills Creat. 2018, 28, 110–130. [Google Scholar] [CrossRef]
  32. Assaraf, O.; Orion, N. Development of system thinking skills in the context of earth system education. J. Natl. Assoc. Res. Sci. Teach. 2005, 42, 518–560. [Google Scholar] [CrossRef]
  33. Hooper, M.; Stave, K. Assessing the effectiveness of systems thinking interventions in the classroom. In Proceedings of the 26th International Conference of the System Dynamics Society, Athens, Greece, 20–24 July 2008. [Google Scholar]
  34. Arnold, R.D. A complete set of systems thinking skills. Insight 2017, 20, 9–17. [Google Scholar] [CrossRef]
  35. Lavi, R.; Dori, Y.J.; Wengrowicz, N.; Dori, D. Model-based systems thinking: Assessing engineering student teams. IEEE Trans. Educ. 2019, 63, 39–47. [Google Scholar] [CrossRef]
  36. Buckle, P. Maturity models for systems thinking. Systems 2018, 6, 23. [Google Scholar] [CrossRef] [Green Version]
  37. Koral-Kordova, S.; Frank, M. Model for describing the systems thinking factors. In Systems Thinking: Foundation, Uses, and Challenges; Moti, F., Shaked, H., Sigal, K.-K., Eds.; Nova Science Publishers: Hauppauge, NY, USA, 2016. [Google Scholar]
  38. Bloom, B.; Engelhart, M.; Furst, E.; Hill, W.; Krathwohl, D. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain; David McKay Company: New York, NY, USA, 1956. [Google Scholar]
  39. Kordova, S.; Frank, M. Systems thinking as an engineering language. Am. J. Syst. Sci. 2018, 6. [Google Scholar] [CrossRef]
  40. Jain, S.; Dubey, S.; Jain, S. Designing and validation of questionnaire. Int. Dent. Med. J. Adv. Res. 2016, 2. [Google Scholar] [CrossRef]
  41. Bo, C.; Shen, G.; Liu, J.; Li, X.-Y.; Zhang, Y.; Zhao, F. Privacy tag: Privacy concern expressed and respected. In Proceedings of the 12th ACM Conference on Embedded Network Sensor Systems, Memphis, TN, USA, 3–6 November 2014; pp. 163–176. [Google Scholar] [CrossRef]
  42. Anic, I.-D.; Budak, J.; Rajh, E.; Recher, V.; Skare, V.; Skrinjaric, B. Extended model of online privacy concern: What drives consumers’ decisions? Online Inf. Rev. 2019, 43, 799–817. [Google Scholar] [CrossRef]
  43. Culnane, C.; Leins, K. Misconceptions in privacy protection and regulation. Law in Context Socio Legal J. 2019, 36, 2. [Google Scholar] [CrossRef]
  44. Regan, P.M. Privacy as a common good in the digital world. Inf. Commun. Soc. 2002, 5, 382–405. [Google Scholar] [CrossRef]
  45. Mokrosinska, D. Privacy and autonomy: On some misconceptions concerning the political dimensions of privacy. Law Philos. 2018, 37, 117–143. [Google Scholar] [CrossRef]
  46. Dorraji, S.E.; Barcys, M. Privacy in digital age: Dead or alive?! Regarding the new EU data protection regulations. Soc. Technol. 2014, 4, 292–305. [Google Scholar] [CrossRef] [Green Version]
  47. Li, H.; Yu, L.; He, W. The impact of GDPR on global technology development. J. Glob. Inf. Technol. Manag. 2019, 22. [Google Scholar] [CrossRef] [Green Version]
  48. European Commission. Data Protection. 2021. Available online: https://ec.europa.eu/info/law/law-topic/data-protection_en (accessed on 2 March 2021).
  49. Hirschprung, R.S.; Leshman, O. Privacy disclosure by de-anonymization using music preferences and selections. Telemat. Inform. 2021, 59, 101564. [Google Scholar] [CrossRef]
  50. Redmiles, E.M.; Kross, S.; Mazurek, M.L. How well do my results generalize? Comparing security and privacy survey results from mturk, web, and telephone samples. In Proceedings of the 2019 IEEE Symposium on Security and Privacy (SP), San Francisco, CA, USA, 19–23 May 2019; IEEE: New York, NY, USA, 2019; pp. 1326–1343. [Google Scholar] [CrossRef]
  51. Warner, S.L. Randomized response: A survey technique for eliminating evasive answer bias. J. Am. Stat. Assoc. 1965, 60, 63–69. [Google Scholar] [CrossRef] [PubMed]
  52. Choi, B.C.; Pak, A.W. Peer reviewed: A catalog of biases in questionnaires. Prev. Chronic Dis. 2005, 2, A13. [Google Scholar]
  53. Robertson, R.E.; Tran, F.W.; Lewark, L.N.; Epstein, R. Estimates of non-heterosexual prevalence: The roles of anonymity and privacy in survey methodology. Arch. Sex. Behav. 2018, 47, 1069–1084. [Google Scholar] [CrossRef]
  54. Kordova, S.K.; Moti, F. Can we train management students to be systems thinkers—Additional results. In Proceedings of the 61st Meeting of the International Society for the Systems Sciences, Vienna, Austria, 9–14 July 2017. [Google Scholar]
  55. Anastasi, A. Psychological Testing, 6th ed.; Macmillan: New York, NY, USA, 1988. [Google Scholar] [CrossRef]
  56. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16. [Google Scholar] [CrossRef] [Green Version]
  57. Malandrino, D.; Petta, A.; Scarano, V.; Serra, L.; Spinelli, R.; Krishnamurthy, B. Privacy awareness about information leakage: Who knows what about me? In Proceedings of the 12th ACM Workshop on Workshop on Privacy in the Electronic Society, Berlin, Germany, 4–8 November 2013; Association for Computing Machinery: New York, NY, USA, 2013; pp. 279–284. [Google Scholar] [CrossRef]
  58. Kani-Zabihi, E.; Helmhout, M. Increasing service users’ privacy awareness by introducing on-line interactive privacy features. In Proceedings of the 16th Nordic Conference on Secure IT Systems, NordSec 2011, Tallinn, Estonia, 26–28 October 2011; Springer: Berlin, Germany, 2011; pp. 131–148. [Google Scholar] [CrossRef]
  59. Torra, V. Privacy in data mining. In Data Mining and Knowledge Discovery Handbook; Oded, M., Lior, R., Eds.; Springer: Boston, MA, USA, 2010. [Google Scholar] [CrossRef]
  60. Hamza, R.; Yan, Z.; Muhammad, K.; Bellavista, P.; Titouna, F. A privacy-preserving cryptosystem for IoT E-healthcare. Inf. Sci. 2020, 527, 493–510. [Google Scholar] [CrossRef]
  61. Li, W.; Milletarì, F.; Xu, D.; Rieke, N.; Hancox, J.; Zhu, W.; Baust, M.; Cheng, Y.; Ourselin, S.; Cardoso, M.J.; et al. Privacy-preserving federated brain tumour segmentation. In International Workshop on Machine Learning in Medical Imaging; Springer: Cham, Switzerland, 2019; pp. 133–141. [Google Scholar] [CrossRef] [Green Version]
  62. Trisnayanti, Y.; Khoiri, A.; Miterianifa; Ayu, H. Development of Torrance test creativity thinking (TTCT) instrument in science learning. In Proceedings of the AIP Conference Proceedings 2194, Surakarta, Indonesia, 26–28 July 2019; AIP Publishing LLC: College Park, MD, USA, 2019. [Google Scholar] [CrossRef]
  63. Tsai, Y. Relationship between organizational culture, leadership behavior and job satisfaction. BMC Health Serv. Res. 2011, 11, 1–9. [Google Scholar] [CrossRef] [Green Version]
  64. Hustinx, P. Privacy by design: Delivering the promises. Identity Inf. Soc. 2010, 3, 253–255. [Google Scholar] [CrossRef] [Green Version]
  65. Hirschprung, R.; Toch, E.; Bolton, F.; Maimon, O. A methodology for estimating the value of privacy in information disclosure systems. Comput. Hum. Behav. 2016, 61, 443–453. [Google Scholar] [CrossRef]
Figure 1. The process of optimizing a systems thinking questionnaire in order to minimize inaccuracies caused by biased answers due to privacy concerns.
Figure 1. The process of optimizing a systems thinking questionnaire in order to minimize inaccuracies caused by biased answers due to privacy concerns.
Systems 09 00036 g001
Figure 2. The distributions of occupations among the empirical study populations. (a) The privacy-assessed group ( P A R a ); (b) The systems-thinking-assessed group ( P A R b ).
Figure 2. The distributions of occupations among the empirical study populations. (a) The privacy-assessed group ( P A R a ); (b) The systems-thinking-assessed group ( P A R b ).
Systems 09 00036 g002
Figure 3. The distribution of the privacy concern level across questions.
Figure 3. The distribution of the privacy concern level across questions.
Systems 09 00036 g003
Figure 4. The distribution of the score of systems thinking skills.
Figure 4. The distribution of the score of systems thinking skills.
Systems 09 00036 g004
Table 1. Pre- and Post- ρ Τ .
Table 1. Pre- and Post- ρ Τ .
GroupItemsThresholdPre- ρ Τ Items That Were DroppedPost- ρ Τ
All items
( Q )
1–212.40.59511, 12, 15, 17–210.633
Group 1
( s 1 )
3, 5, 6, 7, 11, 12, 14, 16, 17, 18, 202.40.44311, 12, 17–200.562
Group 2
( s 2 )
1, 2, 4, 8, 9, 10, 13, 15, 19, 212.40.49715, 19, 210.582
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hirschprung, R.S.; Kordova, S. Introducing the Privacy Aspect to Systems Thinking Assessment Method. Systems 2021, 9, 36. https://doi.org/10.3390/systems9020036

AMA Style

Hirschprung RS, Kordova S. Introducing the Privacy Aspect to Systems Thinking Assessment Method. Systems. 2021; 9(2):36. https://doi.org/10.3390/systems9020036

Chicago/Turabian Style

Hirschprung, Ron S., and Sigal Kordova. 2021. "Introducing the Privacy Aspect to Systems Thinking Assessment Method" Systems 9, no. 2: 36. https://doi.org/10.3390/systems9020036

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop