Next Article in Journal
How to Express and to Measure Whether an Economic System Develops Intensively
Next Article in Special Issue
A Systematic Framework for Exploring Worldviews and Its Generalization as a Multi-Purpose Inquiry Framework
Previous Article in Journal
Evolution of ERP Systems in the Cloud: A Study on System Updates
Previous Article in Special Issue
Natural Systems Thinking and the Human Family
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Maturity Models for Systems Thinking

Robert B. Willumstad School of Business, Adelphi University, 1 South Avenue, Garden City, NY 11530, USA
Systems 2018, 6(2), 23;
Submission received: 27 March 2018 / Revised: 27 May 2018 / Accepted: 31 May 2018 / Published: 11 June 2018
(This article belongs to the Special Issue Systems Thinking)


Recent decades have seen a rapid increase in the complexity of goods, products, and services that society has come to demand. This has necessitated a corresponding growth in the requirements demanded of organizational systems and the people who work in them. The competence a person requires to be effective in working in such systems has become an area of increased interest to scholars and practitioners in many disciplines. How can we assess the degree to which a person is executing the competencies required to do good systems work? Several industries now utilize maturity models in the attempt to evaluate and cultivate people’s ability to effectively execute complex tasks. This paper will examine current thought regarding the value and pitfalls of maturity models. It will identify principles and exemplars that could guide the development of a Maturity Model of Systems Thinking Competence (MMSTC) for the varied roles people inhabit in systems contexts.

1. Introduction

There are many aspects of systems science that occupy the minds of its practitioners. Increasing the widespread use of high-quality systems thinking is among them. Enthusiasm for this goal is based on the conviction that thinking systemically and applying systems knowledge to critical issues are of evolutionary significance to our world. Fueled by this conviction, the systems body of knowledge is growing, being codified, and evaluated for its progress in occupying a rightful place among other scientific disciplines.
Recent work on systemology (e.g., [1,2]) is working to formalize systems knowledge so that educational efforts can give people “clear concepts and a common language that gives them the capability to articulate and reflect on this (systemic) sensibility, and act upon it in a considered way” [3] (p. 3). Further, formalization and other efforts to increase the quality of systems knowledge can enable systems thinkers to “gain influence in supporting organizations, and through that influence to better enable systems thinking and acting of individuals and groups, which may (in turn) lead to more quality in how people deal with complex challenges” [3] (p. 5). At a macro level, the systemology initiative is working to organize our understanding of the body of knowledge about systems relative to other scientific disciplines and to identify key gaps in that body of knowledge.
At a micro level, individual people who consciously work with systems inhabit varying roles—systems engineers, systems scientists, systems dynamicists, and systems practitioners among them. People in each of these roles deal with varying issues and situations and therefore face varying technical demands; a great range of systems competencies is required for the different roles that systems workers play. Future research would be useful to identify the range of roles people do play with respect to systems to clarify the characteristics of the varied personae they require to do systems work. The present paper addresses a different matter. Alongside the ongoing macro-level work to identify the maturity of the systems body of knowledge, there is a need to address, at a more micro level, the levels of competence required for systems thinkers to be effective.
From a “resource-based view of the firm” [4], the unique competencies of people are a resource as crucial as the combination of capital, physical, and other tangible resources that make up a human system. Measuring competence levels, it is reasoned, should be as important to an organization as measuring operational efficiency or financial performance [5]. In business, things get measured so that action can be taken to manage and improve them. Increasingly, this premise is being applied to the degree of maturity that an organization’s people exhibit, i.e., the degree to which they are applying the knowledge and behaviours necessary to achieve excellence. There is increasing interest in identifying reliable ways of measuring such competence. Without the ability to reliably measure competence, it is difficult to strategically focus organizational attention to things such as “staff development, recruitment and selection, professional registration, training needs, analysis and planning, job descriptions, assessment and appraisal” [6] (p. 16). Without this measurement ability, pragmatic action to increase competence maturity is difficult to take.
From the academic standpoint, Rašula, Vukšić, and Štemberger have noted that without reliable ways to measure competence maturity, “a comprehensive theory of knowledge or knowledge assets is very difficult to develop. Consequently, there is no visible progress in the effort to treat knowledge as a variable to be researched, or asset to be managed” [5] (p. 48). The argument here is that maturity assessment is important in developing a common understanding of what comprises fundamental concepts such as knowledge and development. Creating a maturity assessment demands “refinement of a general, unified representation” of such concepts across specialties [7] (p. 83), which can enable consistency in communicating and operationalizing these ideas. These considerations, arising from organizational theory, are relevant to the issues identified by the editors of this special issue on systems thinking. Without reliable ways to measure the maturity of anyone’s competence in systems thinking, it is difficult to develop a theory of systems knowledge. It is difficult to argue for the treatment of systems thinking as an asset for an individual, an organization, a profession, or an academic discipline. It is difficult to agree on how to define exactly what is systems thinking and difficult to know how to develop it.
We can consider another difficulty. The presence of maturity models can assist the development of a comprehensive theory about systems thinking. At the same time, a comprehensive theoretical foundation would be invaluable in developing a maturity model. Which should come first: models of systems thinking maturity, or theories about systems thinking maturity on which models could be based? Perspectives on how such efforts should be sequenced differ according to stakeholder group. Businesses want to focus on what is known about success, what works, and what can be improved. They want models that enable employees to take pragmatic action on present-day demands. They do not have the luxury to await fully developed theories and formalized knowledge before action can be taken. Yet, given the pressing nature of many of the problems to which systems thinking is being applied, undertheorized systems thinking also seems to be a risk. Here, the academic community can help to mitigate the risk, clarifying knowledge and other relevant constructs, thus helping industry to have confidence in the validity of what gets measured and is believed to create success. Scholarly efforts can improve the advancement of science (while also complementing business’ concerns) by focusing on the development of understanding while being less constrained by demands for short-term focus and commercial profitability [8]. Thus, the question of how to prioritize systems thinking maturity models versus theory is a question of objectives: is the goal to make contributions to systems thinking knowledge, or contributions to successful systems projects? Academics and industrialists would answer this differently.
Despite assumptions that people who run organizational systems and people who study them might have similar interests [9], disjoints between academics and practitioners have long existed.1 “The gap is not restricted to the organizational sciences, but rather it is found in nearly all fields in which there are both researchers and practitioners” [8] (p. 340). There are reasons for this. Among them, systems scholar Ian Mitroff has observed that academics and practitioners differ in the types of information they believe constitute valid bases for action [11]. Such difference notwithstanding, the efforts of those interested in research and applied practice have been shown to benefit both [9,10]. Studies have shown that both quality and rate of knowledge creation are enhanced when tensions are present between differing stakeholders, disciplines, hypotheses, theories, prototypes, and implementation strategies [12,13,14,15,16]. This would suggest that two parallel streams of focus—pragmatic maturity model development, and scholarly development of theory about systems thinking maturity—would both be valuable; each would be incomplete without the other, and each would be strengthened by continued improvements in the other.
Thus, given the prospect of gains for both theory and practice, maturity models have emerged in many disciplines: psychology, business processes, project management, and knowledge management among them. In each case, the models have enabled comparisons among people and have had the effect of normalizing specific skills and behaviours deemed important to effectiveness [17]. As such, maturity models are having increased significance in aiding those interested in pursuing performance excellence in many realms of human endeavour [18]. Were systems communities to embark on the task of developing a means of assessing systems thinking maturity, we would be closer to clarifying what comprises systems competence across the widely varying subspecialties of systems theory and practice. Identifying unified, disciplined ways of representing systems thinking competence would go far to promote understanding and cohesion among these varied schools of systems thought. Likewise, it could contribute to the communities’ ability to promote the value of systems knowledge in society and improve the effectiveness of cross-disciplinary dialogue that systems communities could have with other scientific disciplines.
The purpose of this paper is not an exhaustive survey of existing maturity models. Rather, it will discuss construct definitions, types of maturity models, critiques, and design considerations with an eye to examining how maturity models could be useful evaluation tools for systems thinking proponents to consider.

2. Maturity Models: Fundamentals

Maturity models are premised on the idea that successful performance is the result of effectively used knowledge. As such, models are tools for practitioners striving for excellence, groups interested in promoting collective expertise, and for scholars working to build developmental theories [19,20]. However, when applied to systems thinking, an immediate problem presents itself: what is systems thinking actually? A ubiquitous term [21], in the past 50 years many scholars have posited many definitions, e.g., [22,23,24,25]. Additionally, a sizable popular literature has emerged to teach people what it is and how to do it [26]. A central feature of systems thinking articles and books, in most cases, is the claim that becoming a skilled systems thinker rests on understanding what a system is and why systems tend to operate as they do.
According to social psychology, a fundamental first step in any epistemic is a knowledge domain [27]. In many fields, such a requirement is institutionally recognized: to be an accountant, one must have accounting knowledge; to be an historian, one must study history; to be an engineer, one must know one of several core disciplines; etc. [28]. It follows that the same logic would apply to an epistemic of systems thinking. However, regarding systems thinking, Checkland has been dismissive about anything that might be considered a knowledge domain:
This is about the limit of what we can say about every example of systems thinking… there will be an observer who gives an account of the world, or part of it, in systems terms; his purpose in so doing; his definition of his system or systems; the principle which makes them coherent entities; the means and mechanism by which they tend to maintain their integrity; their boundaries, inputs, outputs, and components; their structure….
[24] (p. 102)
Despite this pessimistic tone, a case can be made that a knowledge domain about systems themselves does exist.
For example, the literature pertaining to sociotechnical systems (e.g., [29] and elsewhere) is consistent in describing fundamental tenets such as these:
People exist in relationship [21] and “systems … are intrinsically concerned with relationships” [24] (p. A24). Anyone attempting a systemic view must attend to “detail complexity” (i.e., the number and individual characteristics of a system’s members [30]). Also important are the number and qualities of relationships that a system contains.
Members of systems require one another to achieve their goals. They are connected in relationships of interdependence [31,32]. The strength of interdependence (or “interconnection” [33]) among members varies, affecting the degree to which a system can be viewed as cohesive.
Human systems are purposive [24]. Such purposes arise from the ways members organize meanings [25]. A system’s behaviour expresses explicitly espoused purposes [34] and also tacit purposes which may be unrecognized by members themselves [35,36].
The manner in which systems are organized (intentionally or otherwise) arises from interactions among the system’s members [23,37]. While such interactions tend to be patterned, they are also dynamic [21,38]. Hence, emergence is a feature of human systems [39].
Sociotechnical systems contain dichotomies and tensions [26]. Working effectively with these requires, for example, that one consider both a system’s parts [25,38] and its nature as a coherent whole [24,36,39], akin to working with both “figure and ground”, as described by Gestalt principles of perception [40].
Though not exhaustive of the principles elucidated in systems books and articles, these tenets illustrate central ideas in the systems knowledge domain. Still, how can we know when a person is relating to such knowledge maturely or not?
Designers of maturity model tools imply that the way one uses knowledge reflects one’s location on a scale of immaturity-to-maturity.2 Popular encyclopedias such as Wikipedia convey that maturity involves:
  • The ability to appropriately respond to the environment;
  • Demonstrated capacity for effective decision making, suitable to context;
  • The attainment of sophistication in the flexible use of knowledge and performance of behaviours;
  • Appropriate levels of self-reliance and autonomy (in contrast to dependence on the oversight of authority figures);
  • The ability to consider options and seek relevant advice, when appropriate; and
  • The ability to exercise appropriate temperance and discipline, taking calculated risks without undue impulsivity.
People exhibit these abilities to varying degrees (that is, at various levels of maturity). The means by which maturity is demonstrated is competence, a concept of increasing interest since a seminal publication in the educational testing literature in the 1970s [43]. Since then, it has become a construct of increasing interest in theories of organizational behaviour, popularized by Boyatzis [44] and other scholars. A comprehensive review of competence is beyond the scope of this paper, but for our purposes, in the realm of maturity models, competence at systems thinking would encompass a collection of traits, motives, self-image, and perceptions of social norms and behaviour enabling a person or group to direct systems knowledge in such a way that desired results are consistently achieved [6,45].
“Desirability” of the results achieved is readily measured in organizational settings. For example, in education, a curriculum gets delivered that is shown to enable students to meet intended learning objectives; in systems engineering, a complex structure is delivered that is designed to function effectively across its life cycle; in artificial intelligence, machines are created to accurately mimic human cognition. In any of these cases, success is measured in how well a team identifies and solves necessary problems in such a way that they develop and execute projects that meet sponsors’ intended goals and success criteria [29,38,46].
However, binary thinking (e.g., desirable results vs. undesirable results, maturity vs. immaturity) could readily lead us to view systems thinking as something one can do or cannot. Writers have argued to the contrary, that systems thinking can be developed [21,29]. The maturity model literature takes pains to state that the use of specific knowledge is deemed important both in the capacity to both operate in a competent manner [6] and to improve that capacity: “Improvement … require[s] some guidance on what to improve, and in identifying improvement efforts that will provide the most value … Conducting assessments provides guidance in terms of current capabilities and identification of performance gaps, helping to identify where improvement is possible, necessary, or desirable” [47]. As we can speak of improving the capacity to deliver desirable results, so too does the maturity model literature speak of improving maturity. It does this by conceptualizing maturity in terms of gradations.
A central assumption of maturity models is that development of mature performance occurs in predictable patterns. This assumption is evident throughout scholarly papers in many disciplines that address maturity, for example:
  • In motivational theory: e.g., that human drives are arranged from basic to ultimate [48];
  • In management theory: e.g., that firms develop in stages along logical, predictable paths [49];
  • In agile software development: e.g., that practices evolve from ad hoc to continuous improvement [50].
Proponents of maturity models believe that maturity is comprised of “tightly defined, repeatable, and predictable processes [that] directly contribute” to capable behaviour [51] (p. 147). The sense-making patterns underlying capable behaviour with respect to systems is largely unknown at this time. If they were understood, then models could be built to diagnose the maturity of a person’s (or organization’s) current systems thinking practices (i.e., the stage at which practice has stabilized [52]), in order to understand that person’s (or organization’s) current standing relative to others who are competent in systems thinking [5,53].
Maturity models are guided by the assumption that specifically interlinked collections of competently used knowledge and skills [52] comprise coherent levels or stages of maturity. Levels are ordered, thereby creating a hierarchical concept system that enables comparative ranking of different persons (or groups) and models a process of evolution by which a person (or group) can move toward increasingly sophisticated and reliable performance [17,54]. Different maturity stages are understood to be appropriate for achieving tasks of varying levels of complexity, giving rise to the notion of competence fit, which is another facet of maturity models. Rather than assuming that the highest possible level of maturity is inherently necessary, adherents of maturity models generally agree that maturity level should be matched to the difficulty of the task at hand, problems to be solved, and environmental context in which one is operating. At present, there are no consensually agreed-upon levels of sense-making based on systems thinking skills that could inform the assignment of systems thinkers to systemically complex projects.
If it is true that developmental behaviours are predictable, then maturity models are diagnostic, anticipating the likelihood of success a systems thinker would have when faced with a systems problem and given that thinker’s current maturity level. Maturity models also claim to be prognostic, taking the idea of developmental predictability to mean that likelihood of success can be reliably increased by identifying areas of improvement that will progress one toward excellence. They do this by identifying areas of consensually defined weakness (or “fragility” [20]) that hinder optimal functioning [52]. As such, maturity models could facilitate planning, guidance, and control over future systems thinking performance by outlining what sense-making approaches in systems thinkers should be reinforced and prescribing the sense-making approaches of more mature stages as areas for prioritized development. “Organizations regularly invest in capability development; the capability maturity model aims to provide valuable guidance” in targeting training investment [20] (p. 146), in such a way as to strategically exploit existing capabilities and to strengthen potential ones [55]. Description and prescription lie at the heart of the maturity model’s purpose and promise.
As indicated above, Checkland’s claim was that every example of systems thinking is merely a matter of an observer with a purpose who defines a system and identifies the mechanisms that make its structure coherent [24] (p. 102). This view obscures the fact that observers differ in their capacities to define a system, identify its mechanisms, and understand its structural cohesion. Systems thinkers vary widely in their knowledge and skill, as do professionals of every sort. Accordingly, diverse parties have embarked on maturity model initiatives.

3. Varieties of Maturity Model

A variety of groups have made claims to understanding systems thinking maturity and codifying it. For example:
  • In the United States, Carnegie Mellon University developed a model for the Hewlett-Packard Consulting company. Sabre Airlines has developed an in-house maturity model [56], as has NASA [57,58]. MITRE formulated a competency model reflecting its particular branded approach to government-funded research and development [59].
  • Maturity models have been built based on analyses of human and business processes in European corporations such as Nokia in Finland and British Telecom in the United Kingdom [53].
  • INCOSE UK has constructed a systems engineering competency framework to guide individuals and teams working in enterprises represented by its advisory board [60].
In each instance, maturity was defined in terms of a particular application or domain of human activity, ranging from innovation-generating situations like project management, the collaborative development of new computer software, and the ability to leverage Big Data [51,52,61], to routine organizational operations where effectiveness-enhancing actions such as practices of reflection are said to signify and enhance an organization’s functioning [52]. Oriented toward different fields of human endeavour, what all of these maturity models have in common is the intent of formalizing and institutionalizing particular knowledges, skills, behaviours, values, and practices that are considered necessary for effective (i.e., mature) modes of operating in a particular context. The knowledge, skills, behaviours, values, and practices demanded in different fields of systems theory and practice are vast. However, the different contexts of human activity of interest to maturity modelers give rise to a variety of common characteristics in their models.
All maturity models aim to provide users with conceptual schemas for understanding how maturity is multifaceted in nature. Language used in models includes:
  • “pillar factors”, “dimensions”, and “axes” [17];
  • “structural components”, “phases”, “key agents”, and “externalities” [54];
  • “drivers” [50];
  • “input competencies” [6]; and
  • “tools” [53].
Models vary in the granularity with which they conceptualize these elements and their varied permutations. The degree to which models claim to be “tools” seems related to whether or not they portray maturity as a state resulting from tangible factors conducive to quantitative measurement by Likert scales or intangible factors better suited to qualitative description (the latter ranging from models using broad-based qualitative descriptions to those utilizing detailed descriptions that management theorists would characterize as “thick” or “rich” [62]. The ambition of some maturity models is to elucidate different modes of behaviour with respect to maturity, each useful in their own right. For example, models like these would describe clustered themes of systemic sense-making and the resulting behaviours. Such models would enable users to conceptualize qualitatively different ways that systems thinkers can function (not better or worse; merely different).
In contrast to maturity models that focus on qualities, the ambition of other models is to rank quantity—to rate different modes of functioning in terms of greater or lesser desirability. Such models seek to identify “poor” systems thinking in contrast to “good” systems thinking. Gradations are a feature common to these maturity models: they identify a range of human capabilities and behaviours and locate each in terms of its proximity to what designers understand as a state of ideal systems thinking maturity, thus creating a representation of distance and nearness to that state. Aspirational models like these vary in complicatedness, typically involving four or more levels (depending on whether or not stage zero is accorded any merit [17]), with each model including distinct components, dimensions, behaviours, or capabilities ranging from a few to upwards of 75 [20]. Such models are said to describe an ordered arrangement of levels, each understood as prerequisite to the next step along a singular evolutionary pathway oriented toward optimal performance (i.e., matureness) [18]. Users of these models can identify their location along the path via rating keys, or in some cases, exemplar situations given to represent how behaviour at each stage should look.
Developmentally oriented maturity models have an explicitly forward-moving telos, clarifying the meanings of desirable states such as superiority, mastery, and excellence. Where forward-moving progress is the aim, maturity models vary in the degree to which they explicitly assist users in advancing their path. Some models focus on within-stage characteristics, clarifying in sometimes great detail each level; they facilitate progress only indirectly, by merely naming the subsequent stage to be achieved (“benchmarking models” are examples of this [53]). Developmentally oriented models take a more active and direct role in facilitating users’ progress, describing constraining forces that account for limited functioning (i.e., explaining why one is at a current level of maturity and not a higher one), and by naming drivers or enabling factors that would facilitate movement toward each next stage [50].
While all maturity models describe different stages of maturity, considerably fewer make claims to have uncovered the mechanisms of movement necessary to ascend between stages. That is, models differ in the claims they make to be descriptive of different states of maturity, comparative of modes of maturity exhibited by different people, or prescriptive of what actions to take in order to better one’s level of maturity [54]. Thus, maturity models present themselves for two distinctly different purposes of use: (1) understanding how one operates, and (2) directing how to change that in favour of different or more useful ways of operating.
A final variance in existing maturity models is worth emphasizing. Few models portray maturity as context-free. Indeed, maturity itself should be defined in terms of one’s skillful engagement with contextual factors. Most maturity models identify contextual factors that are pertinent to users and provide descriptions of how effective engagement with those factors typically looks at each level of maturity [47]. Maturity models in different disciplines vary in the attention they draw to environmental factors as central to the development and display of maturity. Systems thinkers work in wide-ranging settings. Amidst other factors, systems thinkers deal with myriad customers, audiences, organizational cultures, and leadership dynamics [51]. All are exigencies that demand a systems thinker’s engagement if one were to perform at effective levels of maturity.

4. Criticisms

Fundamental to maturity models’ raison d’être is the claim that adoption of a maturity model will translate into actual value for individuals or organizations. It is not clear the degree to which these models deliver on that promise [47].
For a maturity model to claim efficacy, there must be comprehensible underlying theoretical constructs and mechanisms of action. Central to the matter is the assertion that concepts called best practices do exist, and that the assumptions and conditions necessary to attain them are clearly understood. Saying that maturity is the state necessary for this attainment is insufficient; models claiming to address maturity should present a well-developed characterization of the construct. Unfortunately, “the central term of maturity is seldom defined explicitly”, despite the number of entrants into the field of maturity models [54] (p. 338). It is unsurprising then, that questions have been raised about how accurately maturity is being codified [47], how effectively it can be measured [50], and whether the claims of its existence in hierarchical structures are a representation of reality that is accurate [20]. Maturity is a complex phenomenon involving many intricately related factors. As systems science has demonstrated, interaction among factors is central to understanding complex phenomena. Theorists and developers of maturity models have been criticized for having inadequately considered such interactions in developing the maturity construct and developing models that claim to account for it [20]. To best practices and maturity itself, we can add notions of transformation to the list of undertheorized aspects of maturity models. The allure of these models is the idea that they help a person or organization achieve ever-greater levels of maturity. Models vary in characterizing this transformation as change, development, or evolution. However, the mechanisms of action driving such transformations toward maturity are little understood. From psychology, for example, “[o]bserved relations between stages of moral development and various forms of social conduct do not establish that the structures of moral reasoning that define stages of moral development exert a significant causal impact on moral behaviour” [63] (p. 672). In the field of business process management, dynamics that generate movement among stages seem even less well understood: “All models implicitly expect organizations to eventually reach the top of the maturity ladder” [54] (p. 339), reflecting little understanding of the processes by which this destination is to be reached. Where mechanisms of action that drive the maturation process are described, maturity modelers argue that these, and not others, are to be followed. This has the effect of devaluing alternative approaches and rendering the modes of thinking used by all but a certain (i.e., “mature”) segment of people “deviant and atypical, rather than reasonable and relevant” [47] (p. 172).3 Enthusiasm for the idea of maturity models notwithstanding, a slipshod approach to building a solid theoretical infrastructure has left such models open to justified criticism.
Maturity models are particularly vulnerable in this respect: the literature generally agrees that maturity models do a good job of describing various degrees of maturity, but where models claim prescriptive insight, they often fail to meet users’ expectations. We see this disappointment in users’ claims that maturity models are oversimplifications of the lived complexities that users experience [53]—surely a concern of systems thinkers—presenting optimistic messages that maturity is a state eventually reached, yet vague on the details of how this actually occurs [67]. Where such details are forthcoming, maturity models attract criticism that the attention they do give to movement between maturity stages is vague or prescribes “step-by-step recipes” [54] that often do not work. Conversely, some models are so complex that they likewise fail to provide the promised rewards [20]. Too simple or too complex, if maturity models are to achieve fitness of use, the complexity of the frameworks they offer must reflect the needs of users. Likewise, the prescriptions they offer must fit the resources available to individuals or organizations. When models are too costly to adopt relative to the rewards they claim to offer, no one wins. When they rigidify the maturity pathways they espouse, individuals or organizations whose problems and environments differ from those envisioned by model designers are left to try force fitting their way to maturity, usually unsuccessfully, or to customize models in ad hoc ways that may also fail [20]. Describing levels of maturity is relatively easy. Recommending pragmatic pathways by which it is to be developed demands attention to real-world impediments to mature behaviour, the difficulties involved in overcoming such impediments, and the need for feasible, flexible guidance that works.
“What works” is, of course, a matter of evidentiary support, and here is where the most damning critiques of maturity models arise. The increasing numbers of maturity models suggest interest in authoritative insight and expertise on how people in varying jobs can operate more maturely; this enthusiasm has, however, been unmatched by actual scientific study to validate such claims. Despite the intuitive appeal of maturity models and anecdotal confirmation that they are useful, research that studies their rigor, validity, or usefulness in correlating model prescriptions with actual success is scarce [45]. When enthusiasm outweighs empirical evidence, the value of a particular model, and maturity models in general, is called into question.
The criticisms that maturity models face are fundamental and appropriate. If such models are to achieve what they set out to achieve, academic communities must undertake serious reflection about the characteristics of maturation to replace the vague belief that it is associated with development in a good direction. (For this reason, scholars in fields such as information science have called for the development of research standards for model designers [53].) Despite the warranted misgivings, there are models that are believed to be relevant and worthwhile: “Certain maturity and competency models might be robust enough to become the global standard for certification purposes” [6] (p. 11). This possibility, that maturity models can stand as international standards of how effective functioning can be measured and developed, has inspired this brief overview.

5. Design Considerations for a Maturity Model of Systems Thinking Competence (MMSTC)

As discussed earlier, knowledge about systems is foundational to systems thinking. Broadly speaking, knowledge generally tends to be associated with thinking. However, it has been argued that a person can know facts about systems without being a systems thinker [26]. This argument is consistent with Kruglanski’s epistemic theory, which recognizes that possessing a knowledge domain is necessary, yet insufficient [27]. Another key element of epistemics is the particular modes of thinking that are conducive to perceiving something accurately. This he calls “welcoming cognitive conditions”—the mental skills and cognitive stances that one requires to focus one’s understanding in order to apprehend a thing. Here, we would say that being a systems thinker requires knowing facts and also utilizing particular ways of perceiving those facts [37]. Coming to know systems facts is a task readily handled by universities; institutions training one on how to do systems thinking are lacking [68].
This is not to say that all existing maturity models for systems thinking have overemphasized knowledge and neglected cognition. The engineering field has several exemplars of models that include cognitive skills, e.g., [33,38,58] and others. However, fields of systems thinking beyond systems engineering are largely deficient in articulating what we might term welcoming cognitive conditions for systems thinking. A Maturity Model for Systems Thinking Competence beyond engineering, then, would incorporate key elements of the domain of general systems knowledge as it is presently understood. For any model to be considered worthwhile, it should also include cognitive orientations deemed necessary for systems thinkers in any discipline to perform systems thinking in competent ways. The facility and sophistication with which one uses them would be indicative of one’s level of systems thinking maturity, and without such orientations, it is questionable whether one could be said to be using systems thinking at all. What might be welcoming cognitive conditions necessary for systems thinking? The following could be considered:
An orientation toward causality: A system’s structure is made up of causally linked variables [31]. A focus of systems thinking is identifying both those variables and the nature of the causal relations among them. To Checkland, focusing one’s attention on simple cause and effect sequences is an inferior form of thinking [24], or at least, not one to be understood as systems thinking. Rather, the degree to which systems thinkers succeed in orienting themselves to multiplicities of causal relationships suggests the degree to which a system’s complexity will be appropriately understood.
An orientation toward logic: Systems thinkers regularly face phenomena that do not make sense, or rather, the sense underlying a system’s behaviour is not always readily apparent. The logic of a system—the “set of principles underlying the arrangement of elements” (Oxford Dictionary)—is what must be grasped if one is to understand the way a particular system coheres, what makes it robust, and how it maintains its equilibrium [24,31]. In the face of complexity, people tend to oversimplify why a system is behaving as it is or to dismiss it as illogical [36]. Neither cognitive strategy supports the ability to accurately discern the logic underlying a particular system’s behaviour.
An orientation toward explicit and implicit structures: Systems can appear explicit, comprised of obvious rules giving rise to understandable behaviours. The awareness that structure generates behaviour [30] enables a systems thinker to avoid attribution errors common among people who do not understand systems (for example, assuming that a particular person’s motivation or actions has caused a system’s “problem”). Knowing that behaviour is expressive of structure, a systems thinker wanting to change counterproductive behaviour will imagine potential changes to structural design, rather than assuming that a simple substitution of “problematic” elements for others will overcome the confluence of implicit structural factors delimiting how those elements will likely behave [69,70].
An orientation toward subjectivity: Subjectivities are particularly potent features of human systems [71]. Each person possesses mental models of reality that, taken together, create and sustain a system’s identity [70] and the degree to which that system is able to learn and change. Mental models eliminate feelings of ambiguity and influence how a system’s members think and act; “unless you understand them, you will not understand the system” [70] (p. 147). Yet, inherently, mental models are an aspect of human subjectivity not available for direct measurement [72]. Thus, systems thinkers must orient themselves toward the difficult work of eliciting communication about mental models,4 translating people’s tacit perceptions into discussable language [31,73]. While doing this, they must also refrain from imposing their own judgments on the models of others [25]—striving for a “rigorous approach to the subjective” [24] (p. A43).
An orientation toward self-reflection: In systems thinking, understanding the mental models of members of a system—from their perspective—is necessary [70], and necessarily difficult. Always, it must be acknowledged that human subjectivity—others’ and our own—is incomplete. While subjectivity enables us to function in reality, it does not provide a full or fully accurate representation of a system in which we are operating. Systems thinkers themselves hold incomplete (and thus inaccurate) understandings of a system. They, no less than anyone, hold preconceived values, “taken as given” assumptions, and their own personalized logics [24,31], all of which can obstruct understanding and clear communication. As such, systems thinkers must orient themselves outwardly toward the systems they seek to understand and also inwardly toward themselves [26]. Discomfort is an inherent part of such dual orientation; oppositional emotions (i.e., disbelief, disagreement, etc.) that emerge in systems thinkers are reliable signals that their personal subjectivities are being challenged [69] and must be consciously reflected upon [24] so that they can be usefully discussed with others.
These orientations toward causality, logic, explicit/implicit structures, subjectivity, and self-reflection could be considered mental stances without which systems thinking is not possible. With verification (to be discussed below), these orientations and/or others could come to be understood as welcoming cognitive conditions uniquely important to systems thinkers and as standards without which a Maturity Model for Systems Thinking Competence would be insufficient.
Were a Maturity Model of Systems Thinking Competence to be developed that would be relevant and worthwhile to systems thinkers in various professions worldwide, systems research communities would do well to glean lessons from modelers and theorists from other disciplines. I turn my attention now to highlighting key considerations to be addressed in any future initiatives to develop competence models for systems thinkers.
Consistent with the move underway to codify the nascent science of systemology [1,2], this author agrees with arguments against maturity models that rely solely on anecdotal assertions (including the welcoming cognitive conditions synthesized above). As Edson and Metcalf [74] have written, good systems research responds to the need to marry scientific discernment with lived experience. Any research initiative to establish a model of systems research competence must consider this. Models that describe levels of maturity solely based on anecdote will fail to meet the rigours of good science and will run the risk of misleading systems thinking practitioners who trust them. There is irony in the fact that most maturity models claim that one cannot skip steps on the path to mature standards of competency, while modelers in most disciplines have skipped the crucial step of empirically validating their own models. Without such validation, claims that experts understand the nature of systems thinking maturity, and that systems communities should measure themselves by those claims, tread on shaky ground. Thus, the development of a maturity model with an explicit theoretical base is vital.
A Maturity Model of Systems Thinking Competence should define constructs like maturity and maturation and must identify observable indicators of maturity levels and the characteristics of paths that lay between them [54]. Numerous bodies of theory could provide useful guidance in the development of a systems thinking maturity model:
  • theories of human development in cognitive psychology can inform a theory of the maturation of systems thinking competence;
  • educational theories can inform understanding of systems thinking skills improvement;
  • convergence and divergence theories can help explain path dependencies among systems thinking maturity levels.
  • theories of bounded rationality and information symmetry can inform understanding of how actors at varying levels of maturity make decisions and exert agency in both ineffective (i.e., immature) and effective (i.e., maturity-building) ways [18].
Much discussion about maturity could lead to the implicit assumption that maturity models focus on the maturity of individuals. While many do, it is also the case that some models focus on the maturity of organizations with respect to their competency in business processes, project management, agile software development, etc. It may be possible to create maturity models for individual systems thinkers, working in education, management, or the social sciences, for example. It may also be useful to create maturity models for teams, for example, working in fields such as artificial intelligence. Were there an initiative to develop a broadly applicable maturity model of collective competency in systems thinking, organizational theories can be useful in developing models for organizations in which systems thinking takes place [53]:
  • the resource-based view of the firm [4] has been mentioned earlier as useful for conceptualizing knowledge and skills as organizational assets;
  • organizational change theories [49] provide understandings about how change initiatives can be regulated as well as expectable conflicts, drivers, and impediments; and
  • life cycle theories and teleological theories of goal formation and implementation [75] can be useful in theorizing the development of organizational capabilities.
Systems thinking communities have at their disposal numerous theory candidates that can assist in the development of sound maturity models for both individuals and groups.
Building on a solid theoretical base, all the strategies and methods demanded of good systems research should be applied to any initiative to develop a Maturity Model of Systems Thinking Competence. Model development must include particular care to rigorously differentiate relationships of inference and causality [45] that anticipate the criticisms leveled at maturity models in other fields, whose claims about what actions can reliably move one to greater maturity rely on scant evidence or none at all.
In any scientific endeavour, care must be taken to avoid generalizing findings from one instance to all conceivable contexts. Röglinger, Pöppelbuß, and Becker have noted that maturity models often do not translate well in all situational contexts their users face [54]. Maturity models have struggled to account for the idiosyncrasies of the problem spaces in which users work. Differences in the size of projects, technical complexity, and organizational culture greatly affect the work people do and the ways they do or do not develop maturity [6]. In particular, work that demands unique processes are hard for maturity modelers to predict and take into account. This makes it difficult to imagine the kinds of skills and behaviours to be called forth from users, which makes it difficult to legitimize certain skills and behaviours as exemplars of maturity [51]. While it is problematic to overstate the number of settings to which a maturity model should apply, so too is it problematic to prescribe qualities—in the name of maturity—that implicitly privilege a too-narrow number of people based on moral typologies [76], gender roles [77], reputations of being proven stellar in particular environments [21], or preference toward particular schools of systems work (e.g., systems dynamics, systems modeling, etc.) [38].
A case can be made that the discipline of systemology is uniquely well-placed to develop frameworks that can be generalized in rigorously defensible ways. As Midgley [78] and others have written, a strength of the systems field is the way it encompasses a very diverse collection of perspectives, priorities, and tools. However, since the caution against overgeneralizing applies also to systems science endeavours (W. Varey, personal communication), one might conclude that any initiative to develop maturity measures for systems thinkers would require different models for every one of the widely differing systems approaches. Recently, however, Hammond [79] reminded us about the origins of the modern systems movement that was motivated by the desire to identify patterns common across the boundaries that typically divide academic inquiry. (A central text in the field does, after all, characterize the movement as a quest for a general systems theory [22], and organizations like the International Society for the Systems Sciences have been established “to foster the investigation of the analogy or isomorphy of concepts, laws, and models in various disciplines and professions” Further, it has been argued that certain perceptual and behavioural competencies are common across multiple systems traditions and methodologies [80].
A credible case can be made that a unified Maturity Model of Systems Thinking Competency is possible. In its creation, designers should be aided by the contributions of systems theorists who have contributed to the field by calling for implicit biases to be surfaced and critiqued in systems work (e.g., [81,82]). For a Maturity Model of Systems Thinking Competencies to be ethical and effective, such biases must be a focus of attention.
Other characteristics of good maturity models would serve systems practitioners well. User-friendly design is important. Systems thinkers operating in different cultures and problem domains should have assessment tools that are accessible and comprehensible. Model theorists have stressed the importance of well-structured and easily applicable self-assessment tools [54]. Some have advocated for tests that are “quick” [83]. Others have pointed out the usefulness of models that include templates and checklists for users to collect evidence and artifacts of competent activity at each level of maturity [17]. Should systems communities elect to computerize a maturity model, it should feature intuitive graphical interface and easy report-generating capabilities aligned with principles of good software design [17]. Should systems communities choose to go beyond a descriptive model to actual evidence-based recommendations on advancing one’s level of systems competence, then “relevant drivers and best practices for a roadmap to [increasing] maturity” [50] (p. 141) in systems thinking should be provided in concrete, actionable language that is commensurate with a level of granularity suitable to each maturity level [54]. An emphasis on pragmatic tools, technology, and developmental plans for a Maturity Model of Systems Thinking Competence would have the effect of meeting systems practitioners in their lived experience, while providing transparency about the qualities and components believed to be indicative of competent skills and behaviours at each stage of systems thinking maturity [54].
It is worthwhile to remember critiques that maturity models imply that adherence to particular schemes of behaviour, uniform techniques, and particular decision-making strategies can automatize and guarantee sure progress toward maturity (e.g., [63]). This trivializes the situational complexities users face and would do systems practitioners ill service. It is axiomatic that systems workers grapple with systems that are messy—wicked, even [84,85]. The grappling would be no less for those attempting to develop a maturity model for competencies relevant to systems thinkers working in complex contexts. Competent systems thinking cannot be routinized; the nature of systems work defies this possibility. Mature systems thinkers are aware of the ways the systems they study are interdependent with the environment and aware of the ways in which they themselves are likewise interdependent [86].
The competing forces of unity and plurality that are central to systems work are mirrored in the structure of maturity development evident in existing models. Every model presents its maturity stages as comprised of multiple interacting factors. Those factors include knowledge, skills, and metacognitive abilities [52]; they involve the interplay of cognition, emotional development, moral development, and decision-making capacities able to resolve difficult psychosocial conflicts (—“Maturity”). In other words, any single stage of maturity operates as a system of interdependent elements. Maturity models are complex, involving dynamic interactions unfolding in ways that can shift a person into progressively more mature levels of functioning—i.e., the development of maturity is a phenomenon involving the emergence of successively higher orders of coherence in a person’s capabilities.
“A static or prescriptive model of maturity cannot hope to provide the level of guidance that organizations require in making effective choices” [47] (p. 181). Similarly, “the development and refinement of a [theoretical] construct is an ongoing process that requires attention to clarifying the constructs’ definition and parts” [87]. The work of developing a maturity model for competent systems thinking must be iterative. Research design for a maturity model project should be both rigorously planned and intentionally modified throughout the research life cycle [88], acknowledging that systems research is a circular process that builds upon previously obtained knowledge and responds to experience gained through the course of the study [89]. The project of developing a maturity model for systems thinkers ought to proceed as would any sound systems research initiative. Careful attention should be paid to problem structuring [89]. How the task is framed should be adjusted as the project unfolds and modelers reflect on what they are learning [90]. Central to the development of a Maturity Model for Systems Thinking Competence would be identification of success factors—for example, education, knowledge networks, use of systems tools and techniques, organizational climate, and the support of leaders are all factors identified as conducive or obstructive to maturity in other domains of knowledge work [5]. The relative contribution of these and other factors to systems thinking would need to be evaluated [6], enabling us to clarify the nature of maturity as it pertains to systems work.

6. Conclusions

If there is to be a more systems-literate world, people working in different roles must play a part. In each role, particular systems competencies must be brought to bear, and those competencies will vary in maturity within each person. Learning theory tells us that experience in doing something does not translate into maturation unless we reflect it against our existing understandings and assumptions [52]. Thousands of intelligent, committed systems thinkers have contributed their expertise to pressing world problems for decades now. Are those experiences maturing into increased competence in the practice of systems thinking worldwide? This is a matter for thoughtful consideration.
In several industries and academic disciplines, maturity models have been a way to address the question. A maturity model for competence in systems thinking would be a difficult undertaking. The number of situational contingencies and mediating factors one typically encounters in systems projects is considerable. Identifying the competencies that actually contribute to project success is not easy, as scholars working in other fields have discovered. Navigating the tension between a maturity model’s formality and flexibility is a challenge [20]. Beyond these, engaging in critical self-reflection—which lies at the heart of maturity assessment—opens the possibility of unexpected and possibly uncomfortable discoveries about one’s own immaturity [52]. A maturity model for competence in systems thinking would be a formidable task, but this is not to suggest it ought to be a task left undone.
The task ahead would need to begin by developing clarity about key concepts:
  • What is immature (i.e., rudimentary) systems thinking?
  • What does mature (i.e., advanced) systems thinking look like?
  • What competencies contribute to maturity in the systems thinker?
  • By what means could these competencies be measured?
  • How do people translate systems knowledge into effective systems thinking skills and behaviours?
  • In what ways do systems thinking competencies stabilize (i.e., what levels of systems thinking maturity could be said to exist)?
  • How does one develop from one level of maturity to another?
  • What are the relationships between competent use of systems knowledge, systemic sense-making skills, and successful project outcomes?
Even once we develop answers to fundamental questions such as these, the work of clarifying, refining, and enhancing a maturity model would be ongoing—cumulative work that scholars in many disciplines have struggled to do well [54]. Empirical studies to establish the validity and usefulness of the model would be necessary, particularly with regard to its ability to predict and guide ways of increasing maturity to greater levels of effectiveness [45]. If a Maturity Model for Systems Thinking Competence is to be worthwhile, its accuracy and applicability must gain widespread acceptance among the systems sciences scholarly communities and systems practitioners alike.
In all this work, the underlying premises for creating a maturity model for systems thinking must be clarified and kept at the forefront. Those premises are yet to be determined. However, some broad-based possibilities can be mentioned here. People who participate in international systems organizations share a vested interest in contributing to more accurate understanding and effective solutions to systemically complex problems. Systems thinking, we believe, is central to that aspiration. Systems thinking involves unique, or at least uniquely combined, human competencies. The competence people exhibit in doing systems thinking varies in maturity. Competence in systems thinking is a developmental process and can progress beyond ad hoc approaches typical of new systems thinkers. The academic disciplines of systems science, and the constituents they serve, would be better served if the discipline could clarify the competencies and skills universally necessary to doing good systems work. This would legitimize systems thinking competence and differentiate systems thinkers from those using other kinds of thinking, which would enable recognition of the unique contributions that systems thinking makes. In a variety of settings, the approaches, intelligences, knowledge domains, and welcoming cognitive conditions associated with systems thinking would come to be better recognized and valued. Maturity Models for Systems Thinking Competence could accomplish important things: development of systemology, increase in the value that systems theory and practice can deliver to pressing world problems, and strengthening the legitimacy of systems knowledge as a branch of science of equal merit to other established disciplines.
A Model of Systems Thinking Competence could contribute to our understanding of the different kinds of systems thinking work that people do. Generating such a model would engage members of systems communities in dialogue about the sociocultural and political realities that impact effective systems work. The unanalyzed processes of “adaptation and negotiation within organizations” that impact systems thinking would be surfaced [52] (p. 19). The ways in which competent systems thinkers secure budgetary support, the way their work gets evaluated, and the way they generate lessons for the future would be important in assessing the factors that contribute to the development of systems thinking competence. The ways in which systems thinkers’ intellectual capital is or is not transferred to others within organizations and industries would need to be addressed [5]; the impact of mentor relationships on the maturation of systems thinkers’ competence could be investigated. A systematic process of collective reflection about factors such as these would clarify important situational contingencies that mediate the development of maturity in systems thinkers.
A Maturity Model of Systems Thinking Competence would make transparent the assumptions underpinning current understandings about what constitutes effective behaviour in meeting the challenges of complex systems. It would facilitate the scientific imperative of enabling assumptions underlying a maturity model to be intersubjectively verified by scientists and practitioners. It would mobilize the sharing of interpretations about the sense-making and problem solving practices that systems communities espouse. Greater understanding about the work we do as systems thinkers will not be gained in social isolation. A Maturity Model for Systems Thinking Competence would become a shared analytical lens through which we could understand and judge the competent use of systems science knowledge, skills, and behaviours. It would, thereby, act as a force for community identity building, with the potential to substantially affect the impact that systems thinkers can make in the future.


The author wishes to thanks Kajal Akruwala, MBA, for her assistance in this research.

Conflicts of Interest

The author declares no conflict of interest.


  1. Rousseau, D.; Wilby, J.; Billingham, J.; Blachfellner, S. A typology for the systems field. Syst. Connect. Matter Life Cult. Technol. 2016, 4, 15–47. [Google Scholar]
  2. Rousseau, D.; Wilby, J.; Billingham, J.; Blachfellner, S. General Systemology: Transdisciplinarity for Discovery, Insight, and Innovation; Springer: Singapore, 2018; ISBN 978-981-10-0891-7. [Google Scholar]
  3. Edson, M.; Buckle, P.; Ferris, T.; Hieronymi, A.; Ison, R.; Metcalf, G.; Mobus, G.; Nguyen, N.; Rousseau, D.; Sankaran, S.; et al. Systems research: A foundation for systems literacy. In Systems Literacy—Proceedings of the Eighteenth IFSR Conversation 2016; Edson, M., Metcalf, G., Tuddenham, P., Chroust, G., Eds.; Books on Demand: Norderstedt, Germany, 2017; pp. 64–76. ISBN 374317913X. [Google Scholar]
  4. Wernerfelt, B. A resource-based view of the firm. Strateg. Manag. J. 1984, 5, 171–180. [Google Scholar] [CrossRef]
  5. Rašula, J.; Vukšić, V.; Štemberger, M. The integrated knowledge management maturity model. Zagreb Int. Rev. Econ. Bus. 2008, 11, 47–62. [Google Scholar]
  6. Skulmoski, G. Project maturity and competence interface. Cost Eng. 2001, 43, 11–18. [Google Scholar]
  7. Kemp, K.; Goldstein, N.; Zelle, H.; Viljoen, J.; Heilbrun, K.; DeMatteo, D. Building consensus on the characteristics of developmental maturity: A cross-disciplinary survey of psychologists. Int. J. Forensic Ment. Health 2017, 16, 83–91. [Google Scholar] [CrossRef]
  8. Domingues, P.; Sampaio, P.; Arezes, P. Integrated management systems assessment: A maturity model proposal. J. Clean. Prod. 2016, 124, 164–174. [Google Scholar] [CrossRef]
  9. Van Looy, A.; de Backer, M.; Poels, G.; Snoeck, M. Choosing the right business process maturity model. Inf. Manag. 2013, 50, 466–488. [Google Scholar] [CrossRef]
  10. Cohen, W.; Levinthal, D. Absorptive capacity: A new perspective on learning and innovation. Adm. Sci. Q. 1990, 35, 128–152. [Google Scholar] [CrossRef]
  11. Killen, C.; Hunt, R. Robust project portfolio management: Capability evolution and maturity. Int. J. Manag. Proj. Bus. 2013, 6, 131–151. [Google Scholar] [CrossRef]
  12. Davidz, H.; Nightingale, D. Enabling systems thinking to accelerate the development of senior systems engineers. Syst. Eng. 2008, 11, 1–14. [Google Scholar] [CrossRef] [Green Version]
  13. Von Bertalanffy, L. General System Theory: Foundations, Development, Applications; George Braziller: New York, NY, USA, 1968; ISBN 0807604534. [Google Scholar]
  14. Richmond, B. Systems thinking: Critical thinking skills for the 1990s and beyond. Syst. Dyn. Rev. 1993, 9, 113–133. [Google Scholar] [CrossRef] [Green Version]
  15. Checkland, P. Systems Thinking, Systems Practice (30 Year Retrospective); Wiley: Oxford, UK, 1999; ISBN 978-0-471-98606-5. [Google Scholar]
  16. Gharajedaghi, J. Systems Thinking: Managing Chaos and Complexity: A Platform for Designing Business Architecture, 2nd ed.; Butterworth-Heinemann: Maryland Heights, MO, USA, 2005; ISBN 978-0123859150. [Google Scholar]
  17. Buckle Henning, P.; Chen, W.-C. Systems thinking: Common ground or untapped territory? Syst. Res. Behav. Sci. 2012, 29, 470–483. [Google Scholar] [CrossRef]
  18. Kruglanski, A. Lay Epistemics and Human Knowledge: Cognitive and Motivational Bases; Springer: New York, NY, USA, 1989; ISBN 978-1-4899-0924-4. [Google Scholar]
  19. Kasser, J.; Hitchins, D.; Huynh, T. Reengineering Systems Engineering. In Proceedings of the 3rd annual Asia-Pacific Conference on Systems Engineering (APCOSE), Singapore, 20–23 July 2009. [Google Scholar]
  20. Frank, M.; Kasser, J. Assessing the Capacity for Engingeering Systems Thinking (CEST) and Other Competencies of Systems Engineers. Available online: (accessed on 21 April 2018).
  21. Sterman, J. Business Dynamics: Systems Thinking and Modeling for A Complex World; Irwin McGraw-Hill: New York, NY, USA, 2000; ISBN 978-0072389159. [Google Scholar]
  22. Anderson, V.; Johnson, L. Systems Thinking Basics: From Concepts to Causal Loops; Pegasus Communications: Waltham, MA, USA, 1997; ISBN 978-1883823122. [Google Scholar]
  23. Dettmer, W. The Logical Thinking Process: A Systems Approach to Complex Problem Solving; American Society for Quality: Milwaukee, WI, USA, 2007; ISBN 978-0873897235. [Google Scholar]
  24. Frank, M. Knowledge, abilities, cognitive characteristics and behavioral competencies of engineers with high capacity for engineering systems thinking (CEST). Syst. Eng. 2006, 9, 91–103. [Google Scholar] [CrossRef]
  25. Bonnema, G.; Broenink, J. Thinking tracks for multidisciplinary system design. Systems 2016, 4, 36. [Google Scholar] [CrossRef]
  26. Munns, A.; Bjeirmi, B. The role of project management in achieving project success. Int. J. Proj. Manag. 1996, 14, 81–87. [Google Scholar] [CrossRef]
  27. Argyris, C.; Schön, D. Theory in Practice: Increasing Professional Effectiveness; Jossey-Bass: San Francisco, CA, USA, 1976; ISBN 1555424465. [Google Scholar]
  28. Buckle, P. Uncovering system teleology: A case for reading unconscious patterns of purposive intent in organizations. Syst. Res. Behav. Sci. 2003, 20, 435–444. [Google Scholar] [CrossRef]
  29. Boardman, J.; Sauser, B. Systems Thinking: Coping with 21st Century Problems; CRC Press Taylor & Francis Group: Boca Raton, LA, USA, 2008; ISBN 978-1420054910. [Google Scholar]
  30. Minati, G.; Abram, M.; Pessa, E. Processes of Emergence of Systems and Systemic Properties: Torwards a General Theory of Emergence. In Proceedings of the International Conference of the Italian Systems Society, Castel Ivano, Italy, 18–20 October 2007. [Google Scholar]
  31. Weinberg, G. An Introduction to General Systems Thinking; Dorset House Publishing: New York, NY, USA, 2001; ISBN 978-0932633491. [Google Scholar]
  32. Banerjee, J. Gestalt theory of perception. In Encyclopaedic Dictionary of Psychological Terms; M.D. Publications: Delhi, India, 1994; pp. 107–109. ISBN 9788185880280. [Google Scholar]
  33. Buckle, P. Recognizing Complex Patterns in Unexpected Workplace Behaviour and Events: A Grounded Theory. Ph.D. Thesis, University of Calgary, Calgary, AB, Canada, 2005. [Google Scholar]
  34. McClelland, D. Testing for competence rather than for “intelligence”. Am. Psychol. 1973, 28, 1–14. [Google Scholar] [CrossRef] [PubMed]
  35. Boyatzis, R. The Competent Manager: A Model for Effective Performance; John Wiley and Sons: New York, NY, UAS, 1982; ISBN 047109031X. [Google Scholar]
  36. Tarkan, A.; Turetken, O.; Reijers, H. Business process maturity models: A systematic literature review. Inf. Softw. Technol. 2016, 75, 122–134. [Google Scholar] [CrossRef] [Green Version]
  37. Gómez, J. The role of immaturity in development and evolution: Theme and variations. In Jerome S. Bruner Beyond 100; Cultivating Possibilities; Marsico, G., Ed.; Springer: New York, NY, USA, 2015; pp. 123–134. ISBN 3319255355. [Google Scholar]
  38. Mullaly, M. If maturity is the answer, then exactly what was the question? Int. J. Manag. Proj. Bus. 2014, 7, 169–185. [Google Scholar] [CrossRef]
  39. Maslow, A. A theory of human motivation. Psychol. Rev. 1943, 50, 370–396. [Google Scholar] [CrossRef]
  40. Van de Ven, A.; Poole, M. Explaining development and change in organizations. Acad. Manag. Rev. 1995, 20, 510–540. [Google Scholar] [CrossRef]
  41. Pöppelbuß, J.; Niehaves, B.; Simons, A.; Becker, J. Maturity models in information systems research literature search and analysis. Commun. Assoc. Inf. Syst. 2011, 29, 505–532. [Google Scholar]
  42. Röglinger, M.; Pöppelbuß, J.; Becker, J. Maturity models in business process management. Bus. Process Manag. J. 2012, 18, 328–346. [Google Scholar] [CrossRef] [Green Version]
  43. Fontana, R.; Fontana, E.; Garbuio, P.; Reinehr, S.; Malucelli, A. Processes versus people: How should agile software development maturity be defined? J. Syst. Softw. 2014, 97, 140–155. [Google Scholar] [CrossRef]
  44. Pasian, B.; Sankaran, S.; Boydell, S. Project management maturity: A critical analysis of existing and emergent factors. Int. J. Manag. Proj. Bus. 2012, 5, 146–157. [Google Scholar] [CrossRef]
  45. Pekkola, S.; Hildén, S.; Rämö, J. A maturity model for evaluating an organisation’s reflective practices. Measur. Bus. Excell. 2015, 19, 17–29. [Google Scholar] [CrossRef]
  46. NASA Systems Engineering Competencies. Available online: (accessed on 21 April 2018).
  47. Jansma, P.; Jones, R. Advancing the Practice of Systems Engineering. In Proceedings of the 2006 IEEE JPL Aerospace Conference, Big Sky, MT, USA, 4–11 March 2006. [Google Scholar]
  48. MITRE Institute. MITRE Systems Engineering (SE) Competency Model Version 1.13E. Available online: (accessed on 27 April 2018).
  49. INCOSE UK. 2010 INCOSE UK Systems Engineering Competencies Framework. Available online: (accessed on 21 April 2018).
  50. Comuzzi, M.; Patil, A. How organisations leverage big data: A maturity model. Ind. Manag. Data Syst. 2016, 116, 1468–1492. [Google Scholar] [CrossRef]
  51. Mintzberg, H. An emerging strategy of “direct” research. Adm. Sci. Q. 1979, 24, 582–589. [Google Scholar] [CrossRef]
  52. Krebs, D.; Denton, K. Explanatory limitations of cognitive-developmental approaches to morality. Psychol. Rev. 2006, 113, 672–675. [Google Scholar] [CrossRef] [PubMed]
  53. King, J.; Kraemer, K. Evolution and organizational information systems: An assessment of Nolan’s stage model. Commun. ACM 1984, 27, 466–475. [Google Scholar] [CrossRef]
  54. Buckle, P.; Thomas, J. Deconstructing project management: A gender analysis of project management guidelines. Int. J. Proj. Manag. 2003, 21, 433–441. [Google Scholar] [CrossRef]
  55. Thomas, J.; Buckle Henning, P. Dancing in the white spaces: Exploring gendered assumptions in successful project managers. Int. J. Proj. Manag. 2007, 25, 552–559. [Google Scholar] [CrossRef]
  56. Thomas, J.; George, S.; Buckle Henning, P. Re-situating expert project managers’ praxis within multiple logics of practice. Int. J. Manag. Proj. Bus. 2012, 5, 377–399. [Google Scholar] [CrossRef]
  57. Cabrera, D. Systems Thinking: Four Universal Patterns of Thinking; VDM Verlag: Saarbrücken, Germany, 2009; ISBN 3639156730. [Google Scholar]
  58. O’Connor, L. The Art of Systems Thinking: Essential Skills for Creativity and Problem Solving; Thorsons: London, UK, 1997; ISBN 1591596173. [Google Scholar]
  59. Jackson, M. Systems Thinking: Creative Holism for Managers; Wiley: Chichester, UK, 2003; ISBN 0-470-84522-8. [Google Scholar]
  60. Edson, M.; Metcalf, G. Evaluating the impact of systems research. In A Guide to Systems Research: Philosophy, Processes and Practice; Edson, M., Buckle Henning, P., Sankaran, S., Eds.; Springer: Singapore, 2017; pp. 199–234. ISBN 9811002622. [Google Scholar]
  61. Lee, J.; Kim, Y. A stage model of organizational knowledge management: A latent content analysis. Expert Syst. Appl. 2001, 20, 299–311. [Google Scholar] [CrossRef]
  62. Walker, L.; Pitts, R. Naturalistic conceptions of moral maturity. Dev. Psychol. 1998, 34, 403–419. [Google Scholar] [CrossRef] [PubMed]
  63. Wark, G.; Krebs, D. Gender and dilemma differences in real-life moral judgment. Dev. Psychol. 1996, 32, 220–230. [Google Scholar] [CrossRef]
  64. Hammond, D. Philosophical foundations of systems research. In A Guide to Systems Research: Philosophy, Processes and Practice; Edson, M., Buckle Henning, P., Sankaran, S., Eds.; Springer: Singapore, 2017; pp. 1–20. ISBN 9811002622. [Google Scholar]
  65. Buckle Henning, P. Competencies necessary for systems research. In A Guide to Systems Research: Philosophy, Processes and Practice; Edson, M., Buckle Henning, P., Sankaran, S., Eds.; Springer: Singapore, 2017; pp. 177–198. ISBN 9811002622. [Google Scholar]
  66. Stephens, A. Ecofeminism and systems thinking: Shared ethics of care for action research. In Sage Handbook of Action Research; Bradbury, H., Ed.; Sage Publications: London, UK, 2015; pp. 564–572. ISBN 1446294544. [Google Scholar]
  67. Midgley, G. Systemic Intervention: Philosophy, Methodology, and Practice; Kluwer Academic Publishers: New York, NY, USA, 2000; ISBN 978-0-306-46488-1. [Google Scholar]
  68. Ulrich, W. Critical Heuristics of Social Planning: A New Approach to Practical Philosophy; Wiley: Chichester, UK, 1983; ISBN 0471953458. [Google Scholar]
  69. Netland, T.; Alfnes, E. Proposing a quick best practice maturity test for supply chain operations. Measur. Bus. Excell. 2011, 15, 66–76. [Google Scholar] [CrossRef]
  70. Churchman, G. Wicked problems. Manag. Sci. 1967, 14, 141–142. [Google Scholar]
  71. Rittel, H.; Webber, M. Wicked problems. Man-Made Futures 1974, 26, 272–280. [Google Scholar]
  72. Von Foerster, H. Ethics and second-order cybernetics. In Understanding Understanding: Essays on Cybernetics and Cognition; Springer: New York, NY, USA, 2003; ISBN 0-387-95392-2. [Google Scholar]
  73. Clark, L.; Watson, D. Constructing validity: Basic issues in objective scale development. Psychol. Assess. 1995, 7, 309–319. [Google Scholar] [CrossRef]
  74. Sankaran, S. Taking action using systems research. In A Guide to Systems Research: Philosophy, Processes and Practice; Edson, M., Buckle Henning, P., Sankaran, S., Eds.; Springer: Singapore, 2017; pp. 111–142. ISBN 9811002622. [Google Scholar]
  75. Edson, M.; Klein, L. Problem structuring and research design. In A Guide to Systems Research: Philosophy, Processes and Practice; Edson, M., Buckle Henning, P., Sankaran, S., Eds.; Springer: Singapore, 2017; pp. 59–80. ISBN 9811002622. [Google Scholar]
  76. Mingers, J.; Rosenhead, J. Problem structuring methods in action. Eur. J. Oper. Res. 2004, 152, 530–554. [Google Scholar] [CrossRef]
  77. Bruner, G. Nature and uses of immaturity. Am. Psychol. 1972, 27, 687–708. [Google Scholar] [CrossRef]
  78. March, J. Exploration and exploitation in organizational learning. Organ. Sci. 1991, 12, 71–87. [Google Scholar] [CrossRef]
  79. Fontana, R.; Meyer, V., Jr.; Reinehr, S.; Malucelli, A. Progressive outcomes: A framework for maturing in agile software development. J. Syst. Softw. 2015, 102, 88–108. [Google Scholar] [CrossRef]
  80. Rynes, S.; Bartunek, J.; Daft, R. Across the great divide: Knowledge creation and transfer between practitioners and academics. Acad. Manag. J. 2001, 44, 340–355. [Google Scholar] [CrossRef]
  81. Rynes, S.; McNatt, B.; Bretz, R. Academic research inside organizations: Inputs, processes and outcomes. Pers. Psychol. 1999, 52, 869–898. [Google Scholar] [CrossRef]
  82. Shrivastava, P.; Mitroff, I. Enhancing organizational research utilization: The role of decision makers’ assumptions. Acad. Manag. Rev. 1984, 9, 18–26. [Google Scholar] [CrossRef]
  83. Rynes, S.; Brown, K.; Colbert, A. Seven common misconceptions about human resource practices: Research findings versus practitioner beliefs. Acad. Manag. Exec. 2002, 16, 92–103. [Google Scholar] [CrossRef]
  84. Eisenhardt, K.; Tabrizi, B. Accelerating adaptive processes: Product innovation in the global computer industry. Adm. Sci. Q. 1995, 40, 84–110. [Google Scholar] [CrossRef]
  85. Lewin, K. Action research and minority problems. J. Soc. Issues 1946, 2, 34–46. [Google Scholar] [CrossRef]
  86. Platt, J. Strong inference. Science 1964, 146, 347–353. [Google Scholar] [CrossRef] [PubMed]
  87. Latham, G.; Erez, M.; Locke, E. Resolving scientific disputes by the joint design of crucial experiments by the antagonists: Application to the Erez-Latham dispute regarding participation in goal-setting. J. Appl. Psychol. 1988, 73, 753–772. [Google Scholar] [CrossRef]
  88. Bylinsky, G. America’s hot young scientists. Fortune 1990, 122, 56–69. [Google Scholar]
  89. Jones, N.; Ross, H.; Lynam, T.; Perez, P.; Leitch, A. Mental models: An interdisciplinary synthesis of theory and methods. Ecol. Soc. 2011, 16, 46. [Google Scholar] [CrossRef]
  90. Argyris, C.; Schön, D. Organizational Learning II: Theory, Method and Practice; Addison-Wesley: Reading, MA, USA, 1996; ISBN 0201629836. [Google Scholar]
For instance, most organizations do not employ theoretically sound management practices even though mounting evidence shows that certain empirically tested business practices are related to greater employee performance, productivity, and overall financial performance for the entire company [9,10].
Despite connotations that immaturity involves undesirable deficits (e.g., [7]), it is worth noting that psychologists have proposed that the stage of immaturity is an important time of experimentation [41,42] that is valuable to the development of individuals (and the evolution of a field’s theoretical understanding of a developmental phenomenon [41]).
For example, theorists have discussed such concerns in the project management discipline’s critique of its own attempts to codify best practice and the means by which it is to be attained (e.g., [64,65,66]).
A variety of techniques have been used to help elicit mental models so that they can be more openly communicated and understood. These range from the use of diagrammatic techniques akin to fuzzy cognitive mapping and systems dynamics approaches, to rich pictures and other participatory modeling approaches that both surface differences and enable consensus analysis [72].

Share and Cite

MDPI and ACS Style

Buckle, P. Maturity Models for Systems Thinking. Systems 2018, 6, 23.

AMA Style

Buckle P. Maturity Models for Systems Thinking. Systems. 2018; 6(2):23.

Chicago/Turabian Style

Buckle, Pamela. 2018. "Maturity Models for Systems Thinking" Systems 6, no. 2: 23.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop