Next Article in Journal
Improving Undergraduate Novice Programmer Comprehension through Case-Based Teaching with Roles of Variables to Provide Scaffolding
Previous Article in Journal
Relativistic Effects on Satellite–Ground Two–Way Precise Time Synchronization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Method to Address Complexity in Organizations Based on a Comprehensive Overview

1
Chair of Information and Communication Management, Faculty of Economics and Management, Technical University of Berlin, 10623 Berlin, Germany
2
Faculty of Economics, Brandenburg University of Applied Sciences, 14770 Brandenburg an der Havel, Germany
3
Department of Information and Computing Sciences, Utrecht University, Princetonplein 5, 3584 CC Utrecht, The Netherlands
*
Author to whom correspondence should be addressed.
Current address: Faculty of Economics, Brandenburg University of Applied Sciences, 14770 Brandenburg an der Havel, Germany.
Information 2021, 12(10), 423; https://doi.org/10.3390/info12100423
Submission received: 20 September 2021 / Revised: 11 October 2021 / Accepted: 12 October 2021 / Published: 16 October 2021
(This article belongs to the Section Review)

Abstract

:
Digitalization increasingly enforces organizations to accommodate changes and gain resilience. Emerging technologies, changing organizational structures and dynamic work environments bring opportunities and pose new challenges to organizations. Such developments, together with the growing volume and variety of the exchanged data, mainly yield complexity. This complexity often represents a solid barrier to efficiency and impedes understanding, controlling, and improving processes in organizations. Hence, organizations are prevailingly seeking to identify and avoid unnecessary complexity, which is an odd mixture of different factors. Similarly, in research, much effort has been put into measuring, reviewing, and studying complexity. However, these efforts are highly fragmented and lack a joint perspective. Further, this negatively affects the complexity research acceptance by practitioners. In this study, we extend the body of knowledge on complexity research and practice addressing its high fragmentation. In particular, a comprehensive literature analysis of complexity research is conducted to capture different types of complexity in organizations. The results are comparatively analyzed, and a morphological box containing three aspects and ten features is developed. In addition, an established multi-dimensional complexity framework is employed to synthesize the results. Using the findings from these analyses and adopting the Goal Question Metric, we propose a method for complexity management. This method serves to provide key insights and decision support in the form of extensive guidelines for addressing complexity. Thus, our findings can assist organizations in their complexity management initiatives.

1. Introduction

As organizations develop and digitize their businesses, the number of interactions and dependencies between their processes, information systems, and organizational units increases dramatically [1]. To deal with this growth, organizations often enhance the technology supporting their businesses. That can also affect the structure of these organizations [2]. Such dynamics are likely to bring significant challenges [3]. One prominent challenge organizations have to tackle is a complexity that blocks decision-making and leads to unreasonably high and mainly hidden costs. For example, according to the study conducted by The Hackett Group, a world-class strategic consultancy, in 2019 [4], the operating costs of low-complexity companies as a percentage of overall revenue were almost 60% lower than those of the companies operating in highly complex settings. The low-complexity companies employed 66% fewer staff and spent 30% less on technology.
There has always been much interest from both academia and industry in complexity research. Many disciplines have established their own complexity subfields adjusting the complexity concepts to the specific goals [5], for example, social complexity [6], software complexity [7], or managerial complexity [8], in addition to generic studies. While recognizing the contributions of these subfields, it is hard to find an agreed-upon definition. Missing conceptual alignment and differing complexity interpretations complicate the common acceptance of the complexity field [5]. Further, there remains much unclarity and fragmentation on what contributes to the complexity and how to not only measure and reduce it but, more importantly, benefit from it. Especially in the organizational domain, the discussion on complexity is either rather generic [9] or devoted to a narrow topic, like task complexity [10]. Hereby, the methods for measuring complexity are prevailingly designed for a particular type of complexity [11] and, therefore, are hardly transferable to other types [12]. However, the organizational domain can be characterized as a complex environment embracing different types of complexity, often even implicit, for example, related to people, structures, technology, or processes. Hence, in such settings, managers lack comprehensive guidelines on how to approach complexity management initiatives, including steps, measurements, and necessary data and information.
Additionally, ongoing technological developments and the use of diverse information systems increase the variety of data sources and types. Such advancements create a demand for a continual review of the existing studies in complexity research that can offer new perspectives, highlight essential gaps, and suggest approaches for complexity management. Moreover, in the literature, we observed a lack of structured outcome providing an integrated overview of the complexity research.
To address the mentioned shortcomings, in our study, we conduct a Systematic Literature Review (SLR) and create a comprehensive overview of complexity research relevant in organizations. To create such a summary, we take as the basis the People Process Technology (PPT) framework [13,14], as well as typical information and data sources that exist in organizations. With this, we aim to cover the complexity that is related to the main components of organizations, namely people, technology, and processes. Hence, we focus on the typically observed complexities related to an organization as a whole (organizational complexity), technology (technological complexity) and people, in particular, communication. The latter we analyze through the lens of generated textual data serving different purposes (textual complexity).
In fact, complexity can be influenced by a myriad of factors and may arise in various forms. For example, in the case of organizational complexity ranging from complex projects to organizational structures, managers and executives face inefficiency, delays, performance decrease, or even the inability to handle their businesses. Similarly, unstructured textual data, like complex and even confusing instructions, project communication and documentation, textual task descriptions will cause unclarities, errors, and rework. Furthermore, complex technology will likely be not only the source of high costs but also an obstacle to successful business functioning, although it has the goal of supporting the processes in organizations. Hence, dynamic business processes, computing platforms, applications, and services can cause a massive complexity encompassing many concepts, technologies, data and information sources, which are interlinked together in diverse organizational interactions.
Afterwards, to address the confusion impeding a common acceptance of the field, we structure and classify the literature findings in the form of a morphological box and integrate them into a multi-dimensional complexity framework. The morphological box [15] aims at summarizing the fundamental aspects of complexity research, whereas the multi-dimensional complexity framework [5] incorporates the complexity types identified in the literature into one integrated structure contributing to the standardization efforts of the complexity research. Based on these two analyses, we derive our main contribution, that is, a method to apply our study findings by extending the Goal Question Metric (GQM) [16].
Hence, our study contributes to the common acceptance and understanding of complexity research in organizations by addressing its high fragmentation. It provides researchers with a comprehensive overview based on the PPT framework and structured outcome of research results in the form of a morphological box and multi-dimensional framework.
As a practical contribution, we address the lack of instructions for selecting of a specific approach to address complexity in organizations. Hence, the method can serve as practical guidance for organizations in their complexity management projects.
The remainder of the paper is structured as follows: Section 2 provides an overview of the related work in the literature and highlights the gaps. In Section 3, we present the research methods used for conducting the literature review and analyzing and classifying its results. Section 4 gives an overview of the results. Section 5 analyzes and classifies the results providing structured outcomes in the form of the morphological box and integrated complexity framework. In Section 6, we present the key contribution of our study, that is, the method to address complexity in organizations. We discuss our findings, their implications and limitations in Section 7. Finally, conclusions and future work are presented in Section 8.

2. Related Work

The term complexity has always received the attention of scholars in different fields, such as Computer Sciences, Organizational Sciences and Linguistics. For example, in Computer Sciences, the term complexity, as a rule, determines the complexity of an algorithm, that is, the number of resources required to execute the algorithm [17]. Organizational Sciences mostly adapt concepts from Complexity Theory and define an organization as a complex dynamic system, which consists of elements interacting with each other and their environment [18]. In Linguistics, complexity is studied from the language perspective and comprises phonological, morphological, syntactic, and semantic complexities [19]. Hence, complexity reveals various definitions and implications. It can be expressed in exact terms, such as McCabe’s software complexity defined as a number of possible paths in code [7]. Alternatively, it can cover broad ideas serving as an umbrella term for many concepts. For example, Edmonds defines complexity as a property of a language that makes it troublesome to formulate its behavior, even given nearly complete information regarding its atomic parts and their interrelations [20]. Such heterogeneity of complexity concepts hampers its wide recognition [5]. Besides, this high fragmentation becomes apparent when searching for literature review studies conducted in the field of complexity, like in the following reviews: ref. [21] on task complexity, ref. [22] on innovation complexity, and ref. [23] on interorganizational complexity from employer and safety perspectives.
Similarly for the term complexity, there is much ambiguity and fragmentation on what contributes to the complexity and the ways to reduce and benefit from it. For example, ref. [24], while conducting research on complexity drivers in manufacturing companies, observe that the studies mostly focus on one definition (perspective) of complexity drivers providing no comprehensive analyses. Ref. [25] states that the task complexity drivers or contributors found in the literature are the consequence of combining diverse settings and subjective preferences. As a result, they are ambiguous and difficult to grasp and analyze. Ref. [26] developed guidelines on how to reduce complexity, focusing on a rather specific field of product and process complexity in a case study of a consumer products manufacturer. The authors justify the necessity of such recurring complexity studies by an increasingly changing complexity nature. Besides, another group of studies aiming to deal with process complexity in organizations consider it from the specific perspective of process models, workflows, and event logs [27,28,29]. Another rather unconventional approach to process complexity is suggested in [30]. The authors study the complexity of IT Service Management processes through the lens of textual data massively generated in the organizations and the related readability and understandability of textual work instructions. This represents a promising but rather narrow research direction. Further, addressing the process complexity of software development, ref. [31] also emphasize a limited scope of the previous studies focusing on the code-related measures.
Moreover, the shortcoming of a narrow focus causes the drop back of certain industries in the complexity management field [12]. Likewise, despite their abundance, complexity measures, such as software complexity measures that are essential for managing complexity, have not been sufficiently described and evaluated in the literature [32]. This is also confirmed in the recent studies evidencing the poor usage of code complexity measures in the industry [33].
Further, a number of recent studies investigate the notion of complexity in companies empirically based on a bottom-up approach, the fact indicating that there is still no alignment and much dissonance on how to approach complexity in organizations. Accordingly, ref. [34] study how project complexity has been perceived by practitioners in different industry sectors. In total, five sectors and more than 140 projects have been researched. It has been concluded that a comprehensive project complexity framework and guidelines could aid in the management of complex projects by raising awareness of the (anticipated) complexities [34]. Ref. [35] also declare the need for developing approaches to measure and manage complex projects. While studying the complexity of supply chains in four case studies, ref. [36] highlight the necessity of frameworks that can assist managers in building overarching complexity management strategies and practices supporting them [37].
In fact, as fairly noted in the recent work [38], organizations comprise many different interconnected elements which make the complexity study challenging and also have a “negative” context. Interestingly, the authors mention that this often inherent and inevitable complexity implies essential benefits, particularly in dynamic and unpredictable conditions. In addition, they provide some strategic leadership guidance on complexity management in organizations. Similarly, in other works [2,39,40,41], the attempts to suggest guidance to address organizational complexity are limited to high-level strategic leadership recommendations.
Hence, in our study aiming to develop a method to comprehensively address complexity in organizations, the following gaps (IG: Identified Gap) serve as a motivation:
  • IG1: Scarce comprehensive analyses and high fragmentation of the complexity field impeding its common acceptance;
  • IG2: Much unclarity and confusion on what contributes to complexity, how to measure, reduce, and benefit from it;
  • IG3: Lack of extensive guidelines for managers on how to approach complexity;
  • IG4: Organizational domain embracing different types of complexity which are often implicit.

3. Research Approach

In this section, we present the research approach we follow to propose a method to deal with complexity. It includes the following: (i) an SLR on complexity in organizations and (ii) morphological box and multi-dimensional framework to classify, analyze, and synthesize the SLR findings. Based on (i) and (ii), we extend the GQM approach and propose practical guidelines for managers to address complexity in organizations.

3.1. Systematic Literature Review

As the starting point in the method development, we perform an SLR on complexity management in organizations, which we describe in the subsections below.

3.1.1. Scope

As complexity is a very broad subject area, conducting an SLR solely on complexity would have yielded an incomparably large set of papers, which is difficult to analyze and provide valuable insights. Hence, we set the focus on complexity management in organizations.
Organizations are commonly described by three main components: people, technology, and processes connecting them [13]. This viewpoint is also known as the People Process Technology (PPT) framework [14]. Despite being a popular concept, especially in industry, its origin is not straightforward. The oldest and most prominent source using the logic of these three elements is Leavitt’s diamond model [13]. This model focuses on problem-solving in organizations and highlights three types of solutions—structure (by means of organizational chart or responsibilities), technology (by means of technologies), and people (by means of Human Resources). Later on, the PPT components make up the fundamentals of the prominent Information Technology Infrastructure Library (ITIL) framework launched in the 1980s [42]. Based on the underlying assumptions in ITIL, any technology solution is only as right as the processes it supports. Similarly, processes are only as good as the people who follow them. Hence, we focus on these three PPT components (people, technology, and process) and build up our complexity types accordingly. In particular, we take organizational, technological, and textual complexity types as the starting point. For each type, we explain our motivation below.
  • Organizational complexity: In the first type of complexity, organizational, we consider a broad organizational perspective discussed by Mintzberg [43] and related studies [44]. Among others, the authors consider organizations from the viewpoint of allocation of tasks and resources. Such a configuration is enabled by major organizational components and common assets, such as tasks, projects, and processes in relation to projects and organizations [45]. In this regard, task complexity research going back to the 1980s [21,46] can be considered the most well studied one in the organizational context [25]. Consequently, task, project, and process complexities are considered important constituents and subtypes of organizational complexity.
  • Technological complexity: As the second type of complexity, we study technological complexity. The oldest and most popular example of technological complexity can be acknowledged software complexity, such as McCabe cyclomatic complexity [7], which is based on the control flows represented in the form of graphs. The McCabe cyclomatic complexity served as a basis for several other technological complexities related to process models and event logs [47];
  • Textual complexity: To address the people component, we focus on communication, that is, on how people exchange information in organizations and receive their tasks. Subsequently, we pose a question about how we can obtain this information. Textual data generated inside and outside organizations remain one of the most valuable types of unstructured data [30,48,49,50]. Hence, we consider textual complexity as the third type of complexity, which, among others, reflects the people component in organizations. Indeed, beyond the structured program codes and event logs, analysts estimate that upward of 80% of enterprise data today is unstructured, whereby the lion’s share is occupied by textual data [48]. There is a great variety of textual data types relevant for organizations. Emails, files, instant messages, posts and comments on social media are some examples.
To provide an integrated overview of the research on the complexity from organizational, technological, and textual perspectives, it is necessary to analyze how complexity is measured in each perspective. Accordingly, we cover the following topics: what complexity metrics are available in the related literature and whether any tool support is provided for them. To capture what the existing studies in the literature focus on, apart from complexity measurement, we investigate the motivations and declared novelty in the studies. Likewise, we set to unveil the practical contributions of the studies in the complexity research on organizations. Another objective is to check the future work avenues mentioned in the related literature so that potential contribution areas can be identified.
Based on the discussion of the scope above and the gaps identified in the related work section, we design five research questions (RQs):
  • RQ1: What concepts of complexity are available in the literature from the organizational, technological, and textual perspectives?
  • RQ2: What metrics are proposed to measure the complexity concepts in organizations?
  • RQ3: What are common motivations and declared novelty areas of the complexity research on organizations?
  • RQ4: Which focus areas and application cases are typically used in the related literature on complexity in organizations?
  • RQ5: What future research directions are communicated in the complexity research on organizations?
To answer the RQs and identify and analyze the existing studies in the related literature, we conduct an SLR. For this purpose, we follow the well-established guidelines by Kitchenham [51]. The main reason is that these guidelines are rigorously applied in the literature, cover major steps of a typical SLR, and could be used to design a review protocol in various fields. For guiding and evaluating literature reviews, we point to the framework proposed in [52]. In addition, we emend our analysis with the tool-support guidelines as presented in [53]. Accordingly, the retrieval and selection mechanism of the papers used in the SLR is given in the subsection below.

3.1.2. Paper Retrieval and Selection

Search strategy: To retrieve papers in the literature, we create search strings that are generic enough to include the studies discussing at least one of the three complexity types. Specifically, based on the complexity types and subtypes that we observed while identifying the gaps in the related work, we outline the following set of search strings:
  • for organizational complexity: “task complexity” or “project complexity” or “process complexity”;
  • for technological complexity: “software complexity” or “process model complexity” or “event log complexity” or “workflow complexity” or “control flow complexity”;
  • for textual complexity: “textual complexity” or “readability” or “understandability”.
Having such separated search strings for each complexity type has advantages. One of them is balancing the granularity in the terms for complexity types. This way, we eliminate the risk of having an unequal number of papers for each complexity type. Another advantage is determining the papers that are mostly devoted to the study of a particular complexity type.
We applied the search strings in the search engine Google Scholar. The reason for this choice is twofold. Firstly, Google Scholar is the world’s largest academic search engine and provides an integrated search environment [54] by encompassing other academic databases, like ACM Digital Library and IEEE Xplore, Scopus, and Web of Science. Secondly and more importantly, Google Scholar ranks search results by their relevance considering all major features of papers, such as full text, authors, published source, and how often each paper has been cited in academic databases. The papers that have at least one of the above search strings in their title, keywords, or in their main body are retrieved. For the three complexity types, our search resulted in a large number of papers, that is, more than 5000.
Inclusion and exclusion criteria: To exclude irrelevant papers, we defined and applied the inclusion and exclusion criteria listed in Table 1. Specifically, we aimed at those papers closely related to organizational, technological, and textual complexities in organizations. We added the citation minimum to select meaningful papers recognized by other researchers. Hereby, the citation minimum was not applied to the recently published papers. The papers dealing with how people in organizations are grouped were excluded, as they mostly focus on organizational structure. Likewise, papers without a scientific basis and containing assumptions or expectations, that is, theoretical speculations, were filtered out. Lastly, as the papers on the same topic of the same authors have a high overlap, the ones with extended content were considered more relevant.
As the result of filtering out based on the inclusion and exclusion criteria, in total, 130 papers were selected and combined in the final set. Due to the practical orientation of complexity and relevance, we included a limited number of technical reports and Ph.D. theses, which consisted of less than 5% of the final set of papers. Using the reference management tool Zotero (Check out Zotero, https://www.zotero.org/, accessed on 15 August 2021), paper metadata were obtained for each paper in the final set. Paper metadata included the reference data about a paper, for example, authors, publication date, authors’ keywords, and abstract.

3.2. Literature Classification

In this subsection, we explain the two approaches, that is, morphological box and multi-dimensional complexity framework, that we used to classify and synthesize the SLR findings obtained following the process described in the previous subsection.
Morphological box: To summarize the primary aspects of complexity research on organizations, we analyzed and classified the literature review results. For this, we used a morphological box, which is a commonly used (for example, in [55,56]) method to present a set of relationships inherent in the multi-dimensional non-quantifiable problem complexes [15]. Developing a morphological box to describe complexity aspects requires the identification of relevant features of papers and grouping these features. To do so, we focused on the three main themes: (1) common characteristics that can be observed in any research paper; (2) complexity quantification; and (3) tangible research artifacts for quantifying complexity. With the last two themes, we aim to identify reusable and distributed complexity research outputs that we can exploit for developing our method on addressing complexity.
Regarding the first theme, we took the identified gaps IG1 and IG4 listed in the related work section. With this, we aimed to show the fragmented studies on complexity. Hence, the following features were identified: motivation, novelty, focus area, application cases, and future research. We grouped them and named this group generic aspects, as they can be observed in any research paper. Based on the common sense on quantification and the second gap IG2 in the related work, four features were identified in the second theme. These are metrics origin, input, output, and validation. To combine these features, we propose complexity metrics and analysis aspects as the second group. Aligned with the third theme and IG3, it might be critical to know the specific conditions of implementation, such as being openly accessible, distributed in a free or commercial manner. Moreover, the implementation of a complexity analysis and metrics should be of deep concern. It is the only procedure that allows the application of particular approaches in the sense of evaluation or a real-world scenario. Thus, we included tool support as another feature in the morphological box. In this feature, we considered the usage of existing tools, own development tools, or no tool support. Table 2 shows the defined three aspects with their features.
Based on the identified features, each paper in the final set was coded to extract topics necessary to answer the RQs. For this, deductive coding was applied [51]. A predefined set of codes observed while identifying the gaps in the literature was taken as the starting point for coding. While assigning the codes in each paper, newly observed codes were added to the code set. In other words, observed concepts in papers were used as codes. In deductive coding, it is important to avoid the researcher’s bias, that is, the researcher’s preferences are likely to influence the selection of papers [51]. Therefore, the topics in papers were identified and coded by two researchers independently. Afterwards, a discussion session was carried out to align on differences in coding. The coding was conducted using the qualitative data analysis tool NVivo (Check out Nvivo, https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software, accessed on 15 August 2021) to ensure consistency. The results obtained from coding served as the basis for the morphological box development.
Multi-dimensional complexity framework: In a variety of disciplines, the term complexity has been applied to the specific context that prompted the establishment of standalone complexity research subfields [5]. This variety impairs the general acceptance of the complexity research (see IG1 in the identified gaps). To address this problem, [5] suggest a four dimension framework unifying the most prevalent views on complexity. Hereby, each dimension comprises two opposing complexity notions derived based on the established Complexity Science literature: objective and subjective (D1, observer perspective), structural and dynamic (D2, time perspective), qualitative and quantitative (D3, measures perspective), and organized and disorganized (D4, perspective of dynamics predictability). In Table 3, the dimensions and their complexity notions are listed. These dimensions allowed us to synthesize the SLR findings based on an agreed-upon vocabulary. As a result, we obtained an integrated multi-dimensional complexity framework.

3.3. Goal Question Metric

The morphological box and multi-dimensional complexity framework described in the previous subsection served to structure and standardize our efforts in the complexity research review. However, these structured outcomes of the SLR findings lack a practical application value for the managers having to deal with complexity in their daily businesses. Hence, questions remain: (i) how to select a suitable approach to complexity analysis and measurement; and (ii) which type of complexity is relevant for a given problem or in a specific situation.
To address this kind of challenge, several approaches exist in the literature. For example, one popular approach originating from strategic management is Balanced Score Card (BSC) [57] embracing four perspectives, that is, financial (shareholders’ view), customer (value-adding view), internal (process-based view), and learning and growth (future view). Initially developed in the business domain, BSC has been adapted to the software domain, specifically in relation to GQM [58,59]. GQM is one of the well-established and widely used approaches to determine metrics for goals, which is also known to be the most goal-oriented approach [16]. GQM was originally developed for the evaluation of defects in the NASA Goddard Space Flight Center environment in a series of projects. Despite this original specificity, its application has been extended to a broader context, software development among others [16].
GQM has a hierarchical structure beginning with a goal definition at the corporate, unit, or project level. It should contain the information on the measurement purpose, the object to be measured, the problem to be measured, and the point of view from which the measure is taken. The goal is refined into several questions that typically break down the problem into its main components. Next, various metrics and measurements are proposed to address each question. Finally, one needs to develop the data collection methods, including validation and analysis [16].
GQM opponents criticize it for being too flexible and generating a large number of metrics [60]. To address the GQM shortcomings, several approaches have been proposed, such as Goal Argument Metric (GAM) [61], or generic approaches as in [62]. Accordingly, GQM should be applied based on the organization’s process maturity level to select the most acceptable metrics [62]. Ref. [63] suggests including a prioritization step into GQM to minimize the number of generated metrics. However, such an approach has a drawback that certain perspectives may remain neglected. To sum up, in the GQM related literature, we have observed a common practice of extending the GQM to adjust it to particular research needs, like in [64] for agile software development and [65] for data warehouses quality assessment.
Hence, in our study, we extend GQM for the purpose of the method development to comprehensively address the complexity in organizations. When adapting GQM to our study needs, we also set to overcome the above-mentioned limitations. Introducing a step-by-step guidance leading from the problem statement to a solution specification, we aim to assist organizations in identifying what is actually needed to deal with complexity.
In the following sections, we describe in detail the application of the discussed research approach. Accordingly, we start with reporting the SLR findings followed by their classification. The latter includes the morphological box development and integration of the results into a multi-dimensional complexity framework. Lastly, a GQM extension is proposed using an illustrative example from a real-world setting.

4. Systematic Literature Review

This section describes the results of our literature review. We address each RQ in a separate subsection and elaborate on our findings. In the visualization of our findings, we use a standard coloring for the three complexity types for the reasons of consistency. The assigned color for each complexity type is as follows: organizational  Information 12 00423 i001 (RGB: 199,199,199), technological  Information 12 00423 i002 (RGB: 127,127,127), and textual  Information 12 00423 i003 (RGB: 65,65,65).

4.1. Complexity Concepts

We present our findings on the complexity concepts of the three complexity types that we focus on, that is, organizational, technological, and textual. In particular, we elaborate on the trend of papers over the years, the components of complexity types, and their distribution.
The majority of the analyzed 130 papers, 54 papers, discuss technological complexity, whereas 40 of the remaining papers are devoted to textual complexity and the rest 36 are about organizational complexity.
In Figure 1, the trend of publications over time is presented. As can be seen, the distribution of papers in all three types of complexity represents a similar trend with a peak at the period between 2011 and 2015. This indicates that, despite the different nature of the three complexities, they are all triggered by and based on today’s information age developments. The most common examples of these developments are fast-expanding technology solutions, complex processes, and new organizational skills needed to deliver products and services faster, with higher quality, and at lower costs than before [66].
Based on the topics that appeared in the reviewed papers, we developed a mind map of complexity concepts, which is depicted below in Figure 2.
Key topics mentioned in the definitions of the organizational type of complexity are task, project, process, and product complexities. In the task complexity, which takes up the major part (36%) of all papers in the organizational complexity group, key topics are task complexity elements and models [11,46,67,68,69], cognitive resources necessary to perform the tasks [70], identification and utilization of complexity factors [71] and effects [72,73,74,75], performing reviews and analyses, as well as building frameworks [10,21,25]. Very often, one and the same study addresses a bundle of topics. For example, in the project complexity (25%), apart from reviews [76], the core topics are building frameworks and models [77,78,79]. Process complexity (14%) can be characterized by either the study of specific processes such as (IT) services [80,81,82] or generic topic of complexity factors and effects [83]. Product (and production) complexity ( 5.56 %) is studied as part of project complexity in the context of new product development projects [84] or generic approaches to production complexity [85]. The product or production complexities are largely dependent on the type of the product itself. Hence, we include very few works in our analysis.
In the technological type of complexity, we found several topics as the key topics. For example, those related to business processes, business process models with 41%, such as in [86] and event logs and workflows taking up 9%, like [87]. Software programs in general (22%), for example, ref. [7], and with a specific focus on user interfaces (13%), as [88], are the topics related to software. Furthermore, there are topics that relate to systems. To name the most common ones, rule-based systems (4%), for example, ref. [89], enterprise systems (4%), like [90], and IT architectures (6%), as discussed in [5].
In the textual complexity type, the most studied topic appeared to be corporate and accounting narratives and legislative documentation in general, such as contracts or institutional mission statements, making up 68% [91,92]. This can be accounted for a strong need for clarity in communication between institutional management and various stakeholders. Other, rather rare topics relate to two groups: (1) textual complexity of webpages and online reviews (13%) [93,94] and (2) management textbooks [95], texts used for reading comprehension [96], and news articles (10%) [97].

4.2. Complexity Metrics and Analysis

In this subsection, to answer the RQ2 “What metrics are proposed to measure the complexity concepts in organizations?” and get a deep understanding of existing approaches on measuring complexity, we analyze the following guiding questions:
  • Which theories and disciplines laid the foundation of complexity metrics, that is, metrics origin?
  • What kind of information and data serve as input for complexity metrics?
  • What kind of output is expected?
  • Whether tool support is provided?
  • How are the proposed metrics validated?
In this analysis, we filtered out the literature and critical review papers (24 in total), since, as a rule, they do not offer any metric. In Figure 3, metrics origin is shown for each type of complexity. Organizational Sciences is the prominent origin of metrics in organizational complexity. In the technological complexity, Software Engineering is the dominating origin, whereas textual complexity metrics, as naturally, are mainly driven by Linguistics. In general, Cognitive Sciences can be considered as a significant common driver for the origin of the metric in the three complexity types.
To understand how respective complexities are measured, we provide the analysis of inputs and outputs related to complexity metrics. In Figure 4 and Figure 5, commonly used inputs and outputs in each complexity type are shown. Since business processes are one of the main assets of organizations, various data types related to business processes (for example, business process model, event log, and business process descriptions) are used to measure organizational and technological complexities.
Answering the question regarding validation, we identified that most of the complexity research (66%) is validated empirically. This is an indication of the practical value and applicability of complexity metrics. However, tool support can be fairly considered as a catalyst for the reproducibility and applicability of research findings.

4.3. Complexity Research Motivations and Novelty

For any novice, but also advanced researchers, it may be useful to get an overview of those motivating considerations and successful research “selling points”, or declared novelty, prominent in the field. In Figure 6, we present the motivations according to complexity type over time. We exclude literature and critical review papers. One can observe whether motivation gains importance over time for a complexity type. For example, in organizational complexity, there is a continuous interest in complexity factors and effects.
In Figure 7, we present motivations in relation to the metrics origin for each complexity type. We excluded literature and critical review papers. One can observe those metrics origin areas that are essential for particular motivations related to complexity metrics. For example, in the development of new complexity metrics, Decision-making and Organizational Sciences are the two prominent areas to consider.
In Figure 8, we present the absolute numbers regarding the research novelty aspect. The most attractive topic is selecting a specific application area. Metrics evaluation and tool support are the two topics that attracted less interest in this context.

4.4. Complexity Research Focus Areas and Application Cases

In Figure 9 and Figure 10, the absolute distributions of specific focus areas and application cases according to complexity type are shown. The papers in which no application is specified are excluded from the application case distribution. To keep the most frequent values of application cases, the threshold for the minimum number of papers is set to two.

4.5. Complexity Research Future Research Directions

In this subsection, we share our findings regarding the future research directions we identified in the reviewed papers.
In Figure 11, we summarize the absolute distributions of the future research directions over time. The most significant future research directions are new validation studies, approaches, and metrics extensions. Such directions as guidelines development, metrics comparison, complexity factors, and complexity effects are rarely mentioned.

5. Literature Classification

In this section, we first classify the SLR results explained in the previous section and develop a morphological box. Second, we exploit a multi-dimensional framework to synthesize the SLR findings.

5.1. Morphological Box

To build the morphological box, the three aspects and ten features described in the research approach section are employed. Initially, each paper in the final set comprised of 130 papers is distributed to the three aspects based on the values in its ten features. Then, thematically related feature values are grouped, and the relative number of papers in each group is calculated. For example, in the novelty feature of generic aspects, “new approach”, “new framework”, and “new metrics” novelties are grouped. As they are mentioned in 30 out of the 130 papers, the relative frequency for that group is 23%. Using the frequency distribution in each group in the three aspects, the morphological box shown in Table 4 is formed. As one paper may discuss multiple topics (that is, have multiple values in a single feature), the percentage value for a group in a feature does not necessarily add up to 100.
In the aspects with an observable amount of values, like validation or tool support, no groups were formed. Hence, the relative number of papers containing the certain value can be directly derived from the table. For example, an empirical validation has been performed in 66% of papers. The three aspects in the created morphological box are explained below.

5.1.1. Generic Aspects

The motivations for performing complexity research can be that multiplex as the term complexity is. As our review shows, they vary from measurement, reviews of complexity studies, development of new complexity metrics, to the investigation of complexity factors and effects. Motivations are highly beneficial to make researchers familiar with the complexity research. Furthermore, for the researchers already into the topic, it would be advantageous to explore the studies related to complexity metrics development as well as practical studies. Practitioners can gain insights into complexity factors, effects, as well as reasons for applying complexity metrics.
The overview of the most frequent and also uncommon focus areas and application cases will help researchers to propose fruitful directions for case study-oriented complexity research and beyond. Business Process and Project Management, Corporate Finance, Software Engineering, and IT Service Management are some of the examples of focus areas. We use Statistical Classification of Economic Activities in the European Community [98] (RAMON) to provide a consistent overview of application areas, like Healthcare, Governmental Services, and Automotive.
Diverse novelty aspects (for example, a new approach, framework, new metrics or adaption of existing ones, findings on empirical studies, application area, and comparative evaluations) may serve as a source of inspiration for both complexity novices and expert researchers. Similarly, the future research aspect can point to some significant limitations of the studies and potential further research directions. Additionally, gaps or problems mentioned in future research can be further investigated to contribute to the existing body of knowledge.

5.1.2. Complexity Metrics and Analysis Aspects

Metrics are common instruments for measurement. They are often used as a communication tool among stakeholders to control and assess the quality and status of artifacts. Moreover, various complexity metrics are applied to measure complexity broadly and in distinct areas. Such complexity analysis and metrics are characterized by certain input and output data. For example, depending on the type of complexity and application case, textual documentation, event logs, software programs, expert interviews, or even image data can serve as an input for the complexity analysis and measurements. The output is directly related to the research motivation in general and associated input in particular.
In the metrics development process, another important aspect is their theoretical base or origin. As a rule, theories are used to name observed concepts and explain relationships between them. The theory is a well-known tool that helps to identify a problem and plan a solution. Hence, metrics origin is included as a feature in the proposed morphological box. Accordingly, the disciplines that are widely used for metrics development are summarized to give an overview and support the metrics development process. Some frequent examples are Organizational Sciences, Cognitive Sciences, Cognitive Informatics, Human Sciences, Software Engineering, Process Mining, and Mathematics.
As a matter of fact, metrics can be of different quality, depending on how precisely they describe an attribute of an entity. Hereby, metrics validity is among the most critical quality characteristics [86]. Validation is essential to guarantee that the outcomes of the metric application are legitimate. There are two types of validation methods: theoretical and empirical [99,100]. Theoretical validation is conducted using one of the following typical ways: (i) metrics development exclusively based on a theory; (ii) metrics development solely based on existing studies; (iii) checking compliance with a standard framework (for example, Briand’s framework properties [101] and Weyuker’s properties [102]; or (iv) illustration of the metrics application with the help of an example. In general, empirical validation of metrics complements theoretical validation. For empirical validation, different strategies are used, for example, case studies, surveys, or experiments. The objective of empirical validation is to find out whether the given metric measures what it is supposed to. Thus, for a metric to be structurally sound and useful, both theoretical and empirical validation are required [103].

5.1.3. Implementation Aspects

As can be seen in Table 4, approximately a third of the papers (34%) provide information about tool support. In particular, 23% of the papers mention that they use existing tools. In the remaining 11%, a tool was developed to support complexity measurement. We observed that in organizational complexity, there is a high demand for tool development, whereas, in technological complexity, tool support is more common. Notably, the studies providing tool support are prevailingly motivated by complexity metrics development.

5.2. Integrated Multi-Dimensional Complexity Framework

While performing the comprehensive literature analysis, we identified multiple specific approaches focusing on complexity and, using them, developed the mind map of complexity concepts shown in Figure 2. Although these approaches somewhat reflect a generic commonly accepted complexity definition, that is, quantity and variety of the elements and their relationships, they are loosely coupled and focus on narrow areas. While addressing only one particular area, they fall short to embrace the whole variety of elements in the environment and support complexity management on large scales. Hence, motivated by the identified gaps IG1 and IG2, we synthesize the SLR results in the multi-dimensional complexity framework explained in the research approach section. In this subsection, we elaborate on that synthesis.
We consolidate the most common complexity concepts with the four dimensions of the framework. Specifically, we integrate the identified concepts to the complexity notions in the framework. Then, we calculate the relative distributions of the complexity notions per complexity type. Similar to the comprehensive literature analysis, the synthesis of the SLR results was performed using the qualitative data analysis tool NVivo. Hereby, each paper was analyzed and assigned to the complexity dimensions and respective notions by two researchers independently. Further, the discrepancies were discussed, and a common decision was taken. The resulting synthesis is depicted in Table 5 and explained below using an illustrative example.
The business process model is one of the complexity concepts we identified in technological complexity. As can be seen in the first row of technological complexity in Table 5, in 91 % of the technological complexity papers focus on the objective observer perspective whereas, in 36 % of the papers, the focus is on the subjective observer perspective. In the remainder of this subsection, we elaborate on the synthesis from the three complexity perspectives, namely organizational, technological, and textual.
In technological complexity, the indicated complexity concepts are the business process model, event logs and workflows, and software. Though technological complexity subtypes are expected to be similar in their nature and, hence, complexity, we observed the diversity of combinations, especially if compared to organizational and textual complexities (see Table 5). Owing to its technological nature, we expected technological complexity to be objective (D1) and quantitatively measured (D3). Interestingly, both subjective (D1) and qualitative (D3) complexity approaches gained popularity in this type of complexity. Accordingly, subjective approaches to analyze complexity are exemplified below:
  • Business process models, event logs and workflows: understandability, cognitive load perspectives [104] and quality measure of a process based on the number of generated process logs [105];
  • IT services: complexity theory-based conceptualization [106];
  • Enterprise systems: defining case study-based complexity factors [107];
  • Rule-based systems: difficulty of problems that can be solved [108];
  • User interfaces: case study-based evaluation [109];
  • Software programs: dependency on the programmer’s skills [110].
In the time perspective (D2) for technological complexity, all but one concepts are either purely or substantially structural in the analyzed papers. As can be seen in Table 5, IT services is the only concept considered completely dynamic (100 %). Similarly, in D4, it is the single disorganized technological complexity concept. Additionally, rule-based systems is the concept with a considerable disorganized relative distribution value.
Due to its intrinsic diversity, that is, the diversity of tasks, projects, and processes and approaches to analyze their complexity, organizational complexity mostly includes all complexity notions of four dimensions, except for product (-tion) complexity, which, as a rule, reflects a defined number of elements (structural notion, D2) that interact in a specifically designed way (organized notion, D4) [85].
Regarding textual complexity, we observed the same complexity notions in the most textual complexity concepts, that is, legislative documentation, news articles, webpages, online reviews, and textbooks. It is noteworthy that they are considered purely structural (D2) and organized (D4). Moreover, except for legislative documentation and textbooks and other teaching materials, textual complexity concepts are highlighted as objective (D1) and quantitative (D3).

6. Method to Address Complexity

In this section, we present our method to address complexity in organizations with an illustrative example taken from a real-life setting.
For developing the method, GQM is taken as the basis and extended using the complexity analysis approaches obtained from the SLR, the developed morphological box, and the integrated multi-dimensional complexity framework. Aligned with the hierarchy in GQM, the starting point of our method is a goal definition. In other words, the method exploits a top-down mechanism in tackling complexity. This is also necessary to provide a staged complexity guidance, starting with a goal statement and reaching a specific solution. In our case, the specific solution is a collection data structure, each element of which is a map with the following key-value pairs: a question derived from a given goal and a set of complexity measurement and analysis approaches relevant for the problem origin of the question.
At the question level, the morphological box is used to refine an expressed goal into several questions. In particular, except for metrics origin, all features in the morphological box and the groups in these features are employed to facilitate deriving specific questions. For example, the groups in the input feature can be used to break down a given goal and define specific questions about the available inputs regarding complexity in the organization.
Next, at the measurement level, each question is analyzed with the help of the integrated multi-dimensional complexity framework. In the analysis, complexity types, complexity concepts, complexity dimensions, and notions are investigated to determine which of the complexity approaches obtained from the SLR can be beneficial for answering each question. With this, the aim is to distill the comprehensive analysis and provide a relevant subset of approaches mapping to each question. To build such a subset, initially, related complexity types are identified for each question. Then, per complexity type, matching complexity concepts, complexity dimensions, and notions are discovered. Based on these obtained inputs, the complexity approaches retrieved in the SLR are filtered, and a subset is created. The created subsets are merged into a final subset per question.
In the final level, that is, the data level, each subset is further filtered considering the available data related to each question. Importantly, in the case of data unavailability, data transformation possibilities can be investigated based on the input groups in the morphological box and the input attributes of the approaches in the created subsets. With this, our method enables organizations to consider and evaluate alternatives that may be beneficial depending on their context.
Based on the explanation on each level above, we summarize the steps of the proposed method:
  • Define a goal;
  • Formulate specific questions from the goal using the morphological box;
    For each question:
  • Analyze the question using the integrated multi-dimensional complexity framework;
  • Identify complexity types related to the question;
    For each complexity type:
    4.1.
    Using the integrated multi-dimensional complexity framework, find matching;
    i
    complexity concepts;
    ii
    complexity dimensions;
    iii
    notions;
    4.2.
    Create complexity approaches subset based on complexity type, i, ii, and iii;
  • Merge complexity approaches subsets and form a final subset;
  • Filter the final subset considering the available data;
  • Check the need and possibilities for data transformation using the morphological box.
In the remainder of this section, we provide an example of addressing complexity (see Table 6). The setting of the example is taken from a company. The company operates in the healthcare sector and provides Business Intelligence (BI) solutions to various healthcare institutions. To gather requirements and change requests, business analysts of the company visit its clients. However, due to the Covid-19 pandemic and its consequent restrictions, visits were not possible. Moreover, healthcare practitioners were deeply in need of adequate dashboards to better understand the pandemic and were sending multiple immediate requests to the company via email. Hence, the business analysis department of the company has to tackle mostly unstructured and ambiguous texts in the received emails. Regarding the situation, the company is looking for a way to deal with the complexity that emerged due to the changes in its work environment.
Following our illustrative example in Table 6, in the growing popularity of remote working, the employees receive their tasks prevailingly in a textual form. The goal is to analyze the workload of employees. This goal is broken down into four questions.
In Question 1, the required data are email texts. Hence, the complexity type in focus is a textual one. Based on Table 5, we can conclude about the most typical complexity dimensions analyzed within this complexity type. Afterward, one can make use of the literature analysis results (For the details of this study, please check out our project on GitHub (https://github.com/complexityreview/comprehensiveOverview, accessed on 11 October 2021)) we obtained with the help of Nvivo. Based on the complexity type (and subtype if applicable) and selected suitable complexity dimensions, one can determine relevant papers and, this way, derive necessary complexity analysis approaches and metrics. In the case of Question 1 of the illustrative example, these are standard readability formulae such as Flesh Reading Ease score or Gunning Fog Index [111].
In Question 2, specific recent complexity analysis approaches, such as [112], are more relevant to answer the question and extract the exact activities from the textual data.
In Question 3 and Question 4, different data and complexity types are required. In Question 3, organizational complexity type and task subtype are to be identified. Due to the inherent variability of organizational complexity, all possible complexity dimensions and notions are observed in this case (see Table 5), which complicates the choice. Thus, individual managerial decisions need to be taken to estimate the efforts and further actions. Additionally, gathering necessary data, that is, specific information from employees regarding task complexity, can be problematic and time-consuming. However, one can rely on solid research works for guidance, for example, [11]. At the same time, in Question 4, the required data, derived complexity type, dimensions, and notions are rather straightforward, and complexity analysis is easy to implement.
To summarize, GQM provides an opportunity to demonstrate the direct practical value of the study findings, that is, the comprehensive literature analysis, obtained complexity types, complexity concepts mind map, the morphological box and the integrated multi-dimensional complexity framework. We extended the standard GQM process with an additional three points based on our study findings: data, complexity type, complexity dimensions, notions, and analysis approaches.

7. Discussion

In this section, we discuss the findings of our study in accordance with the demands we have inferred from the gaps listed in the related work section and which served as a motivation for our study. These are:
  • Demand for an extensive literature analysis embracing different types of complexity in organizations (IG1, IG4);
  • Demand for a structured overview and integration of findings based on existing standard approaches, frameworks, and vocabularies to promote the common acceptance of complexity research in organizations (IG1);
  • Demand for practical guidance supporting managers in addressing the complexity and planning comprehensive complexity management initiatives in companies (IG2, IG3).

7.1. Demand for an Extensive Literature Analysis

In the related literature, we have noted a high fragmentation in the research on complexity in general and in organizations in particular. This tendency has been observed both in regular and review papers. For example, the literature review works consider different types of complexity implicitly related to organizations, such as task complexity [10], project complexity [113], or interorganizational complexity [23]. However, each of such works focuses on a specific aspect of an organization. Our study sets itself apart by addressing multiple types of complexity in an extensive SLR of the complexity studies from the People Process Technology (the PPT framework [14]) perspective. Moreover, our SLR has shown that the peak popularity of complexity research in organizations has been in 2011–2015, which highlights the demand for an up-to-date review.

7.2. Demand for a Structured Overview

The related work shows that the mentioned problem of high fragmentation applies not only to the term complexity but also to the complexity drivers [25], measurements, and approaches to address complexity [26]. We address this limitation while classifying the SLR findings using the morphological box and the integrated multi-dimensional complexity framework. The developed morphological box (see Table 4) provides a logically structured outcome highlighting the most interesting and important points with the help of the three groups of aspects: generic, complexity metrics and analysis, and implementation aspects.
The generic aspects represent a summary on research motivations, novelty, focus areas, application cases, and future research on complexity in organizations, the information typical for any research study. This way, practitioners can get a quick overview on how the complexity research is structured and approached in the academic community. The researchers can also draw valuable insights using the information on motivations or future research directions. For example, some complexity studies are driven by research on the reasons for performing complexity analysis (16%) and complexity effects (12%), that is, hidden benefits of working on the complexity, interesting insight for both researchers and practitioners.
Accordingly, our study aims to facilitate and promote the efforts in this direction while streamlining the complexity research projects. Thus, the less frequent motivations, for example, complexity metrics analysis (5%) and complexity studies review (7%), may serve as a potential gap for further complexity studies. Likewise, the information about novelty areas can be useful for better positioning and justifying research projects in the sense of envisioned contributions. The summary on focus areas and application cases is valuable for researchers and practitioners in two ways. First, the information on the popularity of focus areas can help them to identify relevant areas requiring more research. Second, similar application cases can be closely studied for comparison-related purposes.
In addition, our findings can serve to prove the reusability of existing application cases. In the context of future research directions, the interested researchers can either follow the strategy of the trendy topics or pick up rare cases on which a few research activities have been conducted so far. For example, one can clearly see that the suggested complexity analysis approaches, metrics, and frameworks lack a thorough validation in all three complexity types.
The complexity metrics and analysis aspects include the inputs, resulting outputs, origins of complexity metrics, and their validation. Whereas metrics origin provides a theoretical background, input and output types, as well as validation approaches, provide practical insights into complexity measurement for both researchers and practitioners.
Aside from that, tool support, that is, implementation aspect, is one of the topics that need special attention since most of the complexity metrics (35%, see Table 4) are not using any tool. In other words, there is a clear lack of tool support in complexity measurement, which should be addressed in future research.

7.3. Demand for an Integration of Findings

We integrated our findings into a solid complexity framework [5] whereby four universal dimensions based on time, observer, measures, and dynamics and predictability were applied. Such a generic classification approach allows for independent documentation of diverse complexity concepts. Moreover, it enables more transparency in presenting and understanding the existing work on complexity. In our study, we demonstrated the possibility of integrating seemingly different complexity types into one complexity framework. This also enables other researchers to use their complexity concepts in the context of such a framework without a need to precisely define them [5].
In the classification of our study findings, we could identify various combinations of dimensions’ values in the technological complexity indicating approaches not covered in the specific complexity concept. For example, the complexities of event logs and workflows, IT architectures, and IT services evidence the lack of subjective approaches (see Table 5). Though organizational complexity comprises prevailingly all dimensions and values, product(-ion) complexity lacks dynamic and disorganized approaches. Being structural and organized in its nature, textual complexity demonstrates the absence of subjective approaches in the case of the analysis of news articles, webpages, and reviews. All these open points offer potential for further exploration and, hence, future work. Furthermore, though the classification is not always straightforward, we highlight the importance of formalization in the context of complexity measurement.

7.4. Demand for a Practical Guidance

As mentioned in Section 2, [36,37] state the need for frameworks and practices assisting managers in the development of broad scope complexity management strategies. In our work, we address this shortcoming by proposing a method for complexity management by adopting GQM, a well-known approach for deriving and selecting metrics for a specific task in a goal-oriented manner [16]. In GQM, goals and metrics are adjusted for a specific setting. The upfront problem and, hence, goal statements allow for the selection of the metrics relevant for achieving these goals, which reduces the data collection effort considerably. The interpretation of the measurements becomes also straightforward due to the traceability between data and metrics. Hence, the wrong interpretations can be prevented [114]. Despite such advantages, GQM reveals certain limitations, such as high flexibility and generation of a large number of solutions [60], which were discussed in Section 3. The attempts to deal with this limitation end up in other shortcomings, like dismissing important perspectives [63]. Therefore, we have observed multiple extensions of GQM in line with a particular study purpose. In our work, we adopt and extend GQM with complexity types, dimensions, and notions making it more specific to our objectives and study setting. This way, we address the high flexibility of GQM. Further, in the metric selection stage of GQM, we consider not only metrics but also comprehensive complexity analyses approaches we found in the literature. In doing so, we deal with another shortcoming, that is, the risk of missing relevant solutions.
Although addressing the shortcomings of GQM [60,63], we are aware that our method also reveals some limitations. For example, it uses the SLR results as an input. Hence, it naturally inherits typical SLR limitations, such as authors’ bias while building the search strings, exclusion and inclusion criteria definition, coding, and synthesizing the results [115]. Further, though our SLR contains significant ground work, there is a need for repeating such reviews to have an up-to-date list of complexity metrics and analysis approaches. Moreover, in its current state, our method represents a conceptually designed artifact that is manually applied in a real-world example to illustrate its relevance. Hence, automation solutions such as [116,117] should be considered as a part of future work.

8. Conclusions and Future Work

In this paper, we proposed a method to address complexity in organizations. With the method, our main goal was to develop practical guidelines for the selection of complexity analysis approaches for a particular problem about complexity management in organizations. To achieve this goal, we extended the Goal Question Metric approach [16] using the SLR on complexity and its results, which we comparatively analyzed and synthesized.
In particular, to retrieve the body of knowledge on complexity research and practice, we conducted an SLR on the complexity research considering the three main complexity types, that is, organizational, technological, and textual, as the starting point. We analyzed 130 papers that discuss complexity. Hereby, we took the PPT framework [14] as the basis. Then, to provide structured outcomes and address the problem of high fragmentation, we designed and implemented two classification approaches: a morphological box and an integrated multi-dimensional framework. The developed morphological box summarizes the three fundamental aspects of complexity research. The three aspects reveal ten features capturing the information extracted from the analyzed papers. To contribute to the standardization efforts in the complexity research field, we synthesized our SLR findings by integrating them in a solid multi-dimensional complexity framework [5].
Next, the obtained knowledge and findings from the classification of the SLR results were used to extend GQM, providing the method to address complexity in organizations. With an illustrative example taken from a company, we demonstrated the practical value of the method. Hence, as a practical contribution in the form of the comprehensive guidance, the method can assist organizations in their complexity management initiatives. Thus, our study sets itself apart from the existing work in the way that it serves as a guideline both for complexity researchers and practitioners willing to perform the complexity analysis in an organization. It is important to note that the structuring and conceptualization of complexity presented in our study is the first attempt to align diverse types of complexity.
For future work, two prominent avenues considering the limitations of the method can be highlighted. First, to have an up-to-date set of complexity analysis approaches, the SLR on complexity needs to be automated, as it is currently manual. For this, we aim to use automated systematic review solutions that employ machine learning technologies, for example, ASReview [117]. Such solutions may help to enrich results serving as a reusable basis to better deal with the researchers’ bias. Second, we plan to conduct case studies and investigate the usefulness of the method in organizations, as it is currently demonstrated using an illustrative example from a real-life setting. Moreover, with case studies, we would like to collect the opinions of practitioners in terms of the fit of the method in their daily routine.

Author Contributions

Conceptualization, A.R.; methodology, A.R.; software, Ü.A.; formal analysis, A.R. and Ü.A.; investigation, A.R. and Ü.A.; resources, V.G.M.; data curation, Ü.A.; writing—original draft preparation, A.R. and Ü.A.; writing—review and editing, A.R.; visualization, Ü.A.; supervision, V.G.M.; project administration, A.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on https://github.com/complexityreview/comprehensiveOverview (accessed on 15 October 2021).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. List of Papers Analyzed in This Study

Below, in Table A1, we list the papers analyzed in this study. For the sake of simplicity and due to limited space, we only provide the most relevant features of the papers in addition to their title and authors. In particular, complexity type, motivation, input, and future research features are kept in the list. For the detailed list with all features and encoding, we refer to the project page on check out our project on GitHub (https://github.com/complexityreview/comprehensiveOverview, accessed on 11 October 2021).
Table A1. Analyzed papers in this study.
Table A1. Analyzed papers in this study.
ReferenceComplexity TypeTitleMotivationInputFuture Research
[72]OrganizationalTask complexity and contingent processing in decision making: An information search and protocol analysisReasons for applying complexity metricsOther; Case studies and interviewsTool support
[46]OrganizationalTask complexity: Definition of the constructComplexity metrics developmentOtherFramework extension
[21]OrganizationalTask complexity: A review and analysisComplexity factors; Complexity metrics developmentNot ApplicableNew validation studies; Approach extension
[118]OrganizationalReview of concepts and approaches to complexityComplexity studies reviewNot ApplicableNot Specified
[11]OrganizationalA model of the effects of audit task complexityComplexity effectsNot ApplicableApproach extension
[119]OrganizationalTask complexity affects information seeking and useComplexity effectsOther; Case studies and interviewsNew validation studies
[120]OrganizationalThe impact of knowledge and technology complexity on information systems developmentComplexity metrics developmentSoftware and ArchitecturesApproach extension; New validation studies
[68]OrganizationalTask and technology interaction (TTI): a theory of technological support for group tasksComplexity metrics developmentOtherNew validation studies
[121]OrganizationalPerspective: Complexity theory and organization scienceComplexity metrics developmentNot ApplicableApproach extension
[84]OrganizationalSources and assessment of complexity in NPD projectsComplexity factorsBusiness textual informationNew validation studies
[81]OrganizationalQuantifying the Complexity of IT Service Management ProcessesComplexity metrics developmentBusiness textual informationMetrics extension; New validation studies
[122]OrganizationalComplexity of megaprojectsComplexity factorsNot ApplicableNot Specified
[80]OrganizationalEstimating business value of IT services through process complexity analysisReasons for applying complexity metricsBusiness textual informationApproach extension; Tool support
[123]OrganizationalThe inherent complexity of large scale engineering projectsComplexity metrics developmentNot ApplicableNot Specified
[70]OrganizationalComplexity of Proceduralized TasksComplexity metrics development; Reasons for applying complexity metricsNot ApplicableNot Specified
[66]OrganizationalFinding and reducing needless complexityComplexity factorsOtherNot Specified
[79]OrganizationalRevisiting project complexity: Towards a comprehensive model of project complexityComplexity metrics developmentNot ApplicableApproach extension
[71]OrganizationalModel-based identification and use of task complexity factors of human integrated systemsComplexity factorsOtherFramework extension; Guidelines development
[10]OrganizationalTask complexity: A review and conceptualization frameworkComplexity studies review; Complexity metrics developmentNot ApplicableNew validation studies; Framework extension
[85]OrganizationalTesting complexity index-a method for measuring perceived production complexityComplexity effectsBusiness textual informationNot Specified
[83]OrganizationalThe impact of business process complexity on business process standardizationComplexity factorsOtherNot Specified
[124]OrganizationalRelationships between project complexity and communicationComplexity effectsOther; Case studies and interviewsApproach extension
[78]OrganizationalBuilding up a project complexity framework using an international Delphi studyComplexity metrics developmentOther; Case studies and interviewsNew validation studies
[76]OrganizationalAn extended literature review of organizational factors impacting project management complexityComplexity factorsNot ApplicableNot Specified
[125]OrganizationalRevisiting complexity in the digital ageReasons for applying complexity metricsNot ApplicableNot Specified
[77]OrganizationalComplexity in the Context of Systems Approach to Project ManagementComplexity effectsNot ApplicableNew validation studies; Framework extension
[126]OrganizationalReview of complexity drivers in enterpriseComplexity factorsNot ApplicableNot Specified
[45]OrganizationalMeasurement model of project complexity for large-scale projects from task and organization perspectiveComplexity metrics developmentBusiness textual informationMetrics extension
[75]OrganizationalWork Autonomy and Workplace Creativity: Moderating Role of Task ComplexityReasons for applying complexity metricsOther; Case studies and interviewsNew validation studies
[127]OrganizationalManaging complexity in service processes. The case of large business organizationsComplexity metrics developmentBusiness textual informationNew validation studies; Approach extension
[69]OrganizationalModeling task complexity in crowdsourcingComplexity factorsBusiness textual informationApproach extension
 [128]OrganizationalConstruction project complexity: research trends and implicationsComplexity studies reviewNot ApplicableComplexity factors; Complexity effects
[129]OrganizationalRevisiting complexity theory to achieve strategic intelligenceReasons for applying complexity metricsOtherNot Specified
[25]OrganizationalRevisiting task complexity: A comprehensive frameworkComplexity metrics developmentOtherFramework extension
[130]OrganizationalComplexity drivers in digitalized work systems: implications for cooperative forms of workComplexity factorsOther; Case studies and interviewsApproach extension
[74]OrganizationalRobotic Process Automation Contemporary themes and challengesReasons for applying complexity metricsNot ApplicableComplexity factors; Complexity effects; Guidelines development
[7]TechnologicalA Complexity MeasureComplexity metrics developmentSoftware and ArchitecturesNot Specified
[131]TechnologicalA measure of control flow complexity in program textComplexity metrics developmentSoftware and ArchitecturesNot Specified
[132]TechnologicalSoftware structure metrics based on information flowReasons for applying complexity metrics; Complexity metrics developmentSoftware and ArchitecturesApproach extension
[133]TechnologicalMeasuring the quality of structured designsReasons for applying complexity metrics; Complexity metrics developmentSoftware and ArchitecturesMetrics extension; New validation studies
[110]TechnologicalAn empirical study of a syntactic complexity familyComplexity metrics developmentSoftware and ArchitecturesNot Specified
[134]TechnologicalSystem structure and software maintenance performanceComplexity effectsSoftware and ArchitecturesApproach extension; New validation studies
[108]TechnologicalVerifying, validating, and measuring the performance of expert systemsComplexity effectsSoftware and ArchitecturesApproach extension; Approach implementation
[135]TechnologicalSoftware complexity and maintenance costsComplexity effectsSoftware and ArchitecturesNot Specified
[136]TechnologicalComplexity metrics for rule-based expert systemsComplexity metrics analysisSoftware and ArchitecturesNew validation studies
[137]TechnologicalAn information theory-based approach for quantitative evaluation of user interface complexityComplexity metrics developmentSoftware and ArchitecturesNew validation studies
[138]TechnologicalSoftware metrics by architectural pattern miningReasons for applying complexity metrics; Complexity metrics developmentSoftware and ArchitecturesTool support
[86]TechnologicalFinding a complexity measure for business process modelsComplexity metrics developmentNot ApplicableMetrics extension
[139]TechnologicalA new measure of software complexity based on cognitive weightsComplexity metrics developmentSoftware and ArchitecturesNot Specified
[140]TechnologicalMeasures of information complexity and the implications for automation designComplexity metrics developmentNot ApplicableNew validation studies
[141]TechnologicalComplexity and Automation Displays of Air Traffic Control: Literature Review and AnalysisComplexity metrics developmentNot ApplicableMetrics extension
[47]TechnologicalA discourse on complexity of process modelsComplexity metrics analysis; Complexity metrics developmentNot ApplicableNew validation studies
[142]TechnologicalBusiness process quality metrics: Log-based complexity of workflow patternsComplexity metrics developmentEvent log; workflowsNot Specified
[105]TechnologicalComplexity analysis of BPEL web processesComplexity factorsEvent log; workflowsNew validation studies
[143]TechnologicalApproaches for business process model complexity metricsComplexity metrics developmentSoftware and ArchitecturesMetrics extension
[144]TechnologicalError Metrics for Business Process Models.Reasons for applying complexity metricsBusiness process modelsNew validation studies
[145]TechnologicalA weighted coupling metric for business process models.Complexity metrics developmentBusiness process modelsNew validation studies
[90]TechnologicalA metric for ERP complexityComplexity metrics analysisNot ApplicableTool support
[146]TechnologicalBusiness process control-flow complexity: Metric, evaluation, and validationComplexity metrics developmentEvent log; workflowsNot Specified
[147]TechnologicalEvaluating workflow process designs using cohesion and coupling metricsComplexity metrics development; Reasons for applying complexity metricsBusiness process modelsApproach extension; New validation studies
[148]TechnologicalOn a quest for good process models: the cross-connectivity metricComplexity metrics developmentBusiness process modelsNew validation studies; Complexity factors; Guidelines development
[149]TechnologicalComplex network model for software system and complexity measurementComplexity metrics developmentSoftware and ArchitecturesMetrics extension; Approach extension
[150]TechnologicalComplexity metrics for Workflow netsComplexity metrics developmentBusiness process modelsNew validation studies
[103]TechnologicalA Survey of Business Process Complexity MetricsComplexity studies reviewNot ApplicableMetrics extension; New validation studies; Tool support
[151]TechnologicalPrediction of business process model quality based on structural metricsComplexity metrics analysisBusiness process modelsNew validation studies
[107]TechnologicalEnterprise systems complexity and its antecedents: a grounded-theory approachComplexity factorsSoftware and ArchitecturesComplexity effects; New validation studies
[152]TechnologicalOptimizing the trade-off between complexity and conformance in process reductionComplexity effectsBusiness process modelsNew validation studies
[153]TechnologicalA simpler model of software readabilityComplexity metrics developmentSoftware and Architectures; Case studies and interviewsNot Specified
[154]TechnologicalIntegrated framework for business process complexity analysisComplexity metrics developmentBusiness textual informationMetrics extension; New validation studies
[155]TechnologicalComplexity in Enterprise Architectures-Conceptualization and Introduction of a Measure from a System Theoretic PerspectiveComplexity metrics developmentSoftware and ArchitecturesApproach extension; New validation studies
[109]TechnologicalGUIEvaluator: A Metric-tool for Evaluating the Complexity of Graphical User Interfaces.Complexity metrics developmentNot ApplicableMetrics extension; Metrics comparison
[156]TechnologicalExamining case management demand using event log complexity metricsComplexity metrics developmentEvent log; workflowsMetrics extension; New validation studies; Tool support
[106]TechnologicalA complexity theory approach to IT-enabled services (IESs) and service innovation: Business analytics as an illustration of IESComplexity metrics developmentNot ApplicableNew validation studies
[157]TechnologicalSquare complexity metrics for business process modelsComplexity metrics developmentBusiness process modelsNew validation studies
[158]TechnologicalQuantification of interface visual complexityComplexity metrics developmentOtherMetrics extension; Tool support
[5]TechnologicalAdopting Notions of Complexity for Enterprise Architecture ManagementComplexity metrics developmentNot ApplicableNot Specified
[159]TechnologicalAn exploratory study on the relation between user interface complexity and the perceived qualityComplexity effectsSoftware and ArchitecturesNew validation studies
[160]TechnologicalA systematic literature review of studies on business process modeling qualityComplexity studies reviewNot ApplicableFramework extension
[161]TechnologicalMetrics and performance indicators to evaluate workflow processes on the cloudComplexity measurementsEvent log; workflowsApproach extension; New validation studies
[162]TechnologicalMeasuring complexity of business process models integrated with rulesComplexity metrics developmentBusiness process modelsNot Specified
[163]TechnologicalMetrics for the case management modeling and notation (CMMN) specificationComplexity metrics developmentBusiness process modelsNew validation studies
[88]TechnologicalUI-CAT: calculating user interface complexity metrics for mobile applicationsComplexity measurementsEvent log; Software and ArchitecturesNot Specified
[164]TechnologicalComplexity-aware generation of workflows by process-oriented case-based reasoningReasons for applying complexity metricsEvent log; workflowsMetrics extension; New validation studies
[165]TechnologicalHow visual cognition influences process model comprehensionComplexity metrics developmentBusiness process modelsNew validation studies
[29]TechnologicalComplexity metrics for process models-A systematic literature reviewComplexity studies reviewNot ApplicableMetrics extension
[166]TechnologicalDecision support for reducing unnecessary IT complexity of application architecturesComplexity factorsSoftware and ArchitecturesTool support
[104]TechnologicalDealing with Process Complexity: A Multiperspective ApproachComplexity factorsBusiness process modelsNew validation studies
[167]TechnologicalTowards understanding code readability and its impact on design qualityComplexity effectsSoftware and ArchitecturesMetrics extension
[168]TechnologicalIntegrating Business Process Models with RulesReasons for applying complexity metricsBusiness process modelsNot Specified
[169]TechnologicalComplexity metrics for DMN decision modelsComplexity metrics analysisBusiness process modelsApproach extension
[170]TextualAccounting narratives: A review of empirical studies of content and readabilityComplexity studies reviewNot ApplicableNew validation studies; Approach extension
[91]TextualReadability of annual reports: Western versus Asian evidenceComplexity factorsBusiness textual informationNot Specified
[171]TextualReadability of annual reports: Western versus Asian evidence-a comment to contexualizeComplexity studies reviewNot ApplicableNew validation studies
[95]TextualThe application of the marketing concept in textbook selection: Using the Cloze procedureReasons for applying complexity metricsBusiness textual informationMetrics extension
[172]TextualAnnual report readability variability: tests of the obfuscation hypothesisComplexity factorsBusiness textual informationMetrics extension; New validation studies
[173]TextualCommunication in auditors’ reports: Variations in readability and the effect of audit firm structureComplexity factorsBusiness textual informationNot Specified
[174]TextualA texture index for evaluating accounting narrativesComplexity metrics developmentBusiness textual informationNew validation studies
[175]TextualThe effect of thematic structure on the variability of annual report readabilityComplexity factorsBusiness textual informationNew validation studies
[176]TextualAn approach to evaluating accounting narratives: a corporate social responsibility perspectiveReasons for applying complexity metricsBusiness textual informationMetrics extension
[177]TextualE-comprehension: Evaluating B2B websites using readability formulaeComplexity measurementsBusiness textual informationApproach extension
[178]TextualObfuscation, textual complexity and the role of regulated narrative accounting disclosure in corporate governanceReasons for applying complexity metricsBusiness textual informationNot Specified
[179]TextualEvaluating a measure of content quality for accounting narratives (with an empirical application to narratives from Australia, Hong Kong, and the United States)Complexity factorsBusiness textual informationApproach extension
[180]TextualReadability of corporate annual reports of top 100 Malaysian companiesComplexity factorsBusiness textual informationNew validation studies
[181]TextualReadability of financial statement footnotes of Kuwaiti corporationsComplexity measurementsBusiness textual informationNew validation studies
[182]TextualVoluntary narrative disclosures by local governments: A comparative analysis of the textual complexity of mayoral and chairpersons’ letters in annual reportsComplexity factorsBusiness textual informationApproach extension; New validation studies
[183]TextualThe textual complexity of annual report narratives: A comparison of high-and low-performance companiesComplexity effectsBusiness textual informationNew validation studies; Approach extension
[92]TextualAre business school mission statements readable?: Evidence from the top 100Complexity measurementsBusiness textual informationNot Specified
[184]TextualEnhancing compliance through improved readability: Evidence from New Zealand’s rewrite “experiment”Reasons for applying complexity metricsBusiness textual informationNew validation studies
[185]TextualHow readable are mission statements? An exploratory studyComplexity measurementsBusiness textual informationApproach extension; New validation studies
[186]TextualReadability of accountants’ communications with small business—Some Australian evidenceComplexity measurementsBusiness textual informationMetrics extension; New validation studies
[111]TextualThe readability of managerial accounting and financial management textbooksReasons for applying complexity metricsBusiness textual informationNew validation studies
[187]TextualEvaluating content quality and helpfulness of online product reviews: The interplay of review helpfulness vs. review contentComplexity effectsBusiness textual informationMetrics extension; New validation studies
[93]TextualReading between the vines: analyzing the readability of consumer brand wine web sitesComplexity measurementsBusiness textual informationApproach extension; New validation studies
[188]TextualEssays on the issues of readability in business disciplinesComplexity studies reviewNot ApplicableApproach extension
[96]TextualRevisiting the role of linguistic complexity in ESL reading comprehensionComplexity factorsBusiness textual informationNot Specified
[189]TextualTextual complexity of standard conditions used in the construction industryComplexity factorsBusiness textual informationNot Specified
[190]TextualTourism websites in the Middle East: readable or not?Complexity measurementsBusiness textual informationNew validation studies; Approach extension
[191]TextualDeveloping the Flesch reading ease formula for the contemporary accounting communications landscapeComplexity metrics analysis; Complexity metrics developmentNot ApplicableMetrics extension
[192]TextualText complexity: State of the art and the conundrums it raisesComplexity studies reviewNot ApplicableApproach extension
[193]TextualTraditional and alternative methods of measuring the understandability of accounting narrativesComplexity metrics analysisBusiness textual informationNew validation studies
[97]TextualWhen complexity becomes interestingComplexity effectsBusiness textual informationNew validation studies
[194]TextualReadability and Thematic Manipulation in Corporate Communications: A Multi-Disclosure InvestigationReasons for applying complexity metricsBusiness textual informationNew validation studies
[195]TextualGuiding through the Fog: Does annual report readability reveal earnings management?Complexity effectsBusiness textual informationMetrics extension
[196]TextualFrom Accountability to Readability in the Public Sector: Evidence from Italian UniversitiesComplexity factorsBusiness textual informationApproach extension
[197]TextualThe readability of integrated reportsComplexity measurementsBusiness textual informationMetrics extension; New validation studies
[198]TextualReadability of Mission Statements: A Look at Fortune 500Complexity measurementsBusiness textual informationMetrics extension; New validation studies
[199]TextualAssessing social and environmental performance through narrative complexity in CSR reportsReasons for applying complexity metricsBusiness textual informationNew validation studies
[200]TextualA conceptual model for measuring the complexity of spreadsheetsComplexity metrics developmentOtherMetrics extension
[201]TextualThe influence of business strategy on annual report readabilityComplexity factorsBusiness textual informationNew validation studies
[94]TextualRoles of review numerical and textual characteristics on review helpfulness across three different types of reviewsComplexity effectsBusiness textual informationApproach extension; New validation studies

References

  1. Benbya, H.; Nan, N.; Tanriverdi, H.; Yoo, Y. Complexity and Information Systems Research in the Emerging Digital World. MIS Q. 2020, 44, 1–17. [Google Scholar]
  2. Törnblom, O. Managing Complexity in Organizations: Analyzing and Discussing a Managerial Perspective on the Nature of Organizational Leadership. Behav. Dev. 2018, 23, 51–62. [Google Scholar] [CrossRef]
  3. Legner, C.; Eymann, T.; Hess, T.; Matt, C.; Böhmann, T.; Drews, P.; Maedche, A.; Urbach, N.; Ahlemann, F. Digitalization: Opportunity and Challenge for the Business and Information Systems Engineering Community. Bus. Inf. Syst. Eng. 2017, 59, 301–308. [Google Scholar] [CrossRef]
  4. Essaides, N. The Impact of Organizational Complexity on Finance Performance. Available online: thehackettgroup.com/blog/the-impact-of-organizational-complexity-on-finance-performance/ (accessed on 15 August 2021).
  5. Schneider, A.; Zec, M.; Matthes, F. Adopting Notions of Complexity for Enterprise Architecture Management. In Proceedings of the 20th Americas Conference on Information Systems, Savannah, GA, USA, 7–9 August 2014; Association for Information Systems: Atlanta, GA, USA, 2014; pp. 1–10. [Google Scholar]
  6. Bergman, T.J.; Beehner, J.C. Measuring social complexity. Anim. Behav. 2015, 103, 203–209. [Google Scholar] [CrossRef]
  7. McCabe, T.J. A Complexity Measure. IEEE Trans. Softw. Eng. 1976, 2, 308–320. [Google Scholar] [CrossRef]
  8. Chronéer, D.; Bergquist, B. Managerial Complexity in Process Industrial R&D Projects: A Swedish Study. Proj. Manag. J. 2012, 43, 21–36. [Google Scholar]
  9. Daryani, S.M.; Amini, A. Management and Organizational Complexity. Procedia—Soc. Behav. Sci. 2016, 230, 359–366. [Google Scholar] [CrossRef] [Green Version]
  10. Liu, P.; Li, Z. Task complexity: A review and conceptualization framework. Int. J. Ind. Ergon. 2012, 42, 553–568. [Google Scholar] [CrossRef]
  11. Bonner, S.E. A model of the effects of audit task complexity. Account. Organ. Soc. 1994, 19, 213–234. [Google Scholar] [CrossRef]
  12. Kohr, D.; Budde, L.; Friedli, T. Identifying Complexity Drivers in Discrete Manufacturing and Process Industry. Procedia CIRP 2017, 63, 52–57. [Google Scholar] [CrossRef]
  13. Leavitt, H. Applied organizational change in industry: Structural, technological, and humanistic approaches. In Handbook of Organizations; Routledge: London, UK, 1965; Volume 264. [Google Scholar]
  14. Prodan, M.; Prodan, A.; Purcarea, A. Three new dimensions to people, process, technology improvement model. In New Contributions in Information Systems and Technologies; Springer: Cham, Switzerland, 2015; Volume 353, pp. 481–490. [Google Scholar]
  15. Zwicky, F. Discovery, Invention, Research through the Morphological Approach; The Macmillian Company: Toronto, ON, Canada, 1969. [Google Scholar]
  16. Van Solingen, R.; Basili, V.; Caldiera, G.; Rombach, D. Goal question metric (GQM) approach. In Encyclopedia of Software Engineering; Wiley: New York, NY, USA, 2002. [Google Scholar]
  17. Arora, S.; Barak, B. Computational Complexity—A Modern Approach; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
  18. Grobman, G. Complexity theory: A new way to look at organizational change. Public Adm. Q. 2005, 29, 350–382. [Google Scholar]
  19. Miestamo, M.; Sinnemäki, K.; Karlsson, F. Language Complexity: Typology, Contact, Change; John Benjamins Publishing: Amsterdam, The Netherlands, 2008; Volume 94. [Google Scholar]
  20. Edmonds, B. What is complexity? The philosophy of complexity per se with application to some examples in evolution. In The Evolution of Complexity; Kluwer: Dordrecht, The Netherlands, 1995. [Google Scholar]
  21. Campbell, D. Task complexity: A review and analysis. Acad. Manag. Rev. 1988, 13, 40–52. [Google Scholar] [CrossRef]
  22. Poutanen, P.; Soliman, W.; Ståhle, P. The complexity of innovation: An assessment and review of the complexity perspective. Eur. J. Innov. Manag. 2016, 19, 189–213. [Google Scholar] [CrossRef]
  23. Milch, V.; Laumann, K. Interorganizational complexity and organizational accident risk: A literature review. Saf. Sci. 2016, 82, 9–17. [Google Scholar] [CrossRef]
  24. Vogel, W.; Lasch, R. Complexity drivers in manufacturing companies: A literature review. Logist. Res. 2016, 9, 25. [Google Scholar] [CrossRef]
  25. Efatmaneshnik, M.; Handley, H. Revisiting task complexity: A comprehensive framework. In Proceedings of the 2018 Annual IEEE International Systems Conference, Vancouver, BC, Canada, 23–26 April 2018; pp. 1–4. [Google Scholar]
  26. Hvam, L.; Hansen, C.L.; Forza, C.; Mortensen, N.H.; Haug, A. The reduction of product and process complexity based on the quantification of product complexity costs. Int. J. Prod. Res. 2020, 58, 350–366. [Google Scholar] [CrossRef]
  27. Augusto, A.; Mendling, J.; Vidgof, M.; Wurm, B. The Connection between Process Complexity of Event Sequences and Models discovered by Process Mining. arXiv 2021, arXiv:2106.07990. [Google Scholar]
  28. Fernández-Cerero, D.; Varela-Vaca, Á.J.; Fernández-Montes, A.; López, M.T.G.; Álvarez-Bermejo, J.A. Measuring data-centre workflows complexity through process mining: The Google cluster case. J. Supercomput. 2020, 76, 2449–2478. [Google Scholar] [CrossRef]
  29. Polancic, G.; Cegnar, B. Complexity metrics for process models—A systematic literature review. Comput. Stand. Interfaces 2017, 51, 104–117. [Google Scholar] [CrossRef]
  30. Rizun, N.; Revina, A.; Meister, V.G. Assessing business process complexity based on textual data: Evidence from ITIL IT ticket processing. Bus. Process. Manag. J. 2021. [Google Scholar] [CrossRef]
  31. Damasiotis, V.; Fitsilis, P.; O’Kane, J.F. Modeling Software Development Process Complexity. Int. J. Inf. Technol. Proj. Manag. 2018, 9, 17–40. [Google Scholar] [CrossRef]
  32. Zuse, H. Software Complexity: Measures and Methods; W. de Gruyter: New York, NY, USA, 1991. [Google Scholar]
  33. Antinyan, V.; Staron, M.; Sandberg, A. Evaluating code complexity triggers, use of complexity measures and the influence of code complexity on maintenance time. Empir. Softw. Eng. 2017, 22, 3057–3087. [Google Scholar] [CrossRef] [Green Version]
  34. Bosch-Rekveldt, M.; Bakker, H.; Hertogh, M. Comparing Project Complexity across Different Industry Sectors. Complexity 2018, 2018, 3246508. [Google Scholar] [CrossRef] [Green Version]
  35. Morcov, S.; Pintelon, L.; Kusters, R. Definitions, characteristics and measures of IT Project Complexity—A Systematic Literature Review. Int. J. Inf. Syst. Proj. Manag. 2020, 8, 5–21. [Google Scholar]
  36. Camposa, P.F.; Truccoa, P.; Huatuco, L.H. Managing structural and dynamic complexity in supply chains: Insights from four case studies. Prod. Plan. Control 2019, 30, 611–623. [Google Scholar] [CrossRef]
  37. Aitken, J.; Bozarth, C.; Garn, W. To eliminate or absorb supply chain complexity: A conceptual model and case study. Prod. Plan. Control 2016, 21, 759–774. [Google Scholar] [CrossRef]
  38. Reeves, M.; Levin, S.; Fink, T.; Levina, A. Taming Complexity. Make Sure the Benefits of Any Addition to an Organization’s Systems Outweigh Its Costs. Published January-February 2020. Available online: https://hbr.org/2020/01/taming-complexity (accessed on 9 October 2021).
  39. Uhl-Bien, M.; Arena, M. Complexity leadership: Enabling people and organizations for adaptability. Organ. Dyn. 2017, 46, 4–20. [Google Scholar] [CrossRef]
  40. Watkins, D.; Earnhardt, M.; Pittenger, L.; Roberts, R.; Rietsema, K.; Cosman-Ross, J. Thriving in Complexity: A Framework for Leadership Education. J. Leadersh. Educ. 2017, 16, 148. [Google Scholar] [CrossRef] [Green Version]
  41. Afsar, B.; Umrani, W.A. Transformational leadership and innovative work behavior: The role of motivation to learn, task complexity and innovation climate. Eur. J. Innov. Manag. 2020, 23, 402–428. [Google Scholar] [CrossRef]
  42. Axelos. Information Technology Infrastructure Library; Technical Report; The Stationery Office:: London, UK, 2019. [Google Scholar]
  43. Mintzberg, H. The Structuring of Organizations. In Readings in Strategic Management; Macmillan Education UK: London, UK, 1989; pp. 322–352. [Google Scholar]
  44. Lunenburg, F.C. Organizational Structure: Mintzberg’s Framework. Int. J. Sch. Acad. Intellect. Divers. 2012, 14, 1–7. [Google Scholar]
  45. Lu, Y.; Luo, L.; Wang, H.; Le, Y.; Shi, Q. Measurement model of project complexity for large-scale projects from task and organization perspective. Int. J. Proj. Manag. 2015, 33, 610–622. [Google Scholar] [CrossRef]
  46. Wood, R. Task complexity: Definition of the construct. Organ. Behav. Hum. Decis. Process. 1986, 37, 60–82. [Google Scholar] [CrossRef]
  47. Cardoso, J.; Mendling, J.; Neumann, G.; Reijers, H. A discourse on complexity of process models. In Proceedings of the International Conference on Business Process Management, Vienna, Austria, 4–7 September 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 117–128. [Google Scholar]
  48. Rizkallah, J. The Big (Unstructured) Data Problem. Available online: www.forbes.com/sites/forbestechcouncil/2017/06/05/the-big-unstructured-data-problem/#1ccea420493a (accessed on 15 August 2021).
  49. Revina, A. Business Process Management: Integrated Data Perspective. A Framework and Research Agenda. In Proceedings of the Information Systems Development: Crossing Boundaries between Development and Operations (DevOps) in Information Systems (ISD2021 Proceedings), Valencia, Spain, 8–10 September 2021; Association for Information Systems: Atlanta, GA, USA, 2021. [Google Scholar]
  50. Revina, A. Considering Business Process Complexity through the Lens of Textual Data. In Proceedings of the Sixteenth International Multi-Conference on Computing in the Global Information Technology (ICCGI 2021) Proceedings), Nice, France, 18–22 July 2021. [Google Scholar]
  51. Kitchenham, B. Procedures for Performing Systematic Reviews; Technical Report; Keele University: Keele, UK, 2004. [Google Scholar]
  52. Templier, M.; Paré, G. A framework for guiding and evaluating literature reviews. Commun. Assoc. Inf. Syst. 2015, 37, 113–137. [Google Scholar] [CrossRef]
  53. Bandara, W.; Furtmueller, E.; Gorbacheva, E.; Miskon, S.; Beekhuyzen, J. Achieving rigor in literature reviews: Insights from qualitative data analysis and tool-support. Commun. Assoc. Inf. Syst. 2015, 34, 155–204. [Google Scholar] [CrossRef] [Green Version]
  54. Gusenbauer, M. Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases. Scientometrics 2019, 118, 177–214. [Google Scholar] [CrossRef] [Green Version]
  55. Schoknecht, A.; Thaler, T.; Fettke, P.; Oberweis, A.; Laue, R. Similarity of business process models—A state-of-the-art analysis. ACM Comput. Surv. (CSUR) 2017, 50, 1–33. [Google Scholar] [CrossRef]
  56. Thaler, T.; Ternis, S.F.; Fettke, P.; Loos, P. A Comparative Analysis of Process Instance Cluster Techniques. Wirtschaftsinformatik 2015, 2015, 423–437. [Google Scholar]
  57. Kaplan, R.; Norton, D. The Balanced Scorecard: Measures That Drive Performance. Available online: hbr.org/1992/01/the-balanced-scorecard-measures-that-drive-performance-2 (accessed on 15 August 2021).
  58. Buglione, L.; Abran, A. Balanced Scorecards and GQM: What are the differences. In Proceedings of the Third European Software Measurement Conference (FESMA-AEMES 2000), Madrid, Spain, 18–20 October 2000; pp. 18–20. [Google Scholar]
  59. Martinsons, M.; Davison, R.; Tse, D. The balanced scorecard: A foundation for the strategic management of information systems. Decis. Support Syst. 1999, 25, 71–88. [Google Scholar] [CrossRef]
  60. Ghani, A.A.; Wei, K.T.; Muketha, G.M.; Wen, W.P. Complexity metrics for measuring the understandability and maintainability of business process models using goal-question-metric (GQM). Int. J. Comput. Sci. Netw. Secur. 2008, 8, 219–225. [Google Scholar]
  61. Cyra, Ł.; Górski, J. Extending GQM by argument structures. In Proceedings of the IFIP Central and East European Conference on Software Engineering Techniques, Poznan, Poland, 10–12 October 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 26–39. [Google Scholar]
  62. Fenton, N.; Pfleeger, S. Software Metrics: A Rigorous and Practical Approach; PWS Publishing Co.: Boston, MA, USA, 1997. [Google Scholar]
  63. Berander, P.; Jönsson, P. A Goal Question Metric Based Approach for Efficient Measurement Framework Definition. In Proceedings of the 2006 ACM/IEEE International Symposium on Empirical Software Engineering, Rio de Janeiro, Brazil, 21–22 September 2006; Association for Computing Machinery: New York, NY, USA, 2006; pp. 316–325. [Google Scholar]
  64. Sanaa, H.; Afifi, W.; Darwish, N.R. The goal questions metrics for agile business intelligence. Egypt. Comput. Sci. J. 2016, 40, 24–42. [Google Scholar]
  65. Jarke, M.; Jeusfeld, M.A.; Quix, C.; Vassiliadis, P. Architecture and Quality in Data Warehouses: An Extended Repository Approach. Inf. Syst. 1999, 24, 229–253. [Google Scholar] [CrossRef] [Green Version]
  66. Olson, E.G.; Reger, S.J.M.; Singer, D.D. Finding and reducing needless complexity. Horizon 2010, 18, 53–61. [Google Scholar] [CrossRef]
  67. Murthy, U.; Kerr, D. Comparing audit team effectiveness via alternative modes of computer-mediated communication. Audit. J. Pract. Theory 2004, 23, 141–152. [Google Scholar] [CrossRef]
  68. Rana, A.; Turoff, M.; Hiltz, S.R. Task and technology interaction (TTI): A theory of technological support for group tasks. In Proceedings of the Thirtieth Hawaii International Conference on System Sciences, Wailea, HI, USA, 7–10 January 1997; Volume 2, pp. 66–75. [Google Scholar]
  69. Yang, J.; Redi, J.; Demartini, G.; Bozzon, A. Modeling task complexity in crowdsourcing. In Proceedings of the Fourth AAAI Conference on Human Computation and Crowdsourcing, Austin, TX, USA, 30 October–3 November 2016; AAAI Press: Menlo Park, CA, USA, 2016; pp. 249–258. [Google Scholar]
  70. Park, J. Complexity of Proceduralized Tasks. In Series in Reliability Engineering; Springer: London, UK, 2009; pp. 13–21. [Google Scholar]
  71. Ham, D.; Park, J.; Jung, W. Model-based identification and use of task complexity factors of human integrated systems. Reliab. Eng. Syst. Saf. 2012, 100, 33–47. [Google Scholar] [CrossRef]
  72. Payne, J. Task complexity and contingent processing in decision making: An information search and protocol analysis. Organ. Behav. Hum. Perform. 1976, 16, 366–387. [Google Scholar] [CrossRef]
  73. Saastamoinen, M.; Kumpulainen, S.; Vakkari, P.; Järvelin, K. Task complexity affects information use: A questionnaire study in city administration. Inf. Res. 2013, 19, 592. [Google Scholar]
  74. Syed, R.; Suriadi, S.; Adams, M.; Bandara, W.; Leemans, S.; Ouyang, C.; ter Hofstede, A.; van de Weerd, I.; Wynn, M.T.; Reijers, H. Robotic Process Automation: Contemporary themes and challenges. Comput. Ind. 2020, 115, 103162. [Google Scholar] [CrossRef]
  75. Sia, S.K.; Appu, A.V. Work Autonomy and Workplace Creativity: Moderating Role of Task Complexity. Glob. Bus. Rev. 2015, 16, 772–784. [Google Scholar] [CrossRef]
  76. Gutierrez, C.; Hussein, B. An extended literature review of organizational factors impacting project management complexity. In Proceedings of the 28th IPMA World Congress: Innovation through Dialogue (IPMA), Rotterdam, The Netherlands, 29 September–1 October 2014; IPMA: Amsterdam, The Netherlands, 2014; pp. 1–10. [Google Scholar]
  77. Botchkarev, A.; Finnigan, P. Complexity in the Context of Systems Approach to Project Management. Organ. Proj. Manag. 2015, 2, 15–34. [Google Scholar]
  78. Vidal, L.A.; Marle, F.; Bocquet, J.C. Building up a project complexity framework using an international Delphi study. Int. J. Technol. Manag. 2013, 62, 251–283. [Google Scholar] [CrossRef]
  79. Gul, S.; Khan, S. Revisiting project complexity: Towards a comprehensive model of project complexity. In Proceedings of the 2nd International Conference on Construction and Project Management, Singapore, 16–18 September 2011; IACSIT Press: Singapore, 2011; pp. 148–155. [Google Scholar]
  80. Diao, Y.; Bhattacharya, K. Estimating business value of IT services through process complexity analysis. In Proceedings of the IEEE Network Operations and Management Symposium, Salvador, Brazil, 7–11 April 2008; IEEE: New York, NY, USA, 2008; pp. 208–215. [Google Scholar]
  81. Diao, Y.; Keller, A. Quantifying the Complexity of IT Service Management Processes. In Proceedings of the 17th IFIP/IEEE International Workshop on Distributed Systems: Operations and Management, Dublin, Ireland, 23–25 October 2006; Springer: Berlin/Heidelberg, Germany, 2006; Volume 4269, pp. 61–73. [Google Scholar]
  82. Rennung, F.; Luminosu, C.; Draghici, A.; Paschek, D. An Evaluation of Strategic Methods of Complexity Management to Manage Large Outsourcing Projects Successfully. In Managing Innovation and Diversity in Knowledge Society through Turbulent Time, Proceedings of the MakeLearn and TIIM Joint International Conference, Timisoara, Romania, 25–27 May 2016; ToKnowPress: Bangkok, Thailand; Celje, Slovenia; Lublin, Poland, 2016; pp. 79–88. [Google Scholar]
  83. Schäfermeyer, M.; Rosenkranz, C.; Holten, R. The impact of business process complexity on business process standardization. Bus. Inf. Syst. Eng. 2012, 4, 261–270. [Google Scholar] [CrossRef]
  84. Kim, J.; Wilemon, D. Sources and assessment of complexity in NPD projects. R D Manag. 2003, 33, 15–30. [Google Scholar] [CrossRef]
  85. Mattsson, S.; Gullander, P.; Harlin, U.; Bäckstrand, G.; Fasth, Å.; Davidsson, A. Testing complexity index—A method for measuring perceived production complexity. Procedia CIRP 2012, 3, 394–399. [Google Scholar] [CrossRef] [Green Version]
  86. Latva-Koivisto, A. Finding a Complexity Measure for Business Process Models; Technical Report; Helsinki University of Technology, Systems Analysis Laboratory: Helsinki, Finland, 2001. [Google Scholar]
  87. Hinz, G.; Chen, G.; Aafaque, M.; Röhrbein, F.; Conradt, J.; Bing, Z.; Qu, Z.; Stechele, W.; Knoll, A. Online multi-object tracking-by-clustering for intelligent transportation system with neuromorphic vision sensor. In Proceedings of the Joint German/Austrian Conference on Artificial Intelligence (Künstliche Intelligenz), Dortmund, Germany, 25–29 September 2017; Springer: Cham, Switzerland, 2017; pp. 142–154. [Google Scholar]
  88. Riegler, A.; Holzmann, C. UI-CAT: Calculating user interface complexity metrics for mobile applications. In Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia, Linz, Austria, 30 November–2 December 2015; Association for Computing Machinery: New York, NY, USA, 2015; pp. 390–394. [Google Scholar]
  89. Chen, Z.; Suen, C. Measuring the complexity of rule-based expert systems. Expert Syst. Appl. 1994, 7, 467–481. [Google Scholar] [CrossRef]
  90. Bansal, V.; Negi, T. A metric for ERP complexity. In Proceedings of the International Conference on Business Information Systems, Innsbruck, Austria, 5–7 May 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 369–379. [Google Scholar]
  91. Courtis, J.K. Readability of annual reports: Western versus Asian evidence. Account. Audit. Account. J. 1995, 8, 4–17. [Google Scholar] [CrossRef]
  92. Pitt, L.; Sattari, S.; Bevelander, D. Are business school mission statements readable?: Evidence from the top 100. J. Strateg. Manag. Educ. 2010, 6, 1–16. [Google Scholar]
  93. Mills, A.; Pitt, L.; Sattari, S. Reading between the vines: Analyzing the readability of consumer brand wine web sites. Int. J. Wine Bus. Res. 2012, 24, 169–182. [Google Scholar] [CrossRef]
  94. Zhou, Y.; Yang, S. Roles of review numerical and textual characteristics on review helpfulness across three different types of reviews. IEEE Access 2019, 7, 27769–27780. [Google Scholar] [CrossRef]
  95. Rugimbana, R.; Patel, C. The application of the marketing concept in textbook selection: Using the Cloze procedure. J. Mark. Educ. 1996, 18, 14–20. [Google Scholar] [CrossRef]
  96. Barrot, J.S. Revisiting the role of linguistic complexity in ESL reading comprehension. 3L Southeast Asian J. Engl. Lang. Stud. 2013, 19, 5–18. [Google Scholar]
  97. Van der Sluis, F.; van den Broek, E.; Glassey, R.; van Dijk, E.; de Jong, F. When complexity becomes interesting. J. Assoc. Inf. Sci. Technol. 2014, 65, 1478–1500. [Google Scholar] [CrossRef]
  98. Eurostat. Statistical Classification of Economic Activities in the European Community; Technical Report; Eurostat: Luxembourg, 2020. [Google Scholar]
  99. Serrano, M.; Calero, C.; Piattini, M. Validating metrics for data warehouses. IEE Proc.-Softw. 2002, 149, 161–166. [Google Scholar] [CrossRef]
  100. Devpriya, S.; Ritu, S.; Kumar, M. A Framework for Validation of Object-Oriented Design Metrics. Int. J. Comput. Sci. Inf. Secur. 2009, 6, 46–52. [Google Scholar]
  101. Briand, L.; Differding, C.; Rombach, D. Practical guidelines for measurement-based process improvement. Softw. Process. Improv. Pract. 1996, 2, 253–280. [Google Scholar] [CrossRef]
  102. Weyuker, E. Evaluating software complexity measures. IEEE Trans. Softw. Eng. 1988, 14, 1357–1365. [Google Scholar] [CrossRef]
  103. Muketha, G.; Ghani, A.A.A.; Selamat, M.H.; Atan, R. A Survey of Business Process Complexity Metrics. Inf. Technol. J. 2010, 9, 1336–1344. [Google Scholar] [CrossRef] [Green Version]
  104. Krenn, F. Dealing with Process Complexity: A Multiperspective Approach. In Proceedings of the 10th International Conference on Subject-Oriented Business Process Management, Linz, Austria, 5–6 April 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–10. [Google Scholar]
  105. Cardoso, J. Complexity analysis of BPEL web processes. Softw. Process. Improv. Pract. 2007, 12, 35–49. [Google Scholar] [CrossRef]
  106. Chae, B. A complexity theory approach to IT-enabled services (IESs) and service innovation: Business analytics as an illustration of IES. Decis. Support Syst. 2014, 57, 1–10. [Google Scholar] [CrossRef]
  107. Schoenherr, T.; Hilpert, D.; Soni, A.; Venkataramanan, M.; Mabert, V. Enterprise systems complexity and its antecedents: A grounded-theory approach. Int. J. Oper. Prod. Manag. 2010, 30, 639–668. [Google Scholar] [CrossRef]
  108. Suen, C.Y.; Grogono, P.D.; Shinghal, R.; Coallier, F. Verifying, validating, and measuring the performance of expert systems. Expert Syst. Appl. 1990, 1, 93–102. [Google Scholar] [CrossRef]
  109. Alemerien, K.; Magel, K. GUIEvaluator: A Metric-tool for Evaluating the Complexity of Graphical User Interfaces. In Proceedings of the 26th International Conference on Software Engineering and Knowledge Engineering, Vancouver, BC, Canada, 1–3 July 2013; KSI Research Inc.: Skokie, IL, USA, 2014; pp. 13–18. [Google Scholar]
  110. Basili, V.; Hutchens, D. An empirical study of a syntactic complexity family. IEEE Trans. Softw. Eng. 1983, SE-9, 664–672. [Google Scholar] [CrossRef]
  111. Bargate, K. The readability of managerial accounting and financial management textbooks. Meditari Account. Res. 2012, 20, 4–20. [Google Scholar] [CrossRef]
  112. Leopold, H.; van der Aa, H.; Reijers, H. Identifying candidate tasks for robotic process automation in textual process descriptions. In Enterprise, Business-Process and Information Systems Modeling, Proceedings of the BPMDS 2018, EMMSAD 2018, Tallinn, Estonia, 11–12 June 2018; Lecture Notes in Business Information Processing; Gulden, J., Reinhartz-Berger, I., Schmidt, R., Guerreiro, S., Guedria, W., Bera, P., Eds.; Springer: Cham, Switzerland, 2018; pp. 67–81. [Google Scholar]
  113. Baccarini, D. The concept of project complexity—A review. Int. J. Proj. Manag. 1996, 14, 201–204. [Google Scholar] [CrossRef] [Green Version]
  114. Koziolek, H. Goal, question, metric. In Dependability Metrics; Springer: Cham, Switzerland, 2008; pp. 39–42. [Google Scholar]
  115. Kraus, S.; Breier, M.; Dasí-Rodríguez, S. The art of crafting a systematic literature review in entrepreneurship research. Int. Entrep. Manag. J. 2020, 16, 1023–1042. [Google Scholar] [CrossRef] [Green Version]
  116. Kilpi, T. Implementing a software metrics program at Nokia. IEEE Softw. 2001, 18, 72–77. [Google Scholar] [CrossRef]
  117. Van de Schoot, R.; de Bruin, J.; Schram, R.; Zahedi, P.; de Boer, J.; Weijdema, F.; Kramer, B.; Huijts, M.; Hoogerwerf, M.; Ferdinands, G.; et al. An open source machine learning framework for efficient and transparent systematic reviews. Nat. Mach. Intell. 2021, 3, 125–133. [Google Scholar] [CrossRef]
  118. Modrick, J. Review of concepts and approaches to complexity. In Proceedings of the Human Factors Society Annual Meeting, Atlanta, GA, USA, 12–16 October 1992; SAGE Publications Sage CA: Los Angeles, CA, USA, 1992; Volume 36, pp. 1166–1170. [Google Scholar]
  119. Byström, K.; Järvelin, K. Task complexity affects information seeking and use. Inf. Process. Manag. 1995, 31, 191–213. [Google Scholar] [CrossRef]
  120. Meyer, M.; Curley, K.F. The impact of knowledge and technology complexity on information systems development. Expert Syst. Appl. 1995, 8, 111–134. [Google Scholar] [CrossRef]
  121. Anderson, P. Perspective: Complexity theory and organization science. Organ. Sci. 1999, 10, 216–232. [Google Scholar] [CrossRef] [Green Version]
  122. Brockmann, C.; Girmscheid, G. Complexity of megaprojects. In Proceedings of the CIB World Building Congress: Construction for Development, Cape Town, South Africa, 14–17 May 2007; CIB: Ottawa, ON, Canada, 2007; pp. 219–230. [Google Scholar]
  123. Girmscheid, G.; Brockmann, C. The inherent complexity of large scale engineering projects. Proj. Perspect. 2008, 29, 22–26. [Google Scholar]
  124. Senescu, R.R.; Guillermo, A.M.; Haymaker, J.R. Relationships between project complexity and communication. J. Manag. Eng. 2013, 29, 183–197. [Google Scholar] [CrossRef]
  125. Mocker, M.; Weill, P.; Woerner, S. Revisiting complexity in the digital age. MIT Sloan Manag. Rev. 2014, 55, 73–81. [Google Scholar] [CrossRef]
  126. Gorzeń-Mitka, I.; Okręglicka, M. Review of complexity drivers in enterprise. In Proceedings of the Liberec Economic Forum, Liberec, Czech Republic, 16–17 September 2015; Technical University of Liberec: Liberec, Czech Republic, 2015; pp. 253–260. [Google Scholar]
  127. Rennung, F.M. Managing Complexity in Service Processes. The Case of Large Business Organizations. Ph.D. Thesis, Politehnica Unviersty of Timisoara, Timisoara, Romania, 2016. [Google Scholar]
  128. Luo, L.; He, Q.; Jaselskis, E.; Xie, J. Construction project complexity: Research trends and implications. J. Constr. Eng. Manag. 2017, 143, 04017019. [Google Scholar] [CrossRef]
  129. Basile, G.; Kaufmann, H.R.; Savastano, M. Revisiting complexity theory to achieve strategic intelligence. Int. J. Foresight Innov. Policy 2018, 13, 57–70. [Google Scholar] [CrossRef]
  130. Latos, B.A.; Harlacher, M.; Burgert, F.; Nitsch, V.; Przybysz, P.; Mütze-Niewöhner, S. Complexity drivers in digitalized work systems: Implications for cooperative forms of work. Adv. Sci. Technol. Eng. Syst. 2018, 3, 171–185. [Google Scholar] [CrossRef] [Green Version]
  131. Woodward, M.R.; Hennell, M.A.; Hedley, D. A measure of control flow complexity in program text. IEEE Trans. Softw. Eng. 1979, SE-5, 45–50. [Google Scholar] [CrossRef]
  132. Henry, S.; Kafura, D. Software structure metrics based on information flow. IEEE Trans. Softw. Eng. 1981, SE-7, 510–518. [Google Scholar] [CrossRef]
  133. Troy, D.; Zweben, S. Measuring the quality of structured designs. J. Syst. Softw. 1981, 2, 113–120. [Google Scholar] [CrossRef]
  134. Gibson, V.; Senn, J. System structure and software maintenance performance. Commun. ACM 1989, 32, 347–358. [Google Scholar] [CrossRef]
  135. Banker, R.; Datar, S.; Kemerer, C.; Zweig, D. Software complexity and maintenance costs. Commun. ACM 1993, 36, 81–95. [Google Scholar] [CrossRef]
  136. Chen, Z.; Suen, C.Y. Complexity metrics for rule-based expert systems. In Proceedings of the 1994 International Conference on Software Maintenance, Victoria, BC, Canada, 19–23 September 1994; IEEE: New York, NY, USA, 1994; pp. 382–391. [Google Scholar]
  137. Kang, H.; Seong, P. An information theory-based approach for quantitative evaluation of user interface complexity. IEEE Trans. Nucl. Sci. 1998, 45, 3165–3174. [Google Scholar] [CrossRef]
  138. Paakki, J.; Karhinen, A.; Gustafsson, J.; Nenonen, L.; Verkamo, I. Software metrics by architectural pattern mining. In Proceedings of the International Conference on Software: Theory and Practice (16th IFIP World Computer Congress), Beijing, China, 21–24 August 2000; pp. 325–332. [Google Scholar]
  139. Shao, J.; Wang, Y. A new measure of software complexity based on cognitive weights. Can. J. Electr. Comput. Eng. 2003, 28, 69–74. [Google Scholar] [CrossRef]
  140. Xing, J. Measures of Information Complexity and the Implications for Automation Design; Technical Report; Federal Aviation Administration: Washington, DC, USA, 2004. [Google Scholar]
  141. Xing, J.; Manning, C.A. Complexity and Automation Displays of Air Traffic Control: Literature Review and Analysis; Technical Report; Federal Aviation Administration: Washington, DC, USA, 2005. [Google Scholar]
  142. Cardoso, J. Business process quality metrics: Log-based complexity of workflow patterns. In Proceedings of the OTM Confederated International Conferences “On the Move to Meaningful Internet Systems”, Vilamoura, Portugal, 25–30 November 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 427–434. [Google Scholar]
  143. Gruhn, V.; Laue, R. Approaches for Business Process Model Complexity Metrics. In Technologies for Business Information Systems; Springer: Dordrecht, The Netherlands, 2007; pp. 13–24. [Google Scholar]
  144. Mendling, J.; Neumann, G. Error Metrics for Business Process Models. In Proceedings of the International Conference on Advanced Information Systems Engineering Forum, Trondheim, Norway, 11–15 June 2007; Springer: Berlin/Heidelberg, Germany, 2007; Volume 247, pp. 53–56. [Google Scholar]
  145. Vanderfeesten, I.; Cardoso, J.; Reijers, H. A weighted coupling metric for business process models. In Proceedings of the International Conference on Advanced Information Systems Engineering Forum, Trondheim, Norway, 11–15 June 2007; Springer: Berlin/Heidelberg, Germany, 2007; Volume 247, pp. 41–44. [Google Scholar]
  146. Cardoso, J. Business process control-flow complexity: Metric, evaluation, and validation. Int. J. Web Serv. Res. 2008, 5, 49–76. [Google Scholar] [CrossRef] [Green Version]
  147. Vanderfeesten, I.; Reijers, H.; van der Aalst, W. Evaluating workflow process designs using cohesion and coupling metrics. Comput. Ind. 2008, 59, 420–437. [Google Scholar] [CrossRef] [Green Version]
  148. Vanderfeesten, I.; Reijers, H.; Mendling, J.; van der Aalst, W.; Cardoso, J. On a quest for good process models: The cross-connectivity metric. In Proceedings of the International Conference on Advanced Information Systems Engineering, Montpellier, France, 18–20 June 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 480–494. [Google Scholar]
  149. Gao, S.; Li, C. Complex network model for software system and complexity measurement. In Proceedings of the WRI World Congress on Computer Science and Information Engineering, Los Angeles, CA, USA, 31 March – 2 April 2009; Volume 7, pp. 624–628. [Google Scholar]
  150. Lassen, K.B.; van der Aalst, W. Complexity metrics for workflow nets. Inf. Softw. Technol. 2009, 51, 610–626. [Google Scholar] [CrossRef]
  151. Sánchez-González, L.; García, F.; Mendling, J.; Ruiz, F.; Piattini, M. Prediction of business process model quality based on structural metrics. In Proceedings of the International Conference on Conceptual Modeling, Vancouver, BC, Canada, 1–4 November 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 458–463. [Google Scholar]
  152. Marchetto, A.; Francescomarino, C.D.; Tonella, P. Optimizing the trade-off between complexity and conformance in process reduction. In Proceedings of the International Symposium on Search Based Software Engineering, Szeged, Hungary, 10–12 September 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 158–172. [Google Scholar]
  153. Posnett, D.; Hindle, A.; Devanbu, P. A simpler model of software readability. In Proceedings of the 8th Working Conference on Mining Software Repositories, Honolulu, HI, USA, 21–22 May 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 73–82. [Google Scholar]
  154. Setiawan, M.A.; Sadiq, S. Integrated framework for business process complexity analysis. In Proceedings of the 21st European Conference on Information Systems, Utrecht, The Netherlands, 5–8 June 2013; Association for Information Systems: Atlanta, GA, USA, 2013; pp. 1–12. [Google Scholar]
  155. Schütz, A.; Widjaja, T.; Kaiser, J. Complexity in Enterprise Architectures-Conceptualization and Introduction of a Measure from a System Theoretic Perspective. In Proceedings of the 21st European Conference on Information Systems, Utrecht, The Netherlands, 5–8 June 2013; Association for Information Systems: Atlanta, GA, USA, 2013; pp. 1–12. [Google Scholar]
  156. Benner-Wickner, M.; Book, M.; Brückmann, T.; Gruhn, V. Examining case management demand using event log complexity metrics. In Proceedings of the IEEE 18th International Enterprise Distributed Object Computing Conference Workshops and Demonstrations, Ulm, Germany, 1–2 September 2014; pp. 108–115. [Google Scholar]
  157. Kluza, K.; Nalepa, G.; Lisiecki, J. Square complexity metrics for business process models. In Advances in Business ICT; Advances in Intelligent Systems and Computing; Mach-Król, M., Pełech-Pilichowski, T., Eds.; Springer: Cham, Switzerland, 2014; Volume 257, pp. 89–107. [Google Scholar]
  158. Miniukovich, A.; Angeli, A.D. Quantification of interface visual complexity. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces, Como, Italy, 27–29 May 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 153–160. [Google Scholar]
  159. Taba, S.E.S.; Keivanloo, I.; Zou, Y.; Ng, J.; Ng, T. An exploratory study on the relation between user interface complexity and the perceived quality. In Proceedings of the International Conference on Web Engineering, Toulouse, France, 1–4 July 2014; Springer: Cham, Switzerland, 2014; pp. 370–379. [Google Scholar]
  160. De Oca, I.M.M.; Snoeck, M.; Reijers, H.; Rodríguez-Morffi, A. A systematic literature review of studies on business process modeling quality. Inf. Softw. Technol. 2015, 58, 187–205. [Google Scholar] [CrossRef] [Green Version]
  161. Debnath, N.; Peralta, M.; Salgado, C.; Baigorria, L.; Montejano, G.; Riesco, D. Metrics and performance indicators to evaluate workflow processes on the cloud. In Proceedings of the IEEE/ACS 12th International Conference of Computer Systems and Applications, Marrakech, Morocco, 17–20 November 2015; pp. 1–6. [Google Scholar]
  162. Kluza, K. Measuring complexity of business process models integrated with rules. In Proceedings of the International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland, 14–18 June 2015; Springer: Cham, Switzerland, 2015; pp. 649–659. [Google Scholar]
  163. Marin, M.; Lotriet, H.; van der Poll, J. Metrics for the case management modeling and notation (CMMN) specification. In Proceedings of the 2015 Annual Research Conference on South African Institute of Computer Scientists and Information Technologists, Stellenbosch, South Africa, 28–30 September 2015; Association for Computing Machinery: New York, NY, USA, 2015; pp. 1–10. [Google Scholar]
  164. Müller, G.; Bergmann, R. Complexity-aware generation of workflows by process-oriented case-based reasoning. In Proceedings of the Joint German/Austrian Conference on Artificial Intelligence (Künstliche Intelligenz), Dortmund, Germany, 25–29 September 2017; Springer: Cham, Switzerland, 2017; pp. 207–221. [Google Scholar]
  165. Petrusel, R.; Mendling, J.; Reijers, H. How visual cognition influences process model comprehension. Decis. Support Syst. 2017, 96, 1–16. [Google Scholar] [CrossRef]
  166. Wehling, K.; Wille, D.; Seidl, C.; Schaefer, I. Decision support for reducing unnecessary IT complexity of application architectures. In Proceedings of the 2017 IEEE International Conference on Software Architecture Workshops, Gothenburg, Sweden, 5–7 April 2017; pp. 161–168. [Google Scholar]
  167. Mannan, U.A.; Ahmed, I.; Sarma, A. Towards understanding code readability and its impact on design quality. In Proceedings of the 4th ACM SIGSOFT International Workshop on NLP for Software Engineering, Lake Buena Vista, FL, USA, 4 November 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 18–21. [Google Scholar]
  168. Nalepa, G. Integrating Business Process Models with Rules. In Modeling with Rules Using Semantic Knowledge Engineering; Springer: Cham, Switzerland, 2018; pp. 313–337. [Google Scholar]
  169. Hasić, F.; Vanthienen, J. Complexity metrics for DMN decision models. Comput. Stand. Interfaces 2019, 65, 15–37. [Google Scholar] [CrossRef]
  170. Jones, M.J.; Shoemaker, P.A. Accounting narratives: A review of empirical studies of content and readability. J. Account. Lit. 1994, 13, 142–184. [Google Scholar]
  171. Jones, M.J. Readability of annual reports: Western versus Asian evidence—A comment to contexualize. Account. Audit. Account. J. 1996, 9, 86–91. [Google Scholar] [CrossRef]
  172. Courtis, J.K. Annual report readability variability: Tests of the obfuscation hypothesis. Account. Audit. Account. J. 1998, 11, 459–472. [Google Scholar] [CrossRef]
  173. Hay, D. Communication in auditors’ reports: Variations in readability and the effect of audit firm structure. Asia-Pac. J. Account. 1998, 5, 179–197. [Google Scholar] [CrossRef]
  174. Sydserff, R.; Weetman, P. A texture index for evaluating accounting narratives: An alternative to readability formulas. Account. Audit. Account. J. 1999, 12, 459–488. [Google Scholar] [CrossRef]
  175. Clatworthy, M.; Jones, M.J. The effect of thematic structure on the variability of annual report readability. Account. Audit. Account. J. 2001, 14, 311–326. [Google Scholar] [CrossRef]
  176. Iu, J.; Clowes, C. Approaches to evaluating accounting narratives: A corporate social responsibility perspective. In Proceedings of the Governance and Social Responsibility Conference, Burwood, Australia, 26–27 November 2001; Deakin University: Burwood, Australia, 2001; pp. 111–123. [Google Scholar]
  177. Leong, E.; Ewing, M.; Pitt, L. E-comprehension: Evaluating B2B websites using readability formulae. Ind. Mark. Manag. 2002, 31, 125–131. [Google Scholar] [CrossRef]
  178. Rutherford, B. Obfuscation, textual complexity and the role of regulated narrative accounting disclosure in corporate governance. J. Manag. Gov. 2003, 7, 187–210. [Google Scholar] [CrossRef]
  179. Iu, J.; Clowes, C. Evaluating a measure of content quality for accounting narratives (with an empirical application to narratives from Australia, Hong Kong, and the United States). In Proceedings of the Fourth Asia Pacific Interdisciplinary Research in Accounting Conference APIRA 2004 Proceedings, Singapore, 4–6 July 2004; Nanyang Technological University: Singapore, 2004; pp. 1–21. [Google Scholar]
  180. Mohamad, R.; Rahman, A.A. Readability of corporate annual reports of top 100 Malaysian companies. Malays. Manag. J. 2006, 10, 33–47. [Google Scholar] [CrossRef]
  181. Hewaidy, A. Readability of financial statement footnotes of Kuwaiti corporations. Eur. J. Econ. Financ. Adm. Sci. 2007, 8, 18–28. [Google Scholar]
  182. Clarke, D.; Hrasky, S.; Tan, C. Voluntary narrative disclosures by local governments: A comparative analysis of the textual complexity of mayoral and chairpersons’ letters in annual reports. Aust. J. Public Adm. 2009, 68, 194–207. [Google Scholar] [CrossRef]
  183. Hrasky, S.; Mason, C.; Wills, D. The textual complexity of annual report narratives: A comparison of high-and low-performance companies. N. Z. J. Appl. Bus. Res. 2009, 7, 31–45. [Google Scholar]
  184. Sawyer, A. Enhancing compliance through improved readability: Evidence from New Zealand’s rewrite “experiment”. In Recent Research on Tax Administration and Compliance; Gangi, M.E., Plumley, A., Eds.; IRS Research Bulletin: Washington, DC, USA, 2010; pp. 31–56. [Google Scholar]
  185. Sattari, S.; Pitt, L.; Caruana, A. How readable are mission statements? An exploratory study. Corp. Commun. Int. J. 2011, 16, 282–292. [Google Scholar] [CrossRef]
  186. Stone, G.W. Readability of accountants’ communications with small business—Some Australian evidence. Account. Forum 2011, 35, 247–261. [Google Scholar] [CrossRef]
  187. Korfiatis, N.; GarcíA-Bariocanal, E.; SáNchez-Alonso, S. Evaluating content quality and helpfulness of online product reviews: The interplay of review helpfulness vs. review content. Electron. Commer. Res. Appl. 2012, 11, 205–217. [Google Scholar] [CrossRef]
  188. Sattari, S. Essays on the Issues of Readability in Business Disciplines. Ph.D. Thesis, Luleå University of Technology, Luleå, Sweden, 2012. [Google Scholar]
  189. Rameezdeen, R.; Rodrigo, A. Textual complexity of standard conditions used in the construction industry. Constr. Econ. Build. 2013, 13, 1–12. [Google Scholar] [CrossRef] [Green Version]
  190. Sattari, S.; Wallström, Å. Tourism websites in the Middle East: Readable or not? Int. J. Leis. Tour. Mark. 2013, 3, 201–214. [Google Scholar] [CrossRef]
  191. Stone, G.; Parker, L. Developing the Flesch reading ease formula for the contemporary accounting communications landscape. Qual. Res. Account. Manag. 2013, 10, 31–59. [Google Scholar] [CrossRef]
  192. Goldman, S.R.; Lee, C.D. Text complexity: State of the art and the conundrums it raises. Elem. Sch. J. 2014, 115, 290–300. [Google Scholar] [CrossRef]
  193. Jones, M.; Smith, M. Traditional and alternative methods of measuring the understandability of accounting narratives. Account. Audit. Account. J. 2014, 27, 183–208. [Google Scholar] [CrossRef]
  194. Richards, G.; Fisher, R.; van Staden, C. Readability and Thematic Manipulation in Corporate Communications: A Multi-Disclosure Investigation. In Proceedings of the 2015 Conference Accounting and Finance Association of Australia and New Zealand, Hobart, Australia, 5–7 July 2015; Accounting and Finance Association of Australia and New Zealand: Melbourne, Australia, 2015; pp. 1–38. [Google Scholar]
  195. Ajina, A.; Laouiti, M.; Msolli, B. Guiding through the Fog: Does annual report readability reveal earnings management? Res. Int. Bus. Financ. 2016, 38, 509–516. [Google Scholar] [CrossRef]
  196. Allini, A.; Ferri, L.; Maffei, M.; Zampella, A. From Accountability to Readability in the Public Sector: Evidence from Italian Universities. Int. J. Bus. Manag. 2017, 12, 27–35. [Google Scholar] [CrossRef] [Green Version]
  197. Toit, E.D. The readability of integrated reports. Meditari Account. Res. 2017, 25, 629–653. [Google Scholar] [CrossRef] [Green Version]
  198. Khan, M.S.; Ahmed, I.; Khan, Z.I. Readability of mission statements: A look at fortune 500. J. Qual. Technol. Manag. 2017, 13, 1–14. [Google Scholar]
  199. Nazari, J.; Hrazdil, K.; Mahmoudian, F. Assessing social and environmental performance through narrative complexity in CSR reports. J. Contemp. Account. Econ. 2017, 13, 166–178. [Google Scholar] [CrossRef]
  200. Reschenhofer, T.; Waltl, B.; Shumaiev, K.; Matthes, F. A conceptual model for measuring the complexity of spreadsheets. In Proceedings of the EuSpRIG 2016 Conference “Spreadsheet Risk Management”, England and Wales (ICAEW), London, UK, 7–8 July 2016; pp. 37–48. [Google Scholar]
  201. Lim, E.K.; Chalmers, K.; Hanlon, D. The influence of business strategy on annual report readability. J. Account. Public Policy 2018, 37, 65–81. [Google Scholar] [CrossRef]
Figure 1. Publication trend per complexity type.
Figure 1. Publication trend per complexity type.
Information 12 00423 g001
Figure 2. Complexity concepts mind map.
Figure 2. Complexity concepts mind map.
Information 12 00423 g002
Figure 3. Metrics origin distribution per complexity type.
Figure 3. Metrics origin distribution per complexity type.
Information 12 00423 g003
Figure 4. Types of inputs per complexity type.
Figure 4. Types of inputs per complexity type.
Information 12 00423 g004
Figure 5. Types of outputs per complexity type.
Figure 5. Types of outputs per complexity type.
Information 12 00423 g005
Figure 6. Motivations over time per complexity type.
Figure 6. Motivations over time per complexity type.
Information 12 00423 g006
Figure 7. Motivations in relation to metrics origin per complexity type.
Figure 7. Motivations in relation to metrics origin per complexity type.
Information 12 00423 g007
Figure 8. Novelty per complexity type.
Figure 8. Novelty per complexity type.
Information 12 00423 g008
Figure 9. Focus areas per complexity type.
Figure 9. Focus areas per complexity type.
Information 12 00423 g009
Figure 10. Application cases per complexity type.
Figure 10. Application cases per complexity type.
Information 12 00423 g010
Figure 11. Future research directions over time per complexity type.
Figure 11. Future research directions over time per complexity type.
Information 12 00423 g011
Table 1. Inclusion and exclusion criteria.
Table 1. Inclusion and exclusion criteria.
TypeCriterion
inclusion
  • related to organizational, technological, and textual complexity in organizations;
  • cited at least five times (except for the publications later than 2018);
  • written in English.
exclusion
  • with an exclusive focus on product complexity, system and architecture complexity, or network complexity;
  • complexity related to organizational structure;
  • theoretical speculations;
  • studies of the same author on the same topic (only the most relevant ones are selected).
Table 2. Identified aspects and features for morphological box.
Table 2. Identified aspects and features for morphological box.
AspectFeatureDefinition
Genericmotivationfactors that encourage the researchers to conduct their research
noveltynew and interesting contributions within the research, originality, distinctive research contribution
focus areaa broad area where the research is conducted
application casea narrow and specific area where research artifacts are tested
future researchpotential directions for further studies
Complexity Metrics and Analysismetrics
origin
theories and disciplines laid the foundation of complexity metrics
inputall required data and information to perform complexity analysis and calculation
outputresults of the complexity analysis and calculation
validationhow proposed metrics are validated
Implementationtool
support
whether any tool is used or developed
Table 3. Complexity dimensions and notions.
Table 3. Complexity dimensions and notions.
DimensionPerspectiveNotions
D1observerobjective, subjective
D2timestructural, dynamic
D3measuresqualitative, quantitative
D4dynamics, predictabilityorganized, disorganized
Table 4. Morphological box describing three complexity aspects.
Table 4. Morphological box describing three complexity aspects.
Generic AspectsComplexity Metrics and Analysis AspectsImplementation Aspects
motivationnoveltyfocus areaapplication casefuture researchinputoutputmetrics originvalidationtool support
Complexity metrics
development
Specific application
area
Business
Administration
Information and
Communication
Approach extension,
Metrics extension,
Framework extension
Business
information
Complexity
measurements
Cognitive Informatics;
Cognitive Sciences;
Human Sciences;
Organizational Sciences
EmpiricalNo
38%31%27%29%46%32%62%63%66%35%
Complexity factorsNew approach;
New framework;
New metrics
Business Process
Management
Professional, scientific and
technical activities
New validation studiesSoftware and
architectures
Complexity factorsMathematics;
Process Mining;
Software Engineering
Theoretical
17%
Using existing tools
19%23%22%28%22%17%15%39%23%
Reasons for
complexity metrics
Complexity studies
review
Corporate FinanceManufacturingTool supportBusiness process
models
Complexity effectsLinguisticsYes
11%
16%16%20%9%6%11%14%32%
Complexity
effects
Empirical study
findings
Software EngineeringFinancial and
insurance activities
Complexity effects,
Complexity factors
Event logs;
Workflows
Complexity studies
review findings
Decision making
12%9%19%6%3%10%6%17%
Complexity
measurements
Complexity effect
analysis;
Complexity factor
analysis
eCommerceWholesale and
retail trade
Approach implementation,
Guidelines development,
Metrics comparison
3%
Case studies
and interviews
Metrics selection;
Metrics evaluation
Graph Theory;
System Theory
9%11%5%5%5%5%13%
Complexity studies
review
Metrics evaluation;
Metrics adaptation;
Tool support
Enterprise Architecture
Management;
IT Service Management
EducationOthers
12%
Complexity reduction
method
3%
Information Theory
8%5%5%4%5%
Complexity metrics
analysis
Specific research
artifacts
Information and
Innovation Management
OthersOthers (Psychology;
Complexity Theory;
Complexity Sciences)
5%5%2%23%7%
Table 5. Relative distribution of the complexity notions per complexity type.
Table 5. Relative distribution of the complexity notions per complexity type.
Complexity ConceptD1: ObserverD2: TimeD3: MeasuresD4: Dynamics
Predictability
see Figure 2Objective,
Subjective
Structural,
Dynamic
Quantitative,
Qualitative
Organized,
Disorganized
Technological complexity (%)
Business process modelObj (91)
Subj (36)
Str (100)
Dyn (18)
Quan (77)
Qual (50)
Org (95)
Dorg (27)
Event logs and workflowsObj (100)Str (100)Quan (80)
Qual (20)
Org (100)
SoftwareEnterprise systemsObj (100)
Subj (50)
Str (100)Quan (100)
Qual (50)
Org (100)
IT architecturesObj (100)Str (100)Quan (100)Org (100)
IT servicesObj (100)Dyn (100)Qual (100)Dorg (100)
Rule-based systemsObj (50)
Subj (50)
Str (100)
Dyn (50)
Qual (100)Org (100)
Dorg (50)
User interfacesObj (100)
Subj (57)
Str (100)Quan (100)
Qual (43)
Org (100)
Dorg (29)
ProgramsObj (92)
Subj (17)
Str (100)
Dyn (8)
Quan (75)
Qual (42)
Org (100)
Organizational complexity (%)
Organization as a wholeObj (20)
Subj (100)
Str (100)
Dyn (40)
Quan (40)
Qual (80)
Org (80)
Dorg (40)
TaskObj (86)
Subj (79)
Str (100)
Dyn (7)
Quan (64)
Qual (71)
Org (100)
Dorg (14)
ProjectObj (70)
Subj (70)
Str (100)
Dyn (20)
Quan (50)
Qual (90)
Org (100)
Dorg (10)
ProcessObj (100)
Subj (20)
Str (100)
Dyn (20)
Quan (80)
Qual (40)
Org (100)
Dorg (20)
Product(-ion)Obj (50)
Subj (50)
Str (100)Quan (50)
Qual (50)
Org (100)
Textual complexity (%)
Legislative documentationObj (83)
Subj (41)
Str (100)Quan (83)
Qual (41)
Org (100)
News articlesObj (100)Str (100)Quan (100)Org (100)
WebpagesObj (100)Str (100)Quan (100)Org (100)
Online reviewsObj (100)Str (100)Quan (100)Org (100)
Textbooks and other
teaching materials
Obj (33)
Subj (100)
Str (100)Quan (33)
Qual (100)
Org (100)
Table 6. An example illustrates the use of the proposed method.
Table 6. An example illustrates the use of the proposed method.
Goal: Currently, in the prevailingly remote way of working, managers are giving the tasks to the employees very often in a textual form, for example, per email. Analyze the workload of employees.
Question 1: How difficult is it for employees to read and comprehend the manager’s emails?
a. Data: textual email datab. Complexity type: textualc. Complexity dimensions and notions: D1: objective, D2: structural, D3: quantitative, D4: organizedd. Complexity analysis approach: Flesh Reading Ease score, Gunning Fog Index, etc. [111]
Question 2: How many activities does an email contain?
a. Data: textual email datab. Complexity type: textualc. Complexity dimensions and notions: D1: objective, D2: structural, D3: quantitative, D4: organizedd. Complexity analysis approach: count of verbs, specific approaches suggested in recent research [69,112]
Question 3: What is the task complexity from the employee’s point of view?
a. Data: textual email data, specific information from employees regarding task complexityb. Complexity type: organizational, tasksc. Complexity dimensions and notions: D1: objective, subjective, D2: structural, dynamic, D3: quantitative, qualitative, D4: organized, disorganizedd. Complexity analysis approach: amount and clarity of inputs, processing, and output [11], objective (size, distance functions) and subjective (experience, motivation, etc.) [25]
Question 4: How often does an employee receive such emails per day?
a. Data: email event log datab. Complexity type: technological, event logs and workflowsc. Complexity dimensions and notions: D1: objective, D2: structural, D3: quantitative, D4: organizedd. Complexity analysis approach: count of specific case event IDs per day
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Revina, A.; Aksu, Ü.; Meister, V.G. Method to Address Complexity in Organizations Based on a Comprehensive Overview. Information 2021, 12, 423. https://doi.org/10.3390/info12100423

AMA Style

Revina A, Aksu Ü, Meister VG. Method to Address Complexity in Organizations Based on a Comprehensive Overview. Information. 2021; 12(10):423. https://doi.org/10.3390/info12100423

Chicago/Turabian Style

Revina, Aleksandra, Ünal Aksu, and Vera G. Meister. 2021. "Method to Address Complexity in Organizations Based on a Comprehensive Overview" Information 12, no. 10: 423. https://doi.org/10.3390/info12100423

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop