1. Introduction
In 1979, Russel Ackoff, the operational research (OR) specialist who also taught philosophy, published his well-known paper “
The future of operational research is past” [
1]. In this paper, he argues that the classic OR methodology is inappropriate in general and that deficiencies exist in the concept and practice of optimisation (which is the basis of classical OR). Ackoff also suggests that there are problems with the way in which operational researchers pursue objectivity in their work. He proposes that systems thinking, with the associated concept of expansionism (as opposed to reductionism), be applied to OR. The importance of systems thinking and other potential paradigmatic dilemmas has already been pointed out in earlier work by Ackoff [
2] and others such as [
3,
4]. A large number of studies in different application areas have been performed in the area of systems thinking, for example theories such as a general systems approach, cybernetics, system dynamics, problem structuring methods, critical systems and multi-methodologies. Mingers and White [
5] provide a review of the contribution of these theories and methodologies to OR. A more recent development in OR that also attempts to address some of the reductionist deficiencies is behavioural OR, which studies the behavioural aspects of the use of OR in problem solving and decision support [
6].
Despite a number of recent studies that aimed to address the reductionist deficiencies, it appears that OR is still mainly inspired by a Newtonian framework that claims that the universe can be understood through a process of analysis and reductionism of systems into parts in order to understand how the whole works. As mentioned above, this viewpoint has been indicated and critiqued as early as 1962 by OR researchers who argued that the discipline of OR is based on a mechanistic worldview that is supported by methodologies and practices that share and propagate these assumptions. Ackoff [
1] lists six deficiencies to support these arguments and contends that the OR methodology does not take into account the complexity of the large number of role players and their interactions and intricate relationships. The deficiencies (which will be revisited and explained in
Section 2) include aspects such as the absence of systems that can learn and adapt; a lack of quality-of-life values; inappropriate model abstraction; the absence of a synthesising planning paradigm; an acknowledgement of the interdisciplinary nature of systems; and an insufficient level of objectivity in OR.
In addition to the above, there is also a strong argument for methodological pluralism (also known as multimethodology) as a valuable approach to effectively deals with stakeholder concerns in OR related problem situations [
7]. Midgley describes methodological pluralism as an engagement on a
methodology level (respect other’s methodological ideas and obtain an insight to inform an own methodology) and a
method level (use a wide range of methods). The value of adopting a methodological pluralism outlook is based on the notion that no one theory or set of theories, can ever be comprehensive. The purpose of learning from other methodologies is therefore, according to Midgley, to provide for a continued evolution of one’s own methodology. This is achieved by reflections on similarities and differences between an own methodology and other methodologies. The idea of systems thinking is central to the methodological pluralism view and Midgley [
7] continues to explain systems thinking as three “waves” that influenced OR methods over the years. The first wave (1945–1970s) focuses on quantitative applied science informing hard science (positivist) OR methods such as System Dynamics, Systems Engineering and Viable Systems Modelling. The second wave (1970–1980s) was based on criticisms of the first wave—typically criticism from thinkers such as Ackoff. The focus was moved towards debate rather than expert modelling techniques and include methods such as Interactive Planning and Soft Systems Methodology. The third wave (1980s-present) emerged following critique that the second wave system thinking does not account for conflicts and power relationships. The so called third wave, also referred to as Critical Systems Thinking (CST), was built on the argument for methodological pluralism [
8] as well as the Critical Systems Heuristics methodology developed by [
9]. Both of these two cornerstones of CST will be briefly revisited later in this paper. The nature of Critical Systems Thinking is described by [
10] (p. 136) as “The purpose of contemporary critical systems thinking and practice is to learn about and harness the various systems methodologies, methods and models so that they can best be used by managers to respond to the complexity, turbulence and heterogeneity of the problem situations they face”. Jackson also delivers a strong motivation for methodological pluralism and reiterates that “Pluralism would respect the strength of the various trends in systems thinking, encouraging their theoretical development and suggesting ways in which they could be appropriately attuned to the complexity of managerial problem situations” [
10] (p. 135). Further support for methodological pluralism can be found in References [
11,
12].
Taking the strong argument for methodological pluralism and learning from different theories into account, as well as the reductionist approach sometimes followed by traditional OR practitioners, it becomes clear that other paradigms may also be considered to not just supplement but rather to complement OR methodologies. One such paradigm that may be appropriate to reflect on may be found in complexity theory.
Although difficult to define [
13] (Scholars agree that there is no unified ‘theory of complexity’ with one central definition and set of first principles [
13]; they also agree that several conceptual origins can be traced, rooted in different disciplines that have been combined to form a collective understanding of what have come to be known as ‘complexity theory.’ Checkland [
14] even suggests that it might be better to think of all the endeavours that have notions of complexity (and the study of complex phenomena as their main purpose) as processes that embrace a ‘complexity approach’ rather than wanting to unite these efforts in a ‘grand theory’ of complexity. With this acknowledged, the term ‘complexity theory’ will be used in this paper to refer to the main theoretical underpinnings that inform this broad field of study.), complexity theory offers a supplementary and different theoretical framework for the way in which certain systems can be understood. Paul Cilliers [
15], the late South African philosopher, describes complexity in terms of ten general system properties; this allows one to identify and characterise phenomena as complex systems, should they exhibit these properties. According to Cilliers, these characteristics form a lean ontological foundation for arguing that complexity is a systems property that arises due to the dynamic interaction between the parts of a system and the way it is embedded in an environment with which it interacts in non-linear ways. Complex systems also exhibit other features such as emergence, self-organisation and other net-like causal structures [
15,
16,
17]. These characteristics and properties clearly indicate that a complex system cannot be assessed and engaged with in the same paradigm of order in which OR traditionally operates.
This paper argues that the majority of typical OR applications function in a complex reality that can be described in terms of the characteristics of a complex system. Furthermore, in light of the importance of methodological pluralism and the idea of learning from other theories, this study proposes that OR methodologies can be complemented by considering a complexity theory approach. This goal may be translated into the following hypothesis: By acknowledging the importance and advantages of a methodological pluralism approach, together with an understanding that OR-related decision making takes place in a complex reality, new insights into the methods for engaging with decision making in a real-life context could be obtained—i.e., the underlying theory that defines the field of OR can be complemented by considering other theories such as complexity theory.
Mingers and White [
5] listed complexity theory as one of the recently developed theories that overlaps with systems thinking. However, complexity theory is often linked to chaos theory and the two terms are ostensibly used as synonyms for the same theory [
18]. In clarifying this relation between the fields, it is important to acknowledge that systems thinking and complexity theory have similarities and differences and they represent interacting and overlapping research communities [
19]. There are also other research communities, for example cybernetics, that have an interacting relationship with complexity theory. Midgley and Richardson [
19] (p. 171) emphasised this interrelationship as follows: “There are differences in the agendas [of systems thinking, cybernetics and complexity theory], so the separate identities are worth preserving … but there is sufficient similarity to make mutual respect and learning across community boundaries worthwhile”. Within these distinct but overlapping three research communities (that coalesced as such in the 1940s), there has been a lot of learning across boundaries. Some researchers contribute to more than one of these communities, although they keep their distinct emphases and tend to reference different key texts.
As stated above, in this study, the discussion of complexity theory will be based on the philosophical interpretation as developed by Cilliers [
15] as complexity theory is viewed as an opportunity, within a methodological pluralism view, to expand the understanding of systems. No claim is made that the computational and mathematical models in OR are wrong or of little use. These computational models are of particular significance in deterministic situations where variables and parameters are known and when the problem context is clear and the desired outcomes well defined. Computational and mathematical models are necessary for the development and progress of OR and fulfil an important role in many circumstances. They have produced wonderful results in the past and will continue to do so in a great variety of application areas. The argument is rather that by adopting a methodological pluralism approach and reflecting on complexity theory principles, these models and their underlying theories can be enriched and complemented.
By aligning OR epistemologies with the acknowledgment of real-world complexity, new methods for modelling decision making could be developed. In light of complexity theory, these methods should be cognisant of emergence, boundary setting, provisional knowledge claims and the (ethical) responsibilities that accompany the practical implementation of such methods. Some concrete examples will be presented as part of the study to elucidate how OR could be enhanced when applying a complexity lens. Moreover, complexity theory is chosen as the comparative paradigm and epistemology to OR, because complexity theory advocates and offers explanations for concepts such as non-linear interactions in complex environments and non-hierarchical structures, which are regularly experienced in OR applications.
2. Operational Research: Epistemological Questions
An extended discussion on the definition of OR and the differences between the classical modelling methods (termed “first wave of systems thinking” by Midgley [
7]) and alternative approaches that are focussed on participation and human relationships (termed “second wave of systems thinking” by Midgley [
7]) that can be assessed by processes of problem structuring methods, can be found in Rosenhead and Mingers [
20].
As mentioned in the introduction, Ackoff [
1] noted six deficiencies with regards to the focus on optimisation concepts used in first wave OR approaches. Ackoff’s main critique of these methods was based on the assumption that the models assumed that the problem context was predominantly closed and mechanistic in nature-and that cause and effect resulted in a deterministic concept of reality. In response to these assumptions he proposes a systems thinking approach in which “purposeful systems that contain purposeful parts with different roles or functions and that are themselves parts of larger purposeful systems” can be designed [
1] (p. 96). This approach will then facilitate a process whereby systems can effectively serve their own purposes (self-control), the purposes of their parts (humanisation) and the purposes of larger systems of which they are part (environmentalisation).
In the context of an OR problem, these concepts may be explained in terms of the well-known facility location problem, which can be solved by means of a linear programming model. Suppose that a number of warehouses have to be built on predetermined potential sites. The objective is to determine the appropriate number of warehouses and the sites where they should be built; this should be done in such a way that the demands of all customers are met at a minimum cost. If the only ultimate goal is to build the warehouses in order to serve all the customers at an absolute minimum cost, without taking any other factors into account, the problem and solution are focused on their own purposes (self-control). If other factors are taken into consideration, for example the impact on people and communities living in the area where the warehouse will be erected, the solution will also serve the purposes of their parts (humanisation). For example, a warehouse may cause unemployment in a community where people (small businesses) were employed to store and transport products that may now be available from a nearby warehouse. If the influence on the larger region and environment is also taken into account, the purposes of larger systems (environmentalisation) are considered. Quality of life may be reduced due to pollution, noise and the destruction of eco-systems caused by building activities, to name but a few.
For Ackoff [
1], OR is almost exclusively concerned with self-control and thus with serving its own purpose. Based on this motivation, he identifies the following deficiencies of classic OR:
A need for decision-making systems that can learn and adapt—something that an optimising system cannot do.
Taking into account values that are relevant to quality of life (called aesthetic values by Ackoff).
Model abstraction of systems of problems (called messes by Ackoff). According to Ackoff, problems cannot be treated effectively by decomposing them analytically into separate problems.
A synthesising planning paradigm should be adopted, as opposed to the problem-solving paradigm of “predict and prepare” that is employed by OR.
Interdisciplinary nature and interaction are requirements to deal with messes (complex systems).
In addition to these aspects that are related to optimisation, Ackoff also noted a sixth deficiency pertaining to the concept of objectivity in OR. This issue relates to all those who can be affected by the outcome of a decision-making process.
It may be said that all six issues raised by Ackoff [
1] are fundamental issues in OR. They indicate that OR as a discipline was built on conceptual foundations that assumed that reality works like a clock or machine, adhering to principles such as order, controllability and interactions that are linear, for example a certain amount of input guarantees an equal or proportional amount or effect in output and initial conditions lead to predictable outcomes (a deterministic model).
The concerns of the deficiencies of OR that have been raised by Ackoff have a common denominator in the sense that they are focused on typical human characteristics and require a more holistic, integrated approach to address them. As one of the founders of the new OR paradigm of problem structuring methods (also referred to as soft OR) and as an answer to his own critique, Ackoff suggested that the problem-solving paradigm be replaced with an “interactive planning” method—this approach represents one of the earlier examples of a problem structuring method [
21]. Having informed the main assumptions of the second wave OR approach, he introduced keywords such as “learn and adapt,” “quality of life,” “messes,” “synthesised planning” and “interdisciplinary and interaction” to account for the complexity of large numbers of role players, their interactions and intricate relationships.
There were also a number of other researchers who performed ground breaking work in the area of problem structuring methods and who are also regarded as founders (along with Ackoff) of the new soft (second and third wave) OR approach. Some of the more prominent authors on problem structuring methods include [
7,
22,
23,
24,
25,
26,
27]. For a review and evaluation of the use of problem structuring methods in practice, the work of Mingers and Rosenhead [
28] may be consulted. A key epistemological aspect of problem structuring methods that is regularly mentioned by many of the well-known researchers in this area is the aspect of multi-perspectives and the challenge of navigating human relationships. The seminal work of Rosenhead and Mingers [
29] serves as an example of this.
Furthermore, Flood and Jackson [
30] constructed a “critical systems thinking” framework to group problem contexts; this framework (which will be revisited in
Section 4) consists of two dimensions called
systems and
participants. The systems dimension can range from simple to complex, whereas the relationships between participants are classified as
unitary,
pluralist or
coercive. According to Flood and Jackson, problem structuring methods are aligned with the complex-pluralist problem context. They describe the first dimension, namely systems, only in terms of characteristics of a simple or a complex system. It is at this point where complexity theory may play a significant role; using the complexity theory principles (as put forward by Cilliers) may help orientate OR practitioners to better understand the (complex) systems dimension that is so often referred to in problem structuring methods. It therefore seems permissible to assume that there exists a relationship between problem structuring methods and complexity theory and that the latter (as presented in this paper) is complementary to the use of problem structuring methods. It may also enhance the ideas that are already used and implemented in problem structuring methods and critical systems thinking approaches as developed for example by Ulrich [
9] through his ideas formulated in his critical systems heuristics framework.
4. Operational Research in the Context of Complexity
In this section, the complexity in the context of reality in which a significant number of typical OR applications function is described. This will be done in terms of the characteristics of a complex system, as presented by [
15]. A prerequisite for a discussion of this nature is to first consider the important question of whether all OR applications can be classified as complex. For those problems in which the context is simple, complexity theory would have little or no utility.
From the literature, specifically the work of researchers who proposed frameworks to align OR methodologies with different problem contexts, it is clear that the categorisation of problems as ‘complex’ or ‘simple’ is perfectly legitimate. The work of Jackson and Keys [
8] is of particular interest in this regard. They proposed a framework, called a system of systems methodologies, to classify problem contexts based on two dimensions, namely systems (simple or complex) and the relations between participants or stakeholders. Initially, they described the relations between participants as
unitary (when all agree on a common set of goals) and
pluralist (when there are differing views and objectives). This categorisation framework was further formalised by adding a third option to the possible relations between participants, namely
coercive (irreconcilable views and objectives) [
30,
42]. It was also pointed out by these authors that in Ackoff’s terminology, the problem contexts of ‘simple’ and ‘complex’ can be referred to as ‘mechanical’ and ‘systemic.’ This approach was not free from criticism and questions were raised on issues related to problems that cannot be fitted unambiguously into one of the proposed categories; participants that may disagree on the unitary, pluralist or coercive context; inconsistencies; and lack of objectivity [
7,
43].
From a complexity viewpoint, the work of Snowden and Boone [
44] offers insight into how problem contexts can be classified. They developed a framework (called the
Cynefin framework) to assist decision makers in understanding the context in which they are operating. The framework is based on the relationship between cause and effect and consists primarily of five different contexts. The first two contexts, namely the
simple context (characterised by stability and cause-and-effect relationships) and the
complicated context (“known unknowns”), occur in an ordered world in which fact-based management and decisions can be taken. The next two contexts are the
complex (with no apparent cause-and-effect relationship) and
chaotic (the unknowable) contexts. These two contexts occur in an unordered world and represent the use of patterns to make decisions. The fifth context is the one marked by
disorder and is particularly difficult to recognise when one is in it due to the existence of simultaneously opposing multiple perspectives that compete for prominence. As suggested by Snowden and Boone [
44] (p. 4), the way out of this realm is “to break down the situation into constituent parts and assign each to one of the other four realms”. An extended list of the characteristics of each problem context can be found in their work.
In the context of this paper, it is acknowledged that not all OR applications display properties of a complex system. The ensuing discussion refers to those OR applications that would have been classified as a complex system by the Flood and Jackson [
30] system of systems methodologies—specifically the complex-pluralist and complex-coercive problem contexts. Following the Cynefin framework of Snowden and Boone [
44], the discussion refers to OR applications that would typically fall into the complex and chaotic and, to a certain extent, the disorder contexts. Applications whose objective is simply to seek an optimal answer to an obvious mechanistic problem (e.g., the optimal loading sequence of different items onto a truck) are not considered in the discussion. The facility location problem where a number of warehouses have to be built on pre-determined possible sites will be used as a representative example of a typical OR application in a complex, real-world context. This example, introduced earlier in the paper when reference was made to the work of Ackoff, seems to be appropriate to illustrate the complexity dynamics of a typical OR application [
45].
4.1. OR Applications Consist of a Large Number of Elements
OR as a discipline constitutes different kinds of analytical models. Jackson [
46] distinguishes between classical OR and ‘enhanced’ OR. The newer, enhanced OR provides for different methodological approaches (e.g., problem structuring methods) in different problem contexts and involves a large number of conceptual models, made up of equally large numbers of concepts, elements and ideas. The classical OR discipline is grounded in mathematics and mathematical statistics and is therefore also made up of a large number of concepts, ideas and techniques that are used and applied in an even larger number of combinations. If only these elements are taken into consideration, it implies a finite number of elements that imposes some sort of boundary on an OR system. However, according to [
47,
48], it is unavoidable (and, in fact, required) that a study of a complex system will have some sort of artificial boundary from the observer’s viewpoint.
In her study on complexity and information systems, Merali [
18] pointed out that a network economy and society have emerged and that the world should currently be seen as a
networked world. This view strongly supports the idea that an OR application does not exist in isolation, nor does the set of techniques and concepts on their own constitute an OR system. Other interacting elements are always present such as individuals, societies and the environment. The OR application described earlier is also strongly related to an economic application that interacts with a large number of economic elements such as people, the economy itself (price, supply, demand, labour interest rates etc.) and the environment. All these external interactions do not contribute to the OR application in a deterministic way but rather interact and merge with the application. It is difficult to identify all components of all elements and to fit them into a coherent whole to provide an exact description of the OR application and all its elements.
4.2. The Elements in an OR Application Interact Dynamically
An OR model can only be meaningful in the real-life context in which it is applied. In other words, if the model does not interact with the environment in which it operates, it becomes meaningless. The variables in an OR model do not only interact mathematically with each other but they also represent a certain relationship with reality, economic variables, resource variables and a host of others, all present in the warehouse example and they imply an interaction with the environment and individuals that may be shifting and changing continually. The OR application and specifically the warehouse location problem too, are therefore used in the combined efforts and the relationships caused by the dynamic interactions of a large number of elements. This interaction between the OR application and reality is not necessarily of a physical nature but may also occur as the “transfer of information” [
15] (p. 3).
4.3. The Level of Interaction among Elements in an OR Application is Fairly Rich
Advances in the field of OR are the results of relationships among different elements, which include mathematical concepts as well as application concepts. New developments are therefore formed from interactions among existing concepts and elements. The implementation of an OR model, as described in the example, will result in a rich (multi-level, cross-scale) interaction with, for example, the environment. As a result of the OR model implementation, an economic activity will cause a large number of interactions with economic and other agents.
4.4. The Interactions in an OR Application are Non-Linear
Non-linearity guarantees that small causes can have large results and vice versa [
15]. This is exactly what happens in the type of OR application that is used here as an example. The implementation of a relatively small and simple OR model may produce large economic and/or social returns. Conversely, a comprehensive and significantly large model associated with costly development (in terms of time and money) may have no or very little impact if the model proves to be trivial, incorrect or inadequate. Cilliers also states that the principle of asymmetry is closely related to the principle of non-linearity. The OR application described here contains strong elements of a competitive nature, that is, the best location at the lowest cost. This competitive nature causes an asymmetrical system of relationships. If there were a symmetrical relationship amongst the variables (e.g., no difference in the cost to deliver goods at different locations), there would have been no need for an OR or any other model. According to Cilliers, non-linearity, asymmetry and competition (which are all present in OR applications) are inevitable components of complex systems.
Considering the OR discipline itself, a single (small) idea may also cause large-scale changes and may even lead to new knowledge, schools of thought and applications. One example of this is the work of Farrell [
49] on the application of activity analysis to the measurement of productive efficiency, which was generalised by Charnes, et al. [
50] as a new OR approach, known as data envelopment analysis (DEA). Today, DEA is a new subfield within the OR discipline that has produced literally thousands of research papers and has led to specialist international conferences and journals.
4.5. Interactions in an OR Application Have a Fairly Short Range
The formulation and implementation of an OR model may have a profound impact on elements locally or close to the application area. The aim of an OR model is to bring about change that will (hopefully) result in improvement. This change occurs mainly at a local level and close to the application. For example, building a warehouse (by using an OR model) will first and foremost have an impact on elements close to the building activity, for instance economic activity, social activity (employment, communities etc.), pollution and the eco-system.
Although spatial and temporal location plays the most significant role, the interactions do not necessarily have to be of a short range only—they may be wide-ranging too [
15]. The success or failure of an OR model can have a wider effect than just on the immediate environment. Economic activity at a regional level that also has an impact on national level is an example of such a wide-ranging interaction. The ideas, concepts and techniques used in an initial model may permeate other problem areas, which may result in new OR model formulations; this may provide another dimension in the interactions in OR models and OR applications.
4.6. There Are Loops in the Interactions among the Elements in an OR Application
In OR as a discipline, there exist continuous feedback and interconnected loops. For example, the result of a basic linear programming model has led to extensions and new developments such as fractional and goal programming. The development and application of an OR model have also clear and definite loops and interactions that feed back onto itself. A successful OR model will deliver good results, whereas the opposite is true for an unsuccessful model. If there were no feedback on the model, the model would not be necessary and the formulation of the model would have been a useless exercise—OR models would not have been formulated if there were no feedback or results. In certain cases, feedback may not be immediate but may occur only after some time has expired. The building of a new warehouse, for example, may influence the economy in a positive or negative way after some time—this is unpredictable in the context of the real world. Further examples of this are the complexities of job creation and social communities.
4.7. An OR Application Functions in an Open System
An OR model formulation on its own should be regarded as a closed system that is described or defined within a formal description. An OR model can therefore not be called a ‘complex system’ without qualification. However, to formulate an OR model in a proper way, the modeller or decision maker is confronted with an open system in real life where there are a large number of elements that may have an influence on the formulation. The model may be bombarded with input from the physical, cultural and intellectual application environment. Once the model has been implemented and applied to a specific problem area, the application becomes a truly open system, as it becomes impossible to identify a precise boundary where the impact of the model stops. To confine the results of an OR model to a set of variables that take on certain values would therefore be a gross oversimplification.
4.8. OR Applications Operate under Conditions That Are Far from an Equilibrium
As a result of the non-linear character of interactions between the components of a system, the environment and systems as a whole, a system is asymmetrical and functions in a state of non-equilibrium [
51,
52]. The effect of the relation between non-linearity and conditions that are far from equilibrium is that it causes a ‘multiplicity of stable states’ in a system that is more robust than the single static state of systems that exist in a state of equilibrium [
52]. Systems that are far from equilibrium survive and change through a process of adaptation and self-organisation. Moreover, De Villiers-Botha and Cilliers [
53] explain that our interaction with the world is dynamic and as the environment changes, adaptations are made. The majority of OR applications are aimed at dealing with the improvement and/or adaptation of how we interact with the world. The warehouse model may be a result of the dynamics of supply and demand, a recession, a political system, technology et cetera. In a symmetric or equilibrium state, there would be no need for change, adaptations or management science models. Cilliers [
15] rightly pointed out that
equilibrium is another word for
death.
4.9. OR Applications Have a History
OR applications are greatly influenced by its history. Successful models may be re-used or improved, whereas unsuccessful models are changed or discarded. The history of an OR application and the context in which it was applied determine the future, nature and, in a sense, identity of similar models and applications. The result of an OR model, as in the warehouse example, can no longer be seen as a once-off, objectively given result by merely looking at the optimal answer that was generated by the model. Such an optimal answer does not determine the use and history of a model on its own, as there are a host of other effects and influences that have to be taken into account, for example roles of different stakeholders and unforeseen developments before or during model implementation. Moreover, traces of the application of the OR model persist long after the actual implementation; the meaning of the OR model is therefore dependent on past interactions of similar models with other elements. This is consistent with the view of Cilliers [
15] that the history of a complex system is a collection of traces distributed over the system and is always open to multiple interpretations.
Associated with history is the concept of memory. It is doubted whether mathematics has a memory [
54]; by implication, this would mean that OR does not have a memory either. Although this may be true for an OR model in isolation, such a model does show signs of memory once it has been implemented, as was indicated earlier in the discussion of the influence of successful models and failures. This memory also appears to be contingent and dynamic (a characteristic of a complex system [
54]), as the memory will vary from application to application, even when the same type of model is applied in more than one case.
4.10. Each Element in an OR Application Is Ignorant to the Behaviour of the System as a Whole
Although there exists a mathematical relationship between the variables in an OR model, the elements in the model are ignorant of the behaviour of the system as a whole. An element reacts only to information available to itself and remains unaware of what happens to other elements or their behaviour. The building of a warehouse at location x is based on information close to the activity, for example the building cost locally at location x; the number of clients at or in close proximity to location x; and storing costs at location x. Other elements or their behaviour are unknown, for example the impact on the economy (interest rates, confidence etc.).
Cilliers [
15] warns that this characteristic should be carefully considered. Other characteristics have already indicated that elements respond only to local information (short range); that this information is rich; and that single elements are not significant in themselves. According to Cilliers, the point being made with the
ignorant characteristic is that single elements cannot contain the complexity of the whole system—if that were true, all the complexity would have to be present in
that element. It goes without saying that no single element in a mathematical OR model can contain the complexity of the whole system.
The alignment of OR with the ten characteristics of complexity thinking allows for the acceptance and accommodation of criticism such as the six concerns raised by Ackoff (listed in
Section 2). A summarised overview of how these concerns are linked to the complexity principles are given below—a more general discussion on how the epistemology of OR may be enriched through an acknowledgment of the existence of complexity properties is presented in
Section 5 where concepts such as emergence, boundaries, knowledge and responsibility are highlighted. Although all ten complexity characteristics are related to a certain extent to all the concerns raised by Ackoff, the brief review that follows aims at focussing only on specific characteristics that may be more directly associated with Ackoff’s six concerns.
The first concern raised by Ackoff refers to systems that can
learn and adapt. The complexity characteristics
non-equilibrium (
Section 4.8) and
history (
Section 4.9) are of particular interest. Conditions far from equilibrium cause a system (OR application) to be more robust (adaptable) than a static system in a state of equilibrium. It was further pointed out in
Section 4.8 that systems that operates far from equilibrium survive through a process of change and adaptation. The history property of complexity indicates that the history of a system determines whether a system change (adapt) or be discarded.
Ackoff’s next concern focusses on the lack of
quality of life values in OR applications. The
rich interaction of elements (
Section 4.3) provides an opportunity for OR applications to interact with a broader number of stakeholders, for example, society, environment and so forth. A larger number of interactions with a larger number of stakeholders will lead to a greater appreciation and acceptance of quality of life values. In addition, implemented OR models are
open systems (
Section 4.7) that are subjected to input from different environments which will further contribute to the establishment and acceptance of life values.
Systems of problems (messes) that cannot be treated effectively by decomposing them analytically was mentioned by Ackoff as the third concern. Acknowledging the fact that OR applications consists of
large numbers of elements (
Section 4.1) and that these elements interact dynamically (
Section 4.2), contribute to an explanation of Ackoff’s concern. An OR application does not exist in isolation and can therefore not be broken down into isolated sub-applications. Furthermore, an OR application operates in real-world situations and cannot be free from interacting with its environment—isolating it from the environment (for example) may render the application meaningless.
The fourth concern raised by Ackoff is the argument in favour of a
synthesising planning paradigm. All ten complexity characteristics are of significance here but perhaps the
large number of elements (
Section 4.1.) and their dynamic
interaction (
Section 4.2) as well as the
open systems principle (
Section 4.7) can be singled out as clear indicators that a classical OR application should not be viewed as a linear “predict and prepare” problem. Classical OR problems should rather be addressed using a more holistic approach such as, what Ackoff termed a synthesising planning paradigm.
The penultimate concern refers to the interdisciplinary nature and
interaction requirements to deal with real-world problems. Of particular importance in this case is the complexity concepts of
loops in interactions among elements in an OR application (
Section 4.6), the
non-equilibrium conditions under which OR applications operate (
Section 4.8) and the
open systems characteristic (
Section 4.7). Feedback loops amongst elements emphasised the interaction that takes place between elements and provides for a basic understanding of the intricacies of OR applications. Due to the non-equilibrium state in which OR applications operate and the open systems idea, it is also clear that OR applications do have an interdisciplinary nature and interacts with a number of environments, systems, technological factors and so forth.
The sixth and last concern mentioned by Ackoff is related to responsibility and asks the question
who can be affected by the outcome of a decision? The dynamic
interaction of elements in an OR application (
Section 4.2) emphasised the many role players that may be affected by the application and by decisions based on the outcome of the application. These interactions of elements have a fairly
short range (
Section 4.5) and OR practitioners and decision-makers need to ensure that all stakeholders that may be influenced are identified and considered. The fact that an OR application functions in an
open system (
Section 4.7) with no clear boundaries, increases the risk of not fully understand the responsibilities associated with the impact on different stakeholders.
In a complex real world, it is rarely possible to simplify things according to a list of characteristics or properties, although, based on the ten characteristics above, one can indicate the complexity in the context of reality in which OR is applied. In addition to this, Cilliers [
54] also emphasised that the ten characteristics identified by him constitute only part of the process that is necessary to claim complexity; for example, an important element in complexity is the notion of emergence. Cilliers states that complex systems have emergent properties, which are not something that can simply be reduced to another point on a list of characteristics. However, acknowledgment of the complexity of the real-world context in which OR is applied (based on the ten characteristics) is important, because it challenges the field of OR to develop a new epistemology and related methods to capture and engage with the complex nature of these systems and processes that it aims to understand and influence.
5. Aligning the Epistemology of Operational Research with Complexity
In the preceding section it was indicated that a typical OR application, which generally operates with a controlled and reductionist epistemology, functions in terms of the characteristics of a complex system. It should therefore be acknowledged that OR functions in a system that has a significant number of complex (disorderly) properties and that all OR applications cannot simply rely on a linear sequential approach that assumes that all phenomena are context-free. Such an acknowledgement will open up new avenues, not just for the development of OR epistemologies but also for the acceptance and accommodation of criticism, for example the criticism that has been raised by Ackoff. Previous studies in other disciplines where the existence and value of complexity theory were acknowledged have greatly benefitted from such an acknowledgement. Examples of such studies in music, mathematics, education, staff development and information systems are briefly discussed here in order to indicate how a deeper understanding of the subject was gained by linking the various epistemologies to complexity theory.
A study by Crowe [
55] (p. 18) on complexity science and
music therapy emphasises that complexity science “offers music therapy a scientific model that brings greater understanding to the immensely intricate process that occurs in (the) therapeutic discipline”. Crowe states that it has changed music therapy to be
less concerned with predicting results and more concerned with the process. It also now seeks to understand, rather than to look for causes.
Mowat and Davis [
48] concluded that it is appropriate to consider a system of
mathematical ideas as a network structure that exhibits the properties of a complex system. Based on this, they claim that insights from network theory (and the associated complexity principles) will assist in the understanding of teaching mathematics and also
lead to more effective teaching of mathematics.
In a paper on complexity theory and
staff development, it is argued that complexity theory has provided insight in how to facilitate and plan complexity-based staff development. It is also particularly useful to
understand change and to prepare for change; in some instances, complexity-driven staff development may even initiate a change process [
56].
In order to understand and accept that the dynamics of
information systems (IS) in a networked world can no longer be treated with traditional scientific methods, Merali [
18] posits that a paradigm shift in the IS discipline is necessary. Through an exploration of the concepts of complexity theory and its usefulness for developing IS theory and practice, it is then claimed that “complexity science furnishes us with the concepts and tools for
building multi-level representations of the world and for making sense of the dynamics of emergence” [
18] (p. 226).
Finally, in a study regarding
decision making, with specific reference to the bee algorithm (an OR technique), Paul, et al. [
57] concluded that a complexity perspective is highly appropriate and
assists with the explanation of different properties of decision making in a particle-swarm optimisation context.
The acknowledgment of the existence of complexity properties (identified in a list of 10 characteristics) in the context in which OR is applied, links OR to complexity theory. To enrich and broaden OR’s epistemologies through complexity theory, especially as developed by Cilliers, may also open up new perspectives in OR (as the case is for the other disciplines listed above) and may provide a more non-mechanistic approach and understanding of OR than its epistemological nature in general allows for. Some new perspectives on OR might be gained, for example in terms of (1) emergence; (2) the setting of boundaries; (3) the lack of complete knowledge; and (4) the responsibility (ethics) for choices and consequences regarding definitions of boundaries.
5.1. Emergence
Cilliers [
54] claims that one of the defining characteristics of a complex system is that it will have emergent properties that cannot simply be reduced to properties of components in the system. This characteristic of emergence is defined by Checkland [
14] (p. 314) as “the principle that whole entities exhibit properties which are meaningful only when attributed to the whole, not to its parts—e.g., the smell of ammonia”. Having indicated that OR functions within complexity, the question arises to what extent OR applications relate to emergent properties; in other words, how can emergence broaden OR’s epistemology? What can be expected to emerge in an OR application and what are the benefits of such an emergence?
Complexity emerges as a result of the patterns of interaction between elements in a system [
15]. In an OR application (such as the facility location problem described earlier), the dynamic and non-linear interaction amongst elements may lead to a number of emergent properties. To start with, one would hope that after some time, the interaction of elements would lead to an emergence of a better
understanding of OR and the application. As time passes, stakeholders and communities may develop a
consciousness of the application, its interaction with the environment and the consequences. The building of a warehouse in a specific region may impact on economic activities, which may lead to some form of competition amongst economic units that may ultimately change each of them into a more efficient and effective unit, thereby driving the overall economic structure or system to more
efficiency and effectiveness. Other examples of properties that may emerge after some time may include operational changes (the way things are done), changes in social structures and the development of new eco-systems. Paul, et al. [
57] warn that the magnitude of an emergent property cannot be quantified. This is due to the open boundary and non-linear characteristics of a complex system. It is also true in the case of an OR application, as it is not possible to incorporate all elements that will interact or have an effect on the application. The non-linear interactions of elements in an OR system (e.g., a small model with a large economic or social impact) will make the quantification of emergent properties impossible.
5.2. Setting of Boundaries
Other specific issues that need to be mentioned when dealing with complexity as an alternative epistemology are, firstly, the issue of boundaries that leads directly to the issue of knowledge. An OR application implies solving a specific real-world problem. To solve such a real-world problem, it has to be framed in a specific way, as it would be impossible to try and solve the problem by involving all of reality. Furthermore, for something to be recognisable as a system, it must be bounded in some way [
58]. The OR specialist, together with a team of stakeholders, therefore needs to determine the extent of the system to be studied or modelled. The only way to achieve this is by setting boundaries and according to Audouin, et al. [
59] (p. 3), “such boundaries, whether conceptual, spatial or temporal, for example, are essential, as they enable the generation of knowledge”. However, drawing boundaries is not a simple straightforward exercise. Cilliers [
47] warns that closure by a boundary should not be over-emphasised, as one can never describe it objectively. According to him, a boundary should be thought of as something that
constitutes that which is bounded and not as something that separates one thing from another.
A second point concerning boundaries that was raised by Cilliers is that a system should not be visualised as something contiguous in space but that parts of a system may exist in totally different spatial locations. The idea of boundaries is not something new, not even in the OR discipline. The work of Midgley, et al. [
34] serves as an example of boundaries in an OR application while Velez-Castiblanco, et al. [
60] provide details on how teams of OR practitioners may explore the boundaries of intervention in an OR application. The work of Midgley [
7] and Ulrich [
9] introduced in
Section 3.1 also confirms the existence and importance of boundaries in the work of other researchers. This reiterates how well complexity principles are aligned with the boundary arguments presented by other researchers, such as Midgley and Ulrich and show the complementary nature of other theories such as complexity theory. On this point, the new insights gained (or at least emphasised in Cilliers’ work) through complexity theory is that OR’s epistemology should accommodate (or be broadened by) the fact that boundary setting is artificial, is not objective and is temporary.
5.3. Lack of Complete Knowledge
Closely related to the issue of boundaries is the knowledge aspect. Cilliers [
47] argues that to fully understand a complex system, one has to understand all its complexity as well as the system’s complete environment. This is, of course, not possible and is the reason why boundary setting is necessary. The implication of this is that one can never have complete knowledge of a complex system but only knowledge in terms of a certain framework. Cilliers [
61] further explains that the generation of knowledge in a complex system is an exploratory process and that the knowledge is always provisional. This seems to be important for OR applications—by accepting the boundary and knowledge principles of a complex system, the OR professional will remain aware of the existence of a diverse set of stakeholders and other elements that may impact the OR application, while setting boundaries at the same time to facilitate the practical problem and possible solutions. As with boundaries, Cilliers’ view of complexity aligns well with ideas of Midgley concerning knowledge. Midgley, et al. [
34] (p. 160) state, for example, “if we accept the systems idea that everything is ultimately connected, then no theoretical knowledge, however well elaborated, can accurately reflect reality”.
The arguments concerning the provisional nature of knowledge in a ‘bounded’ system may lead to a charge of relativism but this argument is rejected by Cilliers and others. Woermann [
62] pointed out that relativism (being relative to other things) only makes sense when it is contrasted with absolutism (standing in no relation to anything); the fear of relativism is therefore something that haunts absolutism. Cilliers’ view is that “limited knowledge is not ‘any’ knowledge” [
61] (p. 260) and that complexity is not an excuse for relativism but rather a challenge to develop a new kind of scientific understanding that does not want to argue that “sloppy” work is acceptable [
63].
5.4. Responsibility (Ethics) for Choices and Consequences Regarding Definitions of Boundaries
Another issue that flows from the concept of boundaries and provisional knowledge is the idea of ethics or, more specifically in this case, responsibility. When a specific framework is chosen (boundary setting) to investigate and interpret an application area or system, one cannot escape from the fact that the complexity of the system has been reduced and that a certain level of uncertainty in knowledge will prevail. Cilliers [
47] warns that as a result of such boundary setting, one cannot blame the outcomes of decisions and actions on such a procedure and that responsibility should be assumed. Woermann and Cilliers [
40] confirm this by arguing that whenever models do not correspond with reality (due to boundaries); responsibility should be taken for both the intended and unintended consequences. Similarly, Audouin, et al. [
59] note that boundary definitions involve choices that are essentially value based—this is also in line with the work of Ulrich [
9] and Midgley [
7] presented in
Section 3.1. Ethics and ethical considerations are seemingly important in complexity theory and more insight from a philosophical viewpoint on ethics and complexity can be found in Reference [
40]. The idea of ethical considerations and responsibility in OR is widely accepted. Examples of an awareness of this can be found in References [
64,
65]. Both these studies focus on responsibility, with the latter concentrating on not just responsibility towards an OR client but also towards a wider audience (e.g., stakeholders, society and nature) that may be affected directly or indirectly by the results of an OR application. The work of Ormerod and Ulrich [
66] also provides a comprehensive review of operational research and ethics.
To illustrate the implications of complexity thinking for OR, a facility location problem was earlier described as a typical example of an OR problem. Given the four characterising aspects of complexity (
Section 5.1,
Section 5.2,
Section 5.3 and
Section 5.4), it is fairly easy to name other well-known classical OR problems to serve as further concrete examples of how OR could be influenced by applying a complexity lens to them. For example, the portfolio selection problem is a fundamental OR model in modern finance that can be solved with a standard, non-linear mathematical program. The problem entails the selection of a set of stocks, given a limited budget and subject to a certain level of risk and/or an expected return of the chosen portfolio. The well-known data envelopment analysis problems, where the efficiency of a set of homogeneous decision-making units (e.g., bank branches) is evaluated based on input and output variables, represent another practical example. A third example may be found in the area of manufacturing applications (e.g., a standard product mix problem). These types of problems use linear programming models to help in planning the optimal mix of different products to be manufactured, subject to resource constraints. There are also many other standard OR problems in a wide variety of application areas that could be cited as examples for this discussion.
If one considers the three examples above from a complexity perspective, it becomes clear that the majority of conventional OR problems should not simply be treated as a mathematical model that seeks an optimal answer while neglecting the characteristics and associated intricacies of insights offered by complexity theory. All three examples are related to economic applications where a large number of elements interact continuously. The nature of this interaction is non-linear, as a small change in one of the elements may lead to large changes in the application area. The interaction will, no doubt, lead to emergent properties (e.g., learning and deeper understanding of the problem) that may change or render initial solutions less useful. To model the three problems, it is clear that each one will have to be framed to a certain extent to be able to solve the problem; this means that artificial boundaries will have to be set and defined. Owing to these boundaries, the solutions and knowledge offered by the OR models are only conditional, as the impact and consequences of the application cannot be fully known. For example, the consequences of a specific financial portfolio, a specific product mix or a recommendation on the efficiency or inefficiency of a decision-making unit may be far reaching once the OR solutions are implemented. Finally, the consequences and lack of complete knowledge will result in ethical responsibilities for each of the three applications. This ethical responsibility is applicable to both the OR specialist (whose models recommend a financial portfolio, a product mix or a recommendation on the efficiency or inefficiency of a business unit) and the OR client for implementing the OR recommendations with the associated consequences for humans, the environment, society et cetera.
In general, the characteristics, which are prerequisites for complexity, are to a large extent present in the context of OR applications. The non-linear interactions of elements in an OR application will lead to emergent properties, for example a deeper understanding and consciousness of the problem and its consequences. The setting of boundaries forms part of any OR application and is necessary to frame a problem properly in order to produce possible solutions. Furthermore, there is an increasing acknowledgement in OR that knowledge produced through the setting of boundaries is only provisional; a complete OR application with its impact and consequences cannot be fully known. Lastly, the issue of boundaries and the associated lack of complete knowledge emphasise and broaden the ethical responsibility of OR.
The acknowledgement of the above complex aspects in OR and the alignment of OR’s epistemology to complexity theory, could help to develop new methods for modelling decision making. Furthermore, this acknowledgement that decision making happens in a complex reality makes OR more prone to the benefits of other disciplines experienced through their link with complexity. For example, classical OR should also be less concerned with predicting results and more concerned with the process (this will enable classical OR to take into account other stakeholders, society, nature etc.); be taught more effectively (i.e., a greater sensitivity for setting boundaries and ethical implications); understand change better (also initiate change and a willingness to adapt models and methodologies); make sense of the dynamics of emergence (i.e., expect any unexpected emergent properties); and assist with the explanation of the different properties of decision making and building multi-level representations of the world.
The acceptance of complexity and complexity principles as a framework for understanding (epistemology) or changing the way in which one thinks about applications or systems is not free from criticism. Some examples of challenges to complexity theory are highlighted in the work of Morrison [
67], who raised some of the difficulties of complexity theory in education. The issues raised are, however, generally applicable and not limited to the area of education (and therefore relevant for OR as well).
One of the first issues raised is whether complexity theory is a novel theory or not. Concepts such as the ten characteristics used to describe complexity are not new and may simply be a reformulation of known aspects. In addition, it may also appear as if complexity is simply a statement of the obvious (using old concepts). Morrison also asks questions about the usefulness of complexity theory-it is mainly regarded as a post hoc explanation with limited prospective or predictive utility. It also has the disadvantage of being non-optimal, non-controllable, non-understandable and non-immediate. These characteristics do not fit with systems, applications or practical situations that seek efficiency, control, comprehensibility and immediate solutions. It also raises further questions on responsibility; for example, if one cannot predict the consequences of one’s actions, in what sense can one then be held responsible for what happens after one’s actions? These criticisms should be taken into account when dealing with OR in the context of complexity, because it can further enrich the new directions that OR may follow.