Next Article in Journal
Development and Preliminary Results from the Testbed Infrastructure of the DRIP Project
Previous Article in Journal
The Role of Fire in Achieving the Sustainable Development Goals of the United Nations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Complexity of Legal Processes and Systems †

1
Department of Mathematics, University of California, Los Angeles, CA 90095, USA
2
Foundation for Law & ICT, 9700 Groningen, The Netherlands
*
Author to whom correspondence should be addressed.
Conference Theoretical Information Studies (TIS), Berkeley, CA, USA, 2–6 June 2019.
Proceedings 2020, 47(1), 17; https://doi.org/10.3390/proceedings2020047017
Published: 20 May 2020
(This article belongs to the Proceedings of IS4SI 2019 Summit)

Abstract

:
Complexity measures allow for reflecting on critical issues and estimating the efficiency of various processes and systems. To better organize the functioning of the legal domain, it is rational to use complexity measures. In essence, any legal process is an information process in the legal domain, involving one or several legal systems. Thus, the goal of this work is the development of theoretical and practical tools for the estimation of complexity of information processes, that is, complexity measures in the legal domain, with the goal to ameliorate functioning of the legal systems.

1. Introduction

Legal systems are evolutionary social, i.e., in some sense, natural, behavioral systems, although many like to think that they are just artificial constructions created by people. The latter idea resulted in the amorphous areas of normative ethics and legal studies, although the legal domain needs meta-ethics, descriptive ethics, ethnology and ethology. Having this in mind, if we analyze the essential characteristics of legal systems, we can see that they are the result and product of millions of years of evolution in a changing world. Law consists of formal rules describing expectations concerning a class of human behavior. The origin of these formal rules is an individual or collective process that translates values into individual and general norms, norms into rules and rules into social standards and formal laws (the law). A value is an innate or social need. The argument for the evolutionary character of legal systems is made in [1,2]. The fact that many locally parallel and temporally sequential legal systems have the same characteristics does not prove that they are designed like—for example—cars, but provides evidence that they are the consequence of (have developed on) the basis of the same natural characteristics intrinsic for different societies.
To study legal systems with the goal of improving their functioning, we use complexity theory, because mathematical models of complexity describe efficiency. Indeed, complexity measures allow reflecting critical issues and estimating the efficiency of various processes and systems. Thus, to better organize functioning of the legal domain, it is rational to use complexity measures, which have been constructed in complexity theory and used in a variety of scientific and practical areas. In essence, any legal process is an information process in the legal domain, involving one or several legal systems. Thus, our goal is the development of theoretical and practical tools for the estimation of complexity of information processes, that is, complexity measures, in the legal domain with the goal to ameliorate functioning of the legal domain. Knowledge of dynamic and structural characteristics reflecting complexity of information processes in the legal domain can improve results of these processes, decrease their complexity and better satisfy needs of the society where these processes go on.
To elaborate legal complexity measures, it is necessary to have formal descriptions of legal processes. Here, we use the Logic of Reasonable Inferences (LRI), as the tool for building a formal model of legal knowledge and formalizing descriptions of the legal processes. The LRI was introduced in [3] and utilized for the development of the computer program Argumentator, which was implemented and empirically validated against a multitude of real life legal cases [4]. In [5], a formal model of the factors that determine complexity in the legal domain is presented.
To develop legal complexity measures, we use the theory of direct and dual complexity measures developed in [6,7], as well as inductive complexity introduced and studied in [8]. According to these theories, complexity, at first, is measured for separate processes, and then it is integrated into the complexity of the system in question.

2. Complexity of Systems and Processes

The complexity of a system R is determined by the resources used by this system in the processes of functioning. Note that these resources can be used by the system itself and/or by another system interacting, e.g., using the system R.
Systems use different kinds of resources:
  • Natural resources consumed by the system R: time, space, information, energy, power, minerals and so on;
  • Social resources consumed by the system R: people involved, their time, efforts, expertise, knowledge and so on;
  • Artificial resources consumed by the system R: system time, system space, data, knowledge, memory, system units, system actions and so on.
There are three types of direct complexity measures of a system or a process: static, functional and processual complexity measures.
Definition 1.
Static complexity measures depend on the system, in which this process goes, or on the system, which performs (realizes) this process, being estimated by resources given as input for the process.
Examples of such resources are the system itself, the program of its functioning or the input information.
Definition 2.
Functional complexity measures depend both on the input, e.g., algorithm or program, and the output of the process that goes in the system or is realized by it.
As examples, we can take such measures as the quality of result obtained by the system or the ratio of the size of the input information to the size of the output information.
Definition 3.
Processual complexity measures depend on the process itself.
As examples, we can take such measures as the time of processing some given data or the volume of memory that is demanded by this processing.
In general, there are three classes of complexity measures: direct, dual, and mixed complexity measures. Let us consider some examples of complexities for such systems as algorithms and programs.
Direct complexity measures:
  • The length of an algorithm or program (static complexity);
  • The quality of the result (functional complexity);
  • Time of the computation directed by the algorithm or program (functional complexity).
Dual complexity measures:
  • The length of the shortest algorithm or program needed to obtain the given system (static complexity);
  • The best quality of the possible result (functional complexity);
  • The least time needed to obtain the given system (functional complexity).
Mixed complexity measures:
  • Computational complexity, e.g., time, of the shortest program that produces the given system.
It is possible to define complexity of a system (organization) by processes, which either go in this system (organization) or are performed (organized) by this system (organization).
  • Worst case complexity of a system (organization) is the top measure of the processes, which either go in this system (organization) or are performed (organized) by this system (organization);
  • Average case complexity of a system (organization) is the average measure of the processes, which either go in this system (organization) or are performed (organized) by this system (organization);
  • Best case complexity of a system (organization) is the least measure of the processes, which either go in this system (organization) or are performed (organized) by this system (organization).
Having a direct complexity measure μ, we define its dual complexity measure μ(R) of a system R as the minimal value μ(A) of the system, program or process that allows obtaining, e.g., construction or computation, of R.
For instance, the algorithmic (Kolmogorov) complexity is a paradigmatic example of a dual complexity measure. The algorithmic (Kolmogorov) complexity of a string of symbols R is the minimal length of a program (algorithm) that allows computation (recognition) of R [9,10]. In general, algorithmic (Kolmogorov) complexity measures have been defined and explored in the axiomatic setting. This allows application of these measures in a variety of domains (such as medicine, biology, neurophysiology, physics, economics, hardware and software engineering) in general, and the legal domain in particular.
There are also temporally related complexity measures:
  • The estimated complexity of a system or process reflects opinions about the complexity of this system or process;
  • The projected complexity of a system or process reflects intentions or requirements to the complexity of this system or process;
  • The actualized complexity of a system or process is the complexity of this system or process measured a posteriori.
Analyzing social processes in general and legal processes in particular, we see that almost any process P is based on the process org P of preparation and organization of P. In addition, there is the process act P of putting into action the results of P.
For instance, taking such a process as court trial, we see that its preparation includes investigation while putting its decision into action includes law enforcement.
As a result, we have three temporal legal complexity measures:
  • The temporal organizational complexity of a legal process is time used for organization of the legal process;
  • The temporal operational complexity of a legal process is time used for conducting the legal process coming to a definite decision;
  • The temporal execution complexity of a legal process is time used for putting the legal decision into action.
In turn there are three extent legal complexity measures:
  • The extent organizational complexity of a legal process is the number of items used for organization of the legal process;
  • The extent operational complexity of a legal process is the number of items used for conduction the legal process coming to a definite decision;
  • The extent execution complexity of a legal process is the number of items used for putting the legal decision into action.
Both temporal and extent complexity measures are direct. Dual complexity measures are obtained by application of minimization operation to direct complexity measures.

3. Complexity in the Legal Domain

Items utilized in legal processes, which we will call legal articles, can be:
  • instances of material evidence, i.e., weapon or fingerprints;
  • participants of the process, e.g., attorneys, judges, witnesses;
  • those who prepare (organize) the process, e.g., detectives, attorneys, judges;
  • testimonies, depositions, speeches of participants during the process, negotiations, subpoenas, requests, interrogatories, affidavits, transcripts, objections, statements;
  • legal rules and laws.
Rules for conducting legal processes, e.g., laws of the country, are basic items (legal articles) utilized in legal processes. This delineates an important class of legal complexity measures, namely, of extent operational complexities—imperative complexity measures.
The imperative complexity of a legal process is the number of legal rules used for conduction the legal process coming to a definite decision.
There are three modalities of legal rules (laws):
  • Obligatory rules (laws) describe what is required to do in a legal process;
  • Discretionary rules (laws) describe what is permitted (possible) to do in a legal process;
  • Veto rules (laws) describe what is prohibited to do in a legal process.
Now let us look at dual complexity measures of legal processes.
  • The obligatory (requisite) organizational algorithmic complexity of a legal process is the minimal number of items (legal articles) necessary (sufficient) to organize the legal process;
  • The obligatory (requisite) operational algorithmic complexity of a legal process is the minimal number of items (legal articles) necessary (sufficient) to conduct the legal process coming to a definite decision;
  • The obligatory (requisite) execution algorithmic complexity of a legal process is the minimal number of items (legal articles) necessary (sufficient) for putting the legal decision into action.
It is possible to consider only some type legal articles, e.g., rules or laws, estimating the complexity of a legal process by the (minimal) number of used rules or laws. However, for the sides involved in a legal process, it is important not only to organize, conduct and put into action a legal process, but to achieve definite results in this process. These assumptions bring us to new types of algorithmic complexity.
Let us suppose that there are sides (participants) A1, … , An in a legal process P.
  • The Ak-winning organizational algorithmic complexity of a legal process is the minimal number of items (legal articles) necessary (sufficient) to organize the legal process successful for Ak.
  • The Ak-winning operational algorithmic complexity of a legal process is the minimal number of items (legal articles) necessary (sufficient) to conduct the legal process successful for Ak.
Further research will focus first of all on the legal articles distinguished in the formal model of the factors that determine complexity in the legal domain [5].

4. Conclusions

We have elaborated a system approach to the estimation of complexity of information processes in the legal domain aimed at upgrading and modernizing the functioning of the legal domain based on information technology.
An interesting problem for future research is construction of legal complexity measures based on Shannon’s entropy [11] and Fisher’s statistical information measures [12], as there are intrinsic relations between complexity measures introduced and studied in this paper, and the above mentioned information measures [13,14]
Communication is an important component of diverse legal processes. Consequently, it would be important to construct and study communication legal complexity, taking into account communication complexity of algorithmic processes [15].

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. de Vey Mestdagh, C.N.J.; Hoepman, J.H. Inconsistent Knowledge as a Natural Phenomenon: The Ranking of Reasonable Inferences as a Computational Approach to Naturally Inconsistent (Legal) Theories. In Information and Computation, Essays on Scientific and Philosophical Understanding of Foundations of Information and Computation; Dodig-Crnkovic, G., Burgin, M., Eds.; World Scientific: Hackensack, NJ, USA; pp. 439–476.
  2. de Vey Mestdagh, C.N.J. What Is Law? Paper for the Workshop Governance Meets Law, 23–24 June 2011; University of Groningen: Het Kasteel, Groningen, The Netherlands, 2011. [Google Scholar]
  3. de Vey Mestdagh, C.N.J.; Verwaard, W.; Hoepman, J.H. The Logic of Reasonable Inferences. Legal Knowledge Based Systems, Model-Based Legal Reasoning; Breuker, J.A., de Mulder, R.V., Hage, J.C., Eds.; Vermande: Lelystad, The Netherlands, 1991; pp. 60–76. [Google Scholar]
  4. de Vey Mestdagh, C.N.J. Legal Expert Systems. Experts or Expedients? The Law in the Information Society. In Proceedings of the Fifth International Conference of the Italian National Research Council, Florence, Italy, 2–5 December 1998; Ciampi, C., Marinai, E., Eds.; Istituto per la documentazione giuridica del Centro Nazionaledelle Richerche: Firenze, Italy, 1998; p. 8. [Google Scholar]
  5. de Vey Mestdagh, C.N.J. A Model of Complexity for the Legal Domain. In Proceedings of the IS4SI 2017 Summit Digitalisation for a Sustainable Society, Gothenburg, Sweden, 12–16 June 2017; Volume 1, p. 192. [Google Scholar]
  6. Burgin, M.S. Generalized Kolmogorov Complexity and other Dual Complexity Measures. Cybernetics 1990, 26, 481–490. [Google Scholar] [CrossRef]
  7. Burgin, M. Algorithmic Complexity of Computational Problems. Int. J. Comput. Inf. Technol. 2010, 2, 149–187. [Google Scholar]
  8. Burgin, M. Algorithmic Complexity of Recursive and Inductive Algorithms. Theor. Comput. Sci. 2004, 317, 31–60. [Google Scholar] [CrossRef]
  9. Kolmogorov, A.N. Three approaches to the definition of the quantity of information. Probl. Inf. Transm. 1965, 1, 3–11. [Google Scholar]
  10. Chaitin, G.J. On the Length of Programs for Computing Finite Binary Sequences. J. Assoc. Comput. Mach. 1966, 13, 547–569. [Google Scholar] [CrossRef]
  11. Shannon, C.E. The Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  12. Fisher, R.A. Statistical Methods and Scientific Inference; Oliver and Boyd: Edinburgh, UK, 1956. [Google Scholar]
  13. Li, M.; Vitanyi, P. An Introduction to Kolmogorov Complexity and Its Applications; Springer: New York, NY, USA, 1997. [Google Scholar]
  14. Burgin, M. Inductive Complexity and Shannon Entropy. In Information and Complexity; World Scientific: New York, NY, USA; London, UK; Singapore, 2016; pp. 16–32. [Google Scholar]
  15. Hromkovic, J. Communication Complexity and Parallel Computing; Springer: New York, NY, USA, 1997. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Burgin, M.; Mestdagh, K. d.V. Complexity of Legal Processes and Systems. Proceedings 2020, 47, 17. https://doi.org/10.3390/proceedings2020047017

AMA Style

Burgin M, Mestdagh KdV. Complexity of Legal Processes and Systems. Proceedings. 2020; 47(1):17. https://doi.org/10.3390/proceedings2020047017

Chicago/Turabian Style

Burgin, Mark, and Kees (C.N.J.) de Vey Mestdagh. 2020. "Complexity of Legal Processes and Systems" Proceedings 47, no. 1: 17. https://doi.org/10.3390/proceedings2020047017

Article Metrics

Back to TopTop