# Triadic Automata and Machines as Information Transformers

## Abstract

**:**

## 1. Introduction

## 2. Algorithms, Automata and Machines

**Definition**

**1.**

- Physically represented algorithms, e.g., the hardware of computers;
- Structurally represented algorithms, e.g., structures of computer programs or of transition rules of finite automata;

- Instrumental algorithms, e.g., algorithms in the form of an automaton;
- Textual algorithms, e.g., a system of instructions or rules;
- Numeric algorithms, e.g., algorithms as weights in neural networks.

- Operationally expressed textual algorithms, e.g., systems of instructions;
- Functionally expressed textual algorithms, e.g., partially recursive functions;
- Intentionally expressed textual algorithms giving only descriptions of what is necessary to do, e.g., programs in functional or relational programming languages.

- Parametric algorithms, e.g., weights in neural networks;
- Instruction algorithms, e.g., rules in Turing machines;
- Description algorithms, e.g., programs in functional or relational programming languages.

**Definition**

**2.**

**Definition**

**3.**

**Definition**

**4.**

## 3. Inner Structure of Abstract Automata

**Definition**

**5.**

**Definition**

**6.**

**Definition**

**7.**

**Definition**

**8.**

**Definition**

**9.**

- The control device C
_{M}, which is a finite automaton and represents states of the machine (automaton) M; - The memory W
_{M}, which stores data; - The processor P
_{M}, which transforms (processes) information (data) from the input and the memory W_{M}.

_{M}consists of cells and connections between them. Each cell can be empty or contain a symbol from the alphabet A

_{M}of the machine (automaton) M.

_{M}observes one cell from the memory W

_{M}at a time, and can change the symbol in this cell and go to another cell using connections in the memory W

_{M}. These operations are performed according to the instructions R

_{M}for the processor P

_{M}. These instructions R

_{M}can be stored in the processor P

_{M}or in the memory W

_{M}.

**Example**

**1.**

- The linguistic structure L = (∑, Q, Ω) where ∑ is a finite set of input symbols, Q is a finite set of states, and Ω is a finite set of output symbols of the automaton G;
- The state structure S = (Q, q
_{0}, F) where q_{0}is an element from Q that is called the start state and F is a subset of Q that is called the set of final (in some cases, accepting) states of the automaton G; - The action structure, which is traditionally called the transition function of G and has the following form

_{1}: ∑ × Q→Q

_{2}: ∑ × Q→Ω

**Example**

**2.**

**Example**

**3.**

- Straightforward or prescriptive instructions directly tell what is necessary to do.
- Descriptive instructions describe what result it is necessary to obtain.
- Implicit instructions have a form of data that can be interpreted as instructions.

**Example**

**4.**

**Example**

**5.**

**Example**

**6.**

## 4. Structure of Triadic Automata and Machines

**Definition**

**10.**

- -
- a hardware modification machine (automaton) if it transforms only infware and hardware,
- -
- a software modification or symmetric machine (automaton) if it transforms only infware and software,
- -
- a transducer if it transforms only infware and has input and output,
- -
- a generator if it transforms only infware and has only output,
- -
- an acceptor if it transforms only infware and only input,
- -
- a hardware expansion machine (automaton) if it only expands its hardware,
- -
- a software expansion machine (automaton) if it only expands its software, and
- -
- a symmetric expansion machine (automaton) if it only expands its hardware and software.

**Definition**

**11.**

- The control device C
_{A}, which is a finite automaton and represents states of the machine (automaton) A; - The data memory W
_{A}, which stores data and includes input and output registers; - The software memory V
_{A}, which stores software of the machine (automaton) A; - The data processor P
_{M}, which transforms (processes) information (data) from the memory W_{M}; - The software processor D
_{M}, which transforms (processes) software of A stored in the memory V_{M}; - The metaprocessor P
_{A}, which transforms (e.g., builds or deletes connections in) the hardware H_{A}and/or changes the control device C_{A}.

**Definition**

**12.**

- The control device C
_{H}, which is a finite automaton and represents states of the machine (automaton) H; - The data memory W
_{H}, which stores data; - The instruction memory V
_{H}, which stores instructions; - The data processor P
_{M}, which transforms (processes) information (data) from the memory W_{M}; - The instruction processor D
_{M}, which transforms (processes) information (instructions) from the memory V_{M}; - The memory processor P
_{W}, which transforms (builds or deletes connections and/or cells in) the memory W_{M}; - The memory processor P
_{V}, which transforms (e.g., builds or deletes connections and/or cells in) the memory V_{M}.

_{A}.

**Definition**

**13.**

- -
- do not have processors that transform memory are called symmetric instruction machines,
- -
- have only processor(s) that transform data are called instruction machines,
- -
- have only processor(s) that transform instructions are called translation machines or translators,
- -
- have only processor(s) that transform memory are called construction machines or constructors,
- -
- do not have processors that transform data are called constructors (construction machines) with translators, and
- -
- do not have processors that transform instructions are called generative instruction machines.

- In the static program (static software) of the machine M, everything is constructed before M starts working.
- In the growing program (growing software) of the machine M, parts are constructed while M is working but no parts are deleted.
- In the dynamic program (growing software) of the machine M, when it necessary, some parts are constructed and when it necessary, some parts are deleted while M is working.

- In the static hardware of the machine M, everything is constructed before M starts working.
- In the growing hardware of the machine M, parts are constructed while M is working but no parts are deleted.
- In the dynamic hardware of the machine M, when it necessary, some parts are constructed and some parts are deleted while M is working.

## 5. The Dynamics of Triadic Automata and Machines

**Definition**

**14.**

**H**and

**K**are operationally equivalent if each automaton in

**H**is operationally equivalent to an automaton in

**K**and vice versa.

**Definition**

**15.**

**H**and

**K**are strictly operationally equivalent if each automaton in

**H**is strictly operationally equivalent to an automaton in

**K**and vice versa.

**Definition**

**16.**

**H**and

**K**are functionally equivalent if the automata, machines or algorithms from

**H**and

**K**compute the same class of functions.

**Lemma**

**1.**

**Definition**

**17.**

**H**and

**K**are linguistically equivalent if they compute (or accept) the same class of languages.

**Lemma**

**2.**

**Corollary**

**1.**

**Definition**

**18.**

**R**is called linguistically (functionally) recursive if it is linguistically (functionally) equivalent to the class of all Turing machines.

**Example**

**7.**

**Example**

**9.**

**Theorem**

**1.**

**H**is linguistically recursive if and only if it is linguistically equivalent to the class {U} where U is a universal Turing machine.

**Definition**

**19.**

**W**is called linguistically (functionally) subrecursive if not all languages (functions) computable/acceptable in the class of all Turing machines are computable/acceptable in

**W**.

**Example**

**11.**

**Example**

**12.**

**Example**

**13.**

**Definition**

**20.**

**U**is called linguistically (functionally) superrecursive if not all languages (functions) computable/acceptable in

**U**are computable/acceptable in the class of all Turing machines.

**Example**

**14.**

**Example**

**15.**

**Example**

**16.**

**Definition**

**21.**

**U**is called strictly linguistically (functionally) superrecursive if all languages (functions) computable/acceptable in the class of all Turing machines are also computable/acceptable in

**U**.

**Example**

**17.**

**Example**

**18.**

**Example**

**19.**

**Example**

**20.**

**Lemma**

**3.**

**Definition**

**22.**

**Example**

**21.**

**Example**

**22.**

**Theorem**

**2.**

**Definition**

**23.**

**Example**

**23.**

**Proposition**

**1.**

**Definition**

**24.**

**Example**

**24.**

**Definition**

**25.**

**Example**

**25.**

**Theorem**

**3.**

**Corollary**

**2.**

**Corollary**

**3.**

**Definition**

**26.**

**Theorem**

**4.**

**Theorem**

**5.**

**Corollary**

**4.**

**Definition**

**27.**

**D**is called linguistically (functionally) plainly inductive if it is linguistically (functionally) equivalent to the class

**IM**

_{1}of all inductive Turing machines of the first order.

**Example**

**26.**

**Example**

**27.**

**Example**

**28.**

**Theorem**

**6.**

**H**is linguistically plainly inductive if and only if it is linguistically equivalent to the class {W} where W is a universal inductive Turing machine of the first order.

**Definition**

**28.**

**W**is called linguistically (functionally) plainly superinductive if not all languages (functions) computable/acceptable in

**W**are computable/acceptable in the class of all inductive Turing machines of the first order.

**Theorem**

**7.**

**H**is linguistically (functionally) plainly superinductive if it is linguistically (functionally) equivalent to the class

**IM**

_{2}of all inductive Turing machines of the second or higher order.

**Problem**

**1.**

**IM**

_{2}of all inductive Turing machines of the second order, which are computationally weaker than

**IM**

_{2}but are still linguistically or functionally plainly superinductive?

## 6. Conclusions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Burgin, M. Superrecursive Algorithm; Springer: New York, NY, USA, 2005. [Google Scholar]
- Burgin, M. Inductive Turing Machines. In Unconventional Computing—A volume in the Encyclopedia of Complexity and Systems Science; Adamatzky, A., Ed.; Springer: Berlin/Heidelberg, Germany, 2018; pp. 675–688. [Google Scholar]
- Dymond, P.W.; Cook, S.A. Hardware complexity and parallel computation. In Proceedings of the 21st Annual Symposium on Foundations of Computer Science, Syracuse, NY, USA, 13–15 October 1980; pp. 360–372. [Google Scholar]
- Cook, S.A. Towards a complexity theory of parallel computation. L’Enseignement Math.
**1981**, XXVII, 99–124. [Google Scholar] - Dymond, P.W. On nondeterminism in parallel computation. Theor. Comput. Sci.
**1986**, 47, 111–120. [Google Scholar] [CrossRef][Green Version] - Burgin, M. Nonlinear Phenomena in Spaces of Algorithms. Int. J. Comput. Math.
**2003**, 80, 1449–1476. [Google Scholar] [CrossRef] - Kleene, S. Constructive and Non-constructive Operations. In Proceedings of the International Congress of Mathematicians, Edinburg, UK, 14–21 August 1958; Cambridge University Press: Cambridge, UK, 1960. [Google Scholar]
- Burgin, M. Reflexive Calculi and Logic of Expert Systems. In Creative Processes Modeling by Means of Knowledge Bases; Institute of Mathematics: Sofia, Bulgaria, 1992; pp. 139–160. (In Russian) [Google Scholar]
- Burgin, M. Reflexive Turing Machines and Calculi. Vychislitelnyye Syst. (Log. Methods Comput. Sci.)
**1993**, 148, 94–116, 175–176. [Google Scholar] - Schroeder, M.J. Dualism of Selective and Structural Manifestations of Information in Modelling of Information Dynamics. In Computing Nature; SAPERE 7; Springer: Berlin, Germany, 2013; pp. 125–137. [Google Scholar]
- Schroeder, M.J. From Proactive to Interactive Theory of Computation. In Proceedings of the 6th AISB Symposium on Computing and Philosophy: The Scandal of Computation—What is Computation? Exeter, UK, 2–5 April 2013; pp. 47–51. [Google Scholar]
- Burgin, M. Processing information by symmetric inductive machines. In Proceedings of the IS4SI Summit Berkeley 2019—Where is the I in AI and the Meaning in Information, Berkeley, CA, USA, 2–7 June 2019. [Google Scholar]
- Smith, B.C. Procedural Reflection in Programming Languages. Ph.D. Thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA, USA, 1982. [Google Scholar]
- Demers, F.-N.; Malenfant, J. Reflection in logic, functional and object-oriented programming: A Short Comparative Study. In Proceedings of the IJCAI’95 Workshop on Reflection and Metalevel Architectures and their Applications in AI, Montreal, QC, Canada, 20–25 August 1995; pp. 29–38. [Google Scholar]
- Chlipala, A. Ur: statically-typed metaprogramming with type-level record computation. In Proceedings of the ACM SIGPLAN 2010 Conference on Programming Language Design and Implementation (PLDI’10), Phoenix, AZ, USA, 22–28 June 2010; Volume 45, pp. 122–133. [Google Scholar]
- Burgin, M. Structural Reality; Nova Science Publishers: New York, NY, USA, 2012. [Google Scholar]
- Mandler, J.M. Stories, Scripts, and Scenes: Aspects of Schema Theory; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1984. [Google Scholar]
- Arbib, M.A. Modules, Brains and Schemas, Formal Methods; LNCS 3393; Springer: Berlin/Heidelberg, Germany, 2005; pp. 153–166. [Google Scholar]
- Burgin, M. Mathematical Schema Theory for Modeling in Business and Industry. In Proceedings of the 2006 Spring Simulation Multi Conference (SpringSim‘06), Huntsville, AL, USA, 2–6 April 2006; pp. 229–234. [Google Scholar]
- Burgin, M.; Debnath, N. Reusability as Design of Second-Level Algorithms. In Proceedings of the ISCA 25th International Conference “Computers and their Applications” (CATA-2010), ISCA, Honolulu, HI, USA, 22–24 October 2010; pp. 147–152. [Google Scholar]
- Burgin, M.; Gupta, B. Second-level Algorithms, Superrecursivity, and Recovery Problem in Distributed Systems. Theory Comput. Syst.
**2012**, 50, 694–705. [Google Scholar] [CrossRef] - Dehaene, S.; Changeux, J.P. Development of elementary numerical abilities: A neuronal model. J. Cogn. Neurosci.
**1993**, 5, 390–407. [Google Scholar] [CrossRef] - Dehaene, S. The Number Sense: How the Mind Creates Mathematics; Oxford University Press: New York, NY, USA, 1997. [Google Scholar]
- Schroeder, M.J. Computing with Nature. Proceedings
**2017**, 1, 178. [Google Scholar] [CrossRef][Green Version] - Dodig-Crnkovic, G.; Burgin, M. Information and Computation; World Scientific: New York, NY, USA, 2011. [Google Scholar]
- Zuse, K. Rechnender Raum; Friedrich Vieweg&Sohn: Braunschweig, Germany, 1969. [Google Scholar]
- Fredkin, E. Digital Mechanics. Phys. D
**1990**, 45, 254–270. [Google Scholar] [CrossRef] - Dodig-Crnkovic, G. Significance of Models of Computation from Turing Model to Natural Computation. Minds Mach.
**2011**, 21, 301–322. [Google Scholar] [CrossRef][Green Version] - Burgin, M.; Dodig-Crnkovic, G. Information and Computation—Omnipresent and Pervasive. In Information and Computation; World Scientific: New York, NY, USA, 2011; pp. vii–xxxii. [Google Scholar]
- Shiva, S.G. Advanced Computer Architectures; CRC Press: Boca Raton, FL, USA, 2005. [Google Scholar]
- Haykin, S. Neural Networks: A Comprehensive Foundation; Macmillan: New York, NY, USA, 1994. [Google Scholar]
- Kolmogorov, A.N. On the concept of algorithm. Uspehi Mat. Nauk.
**1953**, 8, 175–176. [Google Scholar] - Shönhage, A. Storage Modification Machines. SIAM J. Comput.
**1980**, 9, 490–508. [Google Scholar] [CrossRef] - Kleinberg, J.; Tardos, E. Algorithm Design; Pearson-Addison-Wesley: Boston, MA, USA, 2006. [Google Scholar]
- Burgin, M.; Adamatzky, A. Structural machines and slime mold computation. Int. J. Gen. Syst.
**2017**, 45, 201–224. [Google Scholar] [CrossRef] - Burgin, M.; Adamatzky, A. Structural Machines as a Mathematical Model of Biological and Chemical Computers. Theory Appl. Math. Comput. Sci.
**2017**, 7, 1–30. [Google Scholar] - Burgin, M.; Mikkilineni, R.; Mittal, S. Knowledge processing as structure transformation. Proceedings
**2017**, 1, 212. [Google Scholar] [CrossRef][Green Version] - Burgin, M. Measuring Power of Algorithms, Computer Programs, and Information Automata; Nova Science Publishers: New York, NY, USA, 2010. [Google Scholar]
- Shepherdson, J.C.; Sturgis, H.E. Computability of Recursive Functions. J. ACM
**1963**, 10, 217–255. [Google Scholar] [CrossRef] - von Neumann, J. Theory of Self-Reproducing Automata; University of Illinois Lectures on the Theory and Organization of Complicated Automata, Edited and completed by Arthur W. Burks; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
- Minsky, M. Computation: Finite and Infinite Machines; Prentice-Hall: New York, NY, USA, 1967. [Google Scholar]
- Hopcroft, J.E.; Motwani, R.; Ullman, J.D. Introduction to Automata Theory, Languages, and Computation; Addison Wesley: Boston, MA, USA, 2007. [Google Scholar]
- Burgin, M. Periodic Turing Machines. J. Comput. Technol. Appl. (JoCTA)
**2014**, 5, 6–18. [Google Scholar] - Burgin, M. Inductive Cellular Automata. Int. J. Data Struct. Algorithms
**2015**, 1, 1–9. [Google Scholar] - Siegelman, H.T. Neural Networks and Analog Computation: Beyond the Turing Limit; Birkhauser: Berlin, Germany, 1999. [Google Scholar]
- Blum, L.; Shub, M.; Smale, S. On a theory of computation and complexity over the real numbers: NP-completeness, recursive functions and universal machines. Bull. Am. Math. Soc.
**1989**, 21, 1. [Google Scholar] [CrossRef][Green Version] - Blum, L.; Cucker, F.; Shub, M.; Smale, S. Complexity and Real Computation; Springer: New York, NY, USA, 1998. [Google Scholar]
- Burgin, M.S. Inductive Turing Machines. Not. Acad. Sci. USSR
**1983**, 270, 1289–1293. [Google Scholar] - Burgin, M. Cluster Computers and Grid Automata. In Proceedings of the ISCA 17th International Conference “Computers and their Applications”, International Society for Computers and their Applications, Honolulu, HI, USA, 13–15 August 2003; pp. 106–109. [Google Scholar]
- Burgin, M. Interactive Hypercomputation. In Proceedings of the 2007 International Conference on Foundations of Computer Science (FCS’07), Las Vegas, NV, USA, 25–28 June 2007; CSREA Press: Las Vegas, NV, USA, 2007; pp. 328–333. [Google Scholar]
- Burgin, M. Super-recursive Algorithms as a Tool for High Performance Computing. In Proceedings of the High Performance Computing Symposium, San Diego, CA, USA, 12–15 April 1999; pp. 224–228. [Google Scholar]

© 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Burgin, M. Triadic Automata and Machines as Information Transformers. *Information* **2020**, *11*, 102.
https://doi.org/10.3390/info11020102

**AMA Style**

Burgin M. Triadic Automata and Machines as Information Transformers. *Information*. 2020; 11(2):102.
https://doi.org/10.3390/info11020102

**Chicago/Turabian Style**

Burgin, Mark. 2020. "Triadic Automata and Machines as Information Transformers" *Information* 11, no. 2: 102.
https://doi.org/10.3390/info11020102