Living systems are dynamically stable by computing themselves at the quantum level

Abstract: The smallest details of living systems are molecular devices that operate between the classical and quantum levels, i.e. between the potential dimension (microscale) and the actual three-dimensional space (macroscale). They realize non-demolition quantum measurements in which time appears as a mesoscale dimension separating contradictory statements in the course of actualization. These smaller devices form larger devices (macromolecular complexes), up to living body. The quantum device possesses its own potential internal quantum state (IQS), which is maintained for prolonged time via error-correction being a reflection over this state. Decoherence-free IQS can exhibit itself by a creative generation of iteration limits in the real world. To avoid a collapse of the quantum information in the process of correcting errors, it is possible to make a partial measurement that extracts only the error-information and leaves the encoded state untouched. In natural quantum computers, which are living systems, the error-correction is internal. It is a result of reflection, given as a sort of a subjective process allotting optimal limits of iteration. The IQS resembles the properties of a quasi-particle, which interacts with the surround, applying decoherence commands to it. In this framework, enzymes are molecular automata of the extremal quantum computer, the set of which maintains stable highly ordered coherent state, and genome represents a concatenation of error-correcting codes into a single reflective set. Biological systems, being autopoietic in physical space, control quantum measurements in the physical universe. The biological evolution is really a functional evolution of measurement constraints in which limits of iteration are established possessing criteria of perfection and having selective values.


Quantum Measurement is a Non-Local Actualization
Quantum measurement process percolates micro and macroscales.In quantum measurements, all interactions are mediated by a holistic reflective factor (an observer) measuring the interaction.This factor addresses to a superposition of all possible states of microsystem, realizing a choice between the potential states, which occurs via macroscopic measuring device, embedded within the system as its actual part ("body").
As a result of quantum measurement, a new actualized macrostate appears non-locally evolving from the previous macrostate, since its points are not defined before the quantum measurement.This means that quantum measurement includes a reflection to the field non-determined beforehand, i.e.
addressing the potential field at the microlevel.Local assembly takes place when the field to which reflection is realized is already defined in the actual three-dimensional space, i.e., when a device is external to the assembling system.It corresponds to a temporal evolution of the system.When the device is embedded into the system that is measured, the positions of all points are just rearranged and singled out in the course of measurement.This is a poiesis, in which the temporal process itself is established.The same point taken before and after measurement becomes split to the image and its reflection.Quantum measurement can be expressed as a process generating a contradiction and representing a logical jump via such a contradiction.It is described as a logical procedure of inducing and addressing a fixed point [1].
An emergent system is always relevant to resolving a paradox or a logical jump.Solving it and obtaining a reflective domain is used as a new transition rule.Resolution of the paradox perpetually proceeds along time, through the flow of which micro and macro levels are connected and any solution is relative.This type of model can be illustrated as an iterative algorithm, using a dynamically changing contraction mapping as the interface of a state and a transition rule [2].It describes a nonlocal structural unfolding where contradictory consequent realizations (quantum reductions) are separated within the internal time-space.
During actualization, an unaccountably infinite number of assembling states unfolds into regular series of spatial events with basically simple and reproducible structures.The selection of an appropriate solution of wave function reduction should satisfy certain limit conditions.Actualization can be viewed as a limit of the recursive process originating in quantum uncertainty.It generates an iterative process of reflection to this uncertainty via allotting it by a certain value.It corresponds to a non-local assembly, which is realized as a reduction of the uncertainty in quantum measurement.The measuring device is a part of the reflective system included in it as an embedding (body).Reduction from the field of potentialities assumes the existence of alternative realizations that represent different projections into real numbers.Quantum complementarity arises as a set of these different projections that cannot exist simultaneously, where contradictory states generate the appearance of uncertainties in the coordinate/impulse or energy/time observables.

Separation of Contradictory Statements Between Scales
The structure of the physical space-time follows from the basic reflective structure appearing in the actualization process.It includes a dimension of an infinite extension V (from vacuum), which consists of superpositioned states described by imaginary numbers.It is actualized via the reduction of potentialities forming the classical three-dimensional space (3D), in which contradictory realizations are separated as series in a temporal process (T).On the edge between V and 3D+T, the measuring device R (from reduction), a subset of 3D+T ("body") is operating.
Modern models of space-time in the superstring theory explicitly follow this approach.According to Randall and Sundrum [3], an extra dimension of the infinite extent supplements three spatial dimensions we observe.The observed 3D space is actually the evolution in time of a three-brane moving through an ambient space-time of a higher dimension (the infinitely large space-time).In such 'brane-world' scenarios, the particle physics is confined to the brane but the particles can interact with the ambient space-time through gravitational interactions.When all extra spatial dimensions of the ambient space-time are compact (being a subset of the potential dimension of an infinite extension), these interactions can be so weak as to have escaped detection by experiments thus far.Thus, our Universe is viewed as a domain wall in this five-dimensional space.
The fifth dimension is not a single coordinate.It is the infinite dimension where the 3D and the other (compactized) dimensions are embedded.The compactized dimensions represent a subset within this potential set (vacuum) forming a mesoscale border between the quantum vacuum and the classical 3D.The actual structure of the 3D+T is deduced from its suitability for the presence of observer.With more or less than one time dimension, the partial differential equations of nature would lack the hyperbolicity property that enables observers to make predictions.In a space with more than three dimensions, there can be no traditional atoms and no space structures.A space with less than three dimensions allows no gravitational force and is too simple and barren to contain observers [4].
The relation of the infinite and the finite is always a choice of an alternative out of a set, and parameters of this choice cannot be deduced from initial conditions.Thus, a statement is needed narrowing the number of combinations of the values.This statement is an internal arbitrary signification realized by a device through the process of measurement.It is a reflection, which parameters are determined by a possibility of the reflection itself via a specific possible construction of the measuring device operating as quantum computer, particularly by introducing of its temporal parameter (the time of observation).
When contradictory statements appearing during actualization are separated by time intervals, we sink from the mathematical into the physical world and face infinite regression avoiding simultaneous existence of opposite definitions.A separation (selection) of contradictory states occurs via measurement process.The temporal process represents as series of computable events, but it is not our computation (by which we count) but an external natural computation (which can be counted by us, i.e. represented as an objective dependence of the spatial coordinates on the time coordinate, i.e. as the physical law).

Maintenance of Hierarchical Space-Time Structure -Computation
Any measuring (computing) device uses an extra dimension to organize structures in the 3D space.Classical computers use electromagnetic field, which is a compactized dimension in the framework of Kaluza-Klein approach, to glue together separate points.Quantum computers will use a total extra dimension of the infinite extension (vacuum) for binding separate points.As a result of measurement (computation) the observed space-time is organized, and the 3D corresponds to the optimal variant for embedding of measuring devices ("bodies") that compute (organize) the V-reality into the actual reality (3D).According to modern views, quantum computers in order to operate without errors should maintain decoherence-free subspaces via implication of error-correcting codes.The living time of decoherence free subspace is determined in frames of Heisenberg's energy-time uncertainty ratio [5].It determines, e.g., the turnover rates of enzymes and conformational movements of other biomacromolecules.The continuous measurement holds a decoherence-free state via the quantum Zeno-effect between levels.
Decoherence-free subspaces themselves (without error-correction) are stable to a perturbation in time to the first order.They fit ideally for the quantum memory applications.For the quantum computation, however, the stability result does not extend beyond the first order.To perform the robust quantum computation in decoherence free subspaces, they must be supplemented with the quantum error correcting codes [6].As a result, the power law appears in the system which is introduced via hyperlinks (Gödel numbers) in the set of real numbers.
Quantum computer can be protected against decoherence for an arbitrary length of time, provided a certain threshold error rate can be achieved.Encoding the state of quantum computer for error correction has the effect of making its operating states macroscopically indistinguishable: the more "stable" the code is, the more errors it can correct in each pass.Any two possible operating states would have to be macroscopically indistinguishable (to a degree), from the point of view of the values of averages of macroscopic variables.Quantum computers can be made stable by encoding them.
The encoded state would appear just like a state that has no information at all.For the nondegenerate codes the expectation values of macroscopic observables in such encoded states are identical to those that would be obtained for a totally mixed state of a maximum entropy, where again the degree of indistinguishability increases with the number of errors corrected by code.The use of encoded states prevents one from being able to construct coherent superpositions of states that look very different macroscopically, which are the special states characterized by the large decoherence rates seen in.Instead, the more errors the code can correct, the more possible states of the system which one might construct (and superimpose) look alike [7].
The concatenated codes involve re-encoding already encoded bits.This process reduces the effective error rate at each level, with the final accuracy being dependent on how many levels of the hierarchy are used.It is not possible to clone the unknown quantum states.The act of measuring would destroy any quantum coherence of the state.It is possible to exploit the entangled states supported by the additional bits.To avoid a collapse of the quantum information in the process of correcting errors, it is possible to make a partial measurement that extracts only the error information and leaves the encoded state untouched.
Quantum error-correcting methods protect information in memory.The concatenation involves the applying this combination of techniques hierarchically [8].Engineering the environment (reservoir), and therefore decoherence, may be a way to avoid complex error-correcting schemes.Decoherence rate scales with the square of a quantity describing the amplitude of the superposition states [9].The best solution of the problem of reservoir is a squeezed reservoir, where all initial states asymptotically relax to a squeezed state of motion [10].

Structure of Computation Device
Leibniz [11] defined living systems as the automata exceeding infinitely all artificial automata.The machines of nature, i.e. living bodies, are machines up to their smallest details ('Monadology', § 64).In modern science it is realized that the smallest details of living systems are molecular automata [12] that operate between the classical (3D) and the quantum levels, i.e. between the potential dimension (microscale) and the actual 3D space (macroscale), i.e. they are mesoscale quantum devices.These smaller devices form larger devices (macromolecular complexes), up to the living body.
The quantum device possesses its own potential internal quantum state (IQS), which is maintained for prolonged time via a reflective error-correction.It is a part representing a superposition of the potential contradictory reality (vacuum), i.e. it belongs to a microscale.The error-correction is a reflection over this state.It is concatenated within the 3D space as a molecular computer (MC).IQS cannot be cloned but it can exhibit itself by a creative generation of limits of iteration in the 3D world.Superpositions can exist only in quantum systems that are free from the external influences.Thus the external influence should be restricted only to error correction without disturbing the IQS.A decay from a superposition to a statistical mixture of states is called decoherence.The rates of decoherence scale exponentially with the size of superposition.
In artificial quantum computers, which principal basis is under extensive theoretical consideration at present time, error-correction is allotted by human constructing this device.In natural quantum computers, which are living systems, the error-correction is internal.It is a result of reflection, given as an estimation of the "state of affairs", i.e., as a sort of an internal process allotting optimal limits of iteration.The IQS by its internal "decision" causes decoherence being coherence-free itself.The IQS is a decoherence-free subspace, which can apply decoherence to its envelope (body).This decoherence should be error-corrected.The decoherence-free state is maintained by the error-correcting code from the quantum computer.Error-correcting code is concatenated by the encoding in genome.
The internal quantum state resembles the properties of a quasi-particle, which interacts with the surround, applying decoherence commands to it.For this it should possess a certain curvature possibly of the order of Planck's mass value.The IQS is maintained by the program of error-correcting codes.In this framework, enzymes are molecular automata of the extremal quantum computer, the set of which maintains highly ordered coherent state, and genome represents a concatenation of errorcorrecting codes into a single reflective set.
The MC operates with molecular words (DNA, RNA) having definite addresses.The MC functions if the operator acts as an enzyme.The set of operators forms the program of calculation, where operators collide by the Brownian movement.A program can be rearranged in the course of computation.The long-term memory of the MC is based on DNA, the short-term memory -on RNA [13].
Signals (transmitted by bosonic fields) of molecular structures can displace a probability distribution in the IQS.Molecular computer is an input and output device of the IQS.A search of address is realized by the directed mechanical transition [14].Thus, the molecular computer maintains the IQS and governs its operation.The entering (input) into the IQS should be realized by the code of a minimal influence on the system (i.e., by the error-correcting code recognizing only a wrong decision).The code should be optimal also on the output [15].
The other, complementary to body, projection of the IQS is a constructing of space-time image.It is possible only when IQS reaches very high capacity for decision-making.In Freudian terms, it includes language (superego) and ego (the reflection of IQS on itself by means of superego).How are the different IQSs linked, besides via exhibiting their objectivation and signification?This question, stated already by Pythagoras in connection with the harmony of the observed world, has no explicit answer in the scientific paradigm.The monads have no windows, according to Leibniz, but they are synchronized via a harmonic objectivation based on the uniformity of fundamental constants.This synchronization is achieved at certain values of physical constants, which are substantiated as appearing to be a unique solution within the reflective loop, corresponding to its stable selfmaintenance [16,17].Physical laws operating with fundamental constants represent a basis of the natural computation.They are optimized within a reflective process in such a way that allows the appearance of higher levels of reflection, including such phenomena as free will and consciousness.Biological systems, being autopoietic in physical space [18] control what, when and where measurements are made on the physical universe [19].The biological evolution is really a functional evolution of measurement constraints, from cells to brains [19].

Exhibition of Internal Activity -Iteration and Limits
How can we distinguish a subjective internal process from the external non-generic phenomenon?There should be something in the generated structure, which really is a limit of iteration that exhibits an internal process.Any internal (subjective) choice exhibits a structure of the semantic paradox, arising to Epimenides [16].The paradox results from mixing the notion of indicating an element with the act of indicating a set consisting of elements.According to Kitabayashi et al. [20] it is possible to formalize this approach via introducing the notion of fixed point.The fixed point x (the point of coincidence of the image and its reflection) of the operation of determination of A and A -, denoted by F can be expressed as an infinite recursion, x = F(F(F( …F(x)… ))), by mapping x = F(x) onto x = F(x).
It can be considered as a point in a two dimensional space.The operation of F is the contraction in a two-dimensional domain, indicating either A or A -.
If faithfulness of A is denoted by m, the invariance of faithfulness with respect to contraction is expressed as f(m)*m = constant, where m is the value of faithfulness and f(m) is the probability of m.If distribution of f(m) does not have an off-set peak, m directly means the rank.Then f(m)*m=c represents what is called the Zipf's law, i.e. log (f(m)) = -m+c (for details see [20]).
The similar formula was introduced by Mandelbrot [21] for the fractal structure, actually fractal is an iteration arising from the set of complex numbers by squaring them, i.e. by reflecting them to the two-dimensional space.An observer cannot detect the Zipf's law until some tool appears, which is a hyperlink between the other objects.It allows realization of the combinatorial game between the objects connected by the hyperlink.The third dimension is a reflection over this 2D domain.It appears if we estimate the actualization domain (brane) for the error-correction.This is possible only through the introduction of the internal time of observer.As a result, the 3D+T structure appears.
According to the Zipf's law, the probability of occurrence of words or other items starts high and tapers off exponentially.Thus a few occur very often while many others occur rarely.The distribution of words is often an inverse exponential like e -an .The power laws can be indicative of the selforganized criticality.The linear iteration with the power law leads to the golden ratio limit.The golden ratio appears as a threshold for establishing a connection between local and global periods of the word.The local period at any position in the word is defined as the shortest repetition (a square) 'centered' in that position.The shortest repetition from that position is described by the golden ratio [22].The power law and the fractal structure appear in the systems exhibiting quantum computation as a consequence of the reflective control.
In biological morphogenesis, which we observe in the actual 3D+T space, the preceding motif unit is transferred into the subsequent one by a certain fixed similarity transformation g, i.e., S k+1 = g * S k , where g may be linear, cyclic, Möbius or fractal.If we have the generating transformation g unfolding m times to a motif unit S k , a component S k+m is obtained, and a group of transformation G will contain elements g 0 , g 1 , g 2 ,…, g m .Actually, the concrete mean of g is determined by the internal timing within the reflective loop.The non-Euclidean effects correspond to a time rescaling within the system.Time appears as a tool for the reduction of uncertainty in quantum measurements, i.e. as a computationgenerating tool.During long times of observation coherent quanta coexist in the whole structure of the device providing its precise operation as a whole entity [23].
The geometry being a set of invariant rules of transformation is a finite representation of measurement result.The domains (growing aperiodically in the general case) are hierarchically embedded into one another and function at every level with different clock time periods.The limit of actualization fits optimality of the structure being actualized, thus it provides the existence of 'fundamental constants', e.g., of the golden ratio, which are the most optimal solutions for design.It was proposed that the 'golden wurf' being the limit of lengths series of three sequential stretches divided by three neighboring numbers of the Fibonacci series, i.e., the constant characteristic for the actualized triadic structure, is even more general characteristic for the non-Euclidean systems than the golden ratio [24].These coordinate scales can be transformed by simple recursive rules via rescaling [25].
In internal evolutionary process, which includes formation of self-referential loops, the evolving state is determined by the two (in the simplest case) contradictory values of the system separated by time interval, and the value in time future acquired after addressing them.Addressing the fixed point means that the two contradictory statements taken as sequential values separated by time interval and equally probable are composed to get the third statement.Thus the next statement (quantitatively modelled as having correspondent value) is composed from the two previous statements when they are memorised within the reflective loop: F n+2 = F n + F n+1 .This will lead to important evolutionary consequences: in the transformation of a non-local incursive system to a local recursive system, certain recursive limits will appear as fundamental canons of perfection formed as memorisation within reflective loops.
The Fibonacci series represent a recurrent consequence of values (at n = 0, 1, 2, 3, …) where n may correspond to the values at discrete times of generating and addressing the fixed point.In many cases of biological morphogenesis the following configurations are realised as limits (n → ∞) of infinite recursion: = 0.618… (the golden section) Other useful series appear when three neighbouring elements F n , F n+1 , F n+2 of the Fibonacci series are taken as lengths of three sequential segments (as appeared in the sequential past (t-1), present (t) and future (t+1) times).In this case we get the wurf W:

309… (the golden wurf)
The value of golden wurf as a limit of the recursive process will have the wurf of three sequential segments with the values 1, Φ and Φ 2 , i.e. it follows from the memorization of limits of recursion in the Fibonacci series [24,26].The golden ratio and the golden wurf constants represent fundamental values of infinite recursion when the next element is formed by the operation on the two previous sequentially appearing elements memorized within the reflective loop.They always occur in morphogenetic patters appearing as limits of infinite process of recursive embedding arising from reflective action (internal quantum measurement).
The neighboring members of the Fibonacci series are linked by the relation 1) n According to Petukhov [26], a deviation from the symmetrical relation will be described as the incorporation of the defect ∆: F n *F n+2 = F 2 n+1 + (-1) n *∆ This deviation (dissymmetrisation) can generate a higher-order symmetry at the next step of evolution corresponding to a sequence of canons in evolution.

Reflective Structure of Living System
The reflective system of living beings (hypercycle) consists of catalysts, substrates and an embedded subset of substrates serving as a matrix for catalysts' reproduction [27].In the simplest case (RNA catalysis -ribozymes), a single molecule can hold all these properties (of catalyst, its substrate, and the matrix).The enzyme catalysis is based on the phenomenon that enzymes provide a precise specific recognition via a prolongation of the relaxation time [5,23], which is relevant to the quantum non-demolition measurement model [28].Enzymes decrease uncertainty in the inorganic catalysis paying by very long relaxation times according to the energy-time uncertainty ratio [5].A specific recognition is characterized by the minimum dissipation of energy during interaction between the measuring device and the measured object.
Code interacts with the whole reflective system as its embedded digital description, which limits its development to simple recursive rules.It is a computable part of the non-computable system similar to the set of Gödel numbers within the arithmetic system that are necessary for its description [29].The pattern of the genetic code could be formed on the basis of search of the optimal variant of the reflective structure.
The 'central dogma' of molecular biology claims the irreversibility of the information transfer.The quantum superpositions in the genome may be reversible, and during the reduction corresponding to the internal measurement they enter into an irreversible process, that corresponds to 'making decision' within the reflective loop at the molecular level.An uncertainty on the genetic level may be provided by base tautomery, transitions of a proton from one place of nucleotide to another, etc. [30].But the main uncertainty, which is reduced in the irreversible process, is connected with combinatorial transformations using molecular addresses at all levels of informational transfer (mobility of genome, splicing, posttranslational processing).During this process, single events corresponding to realization of interacting individual programs form a percolating network, and this can lead to a concrete spatial pattern constructed using an optimal coordinate scale.
DNA folding leads to the formation of alternate structures (within general types of right-handed and left-handed helical) differing in curvatures and topologies that could exist in a superposition before their (internal) observation (measurement).It was discovered that DNA possesses a scale-invariant property consisting in the existence of a long-range power law correlation [31], which is expressed mostly in intron-containing genes and in non-transcribed regulatory DNA sequences [32].Combinatorial events drive the system in an out-of-equilibrium steady state characterized by a power law size distribution [33].It was shown that the coding part of the genome seems to have smaller fractal dimension and longer correlations, than non-coding parts [34].Fractal properties of DNA particularly in its non-coding regions may reflect important properties for providing a combinatorial power for the developmental and evolutionary dynamics of the genetic material, particularly for specific recognitions (as in the case of enzymes) during genome rearrangements.They are connected with the existence of quasi-particles and coherent quanta inside the helical structure of DNA molecule that change their orientation during topological reconstructions and rearrangements.This may provide the existence of genome as a permanently changing superposition of potential states that are reduced in the course of interaction with the changing environment.
The genomic superposition is reduced via the transformational generative grammar of genetic texts in the sense of Chomsky [35].The principles of generative transformations of genetic texts will form a set of interactions based on molecular addresses.Such a generative grammar will represent a language game (open process) with limits (constraints).
The reflective control in genome is realized by tools (molecular addresses) organizing combinatorial events.Thus, the molecular addresses establish the set of rules for language game corresponding to such hierarchical organization.According to Head [36] the genetic structure can be viewed as consisting of the two complementary sets.The first set consists of double-stranded DNA molecules, the second set -of recombinant behaviors allowed by specific classes of enzymatic activities.The associated language consists of strings of symbols that represent the primary structures of the DNA molecules under the given enzymatic activities.Thus we can say that the recombinant (splicing) system possesses a generative formalism.Further Paun [37] showed the closure to Chomsky language families under the splicing operations.The generative capacity of splicing grammar systems is provided by its components.Any linear language can be generated by a splicing grammar with two regular components.Any context-free language can be generated by a splicing grammar system with three regular components.Any recursive enumerable language can be generated by a splicing grammar system with four regular components [38].
The computation strategy of genome is an example of self-assembly mode of computing [39].The self-assembly may be realized as a computation by carving [40], which represents a computation strategy to generate a large set of candidate solutions of a problem, then remove the non-solutions such that what remains is the set of solutions.During this strategy the error-correction is realized, and this takes place in the potential field.We can suppose that the whole organism possesses the ability to forecast the splicing result before it is actualized, i.e. it can realize error-correction in the potential field by eliminating wrong potential possibilities, by implicating error-correcting codes.This means that living systems realize computation at the quantum level, the process maintaining their dynamic stability at the macroscopic time level.
The reality can be described as a set of self-maintained reflective systems exhibiting themselves externally (on macroscales) and interacting via perpetual process of signification (reducing the microscale), which introduces universal computable laws harmonizing their interaction.The evolutionary growth of information occurs via language game of interacting programs, an open process without frames.