An Intensional Probability Theory: Investigating the Link between Classical and Quantum Probabilities

: The link between classical and quantum theories is discussed in terms of extensional and intensional viewpoints. The paper aims to bring evidence that classical and quantum probabilities are related by intensionalization, which means that by abandoning sets from classical probability one should obtain quantum theory. Unlike the extensional concept of a set, the intensional probability is attributed to the quantum ensemble, which is contextually dependent. The contextuality offers a consistent realization of the measurement problem, which should require the existence of the time operator. The time continuum by Brouwer has satisﬁed such a requirement, which makes it fundamental to mathematical physics. The statistical model it provides has been proven tremendously useful in a variety of applications.


Introduction
Émile Meyerson considers the overall history of modern science as a progressive realization of the fundamental bias in human reasoning, which concerns reducing difference and change to identity and constancy [1]. He designated such a tendency the elimination of time, meaning its reduction to a mere parameter. It had already been initiated via the substitution of the genus with the concept of class, which excluded the ontological context related to generation and succession [2]. Classes have been used in the von Neuman-Bernays-Gödel theory to provide a finite axiomatization of sets and handle the set-theoretic paradoxes.
Such a viewpoint is referred to as extensional, considering that each entity is represented by its extension which is a set of incident elements. The hierarchy is based upon cardinality which ignores both the structure of elements and their relations, making a collection of individual objects [3]. An alternative viewpoint, termed intensional, concerns the time continuum which is regarded to be a fundamental intuition of consciousness [4]. Brouwer, who endorsed such intuitionism, is a forerunner of the postmodern science [5].
The paper is concentrated on the foundation of probability, whereas a shift from classical to quantum theory concerns intensionalization in which the concept of the set has been replaced by the ensemble. The intensional viewpoint implies contextuality, which is a term originating from the Latin verb contextere, meaning to weave together [6]. The contextual probability should, therefore, correspond to an entanglement, which is the fundamental conception of quantum theory [7], the reason why the definition of context is often left implicit in the widespread literature [8]. Some attempts to elucidate such a concept have led to very significant insights.
In developmental theory, the context is defined as the totality of interrelations that give meaning to a particular object [9]. Anderson has considered the role of context primarily with respect to word recognition within sentences and in regard to memory [10]. The multidisciplinary research across quantum probability and natural language processing recieved growing attention in recent years [11]. Estes has stressed the importance of context for memory retrieval and visualization [12], which encroaches on the scope of quantum cognitive models [13]. Cohen and Grossberg have used the term in reference to the spatiotemporal pattern [14], which is also evident in quantum gravity [15]. Stewart indicates a link to the holistic paradigm and emergent phenomena [16]. Cohen has considered it complementary to the content that is an extensional view [17]. Bueno has underscored the contextuality of mathematics in general [18].
Ohmadahl relates contexts to states of a system [19], depending on the perception of an essential situation one is in [20]. In that respect, the intensional probability corresponds to a distinction of the internal state which was analyzed in treatments of the problem up to Poisson [21]. The law of evidence that involved such probabilistic reasoning was an integral part of court disputes [22]. The abandonment of the intensional viewpoint significantly contributed to the radical change in the concept of causality which occurred during the late XVIII century [23]. However, it did not end the perplexity that has persisted to the present day in the gap between the objectivist interpretation of quantum theory and the subjectivist one which is prominent via QBism and the many worlds interpretation [24]. The issue has by no means been unequivocally resolved, which validates a claim by Heisenberg that "the history of physics is not only a sequence of experimental discoveries and observations, followed by their mathematical description; it is also a history of concepts" [25]. An argument against the extensional view on quantum probability was recently presented by Peppe [26].
Khrennikov defines the context in terms of conditions under which the measurement is performed [27], p. ix, as already discussed by Kolmogorov, who considered the mathematical formalization of the experiment [28]. That is the reason why contextuality in classical theory is coupled to conditionalization, which becomes generally misleading for the contextual probability [27], p. x. It has been noticed, however, that the context of conditionals might elicit various estimates of their probabilities [29]. In quantum theory, the conditional is a projector whose action on the context makes it a state. These conceptions should, therefore, be separated at the very root and presented in a progressive manner that allows some subtleties to emerge, which are applied not only in physics but in cognition, psychology, artificial intelligence, and other sciences as well [30]. In that manner, fundamental applications act by a feedback loop onto foundations of mathematics reconsidering concepts which it is based upon [31]. The range of such a loop might exceed over any assumption in stipulating structures on the socio-historical level [32].
After the introduction, Section 2 considers the intensional viewpoint. It reflects the geometrical structure of quantum states, which is the ultimate reach of intensionalizing classical probability. Section 3 concerns the measurement problem which has occurred due to the contextuality of quantum theory. A resolution is offered by the time continuum, which is a consistent realization of the measurement process. Results are discussed in Section 4, and the last one contains concluding remarks.

Intensionality of Geometrical Structuring
The intensional viewpoint relies on a premodern tradition that corresponds to geometrical structures. Geometry should therefore be a paradigm of intensionality that is reflected by hierarchical structuring. In order to straightforwardly expose the abovesaid, the opposition between extensional and intensional viewpoints is presented through the style in which modern art relates to traditional iconography ( Figure 1). The modern painting corresponds to a set of incident elements, each representing an individual object, incorporated in a sensory vivid and realistic composition. Structural relations, which are systematically suppressed to the background, are evident in an observer position that has been delimited by the picture frame. On the opposite, iconography does not portray objects but their relations [33], pp. 315-319. An individual object might be thoroughly distorted, but the image transmits structural information involving the observer [34]. The icon is a nested structure that constitutes the spatio-temporal geometry, unlike modern paintings whose geometrical structure has eliminated time [35]. Iconography favors structuring over individuals, whose significance is subordinated, as well as in category theory where the object is not the basic concept.
The intensional viewpoint is associated with, but not limited to, the manner in which the entity has been constructed or defined [36], p. 6. It is directly applied to geometry that has always been processually definable [37]. The seventh axiom of the Elements by Euclide explicitly refers to a movement that is derived through the relationship between line and circle which are basic concepts of the theory as well [38], p. 131. It is in opposition to the extensional viewpoint whereat the line is not basic at all, which requires axioms of arrangement to be placed before axioms of incidence [38], p. 17. The intension of geometry has been damaged in that manner, since the point-line relationship (the incidence ∈) and the line-plane one (the inclusion ⊂) are not represented by the same symbol. A sensible solution considers the incidence to be purely geometrical relation, with no reference to set theory [39].
Geometry is principally independent of the extensional viewpoint, which is disputable only for the continuity axiom that requires a second-order predication. In that regard, the Cantor-Dedekind axiom of continuity considers the line to be a complete order of discrete points. However, many mathematicians and philosophers opposed such a discretization of the continuum, including Aristotle, Leibniz, Kant, Poincaré, Weyl, Brouwer, and René Thom [40], pp. 1-2. The conception of a line that is continually smooth disqualifies the claim it is a mere set of some points, no matter how compact packing has been provided. The continuum is actually irreducible to a nonenumerable cardinality, the reason why the intensional view should be preferred. Such a viewpoint is reflected by quantum states which constitute the geometry of the Hilbert space. It is the ultimate reach of intensionalizing classical probability.

Foundation of Probability Theory
Common knowledge among mathematicians attributes the establishment of classical probability to Andrei Nikolaevich Kolmogorov in the 1930s. However, it is generally unknown that in the same years, John von Neumann founded quantum physics, which was also considered a probability theory. Khrennikov demonstrated the existence of a model of quantum theory in the classical one [41], pp. 261-262, though Accardi had already pointed out an analogy to non-Euclidean geometry [42]. This paper aims to bring evidence that classical and quantum probabilities are related by intensionalization. It means that by abandoning sets, which are the base of classical probability, one should obtain quantum theory.
Classical probability implies the measurable space that is an algebra of sets M Ω , whose elements termed events are included in outcomes of a random trial which is the set denoted by Ω. The algebra M Ω ≤ P(Ω) is closed under the set operations of complement, intersection, and union which are enumerable. If an event X ∈ M Ω happened, the measurable space should collapse to the conditional subspace M X = {XY|Y ∈ M Ω } which is a subalgebra of M Ω . In the preliminary notation, the product of sets denotes intersection, the order relation ≤ concerns inclusion, and P(Ω) = {A|A ≤ Ω} is the partitive set containing all subsets of outcomes.
The operation P : Y → XY is a projector of M Ω onto M X which means that it is idempotent, i.e., P 2 = P. Furthermore, it is an orthogonal projector which implies that the distance of an event Y ∈ M Ω to its projection PY ∈ M X is not greater than the distance to any other element of the conditional subspace M X = {PZ|Z ∈ M Ω }. In that regard, it is necessary to define a distance between sets A B = A \ B + B \ A, whereby the sum denotes the disjoint union of symmetric differences. In terms of the set distance, the orthogonality comes down to the relation Respecting that, events correspond to conditional subspaces and orthogonal projectors which concern geometrical structuring.
The set distance satisfies the axioms of (i) indiscernibility which are equivalent to the formula ω ∈ B ∨ ω / ∈ B whose validity comes because sets are discrete and the incidence is, therefore, a decidable relation. Following the monotonicity of the order, one gets subadditivity even in the case if the left side of (iii) is not a disjoint union. Null in the axiom (i) denotes the empty set.
The issue arises why one should implement a real-valued measure if the measurable space, equipped by a set distance, conditional subspaces, and projectors, already exists. There are two main reasons for that. The first one concerns the fact that inclusion in set theory is a partial order which does not allow comparison of all events. However, such a gain is not crucial. Brouwer has demonstrated that the equality of real numbers is not decidable, which implies that there is just a partial order among them. The situation might appear easier than in set theory, but rational numbers are a much better solution if the total order has been pursued.
The second and more significant reason is the intensionalization of probability theory. It has turned out that set theory is not an adequate framework for probability consideration. The event is regarded classically to be a set of some outcomes ω ∈ Ω, which is agreeable if their cardinality were finite. In that case, the algebra M Ω might coincide with the partitive set P(Ω) which corresponds to the discrete topology of Ω. However, there are random trials wherein it is not easy to designate outcomes at all. An instance concerns the continuum case, whereat the algebra has not involved all of the many subsets. The measurable space M Ω is, therefore, an attempt to follow the topology of Ω which is not a discrete one. Confusion also occurs in considering the contextual probability which should not reduce to conditionalizing by a set of certain outcomes [27], p. x.
For that reason, the set distance is replaced by real-valued metrics which concerns the measurement process. The metrical distance d(A, B) = A B ≥ 0 is induced by a norm whose square X 2 = X|X corresponds to the probability measure µ(X). The norm should satisfy axioms of (i) positive definiteness X = 0 ⇔ X = 0, (ii) homogeneity aX = a X , and (iii) subadditivity X + Y ≤ X + Y . The last axiom is a consequence of the measure subadditivity, which has been satisfied even if the left side of (iii) is not the norm of a disjoint union. The axiom (ii) demands elucidation on what the scalar multiplication of a set should refer to. In that respect, scalars are regarded to be just 0 and 1 values. Multiplication of a set by 1 results in the same set, but multiplying by 0 has collapsed it to a singleton * which consists of any element from Ω that is termed null.
The probability measure µ(X) = X 2 is induced by the inner product X|Y = µ(XY), wherewith the additivity rule µ(∑ ⊥ X i ) = ∑ µ(X i ) comes to be an enumerable instance of the Pythagorean theorem for orthogonal sets. One does not require standardization per unit, since it is feasible for any real-valued measure that is non-trivial. The departure from standardization makes it much easier to consider the conditional probability, which is not required to be unitary.
The axiom (i) implies positive definiteness of the measure µ(X) = 0 ⇔ X = 0 and (ii) requires homogenity µ(aX) = aµ(X) for a = 0, 1 considering that a 2 = a. It follows that the measure of the singleton is null µ( * ) = 0 which contradicts the claim that only the empty set is nullary. The resolution requires an almost equality ≡ between events, which is defined to be A ≡ B ⇔ µ(A B) = 0. The quotient structure of the measurable space M Ω according to the almost equality ≡ in the measure µ is the probability space M µ = M Ω/ ≡ whose elements are equivalence classes of almost equal sets. In such a space, all axioms of the norm and the distance are satisfied.
An immediate consequence of the quotientation is that events do not consist of certain outcomes anymore. Each outcome corresponds to the elementary event which is a singleton and it might be eliminated, remaining thereby in the same class. The probability space M µ ≤ P 2 (Ω) is, therefore, not included in the partitive set of outcomes but in the partitive squared one and the matter should no longer be traced in an extensional manner. The final step of intensionalization concerns the transition from an event to the random variable which is a measurable morphism of M µ onto the Borel algebra B over real or complex numbers. The measurability of a morphism f is defined in terms of the inverse image f −1 : B → M µ , which implies congruence in respect to set operations of complement, intersection, and union. In that regard, one has jumped from objects to morphisms by means of category theory. Considering that the Borel algebra is generated by the topology T of real or complex numbers, the measurability reduces to evidence f −1 (T) ≤ M µ , which means that each variable is continuous in a topology that is included by the probability space. An arising issue concerns the geometry of measurable morphisms.

Geometrical Structure of Probability
A measure affecting the probability space M µ has to be absolutely continuous since µ is positive definite. According to the Radon-Nicodym theorem, there is the derivative ρ ≥ 0, which corresponds to a density function from L 1 µ . Considering that it is non-negative, the density is represented by ρ = |φ| 2 wherein the random variable φ is a root function from L 2 µ . The measure is, therefore, representable in the form dµ φ = |φ| 2 dµ, which means that µ φ = |φ| 2 dµ, whereby the event from the probability space has been implied by the domain of integration.
In that manner, the contextual probability µ comes to be a generator of measures µ φ which are parametrized by the random variable φ. It does not depend on events only, but as well on contexts corresponding to states of a quantum system. Quantum states constitute the Hilbert space L 2 µ whose geometry is induced by the inner product u|v = u † v dµ, wherewith † denotes the complex conjugate of a variable. One defines the expectation value An event X ∈ M µ is identified by its characteristic x(ω) = 1, ω ∈ X 0, ω / ∈ X which is the random variable whose L 2 µ -norm squared coincides to a measure value, considering that x 2 = |x| 2 dµ = x 2 dµ = x dµ = µ(x) = µ(X). However, such a variable has been ill-defined since M µ is the quotient space whose elements do not consist of certain outcomes. One should, therefore, regard the event to be a variable that satisfies x 2 = x and x † = x, which has been used in the proof of the previous relation. These are properties of the orthogonal projector which is the operator satisfying P 2 = P and P † = P, wherewith † denotes the Hermitian adjoint. The projector of an event x is the operator Pφ = φx that restricts any variable to the concerned event. The invariant subspace J : φ = φx consists of states satisfying a fixed point property φ = Pφ for the orthogonal projector. It contains random variables whose support supp φ = φ φ (implying an arrangement 0 ... = 0) is included by the event, which means that supp φ ≤ x. However, not all projectors are related to multiplication by an event, but by any operator being self-adjoint and idempotent.
The probability measure of an event x in the context φ is equal to µ φ (x) = |φ| 2 |x = φx|φx which implies µ φ = Pφ 2 , wherein Pφ is a state from the invariant subspace of an orthogonal projector. It follows that the event takes place of conditionalization which concerns a projection onto the Hilbert space. The probability is related to states through a quadratic form of the norm squared, which makes it nonadditive in general. However, the additivity rule µ x+y+... = µ x + µ y + . . . is satisfied if contexts are mutually orthogonal events, which confirms that classical probability has been included in the quantum one.
The double-slit experiment is regarded to be an instance of quantum probability. It was first performed in 1802 by Thomas Young, who aimed to demonstrate the wavelike behavior of light. Clinton Davisson and Lester Gemer demonstrated in 1927 that electrons behave in the same manner, which was extended to general matter following the de Broglie principle. An important interpretation of the experiment involves a single particle, considering the probability of its occurrence on the screen. A trace of the particle is mediated by two slits that might be opened or closed (Figure 2). In classical theory, one should consider probabilities that particle has passed through a slit, which are disjoint and additive therefore [43]. The quantum probability however depends on a context corresponding to the variable which is termed wave function. Every point which is reached by a wave becomes its source, due to the Huygens principle, which means that contexts of both slits might be summed up in a resultant form. It does not mean that probabilities are summable in the same manner, since µ φ+ψ = µ φ + µ ψ for random variables φ and ψ which are not events but contexts not satisfying the additivity rule. Quantum probability is behaved waverly, which makes it subject to geometrical structuring [41], pp. 120-127. The measurement problem which has occurred due to the contextuality of quantum theory requires an intensional concept of the ensemble.

Quantum Ensembles and Measurement
The foundation of probability concerns one of the thirteen problems which David Hilbert presented at the International Congress of Mathematics in 1900, considering it a testament to the XX century. An original formulation takes probability and mechanics side by side, which should become substantially related due to the emergence of quantum physics thirty years later. The problem that was presented by Hilbert explicitly asked for the axiomatization of physical theories and it had not been clear what features might be incorporated, which is the main reason for probability was axiomatized so late [41], pp. 2-3.
The model highlighted for such an activity has been the Foundations of Geometry, which significantly elucidates his conception of the axiomatic method. Such an announcement is considered by Urlich Majer to be the birth of mathematical physics in terms of a unification theory that should involve geometry, mechanics, probability, etc. [44]. Hilbert's statement that "physics is too much hard for physicists" was actually kept by John von Neumann, who founded quantum probability in 1932. Moreover, he suggested consideration of the completeness issue, tracking Kurt Gödel's results which von Neumann was very familiar with [45]. There is also his establishment that the relation between properties of a physical system and subspaces of the theory makes possible a sort of logical calculus which is not classical one but concerning the concept of simultaneous decidability [46], pp. 252-254.
Andrei Nikoleavich Kolmogorov constituted the axiomatic probability theory in 1933 [28]. It had originally been published in German and the English translation was released only in 1952, whilst the complete Russian translation was only released in 1974. The absence of an English translation at the time when the German language lost its international dimension led to a representation of classical probability in which the interpretation issue by Kolmogorov disappeared, since it was considered a philosophical remark with no mathematical relevance. The consequence of such ignorance was the oblivion of contextuality, considering that Kolmogorov designed probability as a mathematical formalization of an experiment, which was already indicated by the conception of random trials. Any experimental context generates an inherent measure of its own probability space, which makes it intrinsically related to the measurement problem. The contextuality is a significant relation to quantum theory, wherein the context corresponds to a state of the system.
Events are related to some, but not all projectors which represent conditionalization concerning a system state. It is, therefore, necessary to assign a probability not only to events but to any orthogonal projector that corresponds to the invariant subspace consisting of states satisfying a fixed point property. The quantum ensemble is defined to be a measure assigning non-negative value to each of the orthogonal projectors. According to the Gleason theorem, there is a density operator ρ ≥ 0 from L 1 M that is self-adjoint, which implies ρ = ΦΦ † for the root Φ from L 2 M . The ensemble has taken the form dM Φ = ΦΦ † dM, wherein M = µ ⊗ µ is the probability measure on the product algebra M µ ⊗ M µ . The definition means M Φ (P) = ρ|P for any orthogonal projector P, implying the inner product U|V = Tr U † V, wherewith † denotes an adjoint followed by the operator multiplication and trace [47], pp. 131-134.
In that manner, the contextual probability M comes to be a generator of measures M Φ , which are parametrized by the operator Φ representing a context that corresponds to the state of a quantum system. The probability measure should be equal to M Φ (P) = ΦΦ † |P = P Φ 2 , wherein Φ → P Φ is the orthogonal superprojector that is the projection onto the invariant subspace consisted of states satisfying the fixed point property J : Φ = PΦ which has been satisfied by P as well. The ensemble theory offers a consistent realization of the measurement problem which is intrinsically related to contextual probability.
The role of random variable is played by self-adjoint operators in the form A = ∑ a i P i wherein a 1 , . . . , a i , . . . are different eigenvalues and P 1 , . . . , P i , . . . , which satisfy ∑ P i = I, are mutually orthogonal projections onto invariant eigenspaces. The orthogonality P i ⊥ P j is equivalent to P i P j = 0, which means that the projectors generate a measurable space according to the Stone representation theorem for Boolean algebras. It makes it possible to consider the operator to be a variable in terms of classical theory, whereby mutually orthogonal projectors correspond to disjoint sets which are inverse images of single values. Such an extensional viewpoint is inherent to the measurement problem which actually concerns classical probability.
Measuring A in a context Φ, one should regard the density operator ρ = ΦΦ † to be a variable defined on the same space which implies P i ρ = ρP i . The measurement process is, therefore, represented by the conditionalization Mρ = ∑ P i ρP i that is a superprojection onto the invariant subspace consisting of operators which commute with each of projectors [48]. It is also presentable in the form M : ΦΦ † → ∑(P i Φ)(P i Φ) † which has involved superprojectors Φ → P i Φ acting on the root operator.
If P 1 , . . . , P i , . . . are single properties which are projections whose invariant subspaces are one-dimensional, the process corresponds to Mρ = ∑ ρ|P i M Φ (P i ) P i wherein coefficients are probabilities of P i in a context Φ. Such an instance applies as well to the density operator ρ = ∑ M Φ (P i )P i which is a fixed point of M, since the measurement procedure has actually intended to stipulate probabilities M Φ (P i ) = ρ|P i notwithstanding what values of the variable are. One might consider, therefore, that the optimal measurement is applied to a density having satisfied ρ = M o ρ. However, the measurement is generally an irreversible process, which is evident in the fact that a single property might be superprojected to a mixture of properties having some probabilities. The situation is explained in an elegant manner by von Neumann, who defines the entropy increase when one goes from a single property to a mixture [46], pp. 379-426.
He identified two fundamentally diverse types of intervention in the quantum system, the first one corresponding to the reversible evolvement by the Schrödinger equation and the second one to an irreversible measurement. Von Neumann was wondered by the fact that the entropy increase follows the measurement process, not representing any temporal evolution, which is totally opposite to thermodynamics, which relates the increase of entropy to an evolution in the temporal domain. The reason for such an odd situation dwells in the fake concept of time which is represented by a mere parameterization, as in the Newtonian mechanics [46], pp. 351-354. He has admitted an essential weakness of quantum theory, which concerns the fact that it is non-relativistic, whereas spatial coordinates are represented by Hermitian operators, and time is just a parameter. The time operator, which should be a resolution of the measurement problem, is, therefore, a chief link between quantum and relativity theories as well [49].
The uncertainty between time and energy, which has frequently been discussed, is an effective definition of the time operator [50]. In the wave function formulation of quantum theory, however, there is no operator that satisfies the uncertainty relation respecting a Hamiltonian, which corresponds to the energy of a system. The reason for that is the fact that the Hamiltonian rules the evolution of stationary states, which are analogous to orbits in classical theory. Nevertheless, the time operator is definable in an ensemble formulation, which relates states to operators whose evolution is governed by the Liouvillian of a quantum system [51].

Time Continuum and Intuitionism
The intuitionist mathematics, that considers time to be the primordial intuition of consciousness, is an adequate framework to support the intensional viewpoint [36], pp. 92-94. A fundamental structure in that regard is the time continuum, which corresponds to a skeletal category able to embed other ones within itself [52]. Such an embedment of the probability space is a random variable, which recovers its structure in terms of measurable morphisms. In order to represent quantum states, variables are confined to the Hilbert space, where geometry has been induced by an inner product. The concept of a state extends to quantum ensembles, which are represented by operators. The measurement process, corresponding to a supreprojection in the space of ensembles, concerns a realization of the time continuum.
The concept of measurement has originated from the geometrical algebra by Euclide, which is presented in the Elements, whose fifth book elaborates a doctrine of proportion considering commensuration of magnitudes. According to the Euclidean algorithm, magnitudes a and b measure each other in the manner of a continued fraction a b = 1 n 1 + 1 n 2 + 1 . . . having the spectrum n 1 , n 2 , . . . . The proportion a b = c d , which is indicated by the matching of respective terms in both spectra, defines the identity on the time continuum. In that respect, it corresponds to the continued fraction expansion, implying the measurement process which takes place step by step over time.
The expansion concerns Diophantine approximations n i whose numerator and denominator in a fraction ξ i = h i k i satisfy recurrence relations h i+1 = n i+1 h i + h i−1 and k i+1 = n i+1 k i + k i−1 , considering the initial conditions h 0 = 0, h 1 = 1 and which makes a continued fraction to be the alternating series ∆ξ 0 + ∆ξ 1 + · · · = 1 k 0 k 1 − 1 k 1 k 2 + . . . , that is a sparse representation composed of terms from the redundant dictionary 1 1 , 1 2 , . . . , containing fractions of a unit numerator.
The series corresponds to a binary code wherein 0 is assigned to terms of the dictionary that do not participate in the series, and 1 to those that do participate, proceeded by an alternating ± sign. Such a representation of the measurement process is highly redundant since the entire dictionary cannot be involved in a series. One should, therefore, eliminate excess zeros, which is achieved by coding the spectrum n 1 , n 2 , . . . The ultimate representation is composed of alternative ± values different from zero at positions n 1 , n 1 + n 2 , . . . , which gives rise to the question mark function by Minkowski ? : States of the measurement process are represented by random variables from the Hilbert space L 2 µ , implying the Lebesgue measure µ on the Borel algebra over the unit interval. The event X evolves according to the inverse image in the Rényi map R −1 (X) = X 2 + X+1 2 , whereat the sum between fractions denotes disjoint union and one within the fraction is the pointwise addition. Such an evolution induces the operator U : x(ω) → x(Rω), which is applied to any variable. It is coincident to the action of an adjoint operator onto the density function ρ = |φ| 2 , considering that µ φ (Ux) = ρ|Ux = U † ρ|x . The , which also applies to any variable, is the left inverse of the evolutionary operator, i.e., U † U = I.
The measurement process is reflected by wavelets ψ j,k , which correspond to orthonormal bases of L 2 µ . The Haar base is paradigmatically designed by translations and normalized dilatations of the mother wavelet χ(ω) = −1, 0 ≤ ω < 1 2 +1, 1 2 < ω ≤ 1 in the manner of for 1 ≤ k ≤ 2 j , which implies that basic variables are zero-valued elsewhere.
Wavelets on the unit interval should satisfy axioms: (v) basicity φ = A + ∑ j≥0 ∑ 2 j 1 D j,k ψ j,k for every variable φ from L 2 µ and some coefficients of approximation A and details D j,k ; (vi) orthonormality ψ j,k , 1 ≤ k ≤ 2 j and the constant variable 1 constitute an orthonormal base of L 2 µ . It follows from the last axiom that the base consists of decorrelated variables, considering that E ψ j,k = 1|ψ j,k = 0 = E 1 E ψ j,k , and E ψ † j,k ψ l,m = ψ j,k |ψ l,m = 0 = E ψ † j,k E ψ l,m for (j, k) = (l, m). In terms of the evolutionary operator, axiom (iv) gives rise to U ψ j,k = 1 √ 2 ψ j+1,k + 1 √ 2 ψ j+1,k+2 j . Since the Rényi map is a measure-preserving transformation of the time continuum, U should preserve the distribution of random variables across scales. In addition, the translation axiom (iii) claims that constituents of the base are equally distributed within a common scale [53].
Wavelets ψ j,k are designated by two indices, where the first one corresponds to the scale in the binary hierarchy. It is transmitted to detail coefficients which form the binary tree D = (D j,k ), wherein each one D j,k = ψ j,k | f at the scale j is inherited by two of them sharing the same position at the next scale j + 1 (Figure 3). Steps of the measurement process correspond to invariant subspaces ∆ j of orthogonal projectors P j = ∑ 2 j k=1 D j,k ψ j,k , whose wandering generates a multiresolution analysis due to the shift property ∆ j = U −1 (∆ j+1 ). In that respect, time is represented by the operator Tψ j,k = jψ j,k , with eigenvalues that correspond to scales of an eigenbase ψ j,k . The operator T = ∑ j≥0 jP j , acts on a dense subset of L 2 µ 1, which is the Hilbert space reduced by the subspace of constant variables. The uncertainty relation [T, U] = U, which follows from the shift property, is the definition of the time operator [54].

Wavelet-Domain Hidden Markov Model
In order to elucidate the evolution of projectors, one requires an invertible extension of the operator U. The natural extension concerns the Baker map inducing the evolutionary operator U χ : φ(ω) → φ(B ω) [55]. In that regard, the Hilbert space L 2 µ should extend to L 2 M = L 2 µ ⊗ L 2 µ , wherein M = µ ⊗ µ is the Lebesgue measure on the Borel algebra over the unit square. The time operator T χ concerning the evolution by U χ has been explicitly constructed, and the uncertainty relation is easily generalized [47], pp. 47-60. Its projection onto L 2 µ corresponds to the multiresolution analysis generated by the Haar base. The time operator of any wavelets is obtained through conjugation T = C T χ C −1 , by C which transforms the Haar base to the other one. It corresponds to the evolutionary operator U = C U χ C −1 that is also an extension of U : x(ω) → x(Rω) which is a reason to be designated by the same letter.
The orthogonal projection P : φ → φx of an event x evolves into U P = U P U † concerning the event U x, which is an evolutionary superoperator that should be applied to all projectors. The density operator ρ = Φ Φ † represents the probability measure whose evolution which corresponds to an action of U † on the root operator Φ. In that respect, the ensemble evolves according to the rule M Φ (U P) = M U † Φ (P).
The measurement corresponding to wavelets ψ j,k is represented by P j,k = ψ j,k ⊗ D j,k , that is a complete system of orthogonal projectors on L 2 µ 1, which evolve by U P j,k = P j+1,k + P j+1,k+2 j . The probability measure of a property P j,k in the context Φ 1. The density operator should take the form it follows that detail coefficients in the optimal base are decorrelated, i.e., for (j, k) = (l, m).
In another base ψ l,m that is suboptimal, detail coefficients take the form which implies an approximate decorrelation of the ensemble. Considering that ψ o j,k and ψ l,m are almost entirely localized in [ k−1 2 j , k 2 j ] and [ m−1 2 l , m 2 l ], respectively, and ψ l,m |ψ o j,k are, therefore, negligible if domains do not intersect, correlations predominantly concern inheritance along branches of the binary tree. The time operator indicates an irreversibility that leads to the wavelet-domain hidden Markov model, which has been proven tremendously useful in a variety of applications including speech recognition and artificial intelligence (Figure 4).
The model claims that correlations between detail coefficients are realized through hidden variables, forming a Markovian tree S = (S j,k ) due to inheritance along its branches [56]. The information contained in detail coefficients is independent of wavelets, since H(CD) = H(D) + log | det C| = H(D) for unitary operator C representing the base substitution. It is decomposed due to the canonical relation H(D) = H(S) + H(D|S), wherein the first term represents a structural information, and the second one is an irreducible randomness that remains even if all correlations have been recognized. The structural information is base-dependent, and the optimal base is characterized by its maximization, which concerns the most significant increase of entropy in the temporal domain [57].

The Art of Memory
Discussing the measurement problem, von Neumann made a reference to Bohr, who was the first to have pointed out a link between quantum theory and the principle of psychophysical parallelism [46], p. 207. Bohr has adopted Fechner's psychophysics, which is termed the identity view, since the observer is not to be considered a conglomerate of two substances but one single entity [58]. The most significant sources for psychophysical parallelism are the foreword and the introduction from the Elements of Psychophysics [59]. The outer psychophysics, which is a link between sensation and stimulation, is realized through the neuroaesthetical computation that relates sensation to neural activity, which is regarded by Fechner as the inner psychophysics [35].
An important repercussion of von Neumann's solution to the measurement problem is that the irreversibility takes place in the presence of the observer's mind, which seems to play an active role in the process. The only manner to make such an unpleasant situation compatible with psychophysical parallelism concerns switching into the inner psychophysics, due to a change in representation Λ = λ(T) which is the operator function of time [60]. In that regard, the inner psychophysics is represented by a Markovian tree of hidden variables, which corresponds to the recognition of structural information. It concerns a temporally based hierarchy whose paradigm is a memory that has taken an artistic form [61]. The traditional iconography is an instance of the time continuum, whose intensionality reflects in hierarchical structures of geometry ( Figure 5). A temporal organization also characterized primordial painting, whose origins were traced up to the Stone Age [33], pp. 336-337, as well as the postmodern art during the XX century [62]. The art of memory was an occupation of Gottfried Wilhelm Leibniz whose Dissertatio de arte combinatoria relies on various theories of such discipline. In that respect, one should consider his intent on finding a universal language through a combination of significant symbols which was presented in Characteristica universalis [63]. It led him to the invention of infinitesimal calculus, derived from a continuous search for the system of symbolic representation [64]. The time continuum which was indicated in such a manner takes on complete realization considering the intuitionist theory of infinitesimal analysis, which is an intensional one [40].

Postmodern Science
Charles Sanders Peirce admits that it is quite difficult to explain the fact of memory and apparently perceiving the flow of time unless we suppose immediate consciousness to extend beyond a single instant [65]. There is an evident incompatibility between the extensional view and a conscious mind whose primordial intuition concerns the time flow. Memory should be related to inheritance in a hierarchical structure that is temporally organized [61]. The concept of genus originally implied generation and succession, but it has been substituted by the class, which is an abstract collection in the manner of set theory [2]. It was the germ of modernism in which science has been sentenced, due to such negligence of the generative unity.
Postmodern science has greatly overcome the extensional viewpoint [5]. First of all, Gödel's theorems demonstrated the incompleteness of such a view by the usage of self-referential sentences [36], pp. 39-47. However, Gödel has pointed out that Leibniz, in his writings about the Characteristica universalis, did not speak of a Utopian project at all [66]. As a matter of fact, in the Monadology, he exposed a hierarchical structure of entities resembling quantum states. It indicates an intensionalization, by which classical theory is related to quantum one [67].
The phenomenology by Mihailo Petrović is also reminiscent of the mathesis universalis, beginning from Pythagoras and going through Descartes and Leibniz, which represents a universal calculus that should involve everything in existence [68]. In order to give methodological support to such a tendency, he has established mathematical spectra which correspond to the method of optical spectroscopy [69]. It applies embedding of various phenomena in the time continuum, whose digital positions are analogous to spectral lines [70]. In that regard, intuitionism is a fundamental of mathematical physics standing for the unity of mathematics and applications.
Fixed points in the capacity of self-referential sentences, which ascribe themselves some properties in a certain theory, motivated general research of such a conception. Restoration of the intensional viewpoint implies the requirement that the projector designates a self-attributed property in the form of a fixed point [36], p.63. The fact that the projector itself is a fixed point elucidates the property of properties corresponding to self-reference [46], p. 249. It fits well to a resolved form that is termed the reproductive one [71], which indicates an interrelation not only to fractal geometry but to biology and other sciences [72].
The measurement is also represented by a superprojection that is self-referential, and the density operator is a fixed point in the optimal instance. The time operator provides an opportunity of considering the measurement process in terms of a temporal evolution, which unites two fundamentally diverse types of intervention in the quantum system. A change in representation provides switching between outer and inner psychophysics, due to the principle of psychophysical parallelism.

Conclusions
A link between classical and quantum probabilities has been elaborated in terms of intensionalization, which concerns the abandonment of sets in favor of ensembles. In that respect, the probability is not assigned to events but to contexts, which correspond to the states of a quantum system. The intensional probability theory is reflected by a geometrical structure of quantum states, corresponding to operators on the Hilbert space. The contextuality gives rise to the measurement problem, which should require the existence of the time operator. The time continuum by Brouwer has satisfied such a requirement, which makes intuitionism fundamental to mathematical physics.
The concept of measurement is related to the Euclidean algorithm, which considers the commensuration of magnitudes. The step in terms of binary digits concerns the Rényi map which extends to the baker map, inducing the evolutionary operator. The time operator is defined by wavelets, which correspond to the orthonormal base that reflects the hierarchy of the measurement process. Its existence indicates an irreversibility that leads to the waveletdomain hidden Markov model, which has been proven tremendously useful in a variety of applications. The model provides switching between outer and inner psychophysics, due to a change in representation, which is the operator function of time. In that manner, the inner psychophysics should recognize structural information presenting a temporally based hierarchy whose paradigm is memory.