Quantum Randomness is Chimeric

If quantum mechanics is taken for granted, the randomness derived from it may be vacuous or even delusional, yet sufficient for many practical purposes. “Random” quantum events are intimately related to the emergence of both space-time as well as the identification of physical properties through which so-called objects are aggregated. We also present a brief review of the metaphysics of indeterminism.

In what follows, we shall discuss randomness "extracted" from measurements of coherent superpositions of classically mutually exclusive states, then proceed to multipartite and mixed states. No quantum field theoretic many-particle effects such as stimulated or spontaneous emission or decay will be mentioned. In the later parts of the paper, we shall attempt a brief history of physical events that have been deemed "random" and, in particular, their relationship to the metaphysical ideas implied.
A. Quantum randomness through the measurement problem Quantum mechanics allows the coherent superposition (or, by another denomination, linear combination) of states which correspond to mutually exclusive outcomes. The question arises: what kind of physical meaning can be given to such "self-contradictory" states?
For the sake of an example, take |ψ = ψ 0 |0 + ψ 1 |1 = ψ 0 , ψ 1 ⊺ with |ψ 0 | 2 + |ψ 1 | 2 = 1 and (⊺ stands for transposition), |0 = 1, 0 ⊺ and |1 = 0, 1 ⊺ . Note that if we prefer to * svozil@tuwien.ac.at; http://tph.tuwien.ac.at/˜svozil measure a "rotated", transformed observable |ψ ψ| the state |ψ is perfectly determined and value definite; with the respective outcome always occurring. No randomness or value definiteness can be ascribed to such a configuration. With respect to |ψ ψ| and its perpendicular orthogonal projection operator 1 2 − |ψ ψ| there is no uncertainty, and no possibility to obtain randomness. In this way of perception, randomness comes about if "detuned experiments" are performed, such as, for instance, the ones "measuring observables" corresponding to the orthogonal projection operators |0 0| and |1 1| = 1 2 − |0 0|. An immediate question arises: why should such "detuned experiments" yield any results at all, and if so, in what way do outcomes of such "wrong experiments" reflect any intrinsic property of the state |ψ ? It is rather mindboggling that one should get any answer at all from such queries or "detuned" measurements. But this may be as confounding as it may be deceptive: because one might get the impression that there is a physical property "out there", "sticking" and being associated with the state. I believe that mistakenly interpreting an experimental outcome-such as a detector click-as some inherent property, constitutes a major epistemological issue that underlies many ill-posed claims and confusions about such quantum states. Indeed, these misconceptions may epitomize erroneous claims upon which quantum number generators by "quantum coin tosses" are based.
The quantum measurement problem is relevant for any judgment or certification or opinion on quantum randomness: "extracting" or "reducing" such states as |ψ by "measuring" them in the "wrong and detuned" basis |0 and |1 different from |ψ and its orthogonal vector lies at the heart of the quantum measurement problem. The respective "process", just as taking (partial) traces, is non-unitary because it is postulated "many-to-one" and irreversible. Therefore, such "processes" are inconsistent with the unitary quantum evolution, which is "one-to-one" and reversible. (See Section 1.8 of Ref. [5] for a nice presentation.) This inconsistency is an old issue already raised by von Neumann [23,24], Schrödinger [25], London and Bauer [26,27], Everett [28][29][30] and Wigner [31]. It can be developed as a "nesting" or "inverse Russian doll" type argument by everincreasing the domain of unitarity; including the measurement apparatus and the measured state, and hence the interface or cut "between" them. This has been proposed and operational-ized in quantum optical experiments reconstructing the coherent superposition of states after "measurements" [32][33][34][35][36][37][38][39][40], as well as in discussions about the insurmountable practical difficulties in doing so [41,42].
Strictly speaking by assuming irreversible many-to-one "processes" one has to go beyond quantum mechanics in an ad hoc fashion. Presently there is no evidence suggesting that this is necessary or even consistent with empirical data. Should quantum mechanics be extended against all experimental evidence, just because it is theoretically convenient and saves primitive notions of "measurement"?

B. Objectification by emergent context translation
In what follows, it will be argued that any kind of measurement-in particular, also associated with "detuned experiments"-constitutes an object or reality construction, whereby the conventionality of measurement plays an essential role. In this process, the very notion of objects or physical properties becomes conventionalized. Objects or the properties constituting them may be real or chimeric; in the latter, chimeric case those experiments relate to properties the system is fantasized about but not encoded in [43].
The term chimeric will be associated with coherent superpositions or linear combinations of different (mutually orthogonal) states, relative to those states or their associated observable propositions involved. For instance, |ψ = ψ 0 |0 + ψ 1 |1 is chimeric relative to the propositions |0 0| and |1 1|; but is value definite or "real" and not chimeric relative to |ψ ψ|. States are not chimeric relative to the propositions associated with those exact states, that is, |ψ is "real" and not chimeric relative to |ψ ψ|.
The emergent process of "creating chimeras" will be called objectification or object emergence or (re)construction. Objectification is related to an ancient conundrum [44]: the Ship of Theseus, or more generally, what is in Philosophy called "the problem of identity" [45,46]. In the physical measurement process, it is the question of how, through "mediation" of its environment and the measurement apparatus, a physical state or system which initially is unprepared to answer a particular query-or, stated differently, is value indefinite and chimeric-"translates" the respective "detuned" query such that it is can respond to the request. Through this "context translation" it may have acquired signals and information exterior to itself, which may render the answer stochastic relative to itself (because of an influx from the open environment) and to the experimental means available [47,48] (containing or severing that open environment).
This has consequences for the stochasticity of chimeras: they are not only based on some property intrinsic to the object but on the combined context by which the object, as well as the apparatus, is defined [49]. Stochasticity enters by the many degrees of freedom of such a combined system. This kind of emergence of an "experimental outcome" associated with a counter reading of a (macroscopic) measurement apparatus has already been modeled (i) by a coupling of the object with the apparatus and its environment [50], and (ii) by "at-tenuating" a quantum signal from a state to cloning a "noisy multitude" of this state [51,52] (it is always possible to clone two fixed orthogonal states) "as much as possible" (that is, nothing at all) within the framework of the no-cloning theorem (cf. Section 2.1 of Ref. [5]).
For the sake of understanding on which basis claims of absolute randomness are raised beyond evangelical confessions [53,54] let me reconstruct current "best-practice arguments" for quantum indeterminacy [55] and value indefiniteness [22,56] and their counterfactual [57,58] character. There "exist" collections of (counterfactual [57]) observables comprising intertwining contexts (formalized by orthonormal bases or maximal operators in dimension three or higher) with two terminal point states-one serving as preselection or preparation, the other one for postselection or "measurement"-with the following inconsistent properties: upon pre-selection or preparation of a particular state |Ψ , (i) one such collection of observables enforces the nonoccurrence of some post-selected state |Φ , associated with a certain negative experimental result; (ii) another one such collection of observables enforces the occurrence of some post-selected state |Φ ′ , associated with a certain positive experimental result [59,60]; (iii) both post-and pre-selected states are the same, say, |Ψ = 1, 0, 0 56,61]. Figure 1 sketches such a configuration. The classical inconsistency arises from the fact that, depending on the arrangement of the quantum observables, the same observable must either be false (snake decorated path) and true (zigzag decorated path) at the same time-a complete contradiction amounting to the absurd prediction that a detector associated with such a binary observable simultaneously registers a click and does not do so. Relative to the assumptions made |Φ given |Ψ cannot have a classical value definite truth assignment-meaning any such truth assignment is undefined at least for |Φ . This yields the truth assignment a partial function, a notion well known in theoretical computer science [62] (relative to the assumptions made for classical truth assignments, in particular, its independence on the context in which it is measured). The argument can be extended to any state not collinear with or orthogonal to the pre-selected state |Ψ [22].
Another implicit assumption that is seldom mentioned because it is assumed evident is the omni-existence of the collection of complementary observables (because the argument involves different contexts). Indeed, the coexistence of counterfactual, complementary observables is (mostly implicitly) assumed without further discussion. One common response to critical doubts about their existence is that "they can be measured". That is, a particular state |ψ can be prepared or pre-selected and subsequently, the proposition corresponding to another "mismatching" state |ϕ (which should neither be orthogonal to nor collinear with |ψ ) can be measured or postselected. This, of course, is omni-realism pure and simple.
Coming back to the argument sketched in Figure 1, it is evident that, due to pre-selection or preparation, the state |Ψ and its associated observable proposition |Ψ Ψ| is value definite relative to measurements |Ψ Ψ|. But should this be assumed for all the other observables entering the argument? In particular, should value definiteness be expected from some state |Φ given |Ψ ? Because |Φ , and all other observables, entering as counterfactual "intermediaries" in the argument, need to be in a coherent superposition of states different from the pre-selected state |Ψ and other states, which makes them chimeric relative to |Ψ .

C. Entanglement and emergence of space-time
Einstein's primary intent [63][64][65] in writing a paper with Podolsky and Rosen [66] (EPR) was to present a separation principle or separation hypothesis: given two in any, also space-like, separated subsystems A and B of a joint system (A B), then B (my translation, see also [65]) "and everything related to its content is independent of what happens with regard to" A. Thereby, Einstein's presumption has been that, after any interaction between A and B in the past (quoted from the same letter, my translation, see also [65]) "the real state of (A B) consists of the real state of A and the real state of B, which two states have nothing to do with one another". This latter assumption, at least for Einstein, is one pillarof the EPR argument. However, suppose that we are not inclined to follow Einstein's critique of quantum mechanics but propose that, rather than quantum theory, space-time physics, and relativity theory would need to adapt in case there is a collision with quantum mechanics. Then the separation principle should be considered incorrect and not be applied for entangled quantum states introduced by Schrödinger [25,67,68] around the same time 1935 and 1936 of the EPR paper. In particular, there exist entangled states of two subsystems A and B which are indecomposable; that is, they cannot be written as the product of the states of the two "separated" systems A and B; more formally, |Ψ(A B) = |ψ(A) ⊗ |φ (B) , where ⊗ stands for the tensor product.
This inseparability, as discussed by Schrödinger in the measurement context (between object and measurement apparatus) has been re-interpreted in terms of relational proper-ties [43] for multi-partite configurations. It comprises two parts-a restrictive and an extensive property for classical physical systems: (i) quantum mechanics limits the amount of information encodable in a quantized system from above, and (ii) it allows the storage, resampling [69] or scrambling of such limited information "across quanta". Both properties can be viewed as direct consequences of the unitary transformations postulated as formalizations of quantum state evolution, because entangled systems are merely "a unitary transformation apart" from separable states [70,Section 12.8.2].
Let us pursue a very radical, iconoclastic deviation from the Kantian idea that space-time is an a priori theatric frame, a sort of scaffolding, in which physics takes place. Rather, suppose that (i) in reversing Einstein's verdict mentioned earlier, for (maximally) entangled states of a composite system (A B), its constituents share a common identity-that is, they "are tied together" and can be considered "being aspects of a single entity" and, in particular, "not spatio-temporally separated at all"; so much so that any individuality or separateness vanishes.
(ii) Space-time needs to be derived from quantum effects as an (emergent) epiphenomenon, a secondary effect or byproduct that arises vis-à-vis quantized systems and does not stand separate from or independent of them.
In this view distances are a matter of disentanglement and gradual: two events such as detector clicks are "apart" if their corresponding states are (for all practical purposes) factorizable and decomposable, and thus disentangled. Spaciotemporal separations and distances are to be understood more like the second law of thermodynamics [71]: they are not absolute but relative to the (entanglement) means involved. This creates a "patchwork" of clocks and rulers, associated with the respective entanglements. Such emergent space-time frames need not necessarily be consistent with one another, but rather form a mesh of spatial-temporal networks.
Most radically, what may be considered "far apart" in the old Kantian-Einsteinian framework maybe not be separated at all in the new scheme. For most practical purposes [72,73], the two notions of spatial-temporal distances may coincide. Because entanglement and "nonlocality" with respect to the old "absolute" theatrical framework of space-time (for all practical purposes) "happens locally" and-again according to the Ancien Régime in terms of Kantian-Einsteinian spacetime frames-not "far away". This radical departure from the Kant-Einsteinian framework of space-time by emergence from entanglement has been discussed in entanglement-induced gravity [74][75][76][77][78][79][80][81]. See also Ref. [82] for another approach to emergent space-time. This research program is a new and active area of research.
A lot of questions arise immediately. One issue that need to be addressed is that of the finite speed of light, as compared to instantaneous entanglement: can some finite speed of information transfer be derived from an infinite property? One Ansatz is given in Ref. [83]. What is (inertial) motion, and the type of kinematics resulting from entanglement? Entanglement swapping comes to mind immediately, but this lacks any notion of inertia. Indeed, we might be tempted to speculate that the absence of inertia, rather than being a problematic feature, might be an advantage, suggesting possibilities of inertialess motion [84], and motion beyond the relativistic speed limit. It might not appear too unreasonable to speculate, that, if entanglement swapping takes place instantaneously, so maybe motion or signaling in space and time, even despite the following discussion.

D. Peaceful coexistence is not accidental but unavoidable
The argument stated by Einstein in his letter [63][64][65] to Schrödinger quoted earlier amounts to the aforementioned separation principle: measurement of a subsystem A of (A B) cannot alter the state of the subsystem B; in particular, not if the two subsystems are spatially separated. As noted earlier, Einstein attacked quantum mechanics for failing this principle for entangled multi-partite states. But as our approach considers the emergence of space-time as secondary to quantization, rather than questioning the validity of quantum mechanics we might as well respond with an "upside-down" question: why not? Why is space-time not challenged by these issues? To answer such questions it might be prudent to compare a similar classical EPR-type configuration with classical and more general resources. We can imagine at least two scenarios: (i) Value definiteness of the individual constituents A and B and fixing of their respective local shares at creation point: for this scenario Peres gave a most insightful analysis [85]. Classical "singlet" states (e.g., obtained by the preservation of angular momentum) may exhibit certain (dis-)similar behaviors as compared to the quantum case. Classically the joint system (A B) "carries" some "common share"-e.g., a hidden parameter such as the opposite angular momentum pseudovectors of the particles [86][87][88] along one and the same direction. These angular momentum pseudovectors are fixed and value definite for both parties or subsystems A and B already after their interaction. Therefore, the local information can in principle be used to produce local "copies" or "clones" of A and B. This is consistent with relativity theory because those shares remain fixed after their creation, so that whatever manipulation happens on one side does not alter the respective state or share on the other side.
(ii) Value indefiniteness of the individual constituents A and B but fixing of their respective global shares at creation point: This may for instance be achieved by assuming a global value definite share or state of (A B); and yet by not allowing or "granting" definite states to the individual constituents A and B. Therefore, any attempt to copy them fails because of the absence of value definiteness. Quantum mechanics "guarantees" or realizes such a scenario by demanding that any entangled quantized pair (A B) exhibits a relational encoding. The states of the individual constituents A and B are not value definite: they lack "definiteness" or "memory" or infor-mation about individual properties of its constituentsthe value definiteness "resides" in the relational (not the individual), holistic, global, "collective" properties among the constituents [43]. If such individual properties are "enforced" upon the constituents through measurement, they react with a context translation which, through nesting, introduces stochasticity because of the many degrees of freedom introduced from the "outside" environment. As a result one obtains outcome independence although one still obtains parameter dependence; but the latter is only "recoverable" after the outcomes from both sides are compared [89,90].
Per se both scenarios could be extended to any type of twopartite expectation functions, which need not be linear as in the classical case, but can take on any functional form; in particular also the quantum "stronger"-than-classical, nonlinear (trigonometric because of the projective character of the quantum probabilities) form. Indeed, by the same argument expectations and correlations might be even "stronger" than classical and quantum ones [88,[91][92][93] without violating Einstein locality.
Some argue that random outcomes "save" quantum mechanics from violating relativistic causality. Because if it were possible to somehow use the relational encoding of entangled inseparable states, either by duplicating nonorthogonal states [94], or by stimulated emmission [95], then B could infer information on A's settings even before knowing A's outcome post factum, posterior and in retrospect (after combining the knowledge of both outcomes). The random outcomes on A's side assure that B cannot know what happens at the former side. This argument can be extended to stronger-thanquantum correlations.
However, this kind of "peaceful coexistence" [89,90] may also be seen as a characterization of the second scenario (ii) discussed earlier. In particular, if one is considering the "common share" accessible to A and B: it is, say, a pure entangled state of (A B); more formally, it is an indecomposable vector. As it is not decomposable, there is no meaning associated with individual properties of A and B. In this form, quantum entanglement defines spatio-temporal proximity, yet cannot produce any means of communication between the entangled parties: the "more entangled" the parties get the "less individual" properties they carry. Their common share, such as indecomposable vectors, cannot give rise to any form of classical communication between the entangled parties as it is useless.
I, therefore, suggest that rather than speaking about a "peaceful coexistence" between relativity and quantum theory we should speak of this no-signaling constraint as an unavoidable feature of emergent space-time from entanglement. The value-definiteness of the common indecomposable vector share of (A B) results in a value indefiniteness of the individual states of A and B results in stochasticity if individuality is forced upon those subsystems; very much in the same way as stochasticity emerges (by context translation) from coherent superpositions or linear combinations of states, when measured "along with the detuned, twisted contexts"; as sketched earlier.

II. HISTORIC PERCEPTION OF RANDOMNESS
In what follows, randomness will be discussed in the historic context. This is important because of the lessons one could learn for the contemporary debate and perception of lawlessness and randomness. According to an influential narrative, the European Enlightenment developed as a courageous, thorough, and highly successful 1 exorcism of transcendence; in particular, the rejection of law-defying miracles [96]; moreover, the empirical sciences "established natural laws" of regular, reliable tempo-spacial coincidences which appear to be trustworthy and therefore of great utility.
The denial of any direct breach or "rupture" of the laws of nature [97,98,Sect. III,10] has pushed the boundaries of conceivable transcendental real-time interventions, and, in particular, divine providence, to the fringe of "gaps" [97,98,Sect. III,12] in the laws of nature-indeterminate situations where applicable laws, and thus the Principle of Sufficient Reason [99] have not (yet?) been identified.
As effective as the formal [100] and natural sciences are in terms of utility, they turn out to be as means and context relative 2 as any construct of thought: those imaginations of the human mind cannot deliver any "Archimedean point" or "ontological anchor" upon which an "objective reality" (whatever that is) can be based. Indeed, it is my idealistic [101][102][103][104] observation, or, rather stance or convictions, that all our physical narratives [105][106][107], doubles [108,109], images [110,111] and-more optimistically-representations [112] of what we experience as "Nature" are metaphysical-or at least amalgamated with metaphysical components-and ultimately can be denounced as being suspended in our free thought. Therefore, historically we experience a succession of incongruent, incommensurable [113][114][115][116][117][118] scientific research programs [119,120]; a lineup which should make us humble when it comes to the mind-boggling effectiveness [100] of some of our formalisms in predicting, programming, manipulating and instrumentalizing physical systems 3 .
An obvious counter-response to such idealistic positions is to contend that physics is firmly grounded in empirical data are drawn from observation of experimental outcomes. Support of theoretical physical models in the form of corroboration or falsification [125,126] by empirical evidence [127] is indispensable. As an extreme demand, physical theory should strive to include only operational entities which are physi- 1 The criterion of success is taken relative to and in terms of full-spectrum dominance compared to alternative worldviews grounded in esoteric thought. 2 Means relativity of an entity such as an idea or a physical theory is the dependence (eg., validity, existence) of this entity on the means, conventions, or assumptions employed. Context relativity relates to whatever are the circumstances that form the setting for an event in terms of which it can be fully understood. Perhaps means and context relativity are equivalent notions, yet the emphasis lies on different aspects of a situation. 3 The desperation, if not nihilism, that results from the deconstruction of long-held beliefs and narratives has been very vividly described by Schopenhauer [121], as well as through Nietzsche'sÜbermensch [122,123] and Camus' Sisyphe [124].
cally realizable in terms of achievable actions and measurements [128][129][130][131][132]. However, the history of science presents ample evidence that it has never been possible to resort to empirical evidence for the advancement or discrimination of theoretical models alone [115,119,120]. Indeed, as stated by Einstein [63] (reprinted as Letter 206 in [64], my translation), there is a metaphysical circularity because "the real difficulty lies in the fact that physics is a kind of metaphysics; Physics describes 'reality'. But we don't know what 'reality' is; we only know it through the physical description!" And although both the prediction and the willful reproduction of phenomena appears to be the cornerstone of current natural sciences, the "empirical evidence" relating to "scientific facts" is often indirect and fragile, deserving a nuanced and careful analysis [133,134].
I shall offer three examples for the type of problems encountered in quantum mechanics; all three related to the occurrence of certain "clicks" of detectors. Arguably the occurrence or non-occurrence of such a click is the most elementary, binary observable one could think of. But while the (non)registration of detector clicks may be considered undisputable (for all practical purposes [73], and nonwithstanding quantum erasures or haunted measurements [32][33][34][35][36][37][38][39][40]) the "meaning" of such clicks [135] remain open to a great variety of perceptions, interpretations, and understandings.
The first example is about measurements [136] of violations of classical locality with time-varying analyzers [137] if the periodic switching is synchronized with photon emmissions [138]. A second example is about a debate [139,140] on quantum teleportation [141,142]. A third example is about the contingencies [61] arising from counterfactual arguments of Hardy-type configurations [60,143]. These cases document well the different claims and aspects derived from single detector clicks, as perceived by different participating discussants.
Other aspects related to very general limits on symbolic representations need to be acknowledged. Any formalization of physical (in)determinism by (in)computability, and physical randomness as algorithmic incompressibility, and general induction [144][145][146][147][148] would require transfinite means not available [149] in this Universe [150][151][152]. This is because the associated formal proofs are blocked by the aforementioned Gödel-Turing-type incompleteness/incomputability results. Therefore, one cannot expect that the formal and natural sciences offer absolute corroboration of any type of semantic statements. All they allow is the systematic exploitation of syntax and narratives which are true relative to the means and purposes.
In what follows, we shall first discuss what general options of randomness can be imagined; and then proceed with a discussion of their concrete physical modi operandi.

A. Bowler type scenario of a clockwork universe
The assumption of a "clockwork universe"-that is, "stuff" such as matter, energy, together with its assorted evolution laws which are uniformly valid and unique (leaving no room for alternatives)-entails a "bowler"-type situation 4 . The Principle of Sufficient Reason [99] rules; nothing occurs without a "reason" or "cause". Once this universe is created ex nihilo and put into motion there is no further or additional interference with it; as all necessary and sufficient conditions exist to determine its evolution uniquely and completely from a "previous" state into a "later" one 5 .
How could physics facilitate and support such a view?
• The description of a unique physical state as a function of some operational physical quantity such as timeindeed, the very notion of total function (as opposed to partiality [62]), Laplace's demon, causal [153] determinism and the Principle of Sufficient Reason are scientific tropes and schemes signifying clockwork universes. They were widely held in pre-statistical physics and quantum areas until around fin de siècle.
In ordinary differential equations of classical continuum mechanics and classical electrodynamics the semantic notion of "determinism" is formalized by the uniqueness of the solutions, which are guaranteed by a Lipschitz continuity condition [70,Chapter 17].
• The quantum state evolution is postulated to be unique and deterministic 6 . However, in general-in the case of coherent superposition or mixed states-the quantum state is not operationally accessible. Therefore this sort of quantum determinacy cannot be given any direct empirical meaning.
• Deterministic chaos is characterized by a unique initial value-a "seed" supposed to be taken from the mathematical continuum and thus incomputable and even random 7 with probability one-whose information or digits are "revealed" by some suitable deterministic temporal evolution. To be suitable a temporal evolution needs to be very sensitive to changes of initial seeds such that very small fluctuations may produce very large effects. This is like Maxwell's gap scenario discussed later.
Like quantum evolution, deterministic chaos might be considered both an argument for and against classical determinism: because the assumption of the continuum renders almost all seeds formally random [20], thereby passing all statistical tests of randomness; in particular "elementary" test such as Borel normality, certify- 4 In what follows, "god" or "deity" is understood as an entity creating existence; a sort of "programmer of the Universe." 5 In such a scenario free will appears to be illusory and subjectively, as per assumption choices are merely fictitious and delusional 6 Formally it is represented by a unitary transformation, that is, a generalized rotation mapping one orthonormal basis into another one. Such a state evolution is one-to-one and thus reversible and unique. However, if the preparation context differs from the measurement context, the quantum state does not identify outcomes uniquely, thereby allowing one particular kind of quantum indeterminacy. 7 Randomness of an infinite string is taken to be algorithmically incompressible [20].
ing that all sequences of arbitrary length occur with the expected frequency 8 , but also much stronger ones. In this respect classical machinery designed to use extreme sensitivities of the temporal evolution to the initial seed, such as the Athenian [155] κλ ηρωτηριoν (kleroterion), for all practical purposes is not inferior to a quantum oracle for randomness, such as QUAN-TIS [18], based on the "evangelical" belief of irreducible quantum randomness [53].
• In system science or virtual physics, this modus could be referred to as a very restricted virtual reality, computational gaming environment or simulation [156][157][158][159] (aka simulacrum), whereby it is assumed that there is no interference from "the outside" (aka beyond): the respective universe is hermetic. No participation is possible; only passive (without interference) observation.
How does physics contradict such a view?
• Classical gaps are characterized by instabilities at singular points, such that very small fluctuations may produce very large effects. To quote Maxwell [160, pp. 211,212], "for example, the rock loosed by frost and balanced on a singular point of the mountain-side, the little spark which kindles the great forest . . . At these points, influences whose physical magnitude is too small to be taken account of by a finite being, may produce results of the greatest importance.
• In some physical situations the Lipschitz continuity is violated, yielding no unique solutions. The Norton dome [161,162] is a contemporary example of such a situation.
• Spontaneous symmetry breaking, a physical (re)source of nonuniqueness, is a spontaneous process by which a physical system in a symmetric state ends up in an asymmetric state. This is facilitated by some appropriate "Mexican hat" potential, not dissimilar to Norton's dome or Maxwell's [160, pp. 211,212] "rock loosed by frost and balanced on a singular point" mentioned earlier.
In particle physics the Higgs mechanism, the spontaneous symmetry breaking of gauge symmetries, plays an important role in the origin of particle masses in the standard model of particle physics. All of these ruptures or breaches of uniqueness depend on the assumptions and models involved.
• Quantum indeterminacy, in particular, complementarity, contextuality (aka value indefiniteness), and aspects (such as the exact decay time) of the occurrence of certain single events are postulated to signify indeterminism.
Because of both formal and empirical reasons these scenarios might no be interrelated and not separate: for instance, one might suspect that Maxwell's instabilities at singular points could be formalized by "Mexican hat" type potentials discussed in spontaneous symmetry breaking, or by ordinary differential equations yielding Norton dome-type configurations. One might even speculate that all violations of Lipschitz continuity amount to some kind of symmetry breaking.
Empirically one might argue that, for all practical purposes [73], Maxwell's scenario and Norton dome-type configurations (related to violations of Lipschitz continuity) or spontaneous symmetry breaking, never "actually" happen. Because for all practical purposes a rock loosed by frost is never (with probability zero) symmetrically balanced at a singular point; rather the position of its center of gravity will fluctuate around the tip, thereby spoiling symmetry. Also one may argue that, due to (vacuum) fluctuations, singular points make no operational sense; they are (over)idealized concepts invented by the human mind for mere convenience. In particular, microscopic quantum zero-point fluctuations, and thermal fluctuations [163] ultimately spoil symmetries. Therefore, all such exploitations of such singularities might confuse epistemic convenience with an ontology that has no physical, operational grounds.

B. Scenario of a stochastic, disorganized universe
The "converse" of a Laplacean determinism governed by a unique state evolution "tied to" previous states, as mentioned in the previous section, is one in which any given state is independent 9 of the respective previous (and future) states. In such a most extreme scenario among many conceivable degrees of stochasticity the universe is "completely" stochastic and disorganized on the most fundamental level. For the embedded observer's intrinsic perspective, due to irreducible contingency and chance, it appears as if such a world is constantly created anew by throwing some sort of dice 10 .
Whether and how some sort of structural continuity of existence can emerge and be maintained under such circumstances is a fascinating question. As in such a scenario space and time, as much as notions of causality and the laws, are emergent concepts, continuity might emerge with them. 9 Two events A and B are statistically independent if their joint probability P (A ∩ B) can be written as the product of their single probabilities P(A) and P(B); that is, P(A ∩ B) = P(A)P(B). It turns out that this results in a journey down a rabbit hole, as the concept of probability is a nontrivial one [164]. 10 This may be considered an extreme form of creatio continua. However, extrinsically-that is, from an external, extrinsic, perspective-this may be considered creatio ex nihilo as no active, real-time participation is assumed. Indeed, one may speculate that if the temporal ordering of events (and causality) turns out to be epistemic-an intrinsically emerging concept/observable of (self-)cognition/observation-then any differentiation based on temporal creation-such as creatio continua versus ex nihiloturns out to be a "red herring." Alas, without granting "time" some ontology, also differentiation between a "bowling" or "curling" god collapse.
Indeed, one might speculate that "the laws" are some sort of expressions of chaos 11 , the formation of matter and genes are expressions of these laws, the individuals carrying those genes are expressions thereof [166], and that the ideas about the world are expressions of these individuals. In that transitive way, the Universe contemplates itself through our ideasideas such as religion, mathematics, ethics, and so on.
Contemporary physics supports such a view in postulating that many elementary events -such as the spontaneous or stimulated emission of photons-occur acausally, irreducibly pure and simple [53,167]. Indeed, both classical statistical physics at finite resolution 12 , and quantum mechanics, support such a view.
Already Exner [168,169], motivated by statistical physics and the radiation law [170], suggested that [171, p. 7,18] ". . . laws do not exist in nature, those are only formulated by man, he makes use of it as a linguistic and computational aid and only wants to say that the processes in nature run as if matter, like a sentient being, would obey these laws. . . . So we must understand all so-called exact laws only as average laws, which are not valid with absolute certainty, but with the higher probability the more individual processes they result from. All physical laws go back to molecular processes of random nature and from them follows the result according to the laws of probability calculus . . . ." Even in totally "random" datasets, some sort of structure must necessarily emerge by the law of large numbers: for instance, if two dice are thrown sufficiently often, the number seven appears to be the most likely sum of their two faces. Modern arguments for the emergence of laws from chaos employ, among other methods [172][173][174][175][176][177][178][179], Ramsey theory for structure formation and structural continuity through spurious correlations [180]. It is irrelevant whether these events occur "absolutely randomly"-indeed, as has been pointed out earlier, on an individual level and with finitistic means, "absolute randomness" appears to be a vacuous concept.

C. The intermediate curler case
Intuitively the curler case [181] is one in which the natural laws-whatever their form and origin-predominate, but there are situations in which such laws do not exist, or if laws exist they are violated. The first "weak" case of indeterminism can be realized by gaps 13 .
Theologically this could be perceived as a mild form of creatio continua 14 : god has created laws that are not violated, but 11 This is not dissimilar to the impossible choice not to communicate [165]. 12 A Laplacian demon with unbounded resources might be able to determine future states from present ones with arbitrary precision. 13 As mentioned earlier [97,98,Sect. III,10] "stronger" forms of curling involve a "rupture" of the laws of nature, as they are in direct violations of those laws as mentioned in Voltaire's Philosophical Dictionary [182,Chapter 330]. Although nobody can a prioriexclude such latter cases we shall henceforth stick with Hume's attitude towards miracles [134, Section X] and neglect them. 14 Cf. my earlier remarks on creatio continua in footnote 10. god also left "some room" to communicate via gaps.
A "god of the gaps" has been rephrased in many ways. This concept is also quite popular since, on the one hand, the obvious regularities of experience and life express correlations or laws which appear evident: the daily cycle of the sun, the yearly cycle of the seasons, life, death; apples and other stuff falling down and not up, and so on. So denial of regularities appears futile. On the other hand, humans experience fate and uncontrollable circumstances quite often. In a similar reaction, the primitive mind (re)interpreted such "evidence" as god's signal.
As more and more "fateful" behaviors became "understood" and even controllable 15 it is not unreasonable to speculate that, maybe, eventually, there will be no such gaps leftin which case one recovers the bowler, ex nihilo, scenario. Alternatively some "pure" gaps in the causal fabric of our universe might "turn out"-that is, relative to the assumptions and means employed-to be irreducible and final: those gaps cannot be eliminated and might remain forever. In secular terms, this could be suspected to signify irreducible indeterminism or randomness [53]. But there exist other, possibly transcendental, interpretations involving intentionality across gaps.
That these latter scenarios are not purely speculative can be demonstrated by an interactive gaming scenario: If one is considering an interactive virtual reality environment [183,184] one usually assumes that the virtual reality is "sustained" or "supported" by a computational process "running" on some kind of computer whose physical characteristics are not directly related 16 to the simulacrum. To be interactive the two universes need to be intertwined and connected by some sort of (bidirectional) gap through which information flows in both "directions" 17 . For an intrinsic [185] observer embedded [156] in the virtual environment and bound by its operational capacities the capacity to send an arbitrary signal through the interface -from the simulating universe (aka "the beyond") to the simulacrum-can only be realized by a gap. Because without a gap, the signal must remain immanent; that is, it reduces to either lawful or chaotic behavior.
Gaps potentially allow some "transcendental" exchange of signals but do not necessarily imply such a conversation or 15 Think of medical treatments and also volcanic eruptions, floods or weather phenomena such as lightning and thunder. 16 To be feasible and nonmonotonic it can be assumed without loss of generality that both the universe in which the simulation is implemented and the simulated universe are capable of universal computation in the sense of Chuch-Turing. 17 This could result in a sort of dialogue between those realms. This could lead to a "backflow" from the simulacrum to the universe in which the simulation takes place, such that the former simulacrum performs "empirical studies" on the latter, thereby fully and actively participating in it. In this very speculative scenario, "transcendence becomes immanence." Think of evolving artificial intelligence in a computer simulation becoming aware of its situation and asking online players questions about its situation and the general situation. However, as symmetric an exchange through the interface may appear, it is asymmetric in one aspect: whereas the simulacrum cannot exist without the world in which the simulation takes place the latter can exist without the former.
How does physics support gaps? Or can physics rule them out? The following is an update and extension of Frank's discussion on physical gaps.
• As has been mentioned earlier, in the classical domain of ordinary differential equations some breach of the Lipschitz continuity condition [ • As has also been discussed earlier, quantum complementarity, and, as an extension thereof, quantum contextuality (aka value indefiniteness) can be interpreted as the impossibility to co-represent [22,86,187] certain (even finite) sets of-necessarily counterfactual because they are complementary -quantum observables, relative to the asssumptions 18 . This is problematic as the corresponding experimental protocols ("prepare a pure state and measure a different one") seem to suggest that they "reveal" some pre-existing property-indicated by the (non)occurrence of a detector click. Alas this could be misleading, as the respective click might either be subject to debate and interpretation 19 or merely signify the capacity of the measurement apparatus to "translate an improper question;" introducing stochastic noise [47]. This appears to be related to notorious inconsistencies in quantum physics proper [23,24,28,31,188] due to the assumption of irreversible quantum measurements.
• Aspects of certain individual, single events in quantized systems such as the time of emission or absorption of single quanta of light, are postulated to be indeterministic.

III. THE (UN)KNOWN (UN)KNOWNS
The relativity of the considerations on the respective assumptions and means invested or taken for granted results in an echo-chamber of sorts: whatever one puts in one gets 18 One assumption entering those proofs are the (context) independence of outcomes of measurements for "intertwine" observables occurring in more than one context. For reasons of being able to intertwine contexts formalized by orthonormal bases this can only happen in vector spaces of dimension higher than two. 19 A debate [139,140] on the alleged "a posteriori teleportation" is an example for such a nonunique semantic perception of syntactically undisputed detector clicks.
out. As mentioned earlier there is no "firm (meta)physical ground," no undisputable "Archimedean ontological anchor" upon which such speculations can be based. And the tendency of the mind to rationalize, project [189][190][191] and empathically embrace opinions that are favorable to one's ego-investments increases delusions about particular beliefs and corroborations thereof even further. At this point, the Reader might get frustrated: a negative message (akin to a negative theology) has been delivered 20 . Alas, unfortunately, this is all that can be safely stated.
Therefore we should accept the sober fact that there is certainty only in our uncertainty. This has been expressed by many insightful individuals of many philosophical traditions and religions and at various times. Aurelius Augustinus, for instance, writes [ Quantum randomness appears epistemic: identical pre-and postselected states and observables yield definite outcomes because the vector or projection operator shares are identical. If there is a mismatch between preparation and measurement, then the measurement apparatus, as part of the environment, "contributes" to the respective outcomes. Therefore, randomness extracted from coherent superpositions or linear combi-nations of the quantum state is based on the complexity of the environment rather on the intrinsic, ontologic "oracle" character of the state. "Objectification"-the emergence of a property which the original state is not encoded in-is associated with this influx of information from the environment.
This readily translates into the entanglement context, whereby I have argued that, as quantum shares are inseparable vectors, entangled states: by their relational encoding, will not be able to render individual value definiteness of its constituents, which are necessary for A-to-B communication. This relates to the concept of emergent space-time from entanglement.
In the second part of the paper, a wealth of historic resources on random physical outcomes has been reviewed. The emphasis has been on the "evangelical" side of the perception of value indefiniteness, as it has emerged historically.