1. Introduction
Black holes, enigmatic cosmic entities, encode their entropy in ways that challenge the understanding of spacetime and quantum information. This puzzle was first illuminated by Bekenstein’s 1973 proposal [
1]. The Bekenstein–Hawking formula ties this entropy to the horizon area, a connection deepened by Hawking’s 1975 discovery of thermal radiation [
2]. Yet, this radiation introduced the information loss paradox, raising questions about quantum information’s fate [
3,
4,
5,
6]. Despite significant progress, a fundamental question persists: what is the statistical origin of black hole entropy, and how does it arise from quantum degrees of freedom near the horizon? Natural units (
) are adopted for simplicity, setting the Planck length
.
Recent observational breakthroughs, such as the Event Horizon Telescope’s 2019 black hole image [
7] and LIGO–Virgo’s gravitational wave detections [
8], confirm black hole theory. Analogue experiments in Bose–Einstein condensates [
9] and optical systems [
10] replicate Hawking radiation, providing empirical support. Theoretical advances, including the holographic principle [
11,
12,
13] and unitary models [
14,
15], suggest that information survives. However, the microscopic basis of entropy remains unresolved [
16,
17]. Entanglement calculations reproduce the area law but lack a mechanism to encode modes into a statistical ensemble, leaving the physical basis of entropy unclear. Similarly, holographic models rely on quantum correlations yet fail to detail how degrees of freedom produce the thermal state observed externally [
18]. These gaps persist despite earlier classical approaches, e.g., the relativistic-plasma H-theorem around black holes [
19], which assigns a Boltzmann–Shannon entropy to accreted matter but leaves the horizon microstructure unspecified. Our measurement-driven model fills that microphysical gap by linking each
cell to a specific quantum mode.
We present a semi-classical framework in which the horizon is a classical measurement apparatus. Planck-scale patches actively measure quantum field modes, and the resulting bookkeeping recovers the Bekenstein–Hawking area law without relying on horizon entanglement. We stress that we do not claim a new derivation of
; rather, we propose and analyse a concrete non-gravitational model that reproduces the standard thermodynamic result. This perspective not only elucidates the statistical origin of entropy but also provides a concrete mechanism for how quantum information is encoded and potentially lost at the horizon, offering fresh insights into the information paradox. This paper is structured as follows:
Section 2 presents the Unruh state and model,
Section 3 fixes the Hawking temperature using the standard Christensen–Fulling argument to ensure self-consistency,
Section 4 recovers the entropy and its implications,
Section 5 details quantum simulations,
Section 6 outlines experimental predictions for near-term analogue systems,
Section 7 refines the frequency spectrum and analyses how patch-induced entanglement suppression drives a quantum-to-classical transition with a concrete decoherence timescale at the horizon, and
Section 8 discusses significance and future directions. Beyond reproducing the area law, the model also outlines measurable signatures in analogue experiments.
2. Unruh State and Horizon as a Classical Apparatus
Understanding black hole entropy requires a mechanism to bridge quantum field dynamics and classical horizon properties. This model proposes that the horizon itself acts as such a bridge, functioning as a classical apparatus. This classical apparatus can be likened to a macroscopic measuring device in quantum mechanics, which collapses a quantum state upon observation, here transforming the entangled Unruh state into a thermal ensemble observable from the exterior.
The Schwarzschild black hole, characterised by mass
M, has an event horizon at
(in natural units), with a horizon area
. The quantum fields near the horizon are described by the Unruh state—the vacuum state for an accelerated observer, which appears thermal due to the acceleration, mirroring the Hawking radiation spectrum [
20] (see also [
21,
22,
23]). For a mode labelled by
k (e.g., frequency
), the Unruh state is
, where
and
are Fock states with
n particles in the interior and exterior modes, respectively. The normalisation factor ensures
. The exterior modes follow a thermal distribution with inverse temperature
.
Section 3 recovers
self-consistently from the regularity of the renormalised stress tensor; no temperature is assumed at this stage.
A core innovation of this model is the assumption that the horizon acts as a classical apparatus, departing from previous semi-classical approaches that treat the horizon as a quantum boundary, such as Hawking’s original derivation of radiation [
2] or entanglement entropy calculations [
24], which do not model the horizon as an active measurement device. The horizon is discretised into Planck-scale degrees of freedom, with their number given by
. Each degree of freedom, termed a ‘patch’—a Planck-scale region of area
acting as a classical detector—measures an interior mode of the Unruh state. This classical treatment aligns with Bekenstein’s proposal of the horizon encoding classical bits of information per Planck area [
25,
26] and ’t Hooft’s holographic principle [
11,
27]. The measurements of each patch are assumed to be independent, decohering the entangled Unruh state into a thermal ensemble. Planck-area cells are singled out because the local scrambling time
(as argued in [
28]) is minimal at that scale, so smaller detectors cannot act faster while larger ones merely coarse-grain several independent channels. This simplification is reasonable in the semi-classical regime, where classical statistics dominate over quantum coherence across macroscopic horizon scales, as supported by the membrane paradigm’s classical treatment of horizon dynamics [
29]. Guided by von Neumann’s measurement postulate [
30] and the stretched-horizon Brown–York fluid [
31], we model each
cell as a dissipative channel that couples to a single near-horizon mode with rate
. The outgoing partner of a virtual pair thus acts as a pointer state, and the infalling partner imprints a classical record on the patch. While a fully quantum horizon might exhibit fluctuations or entanglement across patches, this semi-classical treatment captures the leading-order physics of entropy generation, serving as a foundation for future quantum gravity extensions. The exterior mode is identified with the patch state, so the Unruh state becomes:
The ‘one mode per patch’ condition ensures that each patch measures a single mode independently, a critical feature for recovering a classical statistical ensemble within the holographic framework. This one-to-one correspondence between patches, modes, and degrees of freedom simplifies the entropy calculation, treating each patch as an independent classical detector, in contrast to a fully quantum horizon where entanglement across patches would complicate the scaling. The condition posits that each Planck-scale patch couples to a single quantum field mode, with the dominant mode selection near
, as recovered in
Section 4. While quantum fields have a continuum of modes, this model assumes a discretisation where the number of modes matches the number of patches (
), conceptually consistent with the holographic principle’s area-scaling degrees of freedom. This discretisation is akin to a coarse-graining in lattice theories, though a precise mode quantisation rule (e.g., a frequency cutoff) remains to be fully specified in future work.
To test the independence assumption underlying our model, we introduce a nearest-neighbour XX interaction term into the Hamiltonian, which represents a weak coupling between adjacent patches on the horizon. This interaction is given by
By applying perturbation theory up to the second order, specifically
, we demonstrate that the von Neumann entropy per patch
experiences an increase of approximately 3% when the coupling strength is set to
, as evaluated for a Planck-mass black hole. This modest enhancement contributes meaningfully to the average entropy value of
nat that is obtained later in
Section 7, thereby indicating that such weak short-range correlations can partially explain and account for the previously unexplained excess observed in the entropy calculations. A detailed derivation of these results is provided in
Appendix A for further reference.
3. Fixing the Hawking Temperature (Standard Result; Avoiding Circularity)
For completeness and to avoid circularity (with no novelty claimed here), we recall the standard Christensen–Fulling argument that fixes the Hawking temperature from the pole structure of the renormalised stress tensor, relying on results from [
32] (see also [
33,
34]) while reproducing only the coordinate changes and regularity argument. This inclusion ensures that our subsequent derivations remain self-consistent without presupposing the temperature value.
In the work by Christensen and Fulling [
32], the renormalised stress–energy tensor for a massless scalar field propagating in a 1+1-dimensional spacetime, when carefully evaluated in an orthonormal basis consisting of unit vectors along the time and radial directions, assumes a specific thermal form that reflects the underlying quantum thermal effects. This form is explicitly
Only the energy density
will enter our argument, as it plays a central role in determining the thermal characteristics near the horizon, serving as the key quantity that captures the energetic behaviour in this critical region.
Let us consider Schwarzschild coordinates
with the metric
,
, which are the standard coordinates for describing the geometry outside a non-rotating black hole. The orthonormal time vector now reads
so that a mixed component transforms as
The
prefactor is fixed by the conformal anomaly, a fundamental quantum effect that arises from the trace of the stress tensor in curved spacetime and represents a departure from classical expectations; the single power of
is kinematic and therefore model-independent, stemming purely from the coordinate transformation properties without relying on specific dynamical assumptions.
We now switch to Kruskal coordinates
with surface gravity
and light-cone variables
,
, which are particularly useful for analysing the behaviour across the horizon without coordinate singularities. Near the horizon, where
, the Schwarzschild component (2) gives
Requiring the renormalised tensor to stay
finite as
forces the residue of the pole to vanish, which uniquely sets
ensuring the physical consistency of quantum field theory in this curved background.
In
dimensions the same cancellation must occur inside each partial wave, ensuring regularity across the full spacetime structure and avoiding unphysical divergences in higher dimensions. Extra centrifugal potentials and grey-body factors modify only the finite part of
; the
pole is untouched. Equivalently, ref. [
32] shows that the
term persists after angular modes are summed, maintaining the essential divergence behaviour that is characteristic of the near-horizon region. Therefore an observer at infinity (
) measures
All subsequent entropy and information-balance formulas in this paper employ this
, providing a consistent foundation for the thermodynamic descriptions that follow.
4. Entropy Accounting in the Horizon-as-Apparatus Model
The statistical entropy of the black hole is recovered using the classical apparatus model. The entropy per mode
is the von Neumann entropy of the reduced density matrix for the patch, obtained by tracing over the interior modes of the Unruh state (1):
where
, which sum to 1. This density matrix is diagonal due to the structure of the Unruh state.
The entanglement entropy per mode is given by
, where the logarithm of the probabilities is
. Summing over
n yields
. Given that
and
, this simplifies to
To match the total entropy, the condition is
, where
. Requiring
, the condition becomes
nat, with entropies measured in nat. Assuming uniformity (
), the equation to solve is
. Define
, so the condition is
, where
.
As a first illustration we treat the dominant mode at
;
Section 7 lifts this simplification and integrates over the full Planck spectrum. In reality, black hole modes follow a continuous frequency spectrum, which is addressed in
Section 7, where the average entropy per mode,
, is computed over the Hawking distribution. Numerical solution yields
, close to the Hawking radiation spectrum’s peak at
(maximising the energy flux
). This proximity indicates that modes dominating the thermal emission also primarily contribute to the entropy, linking the statistical recovery to observable radiation properties (see
Figure 1). Setting
ensures
nat, so the total entropy
matches the Bekenstein–Hawking formula. Alternative values, e.g.,
, give
nat, misaligning the total entropy unless
is adjusted, which contradicts the holographic principle’s fixed
.
Four key insights emerge from this framework: (i) Microstate count—By leveraging the concept of the typical subspace, the number of high-probability microstates can be estimated as
, leading directly to the entropy expression
, providing a clear statistical mechanical interpretation; (ii) area scaling arises naturally because the number of patches, which serve as the fundamental units of information storage, scales proportionally with the horizon area
A, ensuring the entropy’s dependence on surface rather than volume; (iii) the
factor follows intrinsically from the thermal nature of the Unruh distribution, reflecting the specific statistical properties of the quantum fields in the accelerated frame near the horizon; and (iv) the measurement process respects the generalised second law (as demonstrated in
Appendix B), maintaining thermodynamic consistency even under quantum considerations. These points collectively highlight how our model bridges quantum and classical aspects while reproducing established results.
For a horizon divided into
patches (often conceptualised as pixels in the holographic sense), each with an approximately identical reduced density matrix
, the global state is approximated as
to leading order, assuming negligible long-range correlations for the semi-classical regime. By the asymptotic equipartition property—a cornerstone of information theory that partitions the Hilbert space into typical and atypical subspaces—almost all probability mass resides in a typical subspace
with dimension bounded by
for arbitrarily small
. Here,
is the von Neumann entropy of a single patch, quantifying the uncertainty or information content per unit. Thus, the effective microstate count is
and the total entropy of the horizon is
Weak nearest-neighbour correlations (see
Appendix A) modify
by
for
, preserving the leading-order scaling without significantly altering the area-law form. This provides the explicit
, within the assumptions of our model, offering a precise statistical underpinning that aligns with thermodynamic expectations while emphasising the role of the horizon patches as information carriers.
5. Quantum Computer Simulation of Black Hole Entropy
A quantum computer simulation is proposed to model the Unruh state for multiple modes, simulate the measurement process at the horizon across several patches, and compute the total entanglement entropy as a sum of independent contributions. This simulation leverages quantum circuits to represent entangled states, applies projective measurements, and calculates the cumulative entropy to compare with theoretical expectations. Quantum computers excel by naturally encoding entangled states and simulating measurement statistics, offering a direct probe of quantum effects inaccessible to classical methods [
35,
36,
37].
A simplified model of the Unruh state is considered for each mode
k, approximated as a qubit system for computational feasibility. Using the value
recovered in
Section 4, the probabilities are
,
, with higher terms (
) contributing negligibly (e.g.,
). Truncating at
excludes terms contributing less than 0.4% to the probability. The state for each mode is approximated by truncating to
and
, normalising the probabilities to 1,
,
, yielding
.
To illustrate the statistical nature of the entropy, four patches are simulated, each measuring a distinct mode (). A quantum circuit is constructed to prepare the state for each mode using a pair of qubits. Each of the four modes is simulated independently using a pair of qubits (one interior and one patch), totaling eight qubits, with no shared entanglement across pairs to reflect the independence assumption of this model. For each mode, a single qubit is initialised in the state , achieved by applying a rotation gate to , where radians. A CNOT gate is then applied with the first qubit (interior) as the control and the second (patch) as the target, entangling the qubits to form . This process is repeated independently for each of the four modes.
Projective measurements are performed on each interior qubit in the basis, mimicking the measurement process at the horizon. For each mode, the interior qubit is measured, and the patch qubit’s reduced density matrix is computed from the measurement statistics over 100 runs, yielding . The entanglement entropy for each patch is . The qubit approximation limits the Hilbert space to two states, reducing the entanglement entropy compared to a bosonic mode’s infinite-dimensional Hilbert space, where nat at . This simplification reduces the entanglement entropy from the theoretical 0.25 nat to 0.2243 nat, as higher Fock states (), though small (e.g., ), contribute additional entanglement in the full model. Using qutrits to include could narrow this 10.3% gap, a feasible extension with future quantum hardware.
To demonstrate the feasibility of this quantum simulation proposal, we performed a classical simulation using probabilistic sampling to emulate the measurement outcomes over 100 runs per patch (equivalent to a noiseless quantum simulator). The simulation yields entropies per patch of approximately [0.227, 0.254, 0.227, 0.254] nat (with statistical variation due to finite sampling), an average of 0.240 nat per patch, and a total cumulative entropy of 0.961 nat for four patches. These results are consistent with the theoretical expectation of 1.0 nat, accounting for sampling noise and the qubit approximation; the slight deficit aligns with the truncation discussed above. With 100 runs, the standard error in probabilities (e.g.,
) yields an entropy uncertainty of approximately ±0.065 nat per patch, aggregating to ±0.13 nat for four patches.
Figure 2 adopts a ±5% uncertainty (±0.048 nat for the average) as an illustrative bound, underrepresenting the full statistical error for clarity. The measurement outcomes for each patch are expected to yield approximately 94
outcomes and 6
outcomes, confirming the theoretical probabilities. The cumulative entropy is plotted as a function of the number of patches in
Figure 2.
The simulation demonstrates the potential of quantum computing to probe semi-classical aspects of black hole physics, opening avenues for studying entanglement and measurement effects in controlled quantum systems. Patch independence allows each mode’s entangled state to be simulated on just two qubits, with results combined classically. This efficiency—reducing the demand from eight qubits to two per run—enables scalable simulations of larger horizon areas on near-term devices, supporting tests of entropy scaling with patch number. The circuit’s simplicity—a rotation, CNOT, and measurement—makes it executable on noisy intermediate-scale quantum (NISQ) devices like IBM’s superconducting qubits, though scaling to more patches requires mitigating decoherence and gate errors through basic error mitigation strategies available on current platforms [
38,
39]. While independence is assumed for simplicity, quantum correlations between patches could increase the entropy slightly; future simulations might explore such effects to test the robustness of this semi-classical approximation. The simulation’s linear entropy scaling (0.240 nat per patch) mirrors the Bekenstein–Hawking area law, supporting the hypothesis that entropy arises from independent horizon patches. Despite the qubit model’s deficit relative to the full bosonic model (0.25 nat), this approach captures the statistical essence of this model, aligning with the horizon-as-apparatus framework.
6. Experimental Predictions
This semi-classical framework posits that black hole entropy arises from Planck-scale horizon patches measuring quantum field modes, offering testable predictions for near-term analogue experiments and long-term astrophysical observations.
Bose–Einstein condensates (BECs) provide a controlled platform to test the ‘one mode per patch’ condition and horizon-as-apparatus assumption, where a sonic horizon—a boundary where flow velocity exceeds the local speed of sound [
9]—is created by accelerating the condensate flow using a potential barrier or laser-induced flow, generating entangled phonon modes akin to the Unruh state [
10]. Projective measurements on ‘interior’ modes, implemented via laser transitions [
40], simulate patch measurements, and the ‘exterior’ mode’s entanglement entropy
is computed from phonon correlation measurements across the horizon.
A key testable prediction involves plotting
versus
, where
(inverse effective temperature) is tuned by flow velocity (0.3–2 mm/s) and
by phonon frequency (20–200 Hz) [
9], with the model predicting
nat at
within the achievable range of 2–4 (
Figure 3). Another test varies the sonic horizon length via trap geometry, where total entropy should scale linearly with this effective ‘area,’ mirroring the Bekenstein–Hawking area law; in quasi-one-dimensional BEC experiments, the effective ‘area’ corresponds to the horizon length, analogous to higher-dimensional horizon areas [
41].
Challenges include isolating small entropy contributions (∼0.25 nat) and distinguishing quantum entanglement from classical correlations, requiring high-fidelity techniques like quantum state tomography or noise reduction via signal averaging. If residual inter-patch entanglement exists, Bogoliubov theory suggests a possible 5–10% upward shift in (e.g., 0.2625–0.275 nat), though this effect has not yet been quantified.
Complementary platforms broaden the model’s testability, such as photonic lattices or nonlinear media simulating Hawking radiation [
10], where measuring photon correlations across an optical ‘horizon’ tests
with scalability, and superconducting qubit arrays model entangled states across a simulated horizon, offering precise control for probing small entropy contributions.
Future gamma-ray observations of primordial black holes could potentially detect Planck-scale modulations in the Hawking spectrum—discrete features like step-like intensity changes from patch measurements—though current telescopes lack the resolution to observe such effects. The discrete-patch signature is presently unconstrained, with existing Fermi-LAT data showing no detectable line-like structure [
42], but future MeV instruments (e.g., AMEGO-X) could improve sensitivity and offer a long-term validation avenue [
43].
Unlike entanglement-based models predicting continuous entropy spectra, this framework suggests discrete
features due to patch measurements. In contrast to unitary evaporation models [
14] preserving information in subtle correlations, this semi-classical approach implies measurable information loss, detectable via the absence of long-range correlations in analogue systems.
Validating these predictions would support the horizon-as-apparatus concept, advancing quantum gravity and potentially clarifying aspects of the information paradox—e.g., whether measurement-induced information loss holds or quantum correlations restore unitarity—while future quantum simulations or gravitational analogues like metamaterials could explore patch correlations and unitarity. These predictions leverage analogue systems for near-term validation, despite challenges in detection fidelity and noise, offering a bridge to quantum gravity insights.
7. Relaxation of the Uniform Frequency Assumption
Patch measurements also suppress phase coherence between neighbouring modes. Treating two adjacent
cells as a weakly coupled open system (see
Appendix A), we find that the off-diagonal elements of the reduced density matrix decay on the standard fast-scrambling timescale
(cf. the
of Ref. [
44]). For a solar-mass black hole,
s, which is far shorter than the
yr) evaporation time, so the exterior state becomes effectively Gibbsian well before any noticeable mass loss. The same dephasing should be visible in analogue BEC horizons as a rapid loss of off-diagonal phonon coherence, an observable proposed in
Section 6.
The initial model assumed a uniform frequency for all modes, with
(
Section 4), simplifying the Hawking radiation spectrum. In a realistic black hole scenario, field modes exhibit a continuous frequency spectrum
, following a Planck distribution, with
varying across modes. To address this, the model is extended by allowing
to vary while fixing
, consistent with the black hole’s temperature.
The entanglement entropy per mode is
, which depends on
, and the total entropy is
. In the continuum limit, this becomes an integral over the frequency spectrum, weighted by the energy flux
. This weighting reflects the energy contribution of each mode as recorded by the horizon patches, consistent with their role as classical detectors of Hawking radiation in our measurement-based framework. With
, the average entropy per unit energy is as follows:
Analytical evaluation via series expansions gives
where
is the Riemann zeta function (see
Appendix C for details). Nat is used throughout this paper, consistent with the use of natural logarithms. This value, incorporating the full spectrum, is within 3.0% of the holographic target of
nat, supporting the model’s validity. The small deviation may stem from inter-patch quantum correlations neglected in the semi-classical approximation, which assumes independent mode measurements. This difference is minor compared to typical corrections in string theory models [
45].
Figure 4 shows the integrand’s behaviour, peaking at
, with contributions spanning modes near the Hawking spectrum peak at
.
The 3.0% discrepancy suggests an effective mode count adjustment: , where . This hints that horizon patches may not be fully independent, as weak correlations could reduce the effective degrees of freedom. Future work could refine this by modelling such effects. Relaxing the uniform frequency assumption aligns the model with the continuous spectrum of astrophysical black holes, enhancing its relevance.
8. Discussion
In this work, we have presented a horizon-as-apparatus model and demonstrated its consistency with black hole thermodynamics. Our original contributions include a concrete measurement mechanism based on the ‘one mode per patch’ principle that statistically reproduces , an explicit microstate count derived using the typical-subspace argument, quantitative estimates of decoherence and scrambling times along with weak inter-patch correlations, and a quantum circuit blueprint complemented by proposals for analogue experiments to test the framework. We emphasise that we do not claim a new gravitational derivation of the area law; rather, our goal is to provide a physically motivated non-gravitational model whose information-theoretic bookkeeping aligns with established results.
While this semi-classical approach offers valuable insights, it is not without limitations. By assuming independent patch measurements, it overlooks potential quantum correlations between patches that could influence information transfer. This simplification may conflict with modern unitary evolution perspectives, such as those from the ER=EPR conjecture or the firewall hypothesis [
46,
47]. Future work could incorporate inter-patch correlations to bridge semi-classical and quantum gravity frameworks.
This measurement-driven approach carries several implications. Analogue experiments in Bose–Einstein condensates and optical systems can probe the predicted entropy scaling and test for measurement-induced information loss, providing near-term validation of the model. Insights into the quantum–classical interface at the horizon may inform models of horizon microstructure or spacetime quantisation. Additionally, the model offers a semi-classical perspective on whether information is lost or preserved, with testable differences from unitary models detectable in analogue systems. The interplay between classical measurements and quantum states could also inspire quantum information applications, such as error correction codes or computing protocols.
In summary, this work establishes a statistical foundation for black hole entropy via a measurement-based mechanism, opening pathways for experimental and theoretical advances in understanding black holes, quantum gravity, and the nature of information in the universe.