Hunting Quantum Gravity with Analogs: the case of graphene

Analogs of fundamental physical phenomena can be used in two ways. One way consists in reproducing specific aspects of classical or quantum gravity, of quantum fields in curved space or of other high-energy scenarios, on lower-energy corresponding systems. The ``reverse way'' consists in building fundamental physical theories, for instance, quantum gravity models, inspired by the lower-energy corresponding systems. Here we present the case of graphene and other Dirac materials.


Introduction
Richard Feynman wrote beautiful and visionary pages on analogs, in a famous lecture titled "Electrostatic Analogs", available in [1] (see also [2] for comments and discussions). There he explains how it happens that different physical systems, among which a solid analogy can be established, are all described in a unified manner. In his now famous words, this happens because "the same equations have the same solutions". Therefore, if we have no access to certain regimes of system A, but they correspond to certain reachable regimes of the analogous system B, we can perform experiments on system B, and establish results valid for system A.
In the final, and less known, part of the lecture, he ventures into a visionary attempt to explain why this is so. This goes on till the thrilling hypothesis of the existence of more elementary constituents than the ones we deem to be fundamental. All those systems, including electrostatics itself, are just different coarse-grained versions of one dynamics, even more fundamental than quantum electrodynamics. Amazingly, Feynman realizes that the physical properties of space itself play a crucial role in the identification of such fundamental objects (that he calls "little Xons" [1]). It is precisely when space itself, besides matter, is included as part of the emergent phenomenon, that these are also the probably, cannot be entirely solved via theoretical reasonings. See, e.g., [33,34,35,36,37] for different points of view.
In fact, there is plenty of unreachable regimes in fundamental physics, starting from BHs, that we do know to exist, but that are not (easily or at all) reproducible in a laboratory. It is then of tremendous interest to establish solid criteria for such systems to correspond to other systems, within our reach, and to perform experiments on the latter to know of the former. On the other hand, when such correspondences are solidly established, why not inferring from the analog system the most intimate nature of the target system? For instance, if QG behaves like graphene (under certain conditions for graphene and for certain specific regimes of QG), and since we still do not know how QG really is, why not trying to guess the whole QG picture from what we learned of the partial overlap between the two sytems?
This is a less beaten track, but not a completely empty one. For instance, inspired by the findings of [13,14,15,16,17,27,28,29], in [6] the authors propose the existence of fundamental, high-energy constituents underlying both matter and space, and that these, at our low energies, exist in an entangled state. This entanglement is there because both, matter and space, emerge from the dynamics of the same more fundamental objects, whose existence can be inferred from the celebrated upper bound on the entropy of any system, conjectured by Bekenstein [3]. Quoting Feynman and paraphrasing Bekenstein, those objects are called "Xons" [2]. If such a view is correct, even matter that we deem be fundamental, i.e., elementary, is in fact "quasi-matter", just like the massless quasiparticles, ψ, of graphene [30] owe their properties to the interaction with the lattice 1 . The most noticeable result of this "quasi-particle picture" [6] is that the evaporation of a BH inevitably leads to an information loss, in the sense that, in general, there is a nonzero entanglement entropy associated to the final products of the evaporation. On the other hand, within the same picture, in [40] the authors describe BH evaporation from the point of view of the Xons. They see there that the Bekenstein bound [3,4,5] can be an effect of the Pauli exclusion principle, and that a full unitary picture, leading to a complete recovering of the initial information, is only possible if one could track the evolution of those fundamental constituents.
The paper is organized in two large Sections and some concluding remarks. Section 2 is dedicated to graphene and Dirac materials (DMs) as analogs of high-energy fundamental physics. Section 3 is dedicated to the QG that the latter research has inspired. Each large Section has many Subsections. As for Section 2: Subsections 2.1, 2.2 and 2.3 explain the main reasons why graphene is good at reproducing scenarios of fundamental physics; then Subsection 2.4 tells about old, new and future developments of this line of research, dedicating to each topic a brief Subsubsection; finally, Subsection 2.5 comments on the experimental search. As for Section 3: Subsection 3.1 introduces the quasi-particle picture in the QG context; then Subsections 3.2 and 3.3 deal with BH evaporation as seen from the quasi-particles and as seen from the Xons, respectively; the last Susection 3.4 comments on recent work on how (classical) space emerges from the underlying (quantum) dynamics of Xons during BH evaporation. Section 4 is dedicated to our concluding remarks and are a chance to point to future developments of the whole analog enterprise, in general, and of that based on graphene, in particular.

Analog gravity on graphene
Graphene is an allotrope of carbon. It is one-atom-thick; hence it is the closest to a two-dimensional object in nature. It was theoretically speculated about it [41,42] and, decades later, it was experimentally found [43]. Its honeycomb lattice is made of two intertwined triangular sub-lattices L A and L B , see Fig. 1. As is now well known, this structure is behind a natural description of the electronic properties of π electrons 2 in terms of massless, (2 + 1)-dimensional, Dirac (hence, relativistic-like) quasi-particles.

First scale, E < E : from the tight-binding to the Dirac Hamiltonian
Such electrons, in the tight-binding low-energy approximation, are customarily described by the Hamiltonian (as here we use natural units, the reduced Planck constant is = 1) where the nearest-neighbour hopping energy is η 2.8 eV, and a, a † (b, b † ) are the anticommuting annihilation and creation operators, respectively, for the planar π electrons in the sub-lattice L A (L B ), see Fig. 1. All the vectors are bi-dimensional, r = (x, y), and, for the choice of basis vectors made in Fig. 1, if we Fourier transform, a( r) = k a( k)e i k· r , where 1.4 A is the graphene honeycomb lattice length (see Fig. 1). Solving E( k) = ±|f ( k)| ≡ 0 tells us if, in the first Brillouin zone (FBZ), conductivity and valence bands touch and where. Indeed, this does happen for graphene, pointing to a gapless spectrum, for which we expect massless excitations to emerge. Furthermore, the solution is not a Fermi line (the (2 + 1)-dimensional version of the Fermi surface of the (3 + 1) dimensions), but instead, they are two Fermi points, k D ± = ± 4π 3 √ 3 , 0 . Even if the mathematical solution to |f ( k)| = 0 has six points, only the two indicated are inequivalent [30].
The label "D" on the Fermi points stands for "Dirac". That refers to the all-important fact that, near those points, the spectrum is linear, as can be seen from Fig. 2, E ± ±v F | k|, where v F = 3η /2 ∼ c/300 is the Fermi velocity. This behaviour is expected in a relativistic theory, whereas, in a non-relativistic system, the dispersion relations are usually quadratic. Here we indicate the one used in [17]. Figure taken from [16].
where ψ ± ≡ b ± a ± are two-component Dirac spinors, and σ ≡ (σ 1 , σ 2 ), σ * ≡ (σ 1 , −σ 2 ), with σ i the Pauli matrices. Notice that here the 1/2-spinor description emerges from the two sublattice honeycomb structure instead of the intrinsic spin of the π electron. Hence, if one considers the linear/relativistic-like regime only, the first scale is Notice that E ∼ 1.5η, and that the associated wavelength, λ = 2π/| p| 2πv F /E, is 2π . The electrons' wavelength, at energies below E , is large compared to the lattice length, λ > 2π . Those electrons see the graphene sheet as a continuum. The two spinors are connected by the inversion of the full momentum k D Whether one needs one or both such spinors to describe the physics, strongly depends on the given set-up. For instance, when only strain is present, one Dirac point is enough (see, e.g., [27]), similarly (see below here) when certain approximations on the curvature are valid [15,16,17]. The importance and relevance of the two Dirac points for emergent descriptions of scenarios of the high-energy theoretical research, has been discussed at length in [28], where the role of grain boundaries, and the related necessity for two Dirac points, was explained in terms of a relation to spacetime torsion, see below. The full focus on torsion, though, is in [44].
When only one Dirac point is necessary over the whole linear regime, the following Hamiltonian well captures the physics of undeformed (planar and unstrained) graphene where the two-component spinor is, e.g., ψ ≡ ψ + , we moved back to configuration space, p → −i ∂, and sums turned into integrals because of the continuum limit. In various papers, this regime was exploited to a great extent till the inclusion of curvature and torsion in the geometric background. On the other hand, the regimes beyond the linear one were also investigated. There, granular effects associated with the lattice structure emerge, see [45] and also the related [46]. When both Dirac points are necessary, one needs to consider four component spinors in a reducible representation [47,17,48] Ψ ≡ ψ + ψ − , and 4 × 4 Dirac matrices α i = These matrices satisfy all the standard properties, see, e.g., [28] and [17].
With these, the Hamiltonian is 2.2 Second scale, E < E r < E : from the flat space to curved space Dirac Hamiltonian In [13], the goal was to identify the conditions for graphene to get as close as possible to a full-power QFT in curved spacetime. Therefore, key issues had to be faced, such as the proper inclusion of the time variable in a relativistic-like description and the role of the nontrivial vacua and their relation to different quantization schemes for different observers. All this finds its synthesis in the Unruh or the Hawking effects, the clearest and unmistakable signatures of QFT in curved spacetime. Therefore, starting from [13,14], this road was pursued in [15,16]. Let us explain here the main issues and the approximations made there. Besides the scale (4), when we introduce curvature, we also have a second scale. When this happens, E is our "high energy regime", as we ask the curvature to be small compared to a maximal limiting curvature, 1/ 2 , otherwise: i) it would make no sense to consider a smooth metric, and ii) r < (where 1/r 2 measures the intrinsic curvature), means that we should bend the very strong σ-bonds, an instance that does not occur. Therefore, our second scale is with E r = /r E < E . To have a quantitative handle on these scales, let us take, e.g., r 10 as a small radius of curvature (high intrinsic curvature). To this corresponds an energy E r ∼ 0.4 eV, whereas to r ∼ 1 mm ∼ 10 6 , corresponds E r ∼ 4 µeV. The "high energy" to compare with is E ∼ 4 eV. When energies are within E r (wavelengths comparable to 2πr), the electrons experience the global effects of curvature. That is to say, at those wavelengths, they can distinguish between a flat and curved surface and, in particular, between, e.g., a sphere and a pseudosphere. Therefore, whichever curvature r > we consider, the curvature effects are felt until the wavelength becomes comparable to 2π . The formalism we have used, though, considers all deformations of the geometric kind, except for torsion. Hence, this includes intrinsic curvature and elastic strain of the membrane (on the latter, see [27]). However, the power stops before E , because there local effects (such as the actual structure of the defects) play a role that must be taken into account in a QG type of theory. On the latter, the first steps were moved in [45] and also in [46] and in the forthcoming [49].
The intrinsic curvature is taken here as produced by disclination defects, that are customarily described in elasticity theory (see, e.g., [50]), by the (smooth) derivative of the (non-continuous) SO(2)-valued rotational angle ∂ i ω ≡ ω i , where i = 1, 2 is a "curved" spatial index 3 . The corresponding (spatial) Riemann curvature tensor is easily obtained where K is the Gaussian (intrinsic) curvature of the surface. In this approach we have included time, although the metric we adopted is i.e., the curvature is all in the spatial part, and ∂ t g ij = 0. Since the time dimension is included, the SO(2)-valued (abelian) disclination field has to be lifted-up to a SO(1,2)-valued (non-abelian) disclination field 4 , ω µ a , a = 0, 1, 2, with ω a µ = e b µ ω a b and the expression where (J ab ) β α are the Lorentz generators in spinor space, and is the spin connection, whose relation to the Christoffel connection comes from the full metricity condition ∇ µ e a ν = ∂ µ e a ν − Γ λ µν e a λ + ω a µ b e b ν = 0. We also introduced the Vielbein e a µ (and its inverse E µ a ), satisfying η ab e a µ e b ν = g µν , e a µ E ν a = δ ν µ , e a µ E µ b = δ a b , where η ab = diag(1, −1, ...). The Weyl dimension of the Dirac field ψ in n dimensions is d ψ = (1 − n)/2. Here n = 3, and we can move one dimension up (embedding), or down (boundary). More notations can be found in [13]. 4 Recall that in three dimensions ω µ ab = abc ω c µ .
gives the relation between the disclination field and the metric (dreibein). All the information about intrinsic curvature does not change. For instance, the Riemann curvature tensor, R λ µνρ , has only one independent component, proportional to K, just like in (8) (see [13]).
With all of the above in mind, the hypothesis is that, when only curvature is important, the long wavelength/small energy electronic properties of graphene, are well described by the following action with Ω µ ≡ ω µ a J a , and J a are the generators of SO(1,2), the local Lorentz transformations in this lower-dimensional setting. Notice that J a can never take into account mixing of the ψ ± , because they are of the form J a = j a . This point was discussed at length in [28], within the Witten approach [51]. In that approach, the most general gauge field, that takes into account curvature (intrinsic and extrinsic) and torsion has the following structure A µ = Ω µ + K µ , where K µ ≡ e a µ K a , hence a Poincaré (ISO(2, 1)) or (A)dS type of gauge theory, depending on the role played in here by the cosmological constant (on this see [15,16], and the review [17]). The matter, though, might be faced by taking an alternative view, for which the gauge fields are internal rather than spatiotemporal. In this case, a link with the supersymmetry (SUSY) introduced in [52] (that is a SUSY without superpartners, often referred to as unconventional SUSY (USUSY)) can be established, as is shown in [28] and in [53,54,55], as is briefly discussed in Section 2.4.2.
Let us clarify here an important point. Within this scenario, a nontrivial g 00 in (9), hence a clean nontrivial general relativistic effect (recall that g 00 ∼ V grav ) can only happen if specific symmetries and set-ups map the lab system into the wanted one. A lot of work went into it, e.g., [15,16], and went as far as producing measurable predictions of a Hawking/Unruh effect, for certain specific shapes. Let us recall here the main ideas behind this approach, which we may call the "Weyl symmetry approach" [17].

The importance of Weyl symmetry
First of all, one notices that the action (11) enjoys local Weyl symmetry that is an enormous symmetry among fields/spacetimes [56]. As explained in [13,14], to make the most of the Weyl symmetry of (11), we better focus on conformally flat metrics. The simplest metric to obtain in a laboratory is of the kind (9). For this metric the Ricci tensor is R µ ν = diag(0, K, K). This gives as the only nonzero components of the Cotton tensor, Since conformal flatness in (2+1) dimensions amounts to C µν = 0, this shows that all surfaces of constant K give raise in (9) to conformally flat (2+1)-dimensional spacetimes. This points the light-spot to surfaces of constant Gaussian curvature.
The result C µν = 0 is intrinsic (it is a tensorial equation, true in any frame), but to exploit Weyl symmetry to extract non-perturbative exact results, we need to find the coordinate frame, say it Q µ ≡ (T, X, Y ), where Besides the technical problem of finding these coordinates, the issue to solve is the physical meaning of the coordinates Q µ , and their practical feasibility. See [17], and [57]. Tightly related to the previous point is the conformal factor that makes the model globally predictive, over the whole surface/spacetime. The simplest possible solution would be a single-valued, and time independent φ(q), already in the original coordinates frame, q µ ≡ (t, u, v), where t is the laboratory time, and, e.g., u, v the meridian and parallel coordinates of the surface.
Here we are dealing with a spacetime that is embedded into the flat (3+1)-dimensional Minkowski. Although, as said, the focus is on intrinsic curvature effects, just like in a general relativistic context, issues related to the embedding, even just for the spatial part, are important. For instance, when the surface has negative curvature, one needs to move from the abstract objects of non-Euclidean geometry, to objects measurable in a Euclidean real laboratory. This involves the last point above about global predictability, and, in the case of negative curvature, necessarily leads to singular boundaries for the surfaces, as proved in a theorem by Hilbert, see, e.g., [17] and [58]. Even the latter fact is, once more, a coordinates effect, due to our insisting in embedding a negative curvature surface in R 3 , and clarifies the hybrid nature of these emergent relativistic settings. The quantum vacuum of the field that properly takes into account the measurements processes, as for any QFT on a curved spacetime, was identified, including how the graphene hybrid situation can realize that [15,16]. As well known, this is crucial in QFT, in general, and on curved space, in particular.
The above lead us to propose a variety of set-ups, the most promising being the one obtained by shaping graphene as a Beltrami pseudosphere [15,16,17], a configuration that can be put into contact with three key spacetimes with horizon: the Rindler, the de Sitter and the Bañados-Teitelboim-Zanelli (BTZ) BH [59]. The predicted impact on measurable quantities is reported in the first papers, and then explored in the subsequent efforts of computer-based simulations.

Ramifications
Many other high energy scenarios can be reached with graphene and related systems that go under the name of Dirac materials (DMs) [31]. Here we list some of such directions.

Generalized Uncertainty Principles on DMs
In [45] (see also [46]), the realization in DMs of specific generalized uncertainty principles (GUPs) associated with the existence of a fundamental length scale was studied. The scenarios that one wants to reproduce there is that for which the commutation relations are modified, by quantum gravity effects, to be (see, e.g., [60,61,62,63,64,65,66,67,68,69] and references therein) where A =Ã P / , withÃ a phenomenological dimensionless parameter and P ∼ 10 −35 m the Planck length.
In [45], it is shown that a generalized Dirac structure survives beyond the linear regime of the low-energy dispersion relations. Also, a GUP of the kind compatible with (14), related to QG scenarios with a fundamental minimal length (there the graphene lattice spacing) and Lorentz violation (there the particle/hole asymmetry, the trigonal warping, etc.) is naturally obtained. It is then shown that the corresponding emergent field theory is a table-top realization of such scenarios by explicitly computing the third-order Hamiltonian and giving the general recipe for any order. Remarkably, these results imply that going beyond the low-energy approximation does not spoil the well-known correspondence with analog massless quantum electrodynamics phenomena (as usually believed). Instead, it is a way to obtain experimental signatures of quantum-gravity-like corrections to such phenomena.
In [46], the authors investigated the structure of the gravity-induced GUP in (2 + 1)dimensions. They showed that the event horizon of the M = 0 BTZ micro-black-hole furnishes the most consistent limiting "gravitational radius" R g (that is, the fundamental minimal length induced by gravitational effects). A suitable formula for the GUP and estimate the corrections induced by the latter on the Hawking temperature and Bekenstein entropy could be obtained. As for the role of graphene, it is shown that the extremal M = 0 case, and its natural unit of length introduced by the cosmological constant, = 1/ √ −Λ, is a possible alternative to R g , and DMs, when shaped as hyperbolic pseudospheres, represent condensed matter analog realizations of this scenario with = DM . Due to the peculiarities of three-dimensional gravity [70], this configuration can still be regarded as a BH, even though M = 0, on this see, e.g., [71,72,73].
More work in this QG phenomenology direction is forthcoming [49]. There it is found that even more GUPs are at work at different energy scales, and a link is established between the abstract coordinates satisfying the GUPs and the coordinates one measures in the lab.
With this in mind, one sees that our scales here are much more within reach than those of (14). Indeed P needs to be traded for the lattice spacing , that, e.g., for graphene is graphene ∼ 1.4 × 10 −10 m. Therefore, we have much more hope to see in DMs the effects of the modifications to [x i , p j ] = i δ ij compared with the direct effects of O( P ).

Grain boundaries on DMs and two scenarios: Witten 3D gravity, and USUSY
In [28], two different high-energy-theory correspondences on DMs associated with grain boundaries (GBs) are proposed. We recall here that a GB can be realized as a line of disclinations of opposite curvature, for instance, pentagons and heptagons, arranged so that two regions (grains) of the membrane match. These grains have different relative orientations, given by the so-called misorientation angle θ, which characterizes the GB defect. Each side of the GB corresponds to one of the Dirac points (and the other is related by a parity transformation, see Appendix B of [28] for details) in the continuous π electron description. Therefore, the continuous limit description of the π electrons living in a honeycomb with GB needs the two inequivalent Dirac points. Even more, as the θ angle is related to a non-zero Burgers vector b through the Frank formula, and a nonzero b implies non-zero torsion in the continuous limit 5 , such description should take into account torsion. The first correspondence points to a (3 + 1)-dimensional theory, with spatiotemporal gauge group SO(3, 1), with nonzero torsion, locally isomorphic to the Lorentz group in (3 + 1) dimensions, or the de Sitter group in (2 + 1) dimensions, in the spirit of (2 + 1)dimensional gravityà la Witten [51]. The other correspondence treats the two Dirac fields as an internal symmetry doublet, and it is linked there with USUSY [52] with SU (2) internal symmetry [53]. One of the properties of USUSY is the absence of gravitini, although it includes gravity and supersymmetry. Even if in (2 + 1) dimensions it is constructed from a Chern-Simons connection containing fermion fields, the only propagating local degrees of freedom are the fermions [75]. Notice that in USUSY, the torsion of geometric backgrounds appears naturally, and its totally-antisymmetric part is coupled with fermions.
Those results pave the way for the inclusion of GB in the emergent field theory picture associated with these materials, whereas disclinations and dislocations have already been well explored.

Particle-Hole pairs in graphene to spot spatiotemporal torsion
In [44], assuming that dislocations could be meaningfully described by torsion, a scenario is proposed based on the role of time in the low-energy regime of two-dimensional DMs, for which coupling of the fully antisymmetric component of the torsion with the emergent spinor is not necessarily zero. That approach is based on the realization of an exotic time-loop, that could be seen as oscillating particle-hole pairs. Although that is a theoretical paper, the first steps were moved toward testing the laboratory realization of these scenarios by envisaging Gedankenexperiments on the interplay between an external electromagnetic field (to excite the pair particle-hole and realize the time-loops) and a suitable distribution of dislocations described as torsion (responsible for the measurable holonomy in the time-loop, hence a current). The general analysis establishes that we need to move to a nonlinear response regime. Then the authors conclude by pointing to recent results from the interaction laser-graphene that could be used to look for manifestations of the torsion-induced holonomy of the time-loop, e.g., as specific patterns of suppression/generation of higher harmonics. As said before, USUSY takes into account torsion and couples its totally antisymmetric component with fermions in a very natural way. Therefore, it could play a significant role also in this exotic time loop [76].

Vortex solutions of Liouville equation and quasi spherical surfaces
In [57], the authors identified the two-dimensional surfaces corresponding to specific solutions of the Liouville equation of importance for mathematical physics, the non-topological Chern-Simons (or Jackiw-Pi [77,78]) vortex solutions, characterized by an integer [79] N ≥ 1. Such surfaces, called there S 2 (N ), have positive constant Gaussian curvature, K, but are spheres only when N = 1. They have edges and, for any fixed K, have maximal radius c that is found there to be c = N/ √ K. If such surfaces are constructed in a laboratory using DMs, these findings could be of interest to realize table-top Dirac massless excitations on nontrivial backgrounds. Then the types of three-dimensional spacetimes obtained as the product S 2 (N ) × R are also briefly discussed.

Realization in the labs
Besides the theoretical work just outlined, one should always aim at the actual realization of the necessary structures in real laboratories. See, e.g., the work [58], where Lobachevsky geometry was realized via simulations by producing a carbon-based mechanically stable molecular structure arranged in the shape of a Beltrami pseudosphere. It was found there that this structure: i) corresponds to a non-Euclidean crystallographic group, namely a loxodromic subgroup of SL(2, Z); ii) has an unavoidable singular boundary that is fully taken into account. That approach, substantiated by extensive numerical simulations of Beltrami pseudospheres of different sizes, might be applied to other surfaces of constant negative Gaussian curvature and points to a general procedure to generate them. Such results pave the way for future experiments. More work is currently undergoing.
3 Graphene-inspired quantum gravity: the quasiparticle picture If the entropy of any physical system of volume V , including the entropy associated to space itself, is never bigger than the entropy of the BH whose event horizon coincides with the boundary of V this means that the associated Hilbert space, H, has finite dimension, dim(H) ∼ e S BH . This simple consideration poses serious questions.
In fact, at our energy scales, the world is well described by fields (matter) and the space they live in. Quantum fields, as we know them, act on infinite-dimensional Hilbert spaces, to which one should add the degrees of freedom surely carried by (the quanta of) space itself. How can then be that the ultimate Hilbert space, which must include all degrees of freedom, is not only separable, like for a single harmonic oscillator, but it is actually finite-dimensional?
This logic points to the existence of something more fundamental, making both matter and space. Hence, the elementary particles of the Standard Model (leptons, quarks, etc) would be, in fact, quantum quasi-particles, whose physical properties (spin, mass, etc) are the effect of the interaction with a lattice whose emergent picture is, in turn, (classical) space. Inspired by Feynman [1] (see the Introduction here) these objects were called Xons [2]. To access the Xons one needs resolutions of the order of the Planck length, which might not only be technically unfeasible, but actually impossible, see, e.g., [80].
In [6], and later in [40], general arguments are provided regarding the connection between our low-energy quantum-matter-on-classical-space description and an hypothetical fundamental theory of the Xons. The reshuffling of the fundamental degrees of freedom during the unitary evolution then leads to an entanglement between space and matter. The consequences of such scenario are considered in the context of BH evaporation, see, e.g., [81,82,83], and the related information loss: a simple toy model is provided in which an average loss of information is obtained as a consequence of the entanglement between matter and space. Pivotal for the previous study is the work of [84], where the Hawking-Unruh phenomenon is studied within an entropy-operator approach,à la Thermo-Field Dynamics (TFD) [85,86] that discloses the thermal properties of BHs.

The universal quasiparticle picture
Emergent, nonequivalent descriptions of the same underlying dynamics are ubiquitous in QFT [87], as, in general, the vacuum has a nontrivial structure with nonequivalent 6 "phases" [86]. That is, for a given basic dynamics (governed by an Hamiltonian or a Lagrangian) one should expect several different Hilbert spaces, representing different "phases" of the system with distinct physical properties. Distinct excitations play the role of the elementary excitations for the given "phase" but their general character is that of the quasiparticles of condensed matter [85,86].
What it is added here to that QFT picture is that • the degrees of freedom are finite, hence fields are necessarily emergent; • spacetime is also emergent.
Taking this view, the continuum of fields and space is then only the result of an approximation, of a limiting process. In general, there must be (many!) microscopic configurations of the Xons giving rise to the same emergent space but to different/non-equivalent fields. With this in mind (for details see [6]), the generic state |ψ ∈ H can be written as where the vectors |I i and |n i form a basis of H p i G and H q i F , that are the Hilbert space of the "spatial degrees of freedom" (Geometry) of dimension p i and of the Hilbert space of the "matter degrees of freedom" (Fields) of dimension q i , respectively, and c (i) In are numerical 6 Let us explain why we use the word phase in quotation marks. Given the general vacuum of a QFT, one can identify several vacua that cannot be obtained one from the other through a smooth unitary transformation. Starting from each of these "sub vacua", and acting with the appropriate creation operators, one builds several (infinite) sectors, sometimes called super-selection sectors. Not all of them correspond to a phase of the system, in the proper statistical mechanical/thermodynamical sense. On the other hand, all such phases need be described by a superselection sector or by a set of them. On this, see, e.g., [88]. coefficients. Notice that N T is the number of specific rearrangements (Topologies) of the degrees of freedom.
By denoting with P (i) : H → T (i) a projector onto T (i) , a subspace with a given "Topology", the associated density matrix, representing the state of the field, is where |ψ i ∼ P (i) |ψ , and we trace away the degrees of freedom of the gravitational field. Correspondingly, the entropy of entanglement between matter and space, for a given topology of the lattice, is the usual expression 7 This picture needs to be compared to the standard QFT picture, recalled earlier, of the non-equivalent field configurations, or "phases"à la TFD [85,86] where the mirror degrees of freedom, that characterize TFD (often called there the tilde degrees of freedom), model the degrees of freedom of the geometry. These degrees of freedom are then traced away, leaving us with quantities all referring solely to matter (fields). Indeed, the vacuum of TFD can be written as [85] |0(θ) = n w n (θ)|n,ñ , where θ is a physical parameter labeling the different "'phases", w n are probabilities such that n w n = 1, and the states |n,ñ (infinite in number) are the components of the condensate, each made of pairs of n quanta and their n mirror counterparts (ñ). Therefore, such vacuum is clearly an entangled state. Notice that [85] 0(θ)|0(θ ) → 0, (20) in the field limit, which formalizes the inequivalence we have discussed. Notice also that, if one fixes θ, there is no unitary evolution to disentangle the vacuum, as the interaction with the environment and non-unitarity are the basis for the generation and the stability of such entanglement [84]. The expected value of field's observables, O, are obtained by tracing away the mirror modes,ñ. In the TFD formalism this corresponds to taking the vacuum expectation value over the vacuum (19) O ≡ 0(θ)|O|0(θ) = n w n (θ) n|O|n .
In particular, there is always an entanglement entropy associated to any field, given by, e.g., where n k = N k is the expected value of the number operator for the given (fermionic, in this example) mode k. The analogy of (22) with (18) is stronger, if we think that in TFD the process of taking statistical averages through tracing is replaced, by construction [85], by taking vacuum expectation values (vevs) over the vacuum (19). Furthermore, as well known, in the basis where the density matrix in the entropy (18) is diagonal, the entropy can be written as as shown, e.g., in [86]. In this comparison, the mirror (tilde) image of the field mimics the effects of the entanglement with space where the field lives, even when the space is flat. This happens on a level that is both emergent and effective. This would have far reaching consequences, surely worth a serious exploration. For instance, the entanglement entropy associated to any field, would never be zero. Furthermore, this would explain why the attempt to quantize gravity as we quantize the matter fields, cannot make much sense.
To compare TFD entropies and the entropies obtained in the quasi-particle picture, a different point of view is taken in [40]. There the authors focus on BH evaporation as seen form the point of view of the fundamental Xons, and were able to establish formulae and structures indeed similar to those of TFD. The main difference with TFD is that, at the level of the discrete structures related to Xons, the quantum field theoretical considerations illustrated above are only an approximation. In Subsection 3.3 we shall recall those results. Before doing so, let us focus on BH evaporation as seen from the point of view of the emergent quantum fields and emergent space.

Effects of the quasiparticle picture on black hole evaporation
When applied to BH evaporation, the immediate consequence of the above is that it is impossible that after the evaporation we can retrieve the very same "phase" we had before the BH was formed. Hence, the information associated to the quantum fields before the formation of the BH is, in general, lost after the BH has evaporated, due to the entanglement between matter and space. Even when the emergent spaces, before the formation and after the evaporation, are the same (say they are both Minkowski spacetimes), the emergent fields belong, in general, to non-equivalent Hilbert spaces. Therefore, even assuming unitary evolution at the X level, the initial and final Hilbert spaces of fields cannot be the same. There is always a relic matter-space entanglement entropy.
Looking at Eq. (16), it is clear that the Hilbert space H can be written as where we can now introduce measures, R F s, R G s, of the "degenaracies", p i = N G R i G , with N G classical geometries available (they represent the BH with mass M (a) = a ε, where a = 0, 1, . . . N G − 1), and each classical geometry can be realized by R i G microstates. On the other hand, q i = N F R i F , that is, each emergent field state can be realized by R i F indistinguishable microstates. The analytic computations of the entanglement entropy demand a heavy toll, so in [6] the authors proceeded numerically. The case we present here is for the following choice of N G = 30, N T = 2, and R i F = 1, for each topology. The plots in Fig. 3 show the entanglement entropies, corresponding to the three sets of values given in the box, as functions of the discrete evolution parameter k.
As can be seen from the figure, the residual entropies are never zero, and are given by corresponding to the set of values in the box going from the top to the bottom, respectively. The more microscopic realizations of the same macroscopic geometry (i.e. the bigger the degeneracy R G ), the higher the relic entanglement entropy. This is as it must be. The fact that, at the end of the evaporation, the entanglement entropy remains finite signals a dramatic departure from the information conservation scenario of the famous Page curve [82], presented here in Fig. 4. There the total Hilbert space has dimension mn, and consists of two subsystems: the BH subsystem, of dimension n ∼ e A/4 , where A is the area of the event horizon, and the radiation subsystem, of dimension m ∼ e s th , where s th is the thermodynamic radiation entropy. In Page's picture there is no explicit mention of the degrees of freedom of space and the evolution is taken to be unitary. Thus, in that picture one sees that, when the BH is formed, there is no Hawking radiation outside, hence, m = 1 and n = dim H. The BH-radiation entanglement entropy, S m,n is trivially zero. As the BH evaporates, m increases, while n decreases, keeping m n constant. Since the emitted photons are entangled with the particles under the horizon, S m,n increases, but only up to, approximately, half of the evaporation process. There, the information stored below the horizon starts to leak from the BH, so that S m,n decreases until full evaporation, hence n = 1 and m = dim H and S m,n returns to zero. From the point of view of the quasiparticle picture, we may say that, even if one takes a conservative view for which the Xons evolve unitarily, nonunitarity is unavoidable: • the unitary evolution may as well be only formally possible, but physically impossible to measure, for some form of a generalized uncertainty forbidding the necessary Planck scale localization/resolution (see, e.g., [80]); • the emergent description of the evolution is that of the combined system grav-ity+matter, hence there is inevitably information loss, due to the relic entanglement of the matter field with the space; • this description should apply also to standard nonunitary features of QFT, and we evoke here the possibility that the tilde degrees of freedom of TFD could be interpreted as "how the emergent fields see the degrees of freedom of space with which they are entangled".
Notice that this description does allow for an arbitrary number of different fields, hence naturally includes the possibility of yet unknown ("dark") kinds of matter.

BH evaporation as seen from the Xons and the unification of the entropies
In [40] the authors describe BH evaporation from the point of view of the fundamental constituents, assuming they are fermions, so that only one excitation per quantum level is permitted. Because the Xons must be responsible for the formation of both matter and space, no geometric notions can be used. For example, it is assumed that only a finite number N of quantum states/slots are available to the system. This last condition is a non-geometric way of requiring that the system is localized in space. Moreover, it is not meaningful to refer to interior and the exterior of a BH. Instead the authors there distinguish between free and interacting Xons, respectively: BH evaporation is the process in which the number of free Xons decreases, N → (N − 1) → (N − 2) → · · · , while interacting Xons form matter (quasi-particles) and space (geometry), i.e. the environment. The Hilbert space of physical states H is the subspace of a larger kinematical Hilbert space K ≡ H I ⊗ H II , and it has dimension Σ ≡ dim H = 2 N . Here I and II refer to BH and environment, respectively, in the sense explained above.
The state of the system |Ψ(σ) ∈ H is [40] where a and b are environment and BH ladder operators, respectively, and σ is an interpolating parameter going from 0 to π/2. We can also define TFD-like entropy operators S I (σ) = − N n=1 a † n a n ln sin 2 σ + a n a † n ln cos 2 σ . so that their averages on |Ψ(σ) give the von-Neumann entropy of the two subsystems: S I (σ) = S II (σ) = −N sin 2 σ ln sin 2 σ + cos 2 σ ln cos 2 σ .
Such entropy quantifies the entanglement between the environment and the BH. As for the original Page result, the entropy (30) shows that the BH evaporation at such fundamental level is a unitary process, with S(0) = S(π/2) = 0 and a maximum value S max = N ln 2 = ln Σ, so that Σ = e Smax . S max quantifies the maximum information necessary in order to describe the BH and should be identified with the BH entropy before the evaporation. When the BH evaporates the mean number N II (σ) = N cos 2 σ of free Xons decreases, while the mean number of interacting Xons N I (σ) = N sin 2 σ increases. Then, BH and environment entropy should be S BH = N ln 2 cos 2 σ , S env = N ln 2 sin 2 σ .
Moreover, σ finds a natural explanation as a discrete parameter in the interval [0, π/2], essentially counting the diminishing number of free Xons. The entanglment, BH and environment entropy satisfy S I ≤ S BH + S env = S max : the entropy of both BH and environment is bounded from above, in accordance with the Bekenstein bound. In figure 5, these three entropies are plotted as functions of the discrete parameter σ. For a full identification of S max with the entropy S BH of the initial BH, one would have (for a non-rotating, uncharged black hole) where M 0 is the initial BH mass. Finally, identifying the N quantum levels with quanta of area, in the spirit of Refs. [89,90,91,92], one gets for the BH horizon area. Notice that the value of α ≡ A/(N l 2 P ) could be inferred from measurement on BH quasinormal modes [93].

Topological phases and the emergence of space from evaporating BHs
How is the existence of different phases of matter compatible with the finiteness of degrees of freedom? Such issue is closely related with the evasion of Stone-von Neumann theorem [94,95,96]. In fact, it is known that in quantum mechanics all continuous, irreducible representations of Weyl-Heisenberg (for bosons) or Clifford (for fermions) algebra, are unitarily equivalent. However, as it was previously reminded, such theorem does not apply to QFT, where systems with an infinite number of degrees of freedom are studied [97,98,99,100]. The existence of unitarily inequivalent representations of canonical (anti)commutation relations permits to describe transitions among disjoint phases of the same system, in the QFT framework. However, it is known that it is also possible to evade the Stone-von Neumann theorem by relaxing the continuity hypothesis [101]. This has been shown in quantum mechanical systems with a multiply-connected configuration space [102,103,104] or in polymer quantum mechanics [105,106,107].
In Ref. [108] the authors studied an example where both thermodynamical and topological disjoint phases are realized: a vortex solution in a QFT with a spontaneously broken U (1) symmetry was analyzed by means of the boson transformation method [109,110,111]. Such idea, firstly developed by H. Umezawa and collaborators, permits to describe classical extended objects emerging from an underlying QFT, by means of a canonical transformation performed on bosonic quasi-particle fields, which induces an inhomogeneous condensate on vacuum. Then, the authors showed that spontaneous symmetry breaking (SSB) is indeed possible even when the volume of the system stays finite [108]. This represents a first step to understand the emergence of different phases in the Xons model.
The above method also permits to shed some light on the mechanism of formation of space and quasi-particles from an underlying Xons dynamics. In Ref. [112] the authors face the delicate and fascinating issue of how space itself might be viewed as a classical extended object stemming from the SSB of underlying quantum dynamics, with the associated Goldstone bosons. In that case discrete Xons Ψ j are approximated by a field Ψ(x), and the space structure and geometric tensors (metric, curvature, torsion) emerge as a result of the condensation of Goldstone bosons, while quasi-particles are described by fields on a classical (curved) space.
4 Concluding remarks and future perspectives of the graphene analog enterprise QG and other fundamental scenarios can be tested also with analog experiments. In fact, the exciting and rapidly evolving field of analog physics is facing a new era. The interest is shifting from the reproduction of the kinematical aspects of the Hawking/Unruh phenomenon, that has reached a climax of precision and accuracy, to the realization of some form of BH (thermo)dynamics. The latter is a challenging problem, but given its importance, even a partial solution is surely worth the effort. The primary goals of the research in this field should then be: to search for realizations of such dynamical aspects, and to learn from the above on QG. Here we have described the results found following the road of graphene. Let us now collect the many directions we see departing from there.

Hunting for analog BH (thermo)dynamics
A conservative approach to BH evaporation [82,81] assumes that the evolution of the collapsing matter to produce a BH and its subsequent evaporation is a unitary process. This is what we would like to test in our analog systems. Indeed, current ongoing work [112] primarily focuses on the emergence of space in a QG scenario, as described in Subsection 3.4, henceforth from there we are learning on the hunt for a BH dynamics on graphene and other DMs. In fact, the results of that general work will help us construct an experimentally sound geometry/gravity theory that describes the dynamics of the elastic DM membrane and explore the relations to existing gravity models. Having an action, we would be able to compute the Wald entropy, in the usual way [113].
With this in mind, we are studying the realization of BHs on DM, based on the discoveries we have summarized in the Subsection 2.4. One important case under scrutiny is that of the BTZ BH realized using hyperbolic pseudospheres [46]. We shall operate through theoretical investigations but will interact more and more with the experimentalists to test the formulae obtained in [15,114] (or variants, obtained by refining earlier computations in the light of the new results, see, e.g., [28]), and we shall produce more predictions of this kind for different samples' morphologies, and various graphene observables.
In the "time-wise" approach, the focus will be on reproducing BH's (and other nontrivial) emergent metrics, by suitably engineering the interaction of the electromagnetic field with the appropriate DM. The basis for the study are two kinds of results obtained in earlier investigations and discussed here. On the one hand, the emergence of the Hawking/Unruh effect for specific spatial geometries. On the other hand, the great level of accuracy reached with laser pulses to control spatial and temporal resolution for graphene's electrons dispersion relation [115]. The latter results have inspired Ref. [44], where important details are obtained that will pave the way to a full understanding of how to engineer suitable temporal components of the emergent metric, and how to control their dynamics. The two approaches are, of course, tightly related, as one goal will be to rephrase the spatial analysis of previous work into a temporal language, namely by identifying the appropriate transformations among spatial and temporal nontriviality of the emergent metric, and by envisaging the physical setups that could realize those metrics in a laboratory, for instance in the laser-DM interaction.
On the more proper QG side, we expect that lattice effects will play a role even within the continuous approximation regime [27], but surely at the "very high energy" regimes, where the linear approximation no longer works, these aspects become dominant. In the latter regimes the (pseudo-)relativistic structure of the Dirac field will be deformed, and the discrete nature of the space(time) becomes so important that the continuum description, in terms of smooth metrics, will no longer be valid. This will become an important point to enforce analogy with QG scenarios of the discrete spacetime. Indeed, the results in [45], where the natural analog of the Planck length is the lattice spacing of the material, point in that direction.

BH entropy, the information paradox and the Xons model
Having in our hands a suitable emergent gravitational dynamics, along the lines of what explained in Subsection 3.4, it surely will be a great advancement and a necessary step towards analog BH thermodynamics. Still it would not be enough, as a suitable and reliable analog of a BH entropy is the key problem to be solved. In this respect, we have two roads in mind, one easier than the other: i) Entanglement entropy of the Dirac fields, on the given dynamical emergent BH background; ii) Computation of the Wald entropy through standard classical calculations based on the experimentally sound geometry/gravity theory that describes the dynamics of the elastic graphene membrane. The first approach is easier, in two respects. First, one does not need an action for the geometry/gravity theory in point. Second, there are many results at our disposal on the entanglement entropy, from the general ones on generic bipartite systems [116], to the specific ones on the BH thermodynamics [117,118,84]. With these in our hands, we can surely attempt various things, and it will be exciting to see how certain issues of the theoretical side are solved here in practise. Given the results on the granular regimes beyond the linear theory [45,119] (see also [49]), we are also in the position to compute QG corrections to the formula, and compare theoretical predictions on the QG side, as well as experiments on the condensed matter side. The second approach is more difficult, nonetheless we plan to also move steps in that direction because of its more direct link with purely gravitational scenarios. An exciting perspective is that these two approaches are complementary. It will be illuminating to compare the two, Bekenstein/Wald and entanglement.
To have these aspects under control is clearly necessary to face, within this approach, the long-standing issue of the information loss. In [6] it was investigated the impact on the Page curve of a picture born in analogy with condensed matter, named there the "quasi-particle picture". In this picture, more fundamental entities exist (we might call them Xons, with Feynman), and they make particles and spacetime at once: hence the (information preserving) unitarity of the BH evaporation of the Page curve, is not tenable. In [40] it is shown how entanglement, Bekenstein and thermodynamic entropies all stem form the same operator, whose structure is the one typical of Takahashi and Umezawa's TFD [86]. We expect that the several interesting new insights gained from this work, will substantially help to reach the goals.
Finally, in [46], taking advantage of the peculiarities of the BTZ BH [59], the extremal M = 0 case was identified as furnishing an alternative way to the emergence in DMs of a maximal resolution/minimal length, given by the lattice length , and related to the (negative) cosmological constant as = 1/ √ −Λ. Noticeably, a similar independent proposal emerged in the discussion of the entanglement entropy of the BTZ, see [120]. There, the AdS length is promoted to the typical length below which spatial quantum correlation are traced out. Clearly, this road has the potential to produce very interesting results.

Other hep-th scenarios on DMs
Many other aspects, that will contribute towards the main goal of testing QG scenarios with DMs, but that are important on their own right, are in sight. Let us mention one.
The action of graphene can be recast [28] in a form very similar to the action of USUSY, for an external non-abelian SU (2) gauge field and a fixed curved background [52]. Indeed, if the geometric background is fixed and the non-Abelian gauge field is external (there is no dynamics for the phonons and gauge fields), then the only difference between such actions is the coefficient in front of the torsion term. Interestingly enough, the vacuum sector is defined by configurations locally Lorentz flat, as is the case of BTZ BHs [121], and SU (2) connections carrying nontrivial global charges [122].

HELIOS
Let us close by making the case for a laboratory where QG and other fundamental theories of nature are tested with analogs. Having in mind what we discussed at length in this review, and that these are the days of the AdS/CFT correspondence (see, e.g., [123]), relating gravity and matter, we believe that the times are mature for a dedicated laboratory, entirely devoted to test fundamental theories by using analogs [124,125,126]. A laboratory built with the same spirit of CERN will unify, systematize and organize those efforts, but will also raise the status of the analog enterprise to a quest to reach beyond the known. The other side of the story is that analogs are often important materials for technological applications, like the case of graphene discussed in this review. Such a laboratory would then be an invaluable think-tank, where unconventional thinking would be routinely applied to create new technology, and to solve fundamental problems. It is a long while that, within our research group in Prague, we call this future facility HELIOS, for High Energy Lab for Indirect ObservationS.