Information-Theoretic Descriptors of Molecular States and Electronic Communications between Reactants

The classical (modulus/probability) and nonclassical (phase/current) components of molecular states are reexamined and their information contributions are summarized. The state and information continuity relations are discussed and a nonclassical character of the resultant gradient information source is emphasized. The states of noninteracting and interacting subsystems in the model donor-acceptor reactive system are compared and configurations of the mutually-closed and -open equidensity orbitals are tackled. The density matrices for subsystems in reactive complexes are used to describe the entangled molecular fragments and electron communications in donor-acceptor systems which determine the entropic multiplicity and composition of chemical bonds between reactants.

The quantum electronic states of molecular systems and their dynamics are determined by the Schrödinger equation (SE). These (complex) wavefunctions are specified by their modulus and phase components, which generate the probability and current distributions of the system electrons. Such physical attributes respectively reflect the complementary classical (static) and nonclassical (dynamic) structures of "being" and "becoming", which both contribute to the state overall entropy and information content. It is of interest to examine their continuity relations in order to establish the net productions of these properties and to identify the origins of their sources.
In quantum mechanics (QM), the wavefunction phase, or its gradient determining the effective velocity of probability density, give rise to nonclassical information and entropy supplements to inter-reactant electronic communications, which shape the entropic multiplicities of chemical bonds between reactants.
The effective velocity V(r, t) of the probability "fluid" measures its current-per-particle and reflects the state phase gradient: V(r, t) = j p (r, t)/p(r, t) = (h/m) ∇ϕ(r, t).
In the molecular scenario, an electron is moving in the external potential v(r) due to the "frozen" nuclear frame of the familiar Born-Oppenheimer approximation. The electronic Hamiltonian ultimately determines the Schrödinger dynamics of electronic state: ih (∂ψ/∂t) = Hψ.
This SE of molecular QM also implies specific time evolutions of both components of the complex wavefunction of Equation (1) (see Section 3). The time derivatives of the modulus and phase parts of electronic states ultimately reflect the relevant continuity equations associated with physical descriptors of the particle probability and current densities. It directly follows from the SE (6) and its complex conjugate that the quantum dynamics implies the probability continuity relation expressing the vanishing source of this distribution: This equation, thus, expresses the time evolution of the state modulus component: while the associated phase dynamics reads: The inflow of probability flux in Equation (7), also implies the vanishing divergence of the velocity field V: In the probability continuity relation of Equation (7), the negative divergence term, −∇· j p (r, t), represents the local probability outflow and σ p (r, t) stands for its (vanishing) local "source". The total time derivative of the distribution p(r, t) = p[r(t), t], thus, determines the vanishing net production of p(r, t): This total time derivative measures the rate of change in an infinitesimal volume element of probability fluid moving with velocity V = dr/dt, whereas the partial derivative ∂p[r(t), t]/∂t refers to a volume element around the fixed point in space.
One also realizes that the effective velocity V of the probability current, j p = pV, also determines the phase flux j ϕ = ϕ V (13) and its divergence (see Equation (11)): This complementary flow descriptor ultimately generates a nonvanishing phase source: Using Equation (9) finally gives: To summarize, the effective velocity of the probability current also determines the phase flux in molecules. The source (net production) of the classical (probability) variable of the electronic state identically vanishes, while that of its nonclassical (phase) component remains finite. The phase source is seen to be determined by both wavefunction components and the external potential due to the system nuclei.

Resultant Entropy/Information Descriptors and State Continuity
Equation (1) identifies the following (additive) components of the wavefunction logarithm: Together they generate the so-called resultant measures of the global and gradient contents of the state overall entropy or information [53][54][55][56][57][58][59][60][61][62]. For example, for the given time t, the complex global entropy descriptor [59,62], the quantum expectation value of the (non-Hermitian) multiplicative operator of the complex global entropy, S(r) = −2lnψ(r), where S(r) stands for the entropy density per electron, combines the classical (Shannon) entropy S[p] ≡ p(r)S p (r)dr as its real part, and the nonclassical phase supplement S[ϕ] ≡ p(r)S ϕ (r)dr, which determines its imaginary component. The latter reflects the state average phase ϕ[ψ] = p(r)ϕ(r)dr: The corresponding Fisher-type gradient measure of the state resultant information I[ψ] is defined by quantum expectation I[ψ] = ψ|I|ψ of the (Hermitian) operator in position representation, The associated gradient measure of resultant entropy The generalized information measure I[ψ], related to the average kinetic energy T[ψ] = ψ|T|ψ , combines the classical Fisher information in probability distribution, I[p] ≡ p(r)I p (r)dr, and its nonclassical complement I[ϕ] ≡ p(r)I ϕ (r)dr, due to inhomogeneities in the state phase distribution. These equations confirm a symmetrical role played by the additive components of Equation (17) in generating the overall entropy and information content in the quantum electronic state. The resultant distribution of the local resultant gradient information reflects the density of electronic kinetic energy. One also observes that densities-per-electron of the gradient information and complex global entropy are mutually related: Thus, the gradient of the latter constitutes the (coherent) quantum amplitude of the former. Both resultant densities of an entropic content of the electronic state are seen to include the nonclassical (phase/velocity) complements of the classical (modulus/probability) contributions. The preceding relation constitutes a natural (complex) generalization of the corresponding classical link between local information and entropy descriptors of Fisher and Shannon: It should also be observed that a (noncoherent) classical density of Shannon's global entropy, is devoid of any phase content. A reference to Equation (19) indicates that the state information functional I[ψ] is proportional to the average kinetic energy of electrons T[ψ] = ψ|T|ψ , One also recalls that the average electronic energy in state |ψ(t) combines the following components: where we have used the relevant integration by parts and V ne [ψ] denotes the average energy of the electron nuclei attraction. Expressing SE in terms of the state modulus and phase components R and ϕ gives: Its imaginary parts determine the continuity equation for the wavefunction modulus component (see Equation (8)), where we have introduced the modulus flux associated with real part of the state logarithm (17). The real components of Equation (22) similarly recover the phase dynamics of Equation (9): One also observes that combining the preceding equation (multiplied by i) with the modulus continuity of Equation (23) gives the state logarithmic continuity relation This equation can be also recast to express the state logarithmic source: The (complex) logarithmic continuity relation further emphasizes a classical (real) character of the modulus and probability descriptors and a nonclassical (imaginary) nature of the phase and current state variables. It introduces the (complex) wavefunction current and identifies the (nonclassical) state source, stressing the phase origin of the state production of its overall information and entropy density. This combined treatment also reveals two independent sources of the resultant entropy and information descriptors of Equations (18), (20), and (21), the additive components of the logarithmic separation of Equation (17).

Integral Productions of Information and Entropy Descriptors
It is also of interest to examine the integral sources of the resultant measures of the quantum global entropy or gradient information. For the average production of the state complex entropy, where we have recognized the probability continuity of Equation (7). Using Equation (30), then, finally gives This expression again confirms the nonclassical (phase) origin of the complex entropy source, which reflects the average of the local production σ ϕ [ψ] of the state phase (see Equation (16)).
The integral production σ S [ψ] of state overall entropy can be also discussed in terms of the local source contribution per electron, σ S (r), and the associated continuity relation where J S (r) stands for the density of entropy current carried by the probability flux j p (r). Then, it again follows from Equations (16) and (18) that the local source of the complex entropy has purely nonclassical, phase origin: σ S (r) = −2iσ ϕ (r). One similarly determines the corresponding total time derivatives of the overall gradient measures of the state resultant information (Equation (20)) and entropy (Equation (21)): and Using the probability and phase continuity relations, one ultimately obtains the following expressions for the average productions of these state functionals: These expressions reveal the complementary character of the gradient information and entropy, with the positive source of one implying the negative production of another. The time derivatives of these overall functionals manifest the nonclassical (phase) origins of the molecular productions of electronic gradient entropy and information descriptors [64]. These integral sources can be also expressed in terms of the electron current density j p of Equation (3): In close analogy to irreversible thermodynamics [83], they are seen to be determined by the product of local flux and affinity densities, j p (r) and ∇σ ϕ (r), respectively. Using Equations (11) and (16), finally gives the following explicit expression for the phase-source gradient: Thus, it follows that only the wavefunction modulus and shape of the external potential influence the affinity factor in the resultant information and entropy production of Equation (37). It should be recalled, however, that the phase gradient, ∇ϕ, determines the flux factor, j p , in this product. The integral source of the gradient information (Equation (36)) can be also interpreted in terms of the local information source σ I (r): The local continuity equation for the information density I(r) of Equation (20), also involves the information current density: The local continuity relations of Equation (37) again emphasize the nonclassical (phase) origin of the information source. One observes that only a presence of the state local phase contribution generates a finite probability flow and nonvanishing information production.

Isolated/Interacting and Open/Closed Subsystems
Consider the simplest case of the two-electron reactive complex consisting of the B + (base, electron donor) and A + (acid, electron acceptor) subsystems, each containing a single electron, at the polarization (P) stage of the reactive system [63][64][65][67][68][69][70][71]: This model system involves the mutually-closed substrates, at a finite distance R AB between the two subsystems with both (interacting) electrons, g(1,2) = r 1,2 −1 (a.u.), moving in the external potential v = v A + v B , due to the fixed nuclei of both geometrically "frozen" reactants. Their infinite separation, R AB →∞, results in the sum of isolated reactants {X 0 }, while the mutual opening of these molecular fragments in R* = (A*¦B*), at a finite distance R AB , results in the global equilibrium state of R as a whole, after the optimum B→A charge transfer (CT): R*(1, 2) = A*(1, 2) + B*(1, 2).
As indicated above, such open interacting subsystems assume an effective two-electron character, since in R* the two electrons are indistinguishable. Therefore, both mutually open subsystems effectively explore the probability distribution p(r) of the whole complex. The equilibrium subsystems {X*} are, then, characterized by the subsystem densities {ρ X *(r)} exhibiting fractional average numbers of electrons {N X * = ρ X *(r)dr}. They generate the final, equilibrium molecular distribution of the whole reactive complex, and define the optimum amount of the B→A CT: One further recalls that in theory of chemical reactivity this (global) N CT measure results from the electronegativity-equalization (EE) considerations [84][85][86][87][88][89][90], based upon the chemical potential [91][92][93][94][95] and hardness/softness [96] or Fukui function [97] derivative descriptors of the (mutually-closed) polarized subsystems in R + . This model scenario, thus, involves the one-electron Hamiltonians {h 0 (X)} of the isolated, infinitely separated fragments {X 0 (i)} in R ∞ (see Equation (5)), Their eigenvalue problems, define the (one-electron) othonormal bases of the alternative complete sets of stationary states in isolated fragments, also capable of representing any (two-electron) state Ψ(A, B) = Ψ(1, 2) of the whole reactive complex. For example, in the A expansion: The mutually-and externally-open, interacting parts of this model reactive system are in the mixed states described by the corresponding density matrices of subsystems [82]. Indeed, for the mutually-open, interacting fragments, a simple product representation of this (pure) quantum state, with each reactant described by the substrate wavefunction that is dependent exclusively on its own internal coordinates, is not available. It exists only for the (disentangled) states of noninteracting subsystems in R ∞ ≡ R 0 , Ψ 0 (1, 2) = ψ A 0 (1) ψ B 0 (2), and the distinguishable electrons attributed to the mutually-closed (c) reactants in the polarized reactive The (two-electron) Hamiltonians, describing the interacting subsystems in R c + at finite distances between reactants, read: The complete sets of their stationary states {ΘuX(1, 2)}, then, define the alternative (two-electron) bases for expanding the "molecular" state Ψ(1, 2): The equilibrium reactive complex, R* = (A*¦B*), at a finite distance between the two mutually-open substrates, corresponds to the electronic Hamiltonian of reactive complex as a whole, where h(1, 2) stands for the overall perturbation relative to the reference Hamiltonian H R 0 (1, 2) in the separated reactant limit (SRL). In Appendix A the energetic implications of the mutual opening of reactants are briefly examined using the 1st-order perturbation theory. This molecular Hamiltonian determines the stationary states of the whole reactive complex R*: Their phase component is purely time dependent, thus, giving rise to the vanishing phase gradient, and hence zero probability current. A general "molecular" state Ψ(r 1 , r 2 ) ≡ Ψ(1, 2) of two indistinguishable electrons, which determines the electron density ρ(r) = 2 |Ψ(r,r 2 )| 2 dr 2 = 2p(r), also characterizes all equilibrium reactants {X*} in R* = (A*¦B*), since all mutually-open fragments explore the same "molecular" probability distribution: Their (mixed) quantum states are represented by the corresponding density operators, for example, those corresponding to the applied (external) thermodynamic conditions. The reactive system coupled to an external heat bath B(T) and electron reservoir R(µ) would be represented by the equilibrium grand-ensemble establishing the statistical mixture of {Ψ s (1, 2)}. The state probabilities are then related to the absolute temperature T of B and the chemical potential µ of R [94,95]. Expanding a general (pure) state Ψ(1, 2) of R* in the stationary molecular basis {Ψ s (1, 2)} gives: In this molecular state the expectation value of a property F A of subsystem A, represented by the associated quantum operator F A (1), is given by an ensemble-average expression including the partial (fragment) trace operation and the subsystem density matrix ρ(A) [82,98]. Indeed, using the fragment expansion of Equation (44) gives The diagonal element of the above subsystem density matrix (see the normalization condition of Equation (44)), These probabilities define the effective density operator of A in the molecular state Ψ, which determines the effective (mixed) state of this fragment in the reactive system. Its representation in the basis {ψ w A } of Equation (43) is diagonal: Therefore, by selecting in Equation (55) F A = ϕ A , one obtains the following expression for the representative average phase of A in Ψ:

Equidensity Orbital Systems
As an illustration, consider the EO configurations defined by Slater determinants of orbitals conserving the specified molecular probability distribution. In HZM construction [76,77] of modern DFT [40][41][42][43][44][45], of the wavefunctions yielding the prescribed electron density, one introduces the plane-wave type EO, which exactly reproduce the system probability distribution p(r). The density-dependent vector function f (r) = f x (r) i + f y (r) j + f z (r) k =f [p;r], for which the Jacobian determinant then assures the orbital orthonormality: Here, q l = (q l,x i + q l,y j + q l,z k) denotes the (constant) reduced momentum (wave number) vector of EO and Φ l (r) stands for its resultant phase. The latter is defined by the sum of orthogonality phase F q (r) and its local "thermodynamic" supplement ϕ(r), common in all occupied EO of the system electron configuration under consideration: Notice, that in this HZM representation all orbital components are described by the local resultant phases {Φ l (r) = Φ l [p; r], l = 1, 2, . . . , N} originating from the same overall probability density p(r). The resultant local phase Φ l (r) also generates the associated orbital current of Equation (3): The optimum "thermodynamic" contribution ϕ opt. (r), common to all occupied EO reconstructing the given electron density ρ(r) = N p(r), is determined from the subsidiary minimum information principle [53][54][55][56][57]. It relates this phase contribution to the average wave vector in the configuration under consideration [78,79]: The occupied EO in the HZM product state represent the closed (c) orbital system Ψ c (N), with each EO containing a single (distinguishable) electron, {n l + = l}. These (disentangled) nonbonded orbital components are distinguished by their EO phases, with different wave vectors attributed to each orbital (see Equation (63) This mutual opening of EO in Ψ*(N), although still precluding the net electron flows between orbitals, due to the limiting occupations {n l * = 1}, now formally opens electronic exchanges, since in the determinantal state all electrons are indistinguishable (see also Appendix A).
The mutually-open state of orbital subsystems can also involve the external (thermodynamic) coupling of these orbital components to the heat bath B(T) and ("molecular") electron reservoir R(µ) in the (macroscopic) composite system: This mutual and external opening of EO fragments in M(N) implies their effectively "bonded" (entangled) character. It is reflected by their fractional orbital occupations {0 < n l *(µ, T) < 1} marking partial electron outflows to the initially unoccupied (virtual) EO. Indeed, the externally open, thermodynamic-orbital fragments must be described by the statistical mixture of EO states {|ϕ l }, defined by the equilibrium density operator with the equilibrium orbital probabilities {P l (µ, T)} reflecting the applied thermodynamic conditions, i.e., the chemical potential µ of the reservoir and the absolute temperature T of the heat bath. This mixed state of the mutually-open (bonded, entangled) orbital components corresponds to an equalized (average) phase intensity and a common level of the chemical potential, fixed by the electron reservoir. The equilibrium probability of the ϕ l "subsystem" in such an EO grand-ensemble is determined by thermodynamic parameters µ and T, the equilibrium (fractional) orbital occupations {0 < n l * < 1} and orbital energies {e l = φ l |H|φ l }: Here, Ξ EO (µ, T) = l exp[β(µn l * − e l )] denotes the EO grand-partition function, β = (k B T) −1 , and k B is the Boltzmann constant.

Electron Communications
Let us now examine the molecular electron communications between states of isolated subsystems in the donor-acceptor reactive system, R = A--B. For specificity and simplicity reasons we again refer to the two-electron scenario of Section 5. The whole information system of the probability scattering between stationary states of isolated reactants in such a molecular complex involves four blocks of conditional probabilities, defining the internal (diagonal) and external (off-diagonal) blocks of electronic communications within and between reactants, respectively. In Communication Theory of the Chemical Bond [9][10][11][22][23][24][25][26][27][28][29][30][31][32], the former determines the intra-reactant bonds, i.e., determine the substrate polarization and activation accompanying the chemical reaction, while the latter reflect the inter-reactant bond pattern which directly probes the reactivity behavior. In what follows, we shall focus on this external part of electronic communications alone. For specificity, we, thus, examine the communications described by the molecular probabilities P(A→B).
In accordance with SP of QM [99,100], the conditional probability P[w (B)|w(A)] ≡ P w→w (A→B) of observing the output state |ψ w B ≡ |w (B) of the the "receiver" part B of this partial reactive network, given the input state |ψ w A ) ≡ |w(A) in its "source" part A, is determined by the squared modulus of the corresponding scattering amplitude A[w (B)|w(A)] ≡ A w→w (A→B) measuring their mutual projection in the molecular Hilbert space [62,100]: These probabilities satisfy the relevant normalization involving summation over the complete set of all monitoring states in this inter-reactant communication "device": since the sum of state projections {P w (B) = |w (B) w (B)|} then, amounts to the identity operator: Of interest also is the doubly conditional probability scattering, of the |w(A) →|w (B) communication in the specified "parameter" state |Ψ of the whole reactive complex. This communication involves the intermediate state |Ψ in the bridge communication [11,47,51,62]: Its amplitude can be thus regarded as that of the single-cascade communication determined by the product of two-stage amplitudes, where P Ψ stands for the projection operator onto the "molecular" reference state. The associated conditional probabilities, then, satisfy the intermediate ("bridge") normalization condition: Let us now examine more closely such A→B communications in the molecular state Ψ = Ψ(1, 2) between the stationary states {ψ w A (1)} and {ψ w B (2)} of isolated reactants. They determine the following stage projections: the associated (local) bridge amplitude and resulting density of two-electron probabilities: Their global analogs, measuring probabilities between states rather than locations within states, involve integrations of such local scattering densities over all possible locations of Electron 1 in the network source state, ψ w A (1), and of Electron 2 in the system receiver state, ψ w B (2): Hence, the stage probabilities in the cascade of Equation (83) read: The double-conditional scattering probabilities between states of isolated reactants indeed observe the bridge normalization of Equation (80). This inter-reactant, external communication system, defined by blocks P(A→B) and P(B→A) of the conditional probabilities in this resolution of stationary states of isolated reactants, ultimately generates the entropic multiplicities (in bits) of the chemical bonds between both substrates [9][10][11][22][23][24][25][26][27][28][29][30][31][32]62]. For the given molecular state of the whole reactive system, the conditional entropy of the output states, given the input states, ultimately defines the overall IT covalency in the inter-reactant bonds, a measure of the information noise in the underlying communication system, while, the complementary descriptor of the overall bond IT ionicity, then, reflects the mutual information in these reactant states, a measure of the information flow between the two subsystems. Elsewhere [101] we have examined the internal communications {P(X→X)} in interacting subsystems, which shape electronic structure of the polarized reactants. They have been shown to be determined by the fragment density matrices of Equations (55), (84), and (85).

Conclusions
Due to Heisenberg's uncertainty principle of QM, the sharply specified locations of electrons in the position representation of quantum states, which defines the molecular wavefunction and the associated probability distribution in the physical space, precludes the corresponding precise specification of electronic momenta. Therefore, only an effective measure of the latter, consistent with the probability flux definition, is available in quantum description. The current-per-particle measure of the probability velocity, which itself combines the incompatible position and momentum variables of electrons, appears as a natural choice for such an effective local "velocity" descriptor, which gives rise to its vanishing divergence in molecular QM [101]. This simplifies local continuity considerations for the electronic probability "fluid" and separates the "moment", a "static" aspect of electronic probability density determined by the modulus component of molecular wavefunction, from the "momentum", a "dynamic" feature of electronic current distribution reflecting the phase gradient of molecular quantum state. To paraphrase Prigogine [102], the former reflects the state (static) electronic structure "of being" while the latter constitutes the its (dynamic) structure "of becoming". The classical (probability) and nonclassical (current) degrees-of-freedom of molecular states, then respectively determine the system structures "of being" and "of becoming". Both these patterns carry the information contained in the system (complex) quantum electronic state and contribute to the overall (resultant) entropy and information descriptors.
The distributions of electrons and their current in a molecule determine the classical (modulus) and nonclassical (phase) contributions to the overall information content of the system quantum state. The minimum of the average resultant gradient measure of information, the expectation value of (dimensionless) kinetic energy of electrons, then, establishes the information equilibria in the whole molecular system and its fragments, reflected by the local ("thermodynamic") phase contribution. The phase aspect of such generalized, phase-transformed equilibrium states is vital for the coherent propagation of electronic communications in molecules. It also distinguishes the bonded (entangled) and nonbonded (disentangled) states of reactants.
In the present analysis, we have reexamined the probability and phase continuities in QM and summarized the resultant measures of the information and entropy content combining the classical and nonclassical contributions. The additive resolution of the wavefunction logarithm have generated the (complex) state continuity relation with the relevant source contribution identified as the (imaginary) phase production. We have also discussed sources of such overall entropy and information descriptors.
The states of isolated and interacting reactants in a simple model of the reactive (donor-acceptor) system have been explored in some detail. We have emphasized the mixed character of electronic states in the entangled (interacting) molecular fragments. Indeed, such subsystems have been shown to be described by their partial density operators, with the quantum expectations of reactant properties determined by the subsystem density matrices for the specified (pure) molecular state. Information principles using the resultant entropy and information measures have also been used to determine phase equilibria [51][52][53][54][55][56][57] in molecular systems and their constituent parts, marking the extreme values of alternative overall measures of electronic entropy (uncertainty, "disorder") or information (determinicity, "order") content in electronic wavefunctions. These "thermodynamic" states represent phase-transforms of molecular wavefunctions and generate finite equilibrium currents.
As an illustration, the disentangled (mutually closed) and thermodynamically entangled (mutually open) EO systems of the HZM construction have been examined. In this "plane-wave" type representation the fixed electron densities of molecular fragments generate finite electronic currents due to nonvanishing (local) EO phases, and hence also finite nonclassical contributions to the resultant IT descriptors.
Funding: This research received no external funding.

Conflicts of Interest:
The authors declare no conflict of interest.

Appendix A. Mutual Opening of Reactants
In what follows, for simplicity reasons, we "freeze" the (real, positively overlapping) orbitals of reactants in the model system of Section 5, {ψ X 0 = ψ X + ≡ ψ X }, S AB = ψ A |ψ B ≡ A|B ≥ 0, with two electrons, exhibiting the same spin orientation, moving in the molecular external potential v = v A + v B due to the fixed nuclei in both complementary fragments. The corresponding expectation value (in atomic units) of the perturbation h(1, 2) (Equation (49)), reflecting the average interaction relative to SRL, in the state describing the mutually-closed (c) subsystems, The mutual opening (o) of these orbital subsystems gives rise to the associated determinantal state for which the average interaction between reactants reads: corrects the Coulomb repulsion energy by the effect of the Fermi correlation between electrons. Therefore, the energetical effect associated with the mutual opening of these orbital subsystems reads: It is seen to contain the one-electron contributions, due to the external potential terms, and the two-electron exchange stabilization. This "opening" energy is seen to be symmetrical with respect to A↔B exchange. It should be negative, since the mutual opening of subsystems introduces a higher degree of variational flexibility compared to the product state describing the polarized fragments.