Next Article in Journal
A Volatility Estimator of Stock Market Indices Based on the Intrinsic Entropy Model
Next Article in Special Issue
High Dimensional Atomic States of Hydrogenic Type: Heisenberg-like and Entropic Uncertainty Measures
Previous Article in Journal
Information Theory in Molecular Evolution: From Models to Structures and Dynamics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Resultant Information Descriptors, Equilibrium States and Ensemble Entropy

by
Roman F. Nalewajski
†,‡
Department of Theoretical Chemistry, Jagiellonian University, Gronostajowa 2, 30-387 Cracow, Poland
The following notation is adopted: A denotes a scalar, A stands for a vector, A represents a matrix, while A is the quantum-mechanical operator of property A. The logarithm of Shannon’s information measure is taken to an arbitrary but fixed base: log = log2 corresponds to the information content measured in bits (binary digits), while log = ln expresses the amount of information in nats (natural units): 1 nat = 1.44 bits.
Professor Emeritus.
Entropy 2021, 23(4), 483; https://doi.org/10.3390/e23040483
Submission received: 1 March 2021 / Revised: 14 April 2021 / Accepted: 16 April 2021 / Published: 19 April 2021
(This article belongs to the Special Issue Entropic and Complexity Measures in Atomic and Molecular Systems)

Abstract

:
In this article, sources of information in electronic states are reexamined and a need for the resultant measures of the entropy/information content, combining contributions due to probability and phase/current densities, is emphasized. Probability distribution reflects the wavefunction modulus and generates classical contributions to Shannon’s global entropy and Fisher’s gradient information. The phase component of molecular states similarly determines their nonclassical supplements, due to probability “convection”. The local-energy concept is used to examine the phase equalization in the equilibrium, phase-transformed states. Continuity relations for the wavefunction modulus and phase components are reexamined, the convectional character of the local source of the resultant gradient information is stressed, and latent probability currents in the equilibrium (stationary) quantum states are related to the horizontal (“thermodynamic”) phase. The equivalence of the energy and resultant gradient information (kinetic energy) descriptors of chemical processes is stressed. In the grand-ensemble description, the reactivity criteria are defined by the populational derivatives of the system average electronic energy. Their entropic analogs, given by the associated derivatives of the overall gradient information, are shown to provide an equivalent set of reactivity indices for describing the charge transfer phenomena.

1. Introduction

In this conceptual work we focus on the overall entropy/information content of electronic wavefunctions in the position representation of quantum mechanics (QM). Such quantum states are described by (complex) vectors in the molecular Hilbert space or by their statistical mixtures. Each state vector is characterized by its modulus (“length”) and phase (“orientation”) components in the complex plane. The square of the former determines the classical descriptor of the state probability distribution, while the gradient of the latter generates the density of electronic current and the associated velocity field reflecting the probability convection. These physical descriptors summarize different aspects of the state electronic structure: the probability density represents the static “structure of being”, while its flux characterizes the dynamic “structure of becoming” [1]. Indeed, in the underlying continuity equation for the (sourceless) probability distribution, the divergence of electronic flow, which shapes its time dependence, determines the local outflow of the probability density. The fundamental Schrödinger equation (SE) of QM ultimately determines the time evolutions of the state wavefunction itself, its components, and expectation values of all physical observables.
As complementary descriptors of electronic structure and reactivity phenomena, both the modulus and phase parts of molecular states contribute to the overall (resultant) content of their entropy (uncertainty) and information (determinicity) descriptors [2,3,4,5,6,7,8,9]. The need for such generalized information-theoretic (IT) measures of the entropy/information content in electronic states has been emphasized elsewhere [10,11,12,13,14]. Such descriptors combine the classical terms due to wavefunction modulus (or probability density), and the nonclassical contributions generated by the state phase (or its gradient determining the convection velocity). The overall gradient information, the quantum extension of Fisher’s intrinsic accuracy functional for locality events, then represents the dimensionless measure of the state electronic kinetic energy [2,3,4,5,6,7,8,9,10,11,12,13,14,15]. This proportionality relation between the state resultant information content and the average kinetic energy of electrons ultimately allows applications of the molecular virial theorem [16,17,18,19,20,21,22,23,24,25,26,27,28,29] in an information interpretation of the chemical bond and reactivity phenomena [4,6,9,30].
In principle, all these entropic contributions can be extracted by an experimental removal of the position and momentum uncertainties in the system quantum state [31,32]. For the parametrically specified particle location in the position representation of QM, both the probability distribution and its effective convectional velocity (current-per-particle) are uniquely specified by the system wavefunction. Therefore they both constitute bona fide sources of the information contained in electronic states, fully accessible in the separate position and momentum experiments.
In the stationary bound states, for the sharply specified energy, time-independent probability distribution and purely time-dependent phase, the local phase component and probability convection identically vanish. It will be argued, however, that their (phase-transformed) equilibrium analogs exhibit latent electronic fluxes along probability contours, which do not affect the stationary probability density. These flows are related to the state local (“thermodynamic”) phase component, proportional to the negative logarithm of probability density, for which the internal resultant IT descriptor of electronic state vanishes. Therefore, for this equilibrium criterion the average entropy measure in thermodynamic states becomes identical with von Neumann’s entropy [33], the function of external state probabilities defining the density operator of the ensemble mixed state.
In this analysis the quantum dynamics and continuity relations for the modulus (probability) and phase (current) degrees-of-freedom of electronic states are reexamined and their contributions to the resultant entropy/information descriptors are identified. The convection character of the net source of resultant gradient information is stressed, and equivalence of the energy and information criteria of chemical reactivity is emphasized. A distinction between classical (probability) and quantum (wavefunction) mappings is briefly discussed and the convection velocity of probability “fluid” is used to define fluxes of general physical and information properties. In such an approach, the system electrons thus act as carriers of the property densities. The latent electronic flows in the quantum stationary equilibrium, which do not affect the probability distribution, are also examined in some detail. Their quantum dynamics is examined and related to the “horizontal” phase component of “thermodynamic” equilibrium states. The local energy, probability acceleration, and force concepts are related to the state phase equalization and production. It is stressed that, contrary to the sourceless classical IT measures, the resultant descriptors exhibit finite local productions due to their nonclassical contributions.

2. Local Energy and Phase Equalization

Consider, for simplicity reasons, the quantum state |ψ(t)〉 of a single electron at time t, and the associated (complex) wavefunction in position representation,
ψ(r, t) = 〈r|ψ(t)〉 = R(r, t) exp[iφ(r, t)],
defined by its modulus R(r, t) and phase φ(r, t) ≥ 0 parts. The state logarithm then additively separates these two independent components:
2lnψ(r, t) = 2lnR(r, t) + 2iφ(r, t) = lnp(r, t) + 2iφ(r, t),
where p(r, t) = R(r, t)2 denotes the particle spatial probability density. Its real part determines the logarithm of the state classical (probability) component, while the imaginary part accounts for the nonclassical (phase) distribution:
Re[2ln ψ(r, t)] = 2lnR(r, t) = lnp(r, t) and Im[lnψ(r, t)] = φ(r, t).
The electron is moving in the external potential v(r), due to the fixed positions of the system constituent nuclei. In this Born–Oppenheimer (BO) approximation the (Hermitian) electronic Hamiltonian
H(r) = −[ħ2/(2m)]Δ + v(r) ≡ T(r) + v(r)
determines the quantum dynamics of this molecular state, in accordance with the time-dependent SE.
iħ [∂ψ(r, t)/∂t] = H(r) ψ(r, t).
This fundamental equation and its complex conjugate ultimately imply the associated dynamic equations for the wavefunction components or temporal evolutions of the associated physical distributions of the spatial probability and current densities (see the next section).
Consider the stationary state corresponding to the sharply specified energy Est.,
ψst.(r, t) = Rst.(r) exp[−i(Est./ħ)t] = Rst.(r) exp(−iωst.t)
where φst.(r, t) = −ωst.tφst.(t). In this state the probability distribution is time-independent,
pst.(r, t) = |ψst.(r, t)|2 = Rst.(r)2pst.(r),
and the probability current exactly vanishes:
jst.(r, t) = (ħ/2mi) {ψst.(r, t)*ψst.(r, t) − ψst.(r, t) ∇ψst.(r, t)*}
= (ħ/m) pst.(r) ∇φst.(t) = 0.
These eigenstates of electronic Hamiltonian,
H(r) ψst.(r, t) = Est. ψst.(r, t) or H(r) Rst.(r) = Est. Rst.(r)
correspond to the spatially equalized local energy
E(r, t) ≡ ψ(r, t)−1 H(r) ψ(r, t),
Est.(r) ≡ Rst.(r)−1 H(r)Rst.(r) = Est..
This equalization principle can be also interpreted as the related equalization rule for the state spatial phase. Indeed, introducing the local wave-number/phase concepts,
ω(r, t) ≡ E(r, t)/ħ and φ(r, t) = −ω(r, t) t,
directly implies their spatial equalization in the stationary electronic state:
ωst.(r, t) = Est./ħ = ωst. = const. and
φst.(r, t) = −(Est./ħ)t = −ωst.t = φst.(t).
The stationary equilibrium in QM is thus marked by the local phase equalization throughout the whole physical space. It should be realized that due to the complex nature of wavefunctions, the local energy of Equation (10) is also complex in character: E(r, t) ≠ E(r, t)*. This further implies the complex concepts of the local phase or wave-number,
ω(r, t) = c(r, t) + i b(r, t),
c(r, t) = Re[ω(r, t)] = [ω(r, t) + ω(r, t)*]/2, b(r, t) = Im[ω(r, t)] = [ω(r, t) − ω(r, t)*]/(2i),
which determines dynamic equations for the additive components of the state wavefunction of Equations (2) and (3). Rewriting SE in terms of complex wave-number components gives:
lnψ(r, t)/∂t = ψ(r, t)−1 [∂ψ(r, t)/∂t] = lnR(r, t)/∂t + i ∂φ(r, t)/∂t
= −iω(r, t) = −ic(r, t) + b(r, t).
The real terms in this complex equation determine the modulus dynamics,
lnR(r, t)/∂t = b(r, t),
while its imaginary terms determine the time evolution of the wavefunction phase:
∂φ(r, t)/∂t = −c(r, t).
For more SE identification of these wave-number components, the reader is referred to Equations (65) and (66) in Section 5.
To summarize, the (complex) local energy generates a transparent description of the time evolution of wave-function components: its real contribution shapes the phase dynamics, while the modulus dynamics is governed by the imaginary components of E(r, t) or ω(r, t). In QM the spatial equalization of these wave-number or local-phase concepts marks the stationary state corresponding to the sharply specified energy, purely time-dependent phase, and time-independent probability distribution. We argue in Section 7 and Section 8 that these equilibrium states may still exhibit finite “hidden” flows of electrons, along probability contours, which can be associated with the local “horizontal” phase defining the phase-transformed, “thermodynamic” states.

3. Origins of Information Content in Electronic States

The independent (real) parts of the complex electronic wavefunction of an electron in Equation (1) ultimately define the state physical descriptors of the spatial probability density p(r, t) = R(r, t)2 and its current
j(r, t) = (ħ/m) p(r, t) ∇φ(r, t) ≡ p(r, t) V(r, t).
The effective probability velocity introduced in the preceding equation measures a density of the current-per-particle,
V(r, t) ≡ P(r, t)/m = (ħ/m) ∇φ(r, t) ≡ j(r, t)/p(r, t),
and reflects the local convection momentum P(r, t) ≡ ħ k(r, t), with k(r, t) = ∇φ(r, t) standing for its wave-vector factor.
The real and imaginary components of Equation (3), in the wavefunction logarithm of Equation (2), determine the independent probability and velocity densities, respectively. They account for the “static” and “dynamic” (convection) aspects of the state probability distribution, which we call the molecular structures of “being” and “becoming”. Both these organization levels ultimately contribute to the overall entropy or gradient-information contents in quantum electronic states and their thermodynamic mixtures [2,10,11,12,13,14].
The probability IT functionals S[p] and I[p], due to the logarithm of the state probability density of Equation (2), constitute the classical IT concepts of Shannon’s global entropy [34,35],
S[p] = −∫p(r, t) lnp(r, t) dr,
and Fisher’s information functional for locality events [36,37]:
I[p] = ∫p(r, t) [∇lnp(r, t)]2dr = ∫p(r, t)−1 [∇p(r, t)]2 dr.
In the associated resultant measures [2,10,11,12,13,14] these probability functionals are supplemented by the average nonclassical contributions S[φ] and I[φ] = I[j], due to the state phase or its gradient generating the probability velocity:
S[ψ] = S[p] − 2∫p(r, t) φ(r, t) drS[p] − 2〈φψ = S[p] + S[φ] = S[p, φ], and
I[ψ] = I[p] + 4∫p(r, t) [∇φ(r, t)]2dr = I[p] + I[φ] = I[p, φ]
= I[p] + (2m/ħ)2p(r, t)1 j(r, t)2 dr = I[p] + I[j] = I[p, j].
We also introduce the combined measure of the gradient-entropy,
M[ψ] = M[p] + M[φ] ≡ I[p] − I[φ].
The nonclassical entropy terms S[φ] and M[φ] ≡ −I[φ] = −I[j] are negative since the current pattern introduces an extra dynamic “order” into the system electronic “organization”, compared to the corresponding classical descriptors S[p] and M[p] = I[p], thus decreasing the state overall “uncertainty” content. These generalized descriptors of the resultant uncertainty (entropy) content S[ψ] in the quantum state ψ, or of its overall (gradient) information I[ψ] [2,10,11,12,13,14], have been used to describe the phase equilibria in the substrate subsystems and to monitor electronic reconstructions in chemical reactions [3,4,5,13,14,38,39,40].
To summarize, in the resultant IT descriptors of the pure quantum state ψ, the classical probability functionals, of Shannon’s global entropy or Fisher’s intrinsic accuracy for locality events, are supplemented by the corresponding nonclassical complements S[φ] or I[φ] = I[j], respectively, due to the wavefunction phase or the electronic current it generates. In the overall (“scalar”) entropy [2,10], the (positive) classical descriptor is combined with the (negative) average phase contribution,
S[ψ] = −∫p(r, t)[ln p(r, t) + 2φ(r, t)] dr ≡ ∫p(r, t) S(r, t) dr.
while the complex (“vector”) entropy [2,12] represents the expectation value of the state (non-Hermitian) entropy operator S = −2lnψ:
S[ψ] = 〈ψ|S|ψ= − ∫p(r, t)[ln p(r, t) + 2iφ(r, t)] dr ≡ ∫p(r, t) S(r, t) drS[p] + i S[φ] = S[p, φ].
Therefore, the negative nonclassical entropy effectively lowers the state classical uncertainty measure S[p]. Indeed, the presence of finite currents implies more state spatial “order”, i.e., less electronic “disorder”. The resultant measure of the state average gradient information [2,10,11,12,13,14,15],
I[ψ] = 4〈∇ψ|∇ψ〉 = −4〈ψ|Δ|ψ〉 = (8m/ħ2)〈ψ|T|ψ〉 ≡ κT[ψ]
= ∫p(r, t){[∇lnp(r, t)]2 + 4[∇φ(r, t)]2} dr ≡ ∫p(r, t) I(r, t) dr,
then reflects the (dimensionless) kinetic energy of electrons: T[ψ] = 〈ψ|T|ψ〉 = κ−1 I[ψ].
In both the classical IT and in position representation of QM the admissible locations {r} of an electron exhaust the whole physical space and constitute the complete set of elementary particle-position events. The associated infinite and continuous probability scheme of the classical mapping {rp(r)} in Figure 1 thus describes a state of the position indeterminacy (uncertainty). It is best reflected by Shannon’s global entropy S[p], measuring a “spread” (width) of the probability distribution, since we know only the probabilities p(r) = |ψ(r)|2 of possible definite outcomes of the underlying localization experiment in the pure quantum state ψ. Another suitable classical probe of the average information content in p(r) is provided by Fisher’s probability functional I[p]. This gradient measure of the position determinacy reflects the “compactness” (height) of the probability distribution, thus complementing the Shannon global descriptor.
The information given us by carrying out the given experiment consists of removing the uncertainty existing before the experiment [32]. If we carry out the particle-localization probe we obtain some information, since its outcome means that we then know exactly, which position has actually been detected. This implies that, after repeated trials performed for the specified quantum state, the initial uncertainty contained in the position probability scheme has been completely eliminated. The average information gained by such tests thus amounts to the removed position uncertainty. The larger the uncertainty in p(r), the larger the amount of information obtained when we eventually find out which electron position has actually been detected after the experiment. In other words, the amount of information given us by the realization of the classical, probability scheme alone equals the global entropy in the classical probability scheme of Figure 1 [31,32].
In QM, however, one deals with the wavefunction scheme {rψ(r)} of Figure 1, in which the classical probability map {rp(r)} constitutes only a part of the overall (complex) mapping. In fact, the wavefunction mapping implies a simultaneous ascription to the parametrically specified electron position of the local modulus (static) and phase/current (dynamic) arguments of the state wavefunction, or the related local probability and probability velocity descriptors. This two-level scheme in QM ultimately calls for the resultant measures of the entropy/information content in quantum states, combining classical (probability) and nonclassical (phase/current/velocity) contributions. The difference between the resultant and classical information contents can be best compared to that between the (phase-dependent) hologram and (phase-independent) ordinary photograph.
The resultant IT measures are in principle experimentally accessible, since the local probability velocity in physical space, defined by the velocity of probability current, is uniquely specified in QM. In other words, all static and dynamic arguments of the resultant IT descriptors are all sharply specified by the corresponding expectation values of the associated observables. However, the localization experiment alone cannot remove all the uncertainty contained in a general electronic state, which exhibits a nonvanishing local phase component φ(r, t) and hence gives rise to a finite current density j(r, t). This probability flux vanishes only in the stationary state of Equation (6), for the purely time-dependent stationary phase φst.(t): jst.(r, t) = Vst.(r, t) = 0. For such states an experimental determination of electronic position removes completely all the uncertainty contained in the spatial wavefunction Rst.(r) and the probability distribution pst.(r) = Rst.(r)2. Indeed, the quantum scheme of Figure 1 then reduces to the classical mapping alone.
Since the current operator j(r) includes the momentum operator of an electron, P(r) = −iħ∇,
j(r) = (2m)−1[Pp(r) + p(r)P], p(r) = |r〉〈r|, {r |r’〉 = r’|r’〉},
which does not commute with the position operator r(r) = r,
PrrP ≡ [P, r] = −iħ,
the incompatible observables r and j(r) do not have common eigenstates. In other words, these quantities cannot be simultaneously defined sharply, in accordance with Heisenberg’s uncertainty principle of QM. Therefore, the position dispersion σr cannot be simultaneously eliminated with the current dispersion σj in a single type of experiment, e.g., that of the particle localization. Indeed, a removal of σj ultimately calls for an additional momentum experimental setup, which is incompatible with that required for determining the electronic position. Only the repeated, separate localization and momentum experiments, performed on molecular systems in the same quantum state, can fully eliminate the position and current uncertainties contained in a general electronic state. Neertheless, both the particle position r and the local convection velocity V(r) of the probability distribution are precisely defined as expectation values of the associated Hermitian operators. Therefore, their resultant IT functionals are all uniquely specified, with their densities exhibiting vanishing spatial dispersions.
The nonclassical uncertainty S[φ], proportional to the state average phase 〈φψ, effectively lowers the information received from the localization-only experiment. The removable uncertainty in ψ(r) is then less than its classical content S[ρ] or M[ρ] = I[ρ]. In other words, the nonvanishing current pattern introduces an extra (dynamic) determinacy in the system electronic structure, which diminishes its resultant uncertainty (indeterminacy) descriptors.
The phase equilibria corresponding to phase-transformed quantum states,
ψeq.(r) = ψ(r) exp{iφeq.[p, r]},
have been explored elsewhere [2,10,11,12,13,14]. The optimum local (“thermodynamic”) phase component φeq.[p, r] ≡ φeq.(r) for the specified probability density p(r) = pst.(r) in the stationary state ψ = ψst. of Equation (6) marks the exact cancellation of the state classical (S[p]) and nonclassical (S[φeq.]) entropy contributions:
S[ψeq.] = S[p] + S[φeq.] = − ∫p(r) [lnp(r) + 2φeq.(r)]dr = 0.
We argue in the next section that this exact reduction of the “internal” (resultant) entropy content in the equilibrium “thermodynamic” state is essential for the consistency between the von Neumann thermodynamic entropy [33] and the overall IT entropy in the grand ensemble.
The above condition determines the equilibrium (“thermodynamic”, horizontal) local phase for the conserved (stationary) probability distribution,
peq.(r) = |ψeq.(r)|2 = |ψ(r)|2 = p(r),
proportional to the negative logarithm of probability density:
φeq.[p, r] = − ½ lnp(r) ≥ 0.
The same prediction follows from the condition of the vanishing gradient measure of the resultant entropy content in ψeq.:
M[ψeq.] = M[p] + M[φeq.] = I[p] − I[φeq.]
= ∫p(r) {[∇lnp(r)]2 − 4[∇φeq.(r)]2} dr = 0.
Indeed, solving this equation for φeq. ≥ 0 (phase convention) gives:
[∇lnp(r)]2 − 4[∇φeq.(r)]2 = [∇lnp(r) − 2∇φeq.(r)] [∇ln p(r) + 2∇φeq. (r)] = 0 or
∇lnp(r) + 2φeq.(r) = 0 ⇒ φeq.(r) = − ½ lnp(r).
We can also observe that writing the average functionals for resultant entropy measures as expectations of the corresponding (multiplicative) operators,
S[ψ] = −∫p(r) [lnp(r) + 2φ(r)] dr ≡ −〈ψ|lnp + 2φ|ψ
and
M[ψ] = ∫p(r) {[∇lnp(r)]2 − 4[∇φeq.(r)]2} dr = 〈ψ|(∇lnp)2 − 4(∇φ)2|ψ〉,
makes it possible to formally interpret the equilibrium phase of Equations (32) and (34) as the optimum solution defined by the extrema of these wavefunction functionals:
{δS[ψ]/δψ(r)* = 0 or δM[ψ]/δψ(r)* = 0} ⇒ φeq.(r) = −½ lnp(r).

4. Equilibrium States and Thermodynamic Entropy

Consider now the mixed quantum state in the grand ensemble, the statistical mixture of molecular stationary states {|Ψji〉 ≡ |Ψj(Ni)〉} for different numbers of electrons {Ni}, defined by the corresponding density operator,
D = ∑ijjiPji〈Ψji| ≡ ∑ij Pji Oji, ∑ij Oji = 1, ∑ij Pji ≡ ∑i Pi = 1,
where, Oji = |Ψji〉〈Ψji| stands for the state projector. The average entropy or information—say, the resultant IT quantity G represented by the associated operator G, possibly state-dependent, G = G[Ψji] ≡ Gji, is given by the weighted average of the property state-expectations {Gji = 〈Ψji|G|Ψji〉}:
Gens. = tr(DG) = ∑ij Pji 〈Ψji|G|Ψji〉 ≡ ∑ij Pji Gji ≡ 𝓖(D).
For example, the ensemble entropy of von Neumann [33],
Sens. = −kBij Pji lnPji ≡ 𝓢(D),
corresponds to the state entropy operator Sji = Sji Oji and the expectation value of entropy in state Ψji
Sji = 〈Ψji|Sjiji〉 = −kB lnPji.
This average value depends solely on the state external probability Pji in the mixture, shaped by thermodynamic conditions, and is devoid of any local (internal) content of the constituent wavefunction distributions.
One would expect a similar feature in the overall IT description of molecular ensembles. In the pure quantum state |Ψji〉 the probability of finding an electron at the specified location r is given by the state internal distribution,
pji(r) = ρji(r)/Ni ≡ 〈Ψji|p(r)|Ψji〉,
the shape factor of the associated electron density ρji(r). In thermodynamic ensemble it is given by the weighted average over such internal state densities {pji(r)}, with the state (external) probability weights {Pji}:
p(r)〉ens. = tr(Dp) = ∑ij Pji pji(r) ≡ ∑ij Pji, r).
The probability product Pji, r) represents the normalized joint probability of finding in state Ψji an electron at r, with both its factors thus acquiring the status of conditional probabilities:
PjiPji|r) = Pji, r)/pji(r), ∑ij Pji|r) = 1;
pji(r) ≡ P(rji) = Pji, r)/Pji, ∫P(rji) dr = 1.
The Shannon entropy in the ensemble joint distribution then separates into the “external” entropy S[{Pji}] of von Neumann and the weighted average of “internal” state contributions
{S[pji] = − ∫pji(r) lnpji(r) dr},
S[{Pji, r)}] = − ∑ij ∫{Pji, r) lnPji, r)} dr
= − ∑ij Pji lnPji − ∑ij Pjipji(r) lnpji(r) dr
= S[{Pji}] + ∑ij Pji S[pji].
For a consistent IT description of the equilibrium mixed states of the open reactive complexes and their substrate subsystems, it would be desirable that in each phase-transformed pure state,
Ψeq.[pji] = Ψjiexp{iφeq.[pji]},
defined by its local (horizontal) phase φeq.[pji, r] ≡ φ(h)(r) (see also Section 7 and Section 8), equilibrium for the specified state probability density pji(r), the second (internal) contribution of Equation (45), exactly vanishes. This is indeed the case when the internal entropy of each equilibrium state is exactly zero:
Seq.[pji]] = S[pji] − 2 ∫pji(r) φeq.[pji, r] dr ≡ 0.
In statistical mixtures of the equilibrium stationary states the only source of uncertainty is then generated by von Neumann’s ensemble entropy, determined by the “external” probabilities alone. This consistency requirement thus identifies the state equilibrium phase of Equation (32) [2,10,11,12,13,14]:
φeq.[pji, r] = − ½ lnpji(r) ≥ 0.
In such “horizontally” phase-transformed states the thermodynamic and resultant equilibrium entropies are thus consistent with one another:
Sens. = kB S[{Pji, r)}] = kB S[{Pji}].
To summarize, the equilibrium “thermodynamic” (horizontal) phase is proportional to the local probability logarithm. This is very much in spirit of density-functional theory (DFT) [41,42,43,44,45,46]: the equilibrium stationary state is the unique functional of the system electron distribution ρji(r) = Ni pji(r), Ψji,eq. = Ψeq.[ρji], since both Ψji = Ψji[ρji], by the first Hohenberg–Kohn (HK) [41] theorem, and the equilibrium “thermodynamic” phase φeq. = φeq.[pji].
Therefore, when the state “thermodynamic” phase satisfies the “equilibrium” criterion of Equation (30), the introduction of the phase-transformed states for conserved (stationary) probability distribution generates the mutual consistency between the external (ensemble) and internal (resultant) entropy descriptors. It implies that for the single stationary state the resultant global and gradient uncertainty descriptors of the specified wavefunction vanish in equilibrium, as indeed does von Neumann’s [33] entropy of the pure quantum state. In such states, the internal nonclassical (phase/current) contribution exactly cancels out the classical (probability) term. The equilibrium-phase condition of the state vanishing “internal” (resultant) IT descriptor then consistently predicts the equilibrium (horizontal) phase being related to the negative logarithm of the stationary probability distribution [2,10,11,12,13,14]:
{M[pst., φ(h)] = 0 or S[pst., φ(h)] = 0} ⇒ φopt.(h)(r) = − ½ lnpst.(r) ≡ φeq.(pst.).

5. Continuity Relations

It is of crucial importance for continuity laws of QM to distinguish between the reference frame moving with the particle (Lagrangian frame) and the reference frame fixed to the prescribed coordinate system (Eulerian frame). The total derivative d/dt is the time change appearing to an observer who moves with the probability flux, while the partial derivative ∂/∂t is the local time rate of change observed from a fixed point in the Eulerian reference. These derivatives are related to each other by the chain-rule transformation,
d/dt = ∂/∂t + V(r, t)⋅∇,
where the velocity-dependent part V(r, t) ⋅∇ generates the probability “convection” term.
In Schrödinger’s dynamical picture the state vector |ψ(t)〉 introduces an explicit time dependence of the system wavefunction, while the dynamics of the basis vector |r(t)〉 of the position representation is the source of an additional, implicit time dependence of the electronic wavefunction ψ(r, t) = ψ[r(t), t], due to the moving reference (monitoring) point. This separation applies to wavefunctions, their components, and expectation values of physical observables. In Table 1 we summarize the dynamic equations for the wavefunction modulus and phase components together with the continuity relations for the state probability, current, and information densities, which directly follow from the wavefunction dynamics of SE.
It directly follows from the SE that the probability field is sourceless:
∂p(r, t)/∂t = 2R(r, t) [∂R(r, t)/∂t] = −∇⋅ j(r, t) = − V(r, t)⋅∇p(r, t) or
σp(r, t) ≡ dp(r, t)/dt = ∂p(r, t)/∂t + ∇⋅j(r, t) = ρ(r, t)/∂t + ∇p(r, t)⋅V(r, t) = 0.
Indeed, separating the explicit and implicit time dependencies in probability density p(r, t) = p[r(t), t] gives:
σp(r, t) = ∂p[r(t), t]/∂t + (dr/dt)⋅∂p(r, t)/∂r = ∂p(r, t)/∂t + V(r, t)⋅∇p(r, t)
= ∂p(r, t)/∂t + ∇⋅j(r, t).
Above, the total time derivative dp(r, t)/dt determines the vanishing local probability “source”: σp(r, t) = 0. It measures the time rate of change in an infinitesimal volume element of probability fluid moving with probability velocity V(r, t) = dr(t)/dt, while the partial derivative ∂p[r(t), t]/∂t refers to a volume element around the fixed point in space. The divergence of probability flux in the preceding equation,
∇⋅j(r, t) = ∇p(r, t)⋅V(r, t) + p(r, t)∇⋅V(r, t) = ∇p(r, t)⋅V(r, t),
thus implies the vanishing divergence of the velocity field V(r, t), related to the phase Laplacian ∇2φ(r, t) = Δφ(r, t):
∇⋅V(r, t) = (ħ/m) Δφ(r, t) = 0 or Δφ(r, t) = 0.
As in fluid dynamics, in these transport equations the operators (V⋅∇) and ∇2 = Δ represent the “convection” and “diffusion”, respectively. Thus, in Equation (52), the local evolution of the particle probability is governed by the density “convection”, while the preceding equation implies the vanishing “diffusion” of the phase distribution.
In Table 1 we summarize local continuity equations for the wavefunction components, the state physical descriptors, and information densities. For example, it follows from the table that the resultant gradient information exhibits a nonvanishing net production σI(t) due to a finite phase source σφ(r, t). The classical contribution to σI(t) identically vanishes due to the probability continuity of Equation (52). These relations directly follow from the molecular SE and identify the relevant local sources of the distributions of interest.
As an example, consider continuities of the wavefunction components. When expressed in terms of the state modulus and phase parts the SE reads:
iħ [∂ψ(r, t)/∂t] = iħ {[∂R(r, t)/∂t] + i R(r, t) [∂φ(r, t)/∂t]} exp[i φ(r, t)] = H(r) ψ(r, t)
= [−ħ2(2m)−1R(r, t) + 2i∇R(r, t) ⋅∇φ(r, t) − R(r, t) [∇φ(r, t)]2}
+ v(r) R(r, t)] exp[iφ(r, t)],
where we have used Equation (55). Dividing both sides by ħR(r, t) and multiplying by exp[−iφ(r, t)] gives the following (complex) dynamic relation linking the wavefunction components:
i [lnR(r, t)/∂t] − ∂φ(r, t)/∂t
= −[ħ/(2m)]{R(r, t)−1ΔR(r, t) + 2i[∇lnR(r, t)]⋅∇φ(r, t) − [∇φ(r, t)]2} + v(r)/ħ.
Comparing its imaginary parts generates the time evolution of the modulus part of electronic state,
lnR(r, t)/∂t = − (ħ/m) ∇φ(r, t) ⋅∇lnR(r, t) = − V(r, t) ⋅∇lnR(r, t),
which can be directly transformed into the probability continuity equation
∂p(r, t)/∂t = −∇⋅j(r, t) or σp(r, t) = dp(r, t)/dt = 0.
Equating the real parts of Equation (57) similarly determines the phase dynamics
∂φ(r, t)/∂t = [ħ/(2m)] {R(r, t)−1ΔR(r, t) − [∇φ(r, t)]2} − v(r)/ħ.
The preceding equation ultimately determines the production term σφ(r, t) = (r, t)/dt in the phase-continuity relation
∂φ(r, t)/∂t = −∇⋅J(r, t) + σφ(r, t),
since the effective velocity V(r, t) of the probability current j(r, t) = p(r, t) V(r, t) also determines the phase flux and its divergence, the convection term in the continuity Equation (61):
J(r, t) = φ(r, t) V(r, t) and ∇⋅J(r, t) = V(r, t) ⋅∇φ(r, t) = (ħ/m) [∇φ(r, t)]2.
This complementary flow descriptor ultimately identifies the finite phase production
σφ(r, t) ≡ (r, t)/dt = ∂φ(r, t)/∂t + V(r, t) ⋅ ∇φ(r, t) ≠ 0.
Finally, using Equation (60) gives the following expression for the phase source:
σφ(r, t) = [ħ/(2m)]{R(r, t)−1ΔR(r, t) + [∇φ(r, t)]2} − v(r)/ħ.
This production of the local phase is seen to group the probability-diffusion and phase-convection terms supplemented by the external potential contribution.
The component SE (57) also allows one to identify the wave-number distributions introduced in Equations (14)–(17):
c(r, t) = −∂φ(r, t)/∂t = −[ħ/(2m)] {R(r, t)−1ΔR(r, t) − [∇φ(r, t)]2} + v(r)/ħ
= −[ħ/(2m)] {ΔlnR(r, t) + [∇lnR(r, t)]2 − [∇φ(r, t)]2} + v(r)/ħ
and
b(r, t) = lnR(r, t)/∂t = − (ħ/m) ∇φ(r, t) ⋅∇lnR(r, t) = − V(r, t) ⋅∇lnR(r, t).
To summarize, the effective velocity of the probability current also determines the phase flux in molecular states. The source (net production) of the classical probability variable of electronic states identically vanishes, while that of their nonclassical phase part remains finite. In overall descriptors of the state information or entropy contents they ultimately generate finite production terms. For example, the nonclassical information I[φ] generates the nonvanishing (integral) source of the average resultant gradient information I[ψ]:
σI(t)= dI[φ]/dt ≡ ∫p(r, t)σI(r, t) dr = (8m/ħ) ∫ j(r, t) ⋅∇σφ(r, t) dr.
Its density-per-electron σI(r, t) is determined by a product of the local probability “flux” j(r, t) and “affinity“ factor proportional to the gradient of the phase source. It also follows from this local information source in Table 1, that it is determined by the “convection” of the phase source σφ(r, t):
σI(r, t) = (8m/ħ) V(r, t) ⋅∇σφ(r, t).

6. Principle of Stationary Resultant Information and Charge-Transfer Descriptors of Open Systems

The equilibrium subsystems in the specified (pure) state of the molecular system as a whole require the mixed-state description in terms of ensemble-average physical quantities [31,47,48,49,50]. The same applies to the (externally) open microscopic systems in the applied thermodynamic conditions. In reactivity problems the specified temperature T of the “heat bath” 𝕭(T) and electronic chemical potential μ (or electronegativity χ = −μ) of the macroscopic “electron reservoir” 𝓡(μ) call for the grand-ensemble approach [44,51,52]. The equilibrium quantum state is then represented by the statistical mixture of the system pure (stationary) states, defined by the externally imposed (equilibrium) state probabilities. Indeed, only the ensemble-average value of the overall number of electrons 𝓝 ≡ 〈Nens. exhibits a continuous (fractional) spectrum of values justifying the populational derivatives defining the reactivity criteria [44,51,52]. The externally open molecule M(v), identified by its external potential v(r) due to the system fixed nuclei, then constitutes a part of the composed system 𝓜 = [M(v)¦𝓡(μ)] consisting of the mutually open (microscopic) molecular fragment M(v) and an external (macroscopic) electron reservoir 𝓡(μ). In the theory of chemical reactivity one adopts such populational derivatives of the system ensemble-average energy and its underlying Taylor expansion in predicting reactivity behavior of molecules (single-reactant criteria) or bimolecular reactive systems (two-reactant criteria in situ) [44,53,54,55,56,57,58,59].
Such 𝓝-derivatives of electronic energy are indeed involved in definitions of several reactivity criteria, e.g., the chemical potential/electronegativity [44,52,53,54,55,56,57,58,59,60,61,62] or hardness/softness [46,56,57,58,59,63] and Fukui function (FF) [44,56,57,58,59,64] descriptors of the reaction complex. In IT treatments one introduces analogous concepts of the populational derivatives of the ensemble average (resultant) gradient information. Since reactivity phenomena involve electron flows between the mutually open (polarized) substrates, only in such a generalized, ensemble framework can one precisely define the relevant reactivity criteria, determine the hypothetical states of the promoted subsystems, and eventually predict effects of their chemical coordination. It has been demonstrated that, in such an ensemble approach, the energetic and information principles are exactly equivalent, giving rise to identical predictions of thermodynamic equilibria, charge relaxation, and average descriptors of molecular systems and their fragments [9,10,30,65,66].
The populational derivatives of the average energy and resultant information in reactive systems thus invoke the composite representation 〈M(v)〉ens. of the equilibrium state of the molecular system M(v) in the grand ensemble. Thermodynamic conditions in the (microscopic) molecular system are thus imposed by the hypothetical (macroscopic) heat bath 𝕭(T) and external electron reservoir 𝓡(μ). The mixed state then corresponds to the equilibrium probabilities P(μ, T; v) ≡ {Pji(μ, T; v)} of the pure (stationary) states {|Ψji〉 ≡ |Ψj(Ni)〉}, with |Ψji〉 denoting the j-th state for Ni (integer) number of electrons, which define the equilibrium density operator of Equation (37):
D(μ, T; v) = ∑ijjiPji(μ, T; v) 〈Ψji|, ∑ij Pji(μ, T; v) ≡ ∑i Pi(μ, T; v) = 1.
This statistical mixture of molecular states gives rise to the ensemble average values of the system electronic energy and its resultant gradient information. The former is defined by the quantum expectations of electronic Hamiltonians {Hi = H(Ni, v)},
Eens. = ∑ij Pji(μ, T; v) 〈Ψji| Hiji〉 ≡ ∑ij Pji(μ, T; v) Eji ≡ 𝓔(μ, T; v) ≡ 𝓔(D),
while the latter corresponds to the quantum expectation of (Hermitian) operator for the resultant gradient information of Ni electrons, {Ii ≡ I(Ni) ≡ ∑k I(k)}, related to the corresponding kinetic-energy operators {Ti ≡ T(Ni) = ∑k T(k)}, k = 1, 2, …, Ni,
Ii = −4∑k Δk ≡ ∑k I(k) = (8m/ħ2) Tiκ Ti = κk T(k), T(k) = −[ħ2/(2m)] Δk,
Iens. = ∑ij Pji(μ, T; v) 〈Ψji|Iiji〉 ≡ ∑ij Pji(μ, T; v) Iji𝓘(μ, T; v) ≡ 𝓘(D),
Thus the average gradient information 𝓘(D) reflects the (dimensionless) average kinetic energy
Tens. = ∑ij Pji(μ, T; v) 〈Ψji|Tiji〉 ≡ ∑ij Pji(μ, T; v) Tji ≡ 𝓣(μ, T; v) = 𝓣(D) = κ−1 𝓘(D),
Tji = 〈Ψji|Tiji〉 = κ−1 Iji.
The equilibrium probabilities P(μ, T; v) result from the minimum principle of the grand potential (D):
δ[𝓔(D) − μ 𝓝(D) − TՏ(D)]|P(μ,T; v)δΩ(D)|P(μ,T; v) = 0.
Here, the average number of electrons
Nens. = ∑i Ni [∑j Pji(μ, T; v)] ≡ ∑i Ni Pi(μ, T; v) = 𝓝(D)
and the thermodynamic entropy of the ensemble
Sens. = −kBij Pji(μ, T; v) lnPji(μ, T; v) ≡ Տ(D),
with kB denoting the Boltzmann constant.
The entropy-constrained energy principle of Equation (74) can be also interpreted as an equivalent (potential-energy constrained) information rule [5,6,9,67,68,69], for the minimum of the ensemble resultant gradient-information 𝓘(D):
δ[𝓘(D) − λ 𝓦(D) − ζ 𝓝(D) − τ Տ(D)]P(μ,T; v) = 0.
It contains the additional constraint of the fixed overall potential energy, 〈Wens. = 𝓦(D), multiplied by the Lagrange multiplier λ = −κ, and includes the “scaled” information intensities associated with the remaining constraints:
potential ζ = κ μ, enforcing the prescribed electron population 𝓝(D) = N;
temperature τκ T, for the subsidiary entropy condition, Տ(D) = S.
The extrema of the ensemble principles of Equations (74) and (77) determine the same equilibrium probabilities P(μ, T; v) of electronic states. The physical equivalence of the energy and information principles indicates that energetic and information reactivity concepts are mutually related, being both capable of describing charge-transfer (CT) phenomena in acid(A)–base(B) systems.
The ensemble interpretation applies to all populational, 𝓝-derivatives of the average energy or information functionals. For example, in energy representation the global chemical hardness [44,63] reflects the 𝓝-derivative of the chemical potential,
η = ∂2𝓔/∂𝓝2 = ∂μ/∂𝓝 > 0,
while the information hardness measures the 𝓝-derivative of the information potential:
ω = ∂2𝓘/∂𝓝2 = ∂ζ/∂𝓝 = κ η > 0.
The positive signs of these “diagonal” (hardness) derivatives assure the external stability of 〈M(v)〉ens., with respect to charge flows between the molecular system M(v) and its electron reservoir, in accordance with the Le Châtelier and Le Châtelier–Braun principles of thermodynamics [70].
The global FF [44,56,57,58,59,64] is defined by the “mixed” second derivative of the ensemble average energy:
f(r) = ∂/∂𝓝[δ𝓔/δv(r)] = ∂ρ(r)/∂𝓝 = δ/δv(r) (∂𝓔/∂𝓝) = δμ/δv(r),
where we have applied the Maxwell cross-differentiation identity. It can be thus interpreted as either the density response per unit populational displacement, or as the response in the global chemical potential to unit displacement in the local external potential. The analogous derivative of the average gradient information similarly reads:
ϕ(r) = ∂/∂𝓝[δ𝓘/δv(r)] = δ/δv(r) (∂𝓘/∂𝓝) = κ f(r) = δζ/δv(r).
The in situ CT derivatives of the average resultant gradient information in the reactive system R = A–B include the CT potential quantity, related to μCT,
ζCT = ∂𝓘(NCT)/∂NCT = κ μCT,
and the CT hardness descriptor, related to ηCT = SCT−1,
ωCT = ∂2𝓘(NCT)/∂NCT2 = ∂ζ(NCT)/∂NCT = κ ηCTθCT−1,
which is the inverse of the CT softness θCT = ∂NCT/∂ζ. In terms of these CT descriptors, the optimum amount of the B→A electron transfer in the donor–acceptor reactive system,
NCT = 𝓝ANA0 = NB0 − 𝓝B > 0,
thus reads:
NCT = − μCT/ηCT = −μCT SCT = − ζCT/ωCT = −ζCT θCT.
Above, {NX0} and {𝓝X} denote electron populations of the mutually closed and open reactants in M+ = (A+|B+) and M* = (A*¦B*) = M, respectively.
Therefore, the in situ derivatives {ζCT, ωCT = θCT−1} of the average content of the resultant gradient information provide alternative reactivity descriptors, equivalent to the chemical potential and hardness or softness indices {μCT, ηCT = SCT−1} of the classical, energy-centered theory of chemical reactivity. This again demonstrates the physical equivalence of the energy and information principles in describing the CT phenomena in molecular systems. One thus concludes that the resultant gradient information, the quantum generalization of the classical Fisher measure, constitutes a reliable basis for an “entropic” description of reactivity phenomena.

7. Latent Probability Flows in Stationary Equilibrium

Consider again the stationary state ψst.(r, t) of an electron (Equation (6)) corresponding to the sharply specified energy Est.. The wavefunction phase is then purely time dependent, φst.(r, t) = −ωst.tφst.(t), with the state local aspect being described solely by its modulus part Rst.(r), the eigenfunction (see Equation (9)) of the electronic Hamiltonian of Equation (4). This stationary “equilibrium” thus generates the vanishing probability current jst.(r) = pst.(r)Vst.(r), where the time-independent probability distribution pst.(r) = Rst.(r)2 and the vanishing flux-velocity Vst.(r) = (ħ/m) ∇φst.(t) = 0. As indicated in Section 2, the eigenstates of the electronic Hamiltonian correspond to the equalized local energy, Est.(r,t) = Est., marking the equalized local phase: φst.(r, t) = φst.(t).
Clearly, the stationary probability distribution and its vanishing current/velocity in such states do not imply that the particle is then at rest. The electrons are incessantly moving around the fixed nuclei, with the experimentally (sharply) unobserved instantaneous particle velocity W(r, t) = dr(t)/dt = P(r, t)/m reflecting its momentum P(r, t) = ħ k(r, t). Indeed, the system stability requires that centrifugal forces of these fast movements compensate for the nuclear attraction as, e.g., in Bohr’s historic, “planetary” model of the hydrogen atom. The tightly bound inner (“core”) electrons have to move faster than less confined outer (“valence”) electrons. The natural question then arises: how to describe the presence of these unceasing (latent) instantaneous motions in the dynamics of the probability “fluid”?
One observes that, for the probability density pst.(r) to remain conserved in time, its latent flows must follow the probability contours pst.(r) = p(0) = const. (see Figure 2). Any motion in the direction perpendicular to the probability line passing a given location in space would imply a change in time of the probability value at this point, and hence the nonstationary character of the whole distribution. In other words, the latent flows of the stationary position-probability distribution must be “horizontal”, directed along the constant-probability lines. Such probability fluxes in ψst., along the probability contours for the vanishing “vertical” velocity component, preserve in time the stationary character of the spatial probability distribution, which determines the vanishing probability flux. Therefore, the stationary character of the molecular electronic state does not preclude the latent local flows of electronic probability in horizontal directions generating the atomic vortices of Figure 3.
The instantaneous resultant (r) velocity V(r)(r, t) of probability “fluid” thus involves two independent components (see Figure 2): the “vertical” (current) velocity along the phase gradient,
V(v)(r, t) ≡ V(r, t) = j(r, t)/p(r, t) = (ħ/m) ∇φ(r, t),
perpendicular to the local direction of probability contour at time t, V ⊥ (p = const.), and hence parallel to ∇p(r, t), V||(∇φ, ∇p); and the “horizontal” velocity V(h)(r, t), along the probability contour,
V(r)(r, t) = V(r, t) +V(h)(r, t).
The horizontal velocity V(h)(r, t) of probability motions along the constant-probability lines, V(h)||(p = const.), can also remain finite in the stationary electronic states of atomic or molecular systems, since it does not affect the conserved probability distribution. The vertical component V of the probability current then reflects a common direction of gradients ∇φ and ∇p, of the distributions’ fastest increase, with a horizontal supplement perpendicular to both these gradients: V(h)⊥(∇φ, ∇p).
These components of probability velocity imply the associated combination rules for the resultant probability and phase currents:
j(r)(r, t) = p(r, t) V(r)(r, t) = p(r, t)V(r, t) + p(r, t)V(h)(r, t) ≡ j(r, t) +j(h)(r, t),
J(r)(r, t) = φ(r, t)V(r)(r, t) = φ(r, t)V(r, t) + φ(r, t)V(h)(r, t) ≡ J(r, t) +J(h)(r, t).
The above directional properties of the vertical and horizontal components then confirm the validity of the (vertical) continuity relations for the probability and phase distributions:
p/∂t = −j(r) = −pV(r) = −∇p[V + V(h)] = −∇p V = −∇j,
φ/∂t = −J(r) + σφ = −∇φ V(r) + σφ = −∇φ [V + V(h)] + σφ = −∇φ V + σφ = −∇J + σφ,
where we have recognized Equation (55) and observed that horizontal currents j(h) and J(h) generate vanishing divergences,
j(h) = ∇pV(h) = 0 and J(h) = ∇φV(h) = 0,
since V(h) is perpendicular with respect to both ∇p and ∇φ.
Therefore, the phase and probability gradients are both perpendicular to the probability contour and, hence, ∇φ(r, t) ∝ ∇p(r, t). This directional character of the current velocity V(r, t) suggests that the local aspect of the phase function itself should be related to the probability density:
φ(r, t) = φ[p(r, t), t] ⇒ ∇φ = (∂φ/∂p) ∇p.
Such a directional feature indeed characterizes the IT equilibrium (“thermodynamic”) phase of Equation (32) (see also Section 4), resulting from extrema of the phase entropy/information functionals,
φeq.(r, t) = −(1/2) lnp(r, t) ≥ 0,
for which
φeq.(r, t) = −[2p(r, t)]−1p(r, t).
The velocity of the latent, “horizontal” flows along the probability contours can be then attributed to the additional (local) horizontal phase φ(h)(r) component, a “thermodynamic” addition to the purely time-dependent stationary phase φst.(t) in the resultant phase of the transformed state:
Φ(r, t) = φst.(t) + φ(h)(r),
V(r)(r, t) = (ħ/m) ∇Φ(r, t) = (ħ/m) ∇φ(h)(r) = V(h)(r).
In order to study the time-dependent flows in liquids, the separate concepts of “streamline” and “pathline” are introduced [71]. At the specified time, the former are tangential to the directional field of velocity “arrows”. Since the particles move in the direction of the streamlines, there is no motion perpendicular to the streamlines and the property flux per unit time between two streamlines remains constant. Patterns of streamlines describe the instantaneous state of a flow, indicating the direction of motion of all particles at a given time. For the time-dependent flows, the velocity field changes in time, with pathlines no longer coinciding with streamlines. Only for the time-independent flows do the particles move along streamlines, so that pathlines and streamlines coincide.
In the stationary quantum mechanics the contours of molecular probability “fluid” at time t = t0, p(r, t0) = pst.(r) similarly determine the streamlines of the latent (horizontal) flows of electronic probability, which preserve the “static” probability distribution pst.(r) of the stationary quantum state. They generate “vortices” of the latent “horizontal” velocity in spherical probability distributions of free atoms of the promolecule M0, the deformed AIM distributions in the polarized system M+, and in the equilibrium density of the molecule M = M* (see Figure 3).

8. Component Dynamics in Equilibrium Stationary States

Consider again a general (complex) state of an electron (Equation (1)) and its quantum dynamics in Equation (5), determined by the Hamiltonian of Equation (4). Let us separate the local “vertical” r(v) and “horizontal” r(h) components of a general displacement in electronic position (see Figure 2), dr = dr(v) + dr(h), in directions perpendicular and parallel to the probability contour p(r) = const., respectively. The former is consistent with the probability gradient ∇ p(r), which reflects the direction of the distribution fastest increase.
The stationary (ground) state of an electron ψ0, for the sharply specified energy,
E[ψ0] = 〈ψ0|H|ψ0〉 = 〈R0|H|R0〉 = E0Ev[p0],
corresponds to the time-independent modulus function R0(r) and time-dependent phase component φ0(t) = −(E0/ħ)t−ω0 t. The associated equilibrium state then corresponds to the locally (horizontally) modified resultant phase,
Φ(r, t) = φ0(t) + φ(h)(r),
in the phase-transformed wavefunction,
Ψ0(r, t) = ψ0(r, t) exp[iφ(h)(r)] = R0(r) exp{i[− ω0 t + φ(h)(r)]}
R0(r) exp{iΦ(r, t)]} ≡ ϕ0(r) exp{iφ0(t)]},
which conserves the stationary probability distribution:
p0(r) = |Ψ0(r, t)|2 = |ψ0(r)|2 = |ϕ0(r)|2 = [R0(r)]2.
However, the expectation value of the energy,
E0] = 〈Ψ0|H|Ψ0〉 = 〈ϕ0|H|ϕ0〉 = E0 + T[φ(h)],
differs from E0 of Equation (97) by the “horizontal” kinetic energy,
T[φ(h)] = κ−1I[φ(h)] = ∫p0(r){(m/2)[(ħ/m)∇φ(h)(r)]2dr
≡ ∫p0(r){(m[V(h)(r)]2/2)}dr ≡ ∫p0(r) T(h)(r) dr,
related to the (horizontal) nonclassical information,
I[φ(h)] = 4∫p0(r) [∇φ(h)(r)]2dr.
The normalization-constrained minimum principle for this average energy gives the following stationary SE, including the horizontal kinetic-energy contribution:
δ{E[Ψ] − λ 〈Ψ|Ψ〉}|0 = 0 or
0 = {E0 + T[φ(h)]}Ψ0ħ ω0] Ψ 0 = ħ {ω0 + T[φ(h)]/ħ0ħ {ω0 + ω(h)0.
This horizontally-generalized stationary SE thus includes the additional wave-number contribution ω(h) = T[φ(h)]/ħ. The DFT minimum principle of Ev[p], equivalent to the ordinary (stationary) SE,
HR0 = {−[ħ2/(2m)]Δ + v} R0 = E0 R0 = ħω0 R0,
determines the optimum probability distribution, popt. = p0 = R02, and energy Eopt. = Ev[p0] = E0, while the equilibrium horizontal (“thermodynamic”) phase is determined by a supplementary IT rule (see Section 3).
In the stationary equilibrium,
∂R0/∂t = ∂φ(h)/∂t = 0 and ∇Φ = ∇φ(h),
and the horizontal velocity of probability flux reflects the gradient of φ(h):
V(h) = dr(h)/dt = (ħ/m) ∇φ(h).
The resultant probability velocity is then exclusively of a horizontal origin,
V0 = (ħ/m)∇Φ = (ħ/m)∇φ(h)V(h),
and both components of Φ contribute to the resultant phase source in the associated continuity equation:
σΦ = dΦ/dt = 0/dt + (h)/dtσ0 + σh = σ0 + V(h) ⋅∇φ(h) = σ0 + ∇⋅J(h).
Therefore, in the stationary equilibrium, the vertical source of the wavefunction phase remains constant, σ0 = 0/dt = −ω0, while the local horizontal-phase source assumes a purely convectional character:
σh(h)/dt = [dr(h)/dt] [∂φ(h)/∂r(h)]= V(h) ⋅∇φ(h) = ∇⋅J(h).
The SE for components of Ψ0 reads:
Φ/∂t = ∂φ0/∂t = −ω0 = ħ(2m)−1{R0−1ΔR0 + 2i∇R0⋅∇φ(h) − [∇φ(h)]2} − v/ħ.
Its imaginary part confirms that V(h) = (ħ/m)∇φ(h), and hence also the associated probability current, j(h) = p0V(h), are indeed perpendicular to the probability gradient ∇p0 = 2R0R0,
p0 V(h) = ∇⋅j(h) = 0.
The real part of Equation (110) generates the associated phase dynamics,
Φ/∂t = −ω0 = ħ(2m)−1{R0−1ΔR0 − [∇φ(h)]2} − v/ħ = −∇⋅J(h) + σ0,
where the horizontal phase current J(h) = φ(h)V(h), ∇⋅J(h) = ∇φ(h)V(h) and the resultant phase source is defined in Equation (108).

9. Conclusions

In this conceptual overview we have first examined the spatial equalization of the electronic phase in molecules, using the (complex) local-energy concept. The real component of E(r, t) was shown to shape the dynamics of wavefunction phase, while the time evolution of the state modulus was shown to be governed by the imaginary component of local energy. In QM the spatial equalization of the local wave-number or phase concepts marks the system stationary state. The resultant IT descriptors, combining the modulus/probability and phase/current contributions, were revisited and the wave-function mapping in QM was compared with the probability scheme of classical IT. The nonclassical, phase/current supplements in the resultant IT measures effectively lower the classical entropic uncertainty and increase the spatial information determinicity of quantum states.
The phase-transformed (“thermodynamic”) states were introduced and their IT optimum phases were determined from the auxiliary entropic principle. These equilibrium states were shown to exhibit the exactly vanishing internal (resultant) IT descriptors of electronic states. Therefore, in statistical mixtures of such states, the overall entropy content reduces to the external ensemble entropy of von Neumann. This brings more consistency into the quantum IT description of open molecular systems. We have also summarized the local continuity relations for the wavefunction components, state physical descriptors, and information densities. They follow from the component-resolved SE and the realization that flows of molecular properties are carried by (convection) fluxes of electronic probability. Therefore, in such treatments, the electrons are carriers of densities of both the system physical and information properties.
The principal variational principle for the minimum of the grand potential was interpreted as an equivalent information rule. In an ensemble description of chemical reactions in the acid–base systems, the populational derivatives of the ensemble-average resultant information were shown to constitute adequate entropic criteria for diagnosing the molecular CT phenomena, fully equivalent to their energy analogs. Latent electronic fluxes in the stationary molecular states were identified. These hidden (“horizontal”) electronic flows, along the constant-probability contours, do not affect the stationary probability distribution and generate velocity vortices in molecules. Using the SE for wavefunction components, their local velocity was related to the “thermodynamic” phase of the phase-transformed equilibrium states.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Prigogine, I. From Being to Becoming: Time and Complexity in the Physical Sciences; W. H. Freeman: San Francisco, CA, USA, 1980. [Google Scholar]
  2. Nalewajski, R.F. Quantum Information Theory of Molecular States; Nova Science Publishers: New York, NY, USA, 2016. [Google Scholar]
  3. Nalewajski, R.F. Information-theoretic descriptors of molecular states and electronic communications between reactants. Entropy 2020, 22, 749. [Google Scholar] [CrossRef]
  4. Nalewajski, R.F. Understanding electronic structure and chemical reactivity: Quantum-information perspective. Appl. Sci. 2019, 9, 1262. [Google Scholar] [CrossRef] [Green Version]
  5. Nalewajski, R.F. Phase equalization, charge transfer, information flows and electron communications in donor-acceptor systems. Appl. Sci. 2020, 10, 3615. [Google Scholar] [CrossRef]
  6. Nalewajski, R.F. Information-theoretic concepts in theory of electronic structure and chemical reactivity. In Chemical Reactivity Theories: Principles and Approaches; Kaya, S., von Szentpály, L., Eds.; Taylor & Francis: London, UK, 2020; in press. [Google Scholar]
  7. Nalewajski, R.F. Phase modeling in donor-acceptor systems, continuity relations and resultant entropy/information descriptors. In Chemical Reactivity Theories: Principles and Approaches; Kaya, S., von Szentpály, L., Eds.; Taylor & Francis: London, UK, 2020; in press. [Google Scholar]
  8. Nalewajski, R.F. Equidensity orbitals in resultant-information description of electronic states. Theor. Chem. Acc. 2019, 138, 108–123. [Google Scholar] [CrossRef] [Green Version]
  9. Nalewajski, R.F. Resultant information description of electronic states and chemical processes. J. Phys. Chem. A 2019, 123, 9737–9752. [Google Scholar] [CrossRef]
  10. Nalewajski, R.F. Exploring molecular equilibria using quantum information measures. Ann. Phys. 2013, 525, 256–268. [Google Scholar] [CrossRef]
  11. Nalewajski, R.F. On phase/current components of entropy/information descriptors of molecular states. Mol. Phys. 2014, 112, 2587–2601. [Google Scholar] [CrossRef]
  12. Nalewajski, R.F. Complex entropy and resultant information measures. J. Math. Chem. 2016, 54, 1777–1782. [Google Scholar] [CrossRef] [Green Version]
  13. Nalewajski, R.F. Quantum information measures and their use in chemistry. Curr. Phys. Chem. 2017, 7, 94–117. [Google Scholar] [CrossRef]
  14. Nalewajski, R.F. Information equilibria, subsystem entanglement, and dynamics of the overall entropic descriptors of molecular electronic structure. J. Mol. Model. 2018, 24, 1–15. [Google Scholar] [CrossRef] [Green Version]
  15. Nalewajski, R.F. Use of Fisher information in quantum chemistry. Int. J. Quantum Chem. 2008, 108, 2230–2252. [Google Scholar] [CrossRef]
  16. Marc, G.; NcMillan, W.G. The virial theorem. Adv. Chem. Phys. 1985, 58, 209–361. [Google Scholar]
  17. Ruedenberg, K. The physical nature of the chemical bond. Rev. Mod. Phys. 1962, 34, 326–376. [Google Scholar] [CrossRef]
  18. Kutzelnigg, W. The physical mechanism of the chemical bond. Angew. Chem. Int. Ed. 1973, 12, 546–562. [Google Scholar] [CrossRef]
  19. Feinberg, M.; Ruedenberg, K. Paradoxical role of the kinetic-energy operator in the formation of the covalent bond. J. Chem. Phys. 1971, 54, 1495–1511. [Google Scholar] [CrossRef]
  20. Feinberg, M.J.; Ruedenberg, K. Heteropolar one-electron bond. J. Chem. Phys. 1971, 55, 5805–5818. [Google Scholar] [CrossRef]
  21. Bacskay, G.B.; Nordholm, S.; Ruedenberg, K. The virial theorem and covalent bonding. J. Phys. Chem. A 2018, 122, 7880–7893. [Google Scholar] [CrossRef] [PubMed]
  22. Nalewajski, R.F. On the Fues potential and ist improvement. Chem. Phys. 1977, 22, 257–265. [Google Scholar] [CrossRef]
  23. Nalewajski, R.F.; Parr, R.G. Use of the virial theorem in construction of potential energy functions for diatomic molecules. J. Chem. Phys. 1977, 67, 1324–1334. [Google Scholar] [CrossRef]
  24. Nalewajski, R.F. Some implications of the virial theorem for molecular force fields. Chem. Phys. Lett. 1978, 54, 502–505. [Google Scholar] [CrossRef]
  25. Nalewajski, R.F. Use of the virial theorem in construction of potential energy functions for diatomic molecules. 3. Improved potentials from the normalized kinetic field functions. J. Phys. Chem. 1978, 82, 1439–1449. [Google Scholar] [CrossRef]
  26. Nalewajski, R.F. Virial theorem implications for the minimum energy reaction paths. Chem. Phys. 1980, 50, 127–136. [Google Scholar] [CrossRef]
  27. Nalewajski, R.F. Use of the virial theorem in modeling the potential energy surfaces for triatomic collinear reactions. Int. J. Q. Chem. Symp. 1980, 14, 483–492. [Google Scholar] [CrossRef]
  28. Nalewajski, R.F.; Pastewski, R. Normalized kinetic field potentials for the atom-diatom reactions. Testing the collinear surfaces. Int. J. Q. Chem. 2009, 20, 595–610. [Google Scholar] [CrossRef]
  29. Nalewajski, R.F.; Pastewski, R. Normalized kinetic field potentials for atom-diatom reactions. Three-dimensional surfaces from the relaxed bond-energy-bond-order model. J. Phys. Chem. 1981, 85, 3618–3628. [Google Scholar] [CrossRef]
  30. Nalewajski, R.F. Role of electronic kinetic energy and resultant gradient information in chemical reactivity. J. Mol. Model. 2019, 25, 259–278. [Google Scholar] [CrossRef] [Green Version]
  31. Nalewajski, R.F. Independent sources of information-theoretic descriptors of electronic states. Acad. J. Chem. 2020, 5, 106–115. [Google Scholar] [CrossRef]
  32. Khinchin, A.I. Mathematical Foundations of Information Theory; Dover Publications: New York, NY, USA, 1957. [Google Scholar]
  33. von Neumann, J. Mathematical Foundations of Quantum Mechanics; Princeton University Press: Princeton, NJ, USA, 1955. [Google Scholar]
  34. Shannon, C.E. The mathematical theory of communication. Bell Syst. Tech. J. 1948, 7, 379–493, 623–656. [Google Scholar] [CrossRef] [Green Version]
  35. Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949. [Google Scholar]
  36. Fisher, R.A. Theory of Statistical Estimation. Math. Proc. Camb. Philos. Soc. 1925, 22, 700–725. [Google Scholar] [CrossRef] [Green Version]
  37. Frieden, B.R. Physics from Fisher Information; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  38. López-Rosa, S. Information Theoretic Measures of Atomic and Molecular Systems. Ph.D. Thesis, University of Granada, Granada, Spain, 2010. [Google Scholar]
  39. López-Rosa, S.; Esquivel, R.O.; Angulo, J.C.; Antolín, J.; Dehesa, J.S.; Flores-Gallegos, N. Fisher information study in position and momentum spaces for elementary chemical reactions. J. Chem. Theory Comput. 2010, 6, 145–154. [Google Scholar] [CrossRef]
  40. Esquivel, R.O.; Liu, S.; Angulo, J.C.; Dehesa, J.S.; Antolín, J.; Molina-Espíritu, M. Fisher information and steric effect: Study of the internal rotation barrier of ethane. J. Phys. Chem. A 2011, 115, 4406–4415. [Google Scholar] [CrossRef] [PubMed]
  41. Hohenberg, P.; Kohn, W. Inhomogeneous electron gas. Phys. Rev. 1964, 136B, 864–971. [Google Scholar] [CrossRef] [Green Version]
  42. Kohn, W.; Sham, L.J. Self-consistent equations including exchange and correlation effects. Phys. Rev. 1965, 140, A1133–A1138. [Google Scholar] [CrossRef] [Green Version]
  43. Levy, M. Universal variational functionals of electron densities, first-order density matrices, and natural spin-orbitals and solution of the v-representability problem. Proc. Natl. Acad. Sci. USA 1979, 76, 6062–6065. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Parr, R.G.; Yang, W. Density Functional Theory of Atoms and Molecules, 1st ed.; Oxford University Press: Oxford, UK, 1989. [Google Scholar]
  45. Dreizler, R.M.; Gross, E.K.U. Density Functional Theory: An Approach to the Quantum Many-Body Problem; Springer: Berlin/Heidelberg, Germany, 1990. [Google Scholar]
  46. Nalewajski, R.F. (Ed.) Density Functional Theory I-IV, Topics in Current Chemistry Vols 180–183; Springer: Heidelberg, Germany, 1996. [Google Scholar]
  47. Davydov, A.S. Quantum Mechanics; Pergamon Press: Oxford, UK, 1965. [Google Scholar]
  48. Cohen-Tannoudji, C.; Diu, B.; Laloë, F. Quantum Mechanics; Wiley: New York, NY, USA, 1977. [Google Scholar]
  49. Nalewajski, R.F. Interacting subsystems and their molecular ensembles. Acad. J. Chem. 2020, 5, 25–30. [Google Scholar] [CrossRef]
  50. Nalewajski, R.F. Continuity relations, probability acceleration, current sources and internal communications in interacting fragments. Acad. J. Chem. 2020, 5, 58–68. [Google Scholar] [CrossRef]
  51. Nalewajski, R.F. Quantum information perspective on chemical reactivity. In Mathematics Applied to Engineering in Action: Advanced Theories, Methods, and Models; Islam, N., Singh, S.B., Ranjan, P., Haghi, A.K., Eds.; Apple Academic Press: Palm Bay, FL, USA, 2021; pp. 1–40. [Google Scholar]
  52. Nalewajski, R.F. N-dependence of electronic energies in atoms and molecules: Mulliken and expomnential interpolations. J. Math. Chem. 2010, 47, 1068–1076. [Google Scholar] [CrossRef]
  53. Nalewajski, R.F. Sensitivity analysis of charge transfer systems: In situ quantities, intersecting state model and its implications. Int. J. Quantum Chem. 1994, 49, 675–703. [Google Scholar] [CrossRef]
  54. Gyftopoulos, E.P.; Hatsopoulos, G.N. Quantum-thermodynamic definition of electronegativity. Proc. Natl. Acad. Sci. USA 1968, 60, 786–793. [Google Scholar] [CrossRef] [Green Version]
  55. Perdew, J.P.; Parr, R.G.; Levy, M.; Balduz, J.L. Density-functional theory for fractional particle number: Derivative discontinuities of the energy. Phys. Rev. Lett. 1982, 49, 1691–1694. [Google Scholar] [CrossRef]
  56. Nalewajski, R.F.; Korchowiec, J.; Michalak, A. Reactivity criteria in charge sensitivity analysis. Top. Curr. Chem. 1996, 183, 25–141. [Google Scholar]
  57. Nalewajski, R.F.; Korchowiec, J. Charge Sensitivity Approach to Electronic Structure and Chemical Reactivity; World Scientific Pub Co Pte Lt: Singapore, 1997. [Google Scholar]
  58. Nalewajski, R.F. Chemical reactivity concepts in charge sensitivity analysis. Int. J. Quantum Chem. 1994, 56, 453–476. [Google Scholar] [CrossRef]
  59. Geerlings, P.; De Proft, F.; Langenaeker, W. Conceptual density functional theory. Chem. Rev. 2003, 103, 1793–1873. [Google Scholar] [CrossRef]
  60. Mulliken, R.S. A new electronegativity scale: Together with data on valence states and on ionization potentials and electron affinities. J. Chem. Phys. 1934, 2, 782–793. [Google Scholar] [CrossRef]
  61. Iczkowski, R.P.; Margrave, J.L. Electronegativity. J. Am. Chem. Soc. 1961, 83, 3547–3551. [Google Scholar] [CrossRef]
  62. Parr, R.G.; Donnelly, R.A.; Levy, M.; Palke, W.E. Electronegativity: The density functional viewpoint. J. Chem. Phys. 1978, 69, 4431–4439. [Google Scholar] [CrossRef]
  63. Parr, R.G.; Pearson, R.G. Absolute hardness: Companion parameter to absolute electronegativity. J. Am. Chem. Soc. 1983, 105, 7512–7516. [Google Scholar] [CrossRef]
  64. Parr, R.G.; Yang, W. Density functional approach to the frontier-electron theory of chemical reactivity. J. Am. Chem. Soc. 1984, 106, 4049–4050. [Google Scholar] [CrossRef]
  65. Nalewajski, R.F. On entropy/information description of reactivity phenomena. In Advances in Mathematics Research; Baswell, A.R., Ed.; Nova Science Publishers: New York, NY, USA, 2019; Volume 26, pp. 97–157. [Google Scholar]
  66. Nalewajski, R.F. Resultant information approach to donor-acceptor systems. In An Introduction to Electronic Structure Theory; Paulsen, N.T., Ed.; Nova Science Publishers: New York, NY, USA, 2020; pp. 1–58. [Google Scholar]
  67. Nalewajski, R.F. Information Theory of Molecular Systems; Elsevier BV: Amsterdam, The Netherlands, 2006. [Google Scholar]
  68. Nalewajski, R.F. Information Origins of the Chemical Bond; Nova Science Publishers: New York, NY, USA, 2010. [Google Scholar]
  69. Nalewajski, R.F. Perspectives in Electronic Structure Theory; Springer: Heidelberg, Germany, 2012. [Google Scholar]
  70. Callen, H.B. Thermodynamics: An Introduction to the Physical Theories of Equilibrium Thermostatics and Irreversible Thermodynamics; Wiley: New York, NY, USA, 1962. [Google Scholar]
  71. Lugt, H.J. Vortex Flow in Nature and Technology; Wiley: New York, NY, USA, 1983. [Google Scholar]
Figure 1. Classical (probability) and quantum (wavefunction) information schemes in molecular QM. The quantum mapping {rψ(r)} implies both the classical {rp(r)} and nonclassical attributions {r→[φ(r), j(r) or V(r)]}.
Figure 1. Classical (probability) and quantum (wavefunction) information schemes in molecular QM. The quantum mapping {rψ(r)} implies both the classical {rp(r)} and nonclassical attributions {r→[φ(r), j(r) or V(r)]}.
Entropy 23 00483 g001
Figure 2. Local “vertical” (v) and “horizontal” (h) directions.
Figure 2. Local “vertical” (v) and “horizontal” (h) directions.
Entropy 23 00483 g002
Figure 3. Schematic diagrams of atomic and molecular vortices of “horizontal” flows of electronic probability density in atomic fragments of diatomic promolecule M0, the polarized system M+, and in molecule M.
Figure 3. Schematic diagrams of atomic and molecular vortices of “horizontal” flows of electronic probability density in atomic fragments of diatomic promolecule M0, the polarized system M+, and in molecule M.
Entropy 23 00483 g003
Table 1. Summary of wavefunction components of the quantum state |ψ(t)〉 of an electron, their dynamics, physical descriptors and local sources.
Table 1. Summary of wavefunction components of the quantum state |ψ(t)〉 of an electron, their dynamics, physical descriptors and local sources.
Schrödinger equation:H |ψ(t)〉 = iħ [∂|ψ(t)〉/∂t]
Wavefunction:ψ[r(t), t] = 〈r(t)|ψ(t)〉 ≡ ψ(r, t) = R(r, t) exp[iφ(r, t)]
 modulusR(r, t), ∂R(r, t)/∂t = −V(r, t)⋅∇R(r, t)
 phaseφ(r, t), ∂φ(r, t)/∂t = ħ(2m)−1 {R(r, t)−1 ΔR(r, t) − [∇φ(r, t)]2} − v(r)/ħ
 time-dependenceExplicit, due to |ψ(t)〉, and implicit, due to |r(t)〉
 logarithmlnψ(r, t) = lnR(r, t) + iφ(r, t) = ½ lnp(r, t) + iφ(r, t)
Descriptors of electron probability density p(r, t) = R(r, t)2:
 currentj(r, t) = (ħ/m) p(r, t) ∇φ(r, t) = p(r, t) V(r, t)
 velocityV(r, t) ≡ j(r, t)/p(r, t), ∇⋅V(r, t) = (ħ/mφ(r, t) = 0
 accelerationa(r, t) = dV(r, t)/dt = (ħ/m)∇σφ(r, t)
 forceF(r, t) = m a(r, t) ≡ −∇W(r, t)
 potentialW(r, t) = − ∫F(r, t) dr = −ħσφ(r, t)
Resultant gradient information:I[ψ] = ∫p(r, t){[∇lnp(r, t)]2 + 4 [∇φ(r, t)]2} dr ≡ ∫p(r, t)I(r, t) dr
Convection operator:V(r, t)⋅∇ = d/dt − ∂/∂t
Sources:
 probability
σp(r, t) = dp(r, t)/dt = ∂p(r, t)/∂t + ∇⋅ j(r, t) = 0
 phaseσφ(r, t) = (r, t)/dt = ∂φ(r, t)/∂t + ∇⋅J(r, t)
= ħ(2m)−1 {R(r, t)−1ΔR(r, t) + [∇φ(r, t)]2} − v(r)/ħ
J(r, t) = φ(r, t) V(r, t)
 currentσj(r, t) = dj(r, t)/dt = σφ(r, t)V(r, t) + φ(r, t) a(r, t)
 informationσI(t) = κj(r, t)⋅∇v(r) dr = κ ħj(r, t)⋅∇σφ(r, t) dr
κ = 8m/ħ2
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nalewajski, R.F. Resultant Information Descriptors, Equilibrium States and Ensemble Entropy . Entropy 2021, 23, 483. https://doi.org/10.3390/e23040483

AMA Style

Nalewajski RF. Resultant Information Descriptors, Equilibrium States and Ensemble Entropy . Entropy. 2021; 23(4):483. https://doi.org/10.3390/e23040483

Chicago/Turabian Style

Nalewajski, Roman F. 2021. "Resultant Information Descriptors, Equilibrium States and Ensemble Entropy " Entropy 23, no. 4: 483. https://doi.org/10.3390/e23040483

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop