Optimal Nonlinear Estimation in Statistical Manifolds with Application to Sensor Network Localization*Entropy* **2017**, *19*(7), 308; doi:10.3390/e19070308 (registering DOI) - 28 June 2017**Abstract **

Information geometry enables a deeper understanding of the methods of statistical inference. In this paper, the problem of nonlinear parameter estimation is considered from a geometric viewpoint using a natural gradient descent on statistical manifolds. It is demonstrated that the nonlinear estimation for

[...] Read more.
Information geometry enables a deeper understanding of the methods of statistical inference. In this paper, the problem of nonlinear parameter estimation is considered from a geometric viewpoint using a natural gradient descent on statistical manifolds. It is demonstrated that the nonlinear estimation for curved exponential families can be simply viewed as a deterministic optimization problem with respect to the structure of a statistical manifold. In this way, information geometry offers an elegant geometric interpretation for the solution to the estimator, as well as the convergence of the gradient-based methods. The theory is illustrated via the analysis of a distributed mote network localization problem where the Radio Interferometric Positioning System (RIPS) measurements are used for free mote location estimation. The analysis results demonstrate the advanced computational philosophy of the presented methodology.
Full article

A Novel Geometric Dictionary Construction Approach for Sparse Representation Based Image Fusion*Entropy* **2017**, *19*(7), 306; doi:10.3390/e19070306 (registering DOI) - 27 June 2017**Abstract **

Sparse-representation based approaches have been integrated into image fusion methods in the past few years and show great performance in image fusion. Training an informative and compact dictionary is a key step for a sparsity-based image fusion method. However, it is difficult to

[...] Read more.
Sparse-representation based approaches have been integrated into image fusion methods in the past few years and show great performance in image fusion. Training an informative and compact dictionary is a key step for a sparsity-based image fusion method. However, it is difficult to balance “informative” and “compact”. In order to obtain sufficient information for sparse representation in dictionary construction, this paper classifies image patches from source images into different groups based on morphological similarities. Stochastic coordinate coding (SCC) is used to extract corresponding image-patch information for dictionary construction. According to the constructed dictionary, image patches of source images are converted to sparse coefficients by the simultaneous orthogonal matching pursuit (SOMP) algorithm. At last, the sparse coefficients are fused by the Max-L1 fusion rule and inverted to a fused image. The comparison experimentations are simulated to evaluate the fused image in image features, information, structure similarity, and visual perception. The results confirm the feasibility and effectiveness of the proposed image fusion solution.
Full article

Effects of Task Demands on Kinematics and EMG Signals during Tracking Tasks Using Multiscale Entropy*Entropy* **2017**, *19*(7), 307; doi:10.3390/e19070307 (registering DOI) - 27 June 2017**Abstract **

Target-directed elbow movements are essential in daily life; however, how different task demands affect motor control is seldom reported. In this study, the relationship between task demands and the complexity of kinematics and electromyographic (EMG) signals on healthy young individuals was investigated. Tracking

[...] Read more.
Target-directed elbow movements are essential in daily life; however, how different task demands affect motor control is seldom reported. In this study, the relationship between task demands and the complexity of kinematics and electromyographic (EMG) signals on healthy young individuals was investigated. Tracking tasks with four levels of task demands were designed, and participants were instructed to track the target trajectories by extending or flexing their elbow joint. The actual trajectories and EMG signals from the biceps and triceps were recorded simultaneously. Multiscale fuzzy entropy was utilized to analyze the complexity of actual trajectories and EMG signals over multiple time scales. Results showed that the complexity of actual trajectories and EMG signals increased when task demands increased. As the time scale increased, there was a monotonic rise in the complexity of actual trajectories, while the complexity of EMG signals rose first, and then fell. Noise abatement may account for the decreasing entropy of EMG signals at larger time scales. This study confirmed the uniqueness of multiscale entropy, which may be useful in the analysis of electrophysiological signals.
Full article

Complex Dynamics of an SIR Epidemic Model with Nonlinear Saturate Incidence and Recovery Rate*Entropy* **2017**, *19*(7), 305; doi:10.3390/e19070305 (registering DOI) - 27 June 2017**Abstract **

Susceptible-infectious-removed (SIR) epidemic models are proposed to consider the impact of available resources of the public health care system in terms of the number of hospital beds. Both the incidence rate and the recovery rate are considered as nonlinear functions of the number

[...] Read more.
Susceptible-infectious-removed (SIR) epidemic models are proposed to consider the impact of available resources of the public health care system in terms of the number of hospital beds. Both the incidence rate and the recovery rate are considered as nonlinear functions of the number of infectious individuals, and the recovery rate incorporates the influence of the number of hospital beds. It is shown that backward bifurcation and saddle-node bifurcation may occur when the number of hospital beds is insufficient. In such cases, it is critical to prepare an appropriate amount of hospital beds because only reducing the basic reproduction number less than unity is not enough to eradicate the disease. When the basic reproduction number is larger than unity, the model may undergo forward bifurcation and Hopf bifurcation. The increasing hospital beds can decrease the infectious individuals. However, it is useless to eliminate the disease. Therefore, maintaining enough hospital beds is important for the prevention and control of the infectious disease. Numerical simulations are presented to illustrate and complement the theoretical analysis.
Full article

Large Scale Emerging Properties from Non Hamiltonian Complex Systems*Entropy* **2017**, *19*(7), 302; doi:10.3390/e19070302 - 26 June 2017**Abstract **

The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser,

[...] Read more.
The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom) system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO).
Full article

Quantum Probabilities and Maximum Entropy*Entropy* **2017**, *19*(7), 304; doi:10.3390/e19070304 - 26 June 2017**Abstract **
Probabilities in quantum physics can be shown to originate from a maximum entropy principle.
Full article

Node Importance Ranking of Complex Networks with Entropy Variation*Entropy* **2017**, *19*(7), 303; doi:10.3390/e19070303 - 26 June 2017**Abstract **

►
Figures
The heterogeneous nature of a complex network determines the roles of each node in the network that are quite different. Mechanisms of complex networks such as spreading dynamics, cascading reactions, and network synchronization are highly affected by a tiny fraction of so-called important

[...] Read more.
The heterogeneous nature of a complex network determines the roles of each node in the network that are quite different. Mechanisms of complex networks such as spreading dynamics, cascading reactions, and network synchronization are highly affected by a tiny fraction of so-called important nodes. Node importance ranking is thus of great theoretical and practical significance. Network entropy is usually utilized to characterize the amount of information encoded in the network structure and to measure the structural complexity at the graph level. We find that entropy can also serve as a local level metric to quantify node importance. We propose an entropic metric, Entropy Variation, defining the node importance as the variation of network entropy before and after its removal, according to the assumption that the removal of a more important node is likely to cause more structural variation. Like other state-of-the-art methods for ranking node importance, the proposed entropic metric is also used to utilize structural information, but at the systematical level, not the local level. Empirical investigations on real life networks, the Snake Idioms Network, and several other well-known networks, demonstrate the superiority of the proposed entropic metric, notably outperforming other centrality metrics in identifying the top-*k* most important nodes.
Full article

Entropic Measure of Epistemic Uncertainties in Multibody System Models by Axiomatic Design*Entropy* **2017**, *19*(7), 291; doi:10.3390/e19070291 - 26 June 2017**Abstract **

►
Figures
In this paper, the use of the MaxInf Principle in real optimization problems is investigated for engineering applications, where the current design solution is actually an engineering approximation. In industrial manufacturing, multibody system simulations can be used to develop new machines and mechanisms

[...] Read more.
In this paper, the use of the MaxInf Principle in real optimization problems is investigated for engineering applications, where the current design solution is actually an engineering approximation. In industrial manufacturing, multibody system simulations can be used to develop new machines and mechanisms by using virtual prototyping, where an axiomatic design can be employed to analyze the independence of elements and the complexity of connections forming a general mechanical system. In the classic theories of Fisher and Wiener-Shannon, the idea of information is a measure of only probabilistic and repetitive events. However, this idea is broader than the probability alone field. Thus, the Wiener-Shannon’s axioms can be extended to non-probabilistic events and it is possible to introduce a theory of information for non-repetitive events as a measure of the reliability of data for complex mechanical systems. To this end, one can devise engineering solutions consistent with the values of the design constraints analyzing the complexity of the relation matrix and using the idea of information in the metric space. The final solution gives the entropic measure of epistemic uncertainties which can be used in multibody system models, analyzed with an axiomatic design.
Full article

Test of the Pauli Exclusion Principle in the VIP-2 Underground Experiment*Entropy* **2017**, *19*(7), 300; doi:10.3390/e19070300 (registering DOI) - 24 June 2017**Abstract **

►
Figures
The validity of the Pauli exclusion principle—a building block of Quantum Mechanics—is tested for electrons. The VIP (violation of Pauli exclusion principle) and its follow-up VIP-2 experiments at the Laboratori Nazionali del Gran Sasso search for X-rays from copper atomic transitions that are

[...] Read more.
The validity of the Pauli exclusion principle—a building block of Quantum Mechanics—is tested for electrons. The VIP (violation of Pauli exclusion principle) and its follow-up VIP-2 experiments at the Laboratori Nazionali del Gran Sasso search for X-rays from copper atomic transitions that are prohibited by the Pauli exclusion principle. The candidate events—if they exist—originate from the transition of a $2p$ orbit electron to the ground state which is already occupied by two electrons. The present limit on the probability for Pauli exclusion principle violation for electrons set by the VIP experiment is $4.7\times {10}^{-29}$ . We report a first result from the VIP-2 experiment improving on the VIP limit, which solidifies the final goal of achieving a two orders of magnitude gain in the long run.
Full article

Measurement Uncertainty Relations for Position and Momentum: Relative Entropy Formulation*Entropy* **2017**, *19*(7), 301; doi:10.3390/e19070301 - 24 June 2017**Abstract **

Heisenberg’s uncertainty principle has recently led to general measurement uncertainty relations for quantum systems: incompatible observables can be measured jointly or in sequence only with some unavoidable approximation, which can be quantified in various ways. The relative entropy is the natural theoretical quantifier

[...] Read more.
Heisenberg’s uncertainty principle has recently led to general measurement uncertainty relations for quantum systems: incompatible observables can be measured jointly or in sequence only with some unavoidable approximation, which can be quantified in various ways. The relative entropy is the natural theoretical quantifier of the information loss when a `true’ probability distribution is replaced by an approximating one. In this paper, we provide a lower bound for the amount of information that is lost by replacing the distributions of the sharp position and momentum observables, as they could be obtained with two separate experiments, by the marginals of any smeared joint measurement. The bound is obtained by introducing an entropic error function, and optimizing it over a suitable class of covariant approximate joint measurements. We fully exploit two cases of target observables: (1) *n*-dimensional position and momentum vectors; (2) two components of position and momentum along different directions. In (1), we connect the quantum bound to the dimension *n*; in (2), going from parallel to orthogonal directions, we show the transition from highly incompatible observables to compatible ones. For simplicity, we develop the theory only for Gaussian states and measurements.
Full article

The Legendre Transform in Non-Additive Thermodynamics and Complexity*Entropy* **2017**, *19*(7), 298; doi:10.3390/e19070298 - 23 June 2017**Abstract **

We present an argument which purports to show that the use of the standard Legendre transform in non-additive Statistical Mechanics is not appropriate. For concreteness, we use as paradigm, the case of systems which are conjecturally described by the (non-additive) Tsallis entropy. We

[...] Read more.
We present an argument which purports to show that the use of the standard Legendre transform in non-additive Statistical Mechanics is not appropriate. For concreteness, we use as paradigm, the case of systems which are conjecturally described by the (non-additive) Tsallis entropy. We point out the form of the modified Legendre transform that should be used, instead, in the non-additive thermodynamics induced by the Tsallis entropy. We comment on more general implications of this proposal for the thermodynamics of “complex systems”.
Full article

Critical Behavior in Physics and Probabilistic Formal Languages*Entropy* **2017**, *19*(7), 299; doi:10.3390/e19070299 - 23 June 2017**Abstract **

We show that the mutual information between two symbols, as a function of the number of symbols between the two, decays exponentially in any probabilistic regular grammar, but can decay like a power law for a context-free grammar. This result about formal languages

[...] Read more.
We show that the mutual information between two symbols, as a function of the number of symbols between the two, decays exponentially in any probabilistic regular grammar, but can decay like a power law for a context-free grammar. This result about formal languages is closely related to a well-known result in classical statistical mechanics that there are no phase transitions in dimensions fewer than two. It is also related to the emergence of power law correlations in turbulence and cosmological inflation through recursive generative processes. We elucidate these physics connections and comment on potential applications of our results to machine learning tasks like training artificial recurrent neural networks. Along the way, we introduce a useful quantity, which we dub the rational mutual information, and discuss generalizations of our claims involving more complicated Bayesian networks.
Full article

Two Approaches to Obtaining the Space-Time Fractional Advection-Diffusion Equation*Entropy* **2017**, *19*(7), 297; doi:10.3390/e19070297 - 23 June 2017**Abstract **

►
Figures
Two approaches resulting in two different generalizations of the space-time-fractional advection-diffusion equation are discussed. The Caputo time-fractional derivative and Riesz fractional Laplacian are used. The fundamental solutions to the corresponding Cauchy and source problems in the case of one spatial variable are studied

[...] Read more.
Two approaches resulting in two different generalizations of the space-time-fractional advection-diffusion equation are discussed. The Caputo time-fractional derivative and Riesz fractional Laplacian are used. The fundamental solutions to the corresponding Cauchy and source problems in the case of one spatial variable are studied using the Laplace transform with respect to time and the Fourier transform with respect to the spatial coordinate. The numerical results are illustrated graphically.
Full article

Analytical Approximate Solutions of (*n *+ 1)-Dimensional Fractal Heat-Like and Wave-Like Equations*Entropy* **2017**, *19*(7), 296; doi:10.3390/e19070296 - 22 June 2017**Abstract **

►
Figures
In this paper, we propose a new type (*n *+ 1)-dimensional reduced differential transform method (RDTM) based on a local fractional derivative (LFD) to solve (*n *+ 1)-dimensional local fractional partial differential equations (PDEs) in Cantor sets. The presented method is

[...] Read more.
In this paper, we propose a new type (*n *+ 1)-dimensional reduced differential transform method (RDTM) based on a local fractional derivative (LFD) to solve (*n *+ 1)-dimensional local fractional partial differential equations (PDEs) in Cantor sets. The presented method is named the (*n *+ 1)-dimensional local fractional reduced differential transform method (LFRDTM). First the theories, their proofs and also some basic properties of this procedure are given. To understand the introduced method clearly, we apply it on the (*n *+ 1)-dimensional fractal heat-like equations (HLEs) and wave-like equations (WLEs). The applications show that this new technique is efficient, simply applicable and has powerful effects in (*n *+ 1)-dimensional local fractional problems.
Full article

An Exploration Algorithm for Stochastic Simulators Driven by Energy Gradients*Entropy* **2017**, *19*(7), 294; doi:10.3390/e19070294 - 22 June 2017**Abstract **

►
Figures
In recent work, we have illustrated the construction of an exploration geometry on free energy surfaces: the adaptive computer-assisted discovery of an approximate low-dimensional manifold on which the effective dynamics of the system evolves. Constructing such an exploration geometry involves geometry-biased sampling (through

[...] Read more.
In recent work, we have illustrated the construction of an exploration geometry on free energy surfaces: the adaptive computer-assisted discovery of an approximate low-dimensional manifold on which the effective dynamics of the system evolves. Constructing such an exploration geometry involves geometry-biased sampling (through both appropriately-initialized unbiased molecular dynamics and through restraining potentials) and, machine learning techniques to organize the intrinsic geometry of the data resulting from the sampling (in particular, diffusion maps, possibly enhanced through the appropriate Mahalanobis-type metric). In this contribution, we detail a method for exploring the conformational space of a stochastic gradient system whose effective free energy surface depends on a smaller number of degrees of freedom than the dimension of the phase space. Our approach comprises two steps. First, we study the local geometry of the free energy landscape using diffusion maps on samples computed through stochastic dynamics. This allows us to automatically identify the relevant coarse variables. Next, we use the information garnered in the previous step to construct a new set of initial conditions for subsequent trajectories. These initial conditions are computed so as to explore the accessible conformational space more efficiently than by continuing the previous, unbiased simulations. We showcase this method on a representative test system.
Full article

Bodily Processing: The Role of Morphological Computation*Entropy* **2017**, *19*(7), 295; doi:10.3390/e19070295 - 22 June 2017**Abstract **

►
Figures
The integration of embodied and computational approaches to cognition requires that non-neural body parts be described as parts of a computing system, which realizes cognitive processing. In this paper, based on research about morphological computations and the ecology of vision, I argue that

[...] Read more.
The integration of embodied and computational approaches to cognition requires that non-neural body parts be described as parts of a computing system, which realizes cognitive processing. In this paper, based on research about morphological computations and the ecology of vision, I argue that nonneural body parts could be described as parts of a computational system, but they do not realize computation autonomously, only in connection with some kind of—even in the simplest form—central control system. Finally, I integrate the proposal defended in the paper with the contemporary mechanistic approach to wide computation.
Full article

The Entropic Linkage between Equity and Bond Market Dynamics*Entropy* **2017**, *19*(6), 292; doi:10.3390/e19060292 - 21 June 2017**Abstract **

►
Figures
An alternative derivation of the yield curve based on entropy or the loss of information as it is communicated through time is introduced. Given this focus on entropy growth in communication the Shannon entropy will be utilized. Additionally, Shannon entropy’s close relationship to

[...] Read more.
An alternative derivation of the yield curve based on entropy or the loss of information as it is communicated through time is introduced. Given this focus on entropy growth in communication the Shannon entropy will be utilized. Additionally, Shannon entropy’s close relationship to the Kullback–Leibler divergence is used to provide a more precise understanding of this new yield curve. The derivation of the entropic yield curve is completed with the use of the Burnashev reliability function which serves as a weighting between the true and error distributions. The deep connections between the entropic yield curve and the popular Nelson–Siegel specification are also examined. Finally, this entropically derived yield curve is used to provide an estimate of the economy’s implied information processing ratio. This information theoretic ratio offers a new causal link between bond and equity markets, and is a valuable new tool for the modeling and prediction of stock market behavior.
Full article

Enthalpy of Mixing in Al–Tb Liquid*Entropy* **2017**, *19*(6), 290; doi:10.3390/e19060290 - 21 June 2017**Abstract **

►
Figures
The liquid-phase enthalpy of mixing for Al–Tb alloys is measured for 3, 5, 8, 10, and 20 at% Tb at selected temperatures in the range from 1364 to 1439 K. Methods include isothermal solution calorimetry and isoperibolic electromagnetic levitation drop calorimetry. Mixing enthalpy

[...] Read more.
The liquid-phase enthalpy of mixing for Al–Tb alloys is measured for 3, 5, 8, 10, and 20 at% Tb at selected temperatures in the range from 1364 to 1439 K. Methods include isothermal solution calorimetry and isoperibolic electromagnetic levitation drop calorimetry. Mixing enthalpy is determined relative to the unmixed pure (Al and Tb) components. The required formation enthalpy for the Al_{3}Tb phase is computed from first-principles calculations. Based on our measurements, three different semi-empirical solution models are offered for the excess free energy of the liquid, including regular, subregular, and associate model formulations. These models are also compared with the Miedema model prediction of mixing enthalpy.
Full article

The Mehler-Fock Transform in Signal Processing*Entropy* **2017**, *19*(6), 289; doi:10.3390/e19060289 - 20 June 2017**Abstract **

►
Figures
Many signals can be described as functions on the unit disk (ball). In the framework of group representations it is well-known how to construct Hilbert-spaces containing these functions that have the groups SU(1,N) as their symmetry groups. One illustration of this construction is

[...] Read more.
Many signals can be described as functions on the unit disk (ball). In the framework of group representations it is well-known how to construct Hilbert-spaces containing these functions that have the groups SU(1,N) as their symmetry groups. One illustration of this construction is three-dimensional color spaces in which chroma properties are described by points on the unit disk. A combination of principal component analysis and the Perron-Frobenius theorem can be used to show that perspective projections map positive signals (i.e., functions with positive values) to a product of the positive half-axis and the unit ball. The representation theory (harmonic analysis) of the group SU(1,1) leads to an integral transform, the Mehler-Fock-transform (MFT), that decomposes functions, depending on the radial coordinate only, into combinations of associated Legendre functions. This transformation is applied to kernel density estimators of probability distributions on the unit disk. It is shown that the transform separates the influence of the data and the measured data. The application of the transform is illustrated by studying the statistical distribution of RGB vectors obtained from a common set of object points under different illuminants.
Full article

Inconsistency of Template Estimation by Minimizing of the Variance/Pre-Variance in the Quotient Space*Entropy* **2017**, *19*(6), 288; doi:10.3390/e19060288 - 20 June 2017**Abstract **

►
Figures
We tackle the problem of template estimation when data have been randomly deformed under a group action in the presence of noise. In order to estimate the template, one often minimizes the variance when the influence of the transformations have been removed (computation

[...] Read more.
We tackle the problem of template estimation when data have been randomly deformed under a group action in the presence of noise. In order to estimate the template, one often minimizes the variance when the influence of the transformations have been removed (computation of the Fréchet mean in the quotient space). The consistency bias is defined as the distance (possibly zero) between the orbit of the template and the orbit of one element which minimizes the variance. In the first part, we restrict ourselves to isometric group action, in this case the Hilbertian distance is invariant under the group action. We establish an asymptotic behavior of the consistency bias which is linear with respect to the noise level. As a result the inconsistency is unavoidable as soon as the noise is enough. In practice, template estimation with a finite sample is often done with an algorithm called “max-max”. In the second part, also in the case of isometric group finite, we show the convergence of this algorithm to an empirical Karcher mean. Our numerical experiments show that the bias observed in practice can not be attributed to the small sample size or to a convergence problem but is indeed due to the previously studied inconsistency. In a third part, we also present some insights of the case of a non invariant distance with respect to the group action. We will see that the inconsistency still holds as soon as the noise level is large enough. Moreover we prove the inconsistency even when a regularization term is added.
Full article