Entropy doi: 10.3390/e20070490

Authors: Amin Hosseinpoor Milaghardan Rahim Ali Abbaspour Christophe Claramunt

The rapid proliferation of sensors and big data repositories offer many new opportunities for data science. Among many application domains, the analysis of large trajectory datasets generated from people&rsquo;s movements at the city scale is one of the most promising research avenues still to explore. Extracting trajectory patterns and outliers in urban environments is a direction still requiring exploration for many management and planning tasks. The research developed in this paper introduces a spatio-temporal framework, so-called STE-SD (Spatio-Temporal Entropy for Similarity Detection), based on the initial concept of entropy as introduced by Shannon in his seminal theory of information and as recently extended to the spatial and temporal dimensions. Our approach considers several complementary trajectory descriptors whose distribution in space and time are quantitatively evaluated. The trajectory primitives considered include curvatures, stop-points, self-intersections and velocities. These primitives are identified and then qualified using the notion of entropy as applied to the spatial and temporal dimensions. The whole approach is experimented and applied to urban trajectories derived from the Geolife dataset, a reference data benchmark available in the city of Beijing.

]]>Entropy doi: 10.3390/e20070489

Authors: N. Alex Cayco-Gajic Joel Zylberberg Eric Shea-Brown

Correlations in neural activity have been demonstrated to have profound consequences for sensory encoding. To understand how neural populations represent stimulus information, it is therefore necessary to model how pairwise and higher-order spiking correlations between neurons contribute to the collective structure of population-wide spiking patterns. Maximum entropy models are an increasingly popular method for capturing collective neural activity by including successively higher-order interaction terms. However, incorporating higher-order interactions in these models is difficult in practice due to two factors. First, the number of parameters exponentially increases as higher orders are added. Second, because triplet (and higher) spiking events occur infrequently, estimates of higher-order statistics may be contaminated by sampling noise. To address this, we extend previous work on the Reliable Interaction class of models to develop a normalized variant that adaptively identifies the specific pairwise and higher-order moments that can be estimated from a given dataset for a specified confidence level. The resulting &ldquo;Reliable Moment&rdquo; model is able to capture cortical-like distributions of population spiking patterns. Finally, we show that, compared with the Reliable Interaction model, the Reliable Moment model infers fewer strong spurious higher-order interactions and is better able to predict the frequencies of previously unobserved spiking patterns.

]]>Entropy doi: 10.3390/e20070488

Authors: Sephira Riva Shahin Mehraban Nicholas P. Lavery Stefan Schwarzmüller Oliver Oeckler Stephen G. R. Brown Kirill V. Yusenko

We investigate the effect of alloying with scandium on microstructure, high-temperature phase stability, electron transport, and mechanical properties of the Al2CoCrFeNi, Al0.5CoCrCuFeNi, and AlCoCrCu0.5FeNi high-entropy alloys. Out of the three model alloys, Al2CoCrFeNi adopts a disordered CsCl structure type. Both of the six-component alloys contain a mixture of body-centered cubic (bcc) and face centered cubic (fcc) phases. The comparison between in situ high-temperature powder diffraction data and ex situ data from heat-treated samples highlights the presence of a reversible bcc to fcc transition. The precipitation of a MgZn2-type intermetallic phase along grain boundaries following scandium addition affects all systems differently, but especially enhances the properties of Al2CoCrFeNi. It causes grain refinement; hardness and electrical conductivity increases (up to 20% and 14% respectively) and affects the CsCl-type &rarr; fcc equilibrium by moving the transformation to sensibly higher temperatures. The maximum dimensionless thermoelectric figure of merit (ZT) of 0.014 is reached for Al2CoCrFeNi alloyed with 0.3 wt.% Sc at 650 &deg;C.

]]>Entropy doi: 10.3390/e20070487

Authors: Sicheng Zhai Wen Wang Juan Xu Shuai Xu Zitang Zhang Yan Wang

FeSiBAlNi (W5), FeSiBAlNiCo (W6-Co), and FeSiBAlNiGd (W6-Gd) high entropy alloys (HEAs) were prepared using a copper-mold casting method. Effects of Co and Gd additions combined with subsequent annealing on microstructures and magnetism were investigated. The as-cast W5 consists of BCC solid solution and FeSi-rich phase. The Gd addition induces the formation of body-centered cubic (BCC) and face-centered cubic (FCC) solid solutions for W6-Gd HEAs. Whereas, the as-cast W6-Co is composed of the FeSi-rich phase. During annealing, no new phases arise in the W6-Co HEA, indicating a good phase stability. The as-cast W5 has the highest hardness (1210 HV), which is mainly attributed to the strengthening effect of FeSi-rich phase evenly distributed in the solid solution matrix. The tested FeSiBAlNi-based HEAs possess soft magnetism. The saturated magnetization and remanence ratio of W6-Gd are distinctly enhanced from 10.93 emu/g to 62.78 emu/g and from 1.44% to 15.50% after the annealing treatment, respectively. The good magnetism of the as-annealed W6-Gd can be ascribed to the formation of Gd-oxides.

]]>Entropy doi: 10.3390/e20070486

Authors: Zhiyuan Li Juan Du Xavier Ottavy Hongwu Zhang

A local loss model and an integral loss model are proposed to study the irreversible flow loss mechanism in a linear compressor cascade. The detached eddy simulation model based on the Menter shear stress transport turbulence model (SSTDES) was used to perform the high-fidelity simulations. The flow losses in the cascade with an incidence angle of 2&deg;, 4&deg; and 7&deg; were analyzed. The contours of local loss coefficient can be explained well by the three-dimensional flow structures. The trend of flow loss varying with incidence angle predicted by integral loss is the same as that calculated by total pressure loss coefficient. The integral loss model was used to evaluate the irreversible loss generated in different regions and its varying trend with the flow condition. It as found that the boundary layer shear losses generated near the endwall, the pressure surface and the suction surface are almost identical for the three incidence angles. The secondary flow loss in the wake-flow and blade-passage regions changes dramatically with the flow condition due to the occurrence of corner stall. For this cascade, the secondary flow loss accounts for 26.1%, 48.3% and 64.3% of the total loss for the flow when the incidence angles are 2&deg;, 4&deg; and 7&deg;, respectively. Lastly, the underlying reason for the variation of the secondary flow loss with the incidence angle is explained using the Lc iso-surface method.

]]>Entropy doi: 10.3390/e20070485

Authors: Angelo Carollo Bernardo Spagnolo Davide Valenti

In this article, we derive a closed form expression for the symmetric logarithmic derivative of Fermionic Gaussian states. This provides a direct way of computing the quantum Fisher Information for Fermionic Gaussian states. Applications range from quantum Metrology with thermal states to non-equilibrium steady states with Fermionic many-body systems.

]]>Entropy doi: 10.3390/e20070484

Authors: Mohammad H. Ahmadi Mirhadi S. Sadaghiani Fathollah Pourfayaz Mahyar Ghazvini Omid Mahian Mehdi Mehrpooya Somchai Wongwises

An exergy analysis of a novel integrated power system is represented in this study. A Solid Oxide Fuel Cell (SOFC), which has been assisted with a Gas Turbine (GT) and Organic Rankine Cycle (ORC) by employing liquefied natural gas (LNG) as a heat sink in a combined power system is simulated and investigated. Initially in this paper, the integrated power system and the primary concepts of the simulation are described. Subsequently, results of the simulation, exergy analysis, and composite curves of heat exchangers are represented and discussed. The equations of the exergy efficiency and destruction for the main cycle&rsquo;s units such as compressors, expanders, pumps, evaporators, condensers, reformers, and reactors are presented. According to the results, the highest exergy destruction is contributed to the SOFC reactor, despite its acceptable exergy efficiency which is equal to 75.7%. Moreover, the exergy efficiencies of the ORC cycle and the whole plant are determined to be 64.9% and 39.9%, respectively. It is worth noting that the rational efficiency of the integrated power system is 53.5%. Among all units, the exergy efficiency of the LNG pump is determined to be 11.7% the lowest exergy efficiency among the other investigated components, indicating a great potential for improvements.

]]>Entropy doi: 10.3390/e20070483

Authors: Louis H. Kauffman

This paper reviews results about discrete physics and non-commutative worlds and explores further the structure and consequences of constraints linking classical calculus and discrete calculus formulated via commutators. In particular, we review how the formalism of generalized non-commutative electromagnetism follows from a first order constraint and how, via the Kilmister equation, relationships with general relativity follow from a second order constraint. It is remarkable that a second order constraint, based on interlacing the commutative and non-commutative worlds, leads to an equivalent tensor equation at the pole of geodesic coordinates for general relativity.

]]>Entropy doi: 10.3390/e20070482

Authors: Bin Pang Yuling He Guiji Tang Chong Zhou Tian Tian

The impulsive fault feature signal of rolling bearings at the early failure stage is easily contaminated by the fundamental frequency (i.e., the rotation frequency of the shaft) signal and background noise. To address this problem, this paper puts forward a rolling bearing weak fault diagnosis method with the combination of optimal notch filter and enhanced singular value decomposition. Firstly, in order to eliminate the interference of the fundamental frequency signal, the original signal was processed by the notch filter with the fundamental frequency as the center frequency and with a varying bandwidth to get a series of corresponding notch filter signals. Secondly, the Teager energy entropy index was adopted to adaptively determine the optimal bandwidth to complete the optimal notch filter analysis on the raw vibration signal and obtain the corresponding optimal notch filter signal. Thirdly, an enhanced singular value decomposition de-nosing method was employed to de-noise the optimal notch filter signal. Finally, the envelope spectrum analysis was conducted on the de-noised signal to extract the fault characteristic frequencies. The effectiveness of the presented method was demonstrated via simulation and experiment verifications. In addition, the minimum entropy deconvolution, Kurtogram and Infogram methods were employed for comparisons to show the advantages of the presented method.

]]>Entropy doi: 10.3390/e20070481

Authors: Philip Tee George Parisis Luc Berthouze Ian Wakeman

Combinatoric measures of entropy capture the complexity of a graph but rely upon the calculation of its independent sets, or collections of non-adjacent vertices. This decomposition of the vertex set is a known NP-Complete problem and for most real world graphs is an inaccessible calculation. Recent work by Dehmer et al. and Tee et al. identified a number of vertex level measures that do not suffer from this pathological computational complexity, but that can be shown to be effective at quantifying graph complexity. In this paper, we consider whether these local measures are fundamentally equivalent to global entropy measures. Specifically, we investigate the existence of a correlation between vertex level and global measures of entropy for a narrow subset of random graphs. We use the greedy algorithm approximation for calculating the chromatic information and therefore K&ouml;rner entropy. We are able to demonstrate strong correlation for this subset of graphs and outline how this may arise theoretically.

]]>Entropy doi: 10.3390/e20060480

Authors: Saran Chen Xin Lu Zhong Liu Zhongwei Jia

With the increasing use of online social networking platforms, online surveys are widely used in many fields, e.g., public health, business and sociology, to collect samples and to infer the population characteristics through self-reported data of respondents. Although the online surveys can protect the privacy of respondents, self-reporting is challenged by a low response rate and unreliable answers when the survey contains sensitive questions, such as drug use, sexual behaviors, abortion or criminal activity. To overcome this limitation, this paper develops an approach that collects the second-order information of the respondents, i.e., asking them about the characteristics of their friends, instead of asking the respondents&rsquo; own characteristics directly. Then, we generate the inference about the population variable with the Hansen-Hurwitz estimator for the two classic sampling strategies (simple random sampling or random walk-based sampling). The method is evaluated by simulations on both artificial and real-world networks. Results show that the method is able to generate population estimates with high accuracy without knowing the respondents&rsquo; own characteristics, and the biases of estimates under various settings are relatively small and are within acceptable limits. The new method offers an alternative way for implementing surveys online and is expected to be able to collect more reliable data with improved population inference on sensitive variables.

]]>Entropy doi: 10.3390/e20060479

Authors: Yongqi Wang Kolumban Hutter

n/a

]]>Entropy doi: 10.3390/e20060478

Authors: Daniel Rohrlich Guy Hetzroni

We review an argument that bipartite &ldquo;PR-box&rdquo; correlations, though designed to respect relativistic causality, in fact violate relativistic causality in the classical limit. As a test of this argument, we consider Greenberger&ndash;Horne&ndash;Zeilinger (GHZ) correlations as a tripartite version of PR-box correlations, and ask whether the argument extends to GHZ correlations. If it does&mdash;i.e., if it shows that GHZ correlations violate relativistic causality in the classical limit&mdash;then the argument must be incorrect (since GHZ correlations do respect relativistic causality in the classical limit.) However, we find that the argument does not extend to GHZ correlations. We also show that both PR-box correlations and GHZ correlations can be retrocausal, but the retrocausality of PR-box correlations leads to self-contradictory causal loops, while the retrocausality of GHZ correlations does not.

]]>Entropy doi: 10.3390/e20060477

Authors: Alejandro Ramírez-Rojas Elsa Leticia Flores-Márquez Nicholas V. Sarlis Panayiotis A. Varotsos

We analyse seismicity during the 6-year period 2012&ndash;2017 in the new time domain termed natural time in the Chiapas region where the M8.2 earthquake occurred, Mexico&rsquo;s largest earthquake in more than a century, in order to study the complexity measures associated with fluctuations of entropy as well as with entropy change under time reversal. We find that almost three months before the M8.2 earthquake, i.e., on 14 June 2017, the complexity measure associated with the fluctuations of entropy change under time reversal shows an abrupt increase, which, however, does not hold for the complexity measure associated with the fluctuations of entropy in forward time. On the same date, the entropy change under time reversal has been previously found to exhibit a minimum [Physica A 506, 625&ndash;634 (2018)]; we thus find here that this minimum is also accompanied by increased fluctuations of the entropy change under time reversal. In addition, we find a simultaneous increase of the Tsallis entropic index q.

]]>Entropy doi: 10.3390/e20060476

Authors: Ming Li Wendan Wei Jialin Wang Xiaoyu Qi

Accounting informatization is an important part of enterprise informatization. It affects the accounting and finance operational efficiency. With the comprehensive evaluation of accounting informatization from multiple aspects, we can find the strengths and weaknesses of corporate accounting informatization, which then can be improved precisely. In this paper, an evaluation approach of accounting informatization is proposed. Firstly, the evaluation index system is constructed from the aspects of strategic position, infrastructure construction, implementation of accounting informatization, informatization guarantee, and application efficiency. Considering the complexity and ambiguity of the index, experts are required to give linguistic ratings which are then converted into intuitionistic fuzzy number. Then, an entropy and cross entropy method based on intuitionistic fuzzy sets is proposed to derive the weights of experts so as to reduce the error caused by personal bias. By combining the weights of the index and weighted ratings, the evaluation results are obtained. Finally, a case of accounting information evaluation is given to illustrate the feasibility of the proposed approach.

]]>Entropy doi: 10.3390/e20060475

Authors: Alejandro Linares-Barranco Hongjie Liu Antonio Rios-Navarro Francisco Gomez-Rodriguez Diederik P. Moeys Tobi Delbruck

Taking inspiration from biology to solve engineering problems using the organizing principles of biological neural computation is the aim of the field of neuromorphic engineering. This field has demonstrated success in sensor based applications (vision and audition) as well as in cognition and actuators. This paper is focused on mimicking the approaching detection functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC) and its application to robotics. These RGCs transmit action potentials when an expanding object is detected. In this work we compare the software and hardware logic FPGA implementations of this approaching function and the hardware latency when applied to robots, as an attention/reaction mechanism. The visual input for these cells comes from an asynchronous event-driven Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation, on a Spartan 6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz. The entropy has been calculated to demonstrate that the system is not totally deterministic in response to approaching objects because of several bioinspired characteristics. It has been measured that a Summit XL mobile robot can react to an approaching object in 90 ms, which can be used as an attentional mechanism. This is faster than similar event-based approaches in robotics and equivalent to human reaction latencies to visual stimulus.

]]>Entropy doi: 10.3390/e20060474

Authors: Thomas Filk

The article describes an interpretation of the mathematical formalism of standard quantum mechanics in terms of relations. In particular, the wave function &psi;(x) is interpreted as a complex-valued relation between an entity (often called &ldquo;particle&rdquo;) and a second entity x (often called &ldquo;spatial point&rdquo;). Such complex-valued relations can also be formulated for classical physical systems. Entanglement is interpreted as a relation between two entities (particles or properties of particles). Such relations define the concept of &ldquo;being next to each other&rdquo;, which implies that entangled entities are close to each other, even though they might appear to be far away with respect to a classical background space. However, when space is also considered to be a network of relations (of which the classical background space is a large-scale continuum limit), such nearest neighbor configurations are possible. The measurement problem is discussed from the perspective of this interpretation. It should be emphasized that this interpretation is not meant to be a serious attempt to describe the ontology of our world, but its purpose is to make it obvious that, besides Bohmian mechanics, presumably many other ontological interpretations of quantum theory exist.

]]>Entropy doi: 10.3390/e20060473

Authors: Claudia Zander Angel Ricardo Plastino

We revisit the concept of entanglement within the Bohmian approach to quantum mechanics. Inspired by Bohmian dynamics, we introduce two partial measures for the amount of entanglement corresponding to a pure state of a pair of quantum particles. One of these measures is associated with the statistical correlations exhibited by the joint probability density of the two Bohmian particles in configuration space. The other partial measure corresponds to the correlations associated with the phase of the joint wave function, and describes the non-separability of the Bohmian velocity field. The sum of these two components is equal to the total entanglement of the joint quantum state, as measured by the linear entropy of the single-particle reduced density matrix.

]]>Entropy doi: 10.3390/e20060472

Authors: Jan Naudts

Quantum information geometry studies families of quantum states by means of differential geometry. A new approach is followed with the intention to facilitate the introduction of a more general theory in subsequent work. To this purpose, the emphasis is shifted from a manifold of strictly positive density matrices to a manifold of faithful quantum states on the C*-algebra of bounded linear operators. In addition, ideas from the parameter-free approach to information geometry are adopted. The underlying Hilbert space is assumed to be finite-dimensional. In this way, technicalities are avoided so that strong results are obtained, which one can hope to prove later on in a more general context. Two different atlases are introduced, one in which it is straightforward to show that the quantum states form a Banach manifold, the other which is compatible with the inner product of Bogoliubov and which yields affine coordinates for the exponential connection.

]]>Entropy doi: 10.3390/e20060471

Authors: Fanrong Meng Xiaobin Rui Zhixiao Wang Yan Xing Longbing Cao

Attributed networks consist of not only a network structure but also node attributes. Most existing community detection algorithms only focus on network structures and ignore node attributes, which are also important. Although some algorithms using both node attributes and network structure information have been proposed in recent years, the complex hierarchical coupling relationships within and between attributes, nodes and network structure have not been considered. Such hierarchical couplings are driving factors in community formation. This paper introduces a novel coupled node similarity (CNS) to involve and learn attribute and structure couplings and compute the similarity within and between nodes with categorical attributes in a network. CNS learns and integrates the frequency-based intra-attribute coupled similarity within an attribute, the co-occurrence-based inter-attribute coupled similarity between attributes, and coupled attribute-to-structure similarity based on the homophily property. CNS is then used to generate the weights of edges and transfer a plain graph to a weighted graph. Clustering algorithms detect community structures that are topologically well-connected and semantically coherent on the weighted graphs. Extensive experiments verify the effectiveness of CNS-based community detection algorithms on several data sets by comparing with the state-of-the-art node similarity measures, whether they involve node attribute information and hierarchical interactions, and on various levels of network structure complexity.

]]>Entropy doi: 10.3390/e20060470

Authors: Ting Yang Shujun Liu Wenguo Liu Jishun Guo Pin Wang

In this paper, a noise enhanced binary hypothesis-testing problem was studied for a variable detector under certain constraints in which the detection probability can be increased and the false-alarm probability can be decreased simultaneously. According to the constraints, three alternative cases are proposed, the first two cases concerned minimization of the false-alarm probability and maximization of the detection probability without deterioration of one by the other, respectively, and the third case was achieved by a randomization of two optimal noise enhanced solutions obtained in the first two limit cases. Furthermore, the noise enhanced solutions that satisfy the three cases were determined whether randomization between different detectors was allowed or not. In addition, the practicality of the third case was proven from the perspective of Bayes risk. Finally, numerous examples and conclusions are presented.

]]>Entropy doi: 10.3390/e20060469

Authors: Gerardo Valadez Huerta Vincent Flasbart Tobias Marquardt Pablo Radici Stephan Kabelac

The calculation of the entropy production rate within an operational high temperature solid oxide fuel cell (SOFC) is necessary to design and improve heating and cooling strategies. However, due to a lack of information, most of the studies are limited to empirical relations, which are not in line with the more general approach given by non-equilibrium thermodynamics (NET). The SOFC 1D-model presented in this study is based on non-equilibrium thermodynamics and we parameterize it with experimental data and data from molecular dynamics (MD). The validation of the model shows that it can effectively describe the behavior of a SOFC at 1300&nbsp;K. Moreover, we show that the highest entropy production is present in the electrolyte and the catalyst layers, and that the Peltier heat transfer is considerable for the calculation of the heat flux in the electrolyte and cannot be neglected. To our knowledge, this is the first validated model of a SOFC based on non-equilibrium thermodynamics and this study can be extended to analyze SOFCs with other solid oxide electrolytes, with perovskites electrolytes or even other electrochemical systems like solid oxide electrolysis cells (SOECs).

]]>Entropy doi: 10.3390/e20060468

Authors: Jonathan N. Blakely Marko S. Milosavljevic Ned J. Corron

Chaotic evolution is generally too irregular to be captured in an analytic solution. Nonetheless, some dynamical systems do have such solutions enabling more rigorous analysis than can be achieved with numerical solutions. Here, we introduce a method of coupling solvable chaotic oscillators that maintains solvability. In fact, an analytic solution is given for an entire network of coupled oscillators. Importantly, a valid chaotic solution is shown even when the coupling topology is complex and the population of oscillators is heterogeneous. We provide a specific example of a solvable chaotic network with star topology and a hub that oscillates much faster than its leaves. We present analytic solutions as the coupling strength is varied showing states of varying degrees of global organization. The covariance of the network is derived explicity from the analytic solution characterizing the degree of synchronization across the network as the coupling strength varies. This example suggests that analytic solutions may constitute a new tool in the study of chaotic network dynamics generally.

]]>Entropy doi: 10.3390/e20060467

Authors: Jaume del Olmo Alos Javier Rodríguez Fonollosa

Asymptotic secrecy-capacity achieving polar coding schemes are proposed for the memoryless degraded broadcast channel under different reliability and secrecy requirements: layered decoding or layered secrecy. In these settings, the transmitter wishes to send multiple messages to a set of legitimate receivers keeping them masked from a set of eavesdroppers. The layered decoding structure requires receivers with better channel quality to reliably decode more messages, while the layered secrecy structure requires eavesdroppers with worse channel quality to be kept ignorant of more messages. Practical constructions for the proposed polar coding schemes are discussed and their performance evaluated by means of simulations.

]]>Entropy doi: 10.3390/e20060466

Authors: Dennis Dieks

A consensus seems to have developed that the Gibbs paradox in classical thermodynamics (the discontinuous drop in the entropy of mixing when the mixed gases become equal to each other) is unmysterious: in any actual situation, two gases can be separated or not, and the associated harmless discontinuity from &ldquo;yes&rdquo; to &ldquo;no&rdquo; is responsible for the discontinuity. By contrast, the Gibbs paradox in statistical physics continues to attract attention. Here, the problem is that standard calculations in statistical mechanics predict a non-vanishing value of the entropy of mixing even when two gases of the same kind are mixed, in conflict with thermodynamic predictions. This version of the Gibbs paradox is often seen as a sign that there is something fundamentally wrong with either the traditional expression S=klnW or with the way W is calculated. It is the aim of this article to review the situation from the orthodox (as opposed to information theoretic) standpoint. We demonstrate how the standard formalism is not only fully capable of dealing with the paradox, but also provides an intuitively clear picture of the relevant physical mechanisms. In particular, we pay attention to the explanatory relevance of the existence of particle trajectories in the classical context. We also discuss how the paradox survives the transition to quantum mechanics, in spite of the symmetrization postulates.

]]>Entropy doi: 10.3390/e20060465

Authors: Tim Maudlin

Quantum physics demands some radical revision of our fundamental beliefs about physical reality. We know that because there are certain verified physical phenomena&mdash;two-slit interference, the disappearance of interference upon monitoring, violations of Bell&rsquo;s inequality&mdash;that have no classical analogs. But the exact nature of that revision has been under dispute since the foundation of quantum theory. I offer a method of clarifying what the commitments of a clearly formulated physical theory are, and apply it to a discussion of some options available to account for another non-classical phenomenon: the Aharonov&ndash;Bohm effect.

]]>Entropy doi: 10.3390/e20060464

Authors: Marianthi Markatou Yang Chen

One natural way to measure model adequacy is by using statistical distances as loss functions. A related fundamental question is how to construct loss functions that are scientifically and statistically meaningful. In this paper, we investigate non-quadratic distances and their role in assessing the adequacy of a model and/or ability to perform model selection. We first present the definition of a statistical distance and its associated properties. Three popular distances, total variation, the mixture index of fit and the Kullback-Leibler distance, are studied in detail, with the aim of understanding their properties and potential interpretations that can offer insight into their performance as measures of model misspecification. A small simulation study exemplifies the performance of these measures and their application to different scientific fields is briefly discussed.

]]>Entropy doi: 10.3390/e20060463

Authors: Shouliang Li Weikang Ding Benshun Yin Tongfeng Zhang Yide Ma

With the popularity of the Internet, the transmission of images has become more frequent. It is of great significance to study efficient and secure image encryption algorithms. Based on traditional Logistic maps and consideration of delay, we propose a new one-dimensional (1D) delay and linearly coupled Logistic chaotic map (DLCL) in this paper. Time delay is a common phenomenon in various complex systems in nature, and it will greatly change the dynamic characteristics of the system. The map is analyzed in terms of trajectory, Lyapunov exponent (LE) and Permutation entropy (PE). The results show that this map has wide chaotic range, better ergodicity and larger maximum LE in comparison with some existing chaotic maps. A new method of color image encryption is put forward based on DLCL. In proposed encryption algorithm, after various analysis, it has good encryption performance, and the key used for scrambling is related to the original image. It is illustrated by simulation results that the ciphered images have good pseudo randomness through our method. The proposed encryption algorithm has large key space and can effectively resist differential attack and chosen plaintext attack.

]]>Entropy doi: 10.3390/e20060462

Authors: Roderich Tumulka

The biggest and most lasting among David Bohm&rsquo;s (1917&ndash;1992) many achievements is to have proposed a picture of reality that explains the empirical rules of quantum mechanics. This picture, known as pilot wave theory or Bohmian mechanics among other names, is still the simplest and most convincing explanation available. According to this theory, electrons are point particles in the literal sense and move along trajectories governed by Bohm&rsquo;s equation of motion. In this paper, I describe some more recent developments and extensions of Bohmian mechanics, concerning in particular relativistic space-time and particle creation and annihilation.

]]>Entropy doi: 10.3390/e20060461

Authors: Yijun Wang Xudong Wang Duan Huang Ying Guo

We show that a noiseless linear amplifier (NLA) can be placed properly at the receiver&rsquo;s end to improve the performance of self-referenced (SR) continuous variable quantum key distribution (CV-QKD) when the reference pulses are weak. In SR CV-QKD, the imperfections of the amplitude modulator limit the maximal amplitude of the reference pulses, while the performance of SR CV-QKD is positively related to the amplitude of the reference pulses. An NLA can compensate the impacts of large phase noise introduced by the weak reference pulses. Simulation results derived from collective attacks show that this scheme can improve the performance of SR CV-QKD with weak reference pulses, in terms of extending maximum transmission distance. An NLA with a gain of g can increase the maximum transmission distance by the equivalent of 20log10g dB of losses.

]]>Entropy doi: 10.3390/e20060460

Authors: George Ruppeiner

Black holes pose great difficulties for theory since gravity and quantum theory must be combined in some as yet unknown way. An additional difficulty is that detailed black hole observational data to guide theorists is lacking. In this paper, I sidestep the difficulties of combining gravity and quantum theory by employing black hole thermodynamics augmented by ideas from the information geometry of thermodynamics. I propose a purely thermodynamic agenda for choosing correct candidate black hole thermodynamic scaled equations of state, parameterized by two exponents. These two adjustable exponents may be set to accommodate additional black hole information, either from astrophysical observations or from some microscopic theory, such as string theory. My approach assumes implicitly that the as yet unknown microscopic black hole constituents have strong effective interactions between them, of a type found in critical phenomena. In this picture, the details of the microscopic interaction forces are not important, and the essential macroscopic picture emerges from general assumptions about the number of independent thermodynamic variables, types of critical points, boundary conditions, and analyticity. I use the simple Kerr and Reissner-Nordstr&ouml;m black holes for guidance, and find candidate equations of state that embody several the features of these purely gravitational models. My approach may offer a productive new way to select black hole thermodynamic equations of state representing both gravitational and quantum properties.

]]>Entropy doi: 10.3390/e20060459

Authors: Yuli Zhao Yin Zhang Bin Zhang Kening Gao Pengfei Li

Since complex search tasks are usually divided into subtasks, providing subtask-oriented query recommendations is an effective way to support complex search tasks. Currently, most subtask-oriented query recommendation methods extract subtasks from plain form search logs consisting of only queries and clicks, providing limited clues to identify subtasks. Meanwhile, for several decades, the Computer Human Interface (CHI)/Human Computer Interaction (HCI) communities have been working on new complex search tools for the purpose of supporting rich user interactions beyond just queries and clicks, and thus providing rich form search logs with more clues for subtask identification. In this paper, we researched the provision of subtask-oriented query recommendations by extracting thematic experiences from the rich form search logs of complex search tasks logged in a proposed visual data structure. We introduce the tree structure of the visual data structure and propose a visual-based subtask identification method based on the visual data structure. We then introduce a personalized PageRank-based method to recommend queries by ranking nodes on the network from the identified subtasks. We evaluated the proposed methods in experiments consisting of informative and tentative search tasks.

]]>Entropy doi: 10.3390/e20060458

Authors: Gerhard Grössing Siegfried Fussy Johannes Mesa Pascasio Herbert Schwabl

In the quest for an understanding of nonlocality with respect to an appropriate ontology, we propose a &ldquo;cosmological solution&rdquo;. We assume that from the beginning of the universe each point in space has been the location of a scalar field representing a zero-point vacuum energy that nonlocally vibrates at a vast range of different frequencies across the whole universe. A quantum, then, is a nonequilibrium steady state in the form of a &ldquo;bouncer&rdquo; coupled resonantly to one of those (particle type dependent) frequencies, in remote analogy to the bouncing oil drops on an oscillating oil bath as in Couder&rsquo;s experiments. A major difference to the latter analogy is given by the nonlocal nature of the vacuum oscillations. We show with the examples of double- and n-slit interference that the assumed nonlocality of the distribution functions alone suffices to derive the de Broglie&ndash;Bohm guiding equation for N particles with otherwise purely classical means. In our model, no influences from configuration space are required, as everything can be described in 3-space. Importantly, the setting up of an experimental arrangement limits and shapes the forward and osmotic contributions and is described as vacuum landscaping.

]]>Entropy doi: 10.3390/e20060457

Authors: Michal Pavelka Václav Klika Miroslav Grmela

Landau damping is the tendency of solutions to the Vlasov equation towards spatially homogeneous distribution functions. The distribution functions, however, approach the spatially homogeneous manifold only weakly, and Boltzmann entropy is not changed by the Vlasov equation. On the other hand, density and kinetic energy density, which are integrals of the distribution function, approach spatially homogeneous states strongly, which is accompanied by growth of the hydrodynamic entropy. Such a behavior can be seen when the Vlasov equation is reduced to the evolution equations for density and kinetic energy density by means of the Ehrenfest reduction.

]]>Entropy doi: 10.3390/e20060456

Authors: Branko Ristic Jennifer L. Palmer

This short note addresses the problem of autonomous on-line path-panning for exploration and occupancy-grid mapping using a mobile robot. The underlying algorithm for simultaneous localisation and mapping (SLAM) is based on random-finite set (RFS) modelling of ranging sensor measurements, implemented as a Rao-Blackwellised particle filter. Path-planning in general must trade-off between exploration (which reduces the uncertainty in the map) and exploitation (which reduces the uncertainty in the robot pose). In this note we propose a reward function based on the R&eacute;nyi divergence between the prior and the posterior densities, with RFS modelling of sensor measurements. This approach results in a joint map-pose uncertainty measure without a need to scale and tune their weights.

]]>Entropy doi: 10.3390/e20060455

Authors: Mingtao Ge Jie Wang Fangfang Zhang Ke Bai Xiangyang Ren

According to the dynamic characteristics of the rolling bearing vibration signal and the distribution characteristics of its noise, a fault identification method based on the adaptive filtering empirical wavelet transform (AFEWT) and kernel density estimation mutual information (KDEMI) classifier is proposed. First, we use AFEWT to extract the feature of the rolling bearing vibration signal. The hypothesis test of the Gaussian distribution is carried out for the sub-modes that are obtained by the twice decomposition of EWT, and Gaussian noise is filtered out according to the test results. In this way, we can overcome the noise interference and avoid the mode selection problem when we extract the feature of the signal. Then we combine the advantages of kernel density estimation (KDE) and mutual information (MI) and put forward a KDEMI classifier. The mutual information of the probability density combining the unknown signal feature vector and the probability density of the known type signal is calculated. The type of the unknown signal is determined via the value of the mutual information, so as to achieve the purpose of fault identification of the rolling bearing. In order to verify the effectiveness of AFEWT in feature extraction, we extract signal features using three methods, AFEWT, EWT, and EMD, and then use the same classifier to identify fault signals. Experimental results show that the fault signal has the highest recognition rate by using AFEWT for feature extraction. At the same time, in order to verify the performance of the AFEWT-KDEMI method, we compare two classical fault signal identification methods, SVM and BP neural network, with the AFEWT-KDEMI method. Through experimental analysis, we found that the AFEWT-KDEMI method is more stable and effective.

]]>Entropy doi: 10.3390/e20060454

Authors: Fabricio Toscano Daniel S. Tasca Łukasz Rudnicki Stephen P. Walborn

Uncertainty relations involving incompatible observables are one of the cornerstones of quantum mechanics. Aside from their fundamental significance, they play an important role in practical applications, such as detection of quantum correlations and security requirements in quantum cryptography. In continuous variable systems, the spectra of the relevant observables form a continuum and this necessitates the coarse graining of measurements. However, these coarse-grained observables do not necessarily obey the same uncertainty relations as the original ones, a fact that can lead to false results when considering applications. That is, one cannot naively replace the original observables in the uncertainty relation for the coarse-grained observables and expect consistent results. As such, several uncertainty relations that are specifically designed for coarse-grained observables have been developed. In recognition of the 90th anniversary of the seminal Heisenberg uncertainty relation, celebrated last year, and all the subsequent work since then, here we give a review of the state of the art of coarse-grained uncertainty relations in continuous variable quantum systems, as well as their applications to fundamental quantum physics and quantum information tasks. Our review is meant to be balanced in its content, since both theoretical considerations and experimental perspectives are put on an equal footing.

]]>Entropy doi: 10.3390/e20060453

Authors: Georg J. Schmitz

Different notions of entropy can be identified in different scientific communities: (i) the thermodynamic sense; (ii) the information sense; (iii) the statistical sense; (iv) the disorder sense; and (v) the homogeneity sense. Especially the &ldquo;disorder sense&rdquo; and the &ldquo;homogeneity sense&rdquo; relate to and require the notion of space and time. One of the few prominent examples relating entropy to both geometry and space is the Bekenstein-Hawking entropy of a Black Hole. Although this was developed for describing a physical object&mdash;a black hole&mdash;having a mass, a momentum, a temperature, an electrical charge, etc., absolutely no information about this object&rsquo;s attributes can ultimately be found in the final formulation. In contrast, the Bekenstein-Hawking entropy in its dimensionless form is a positive quantity only comprising geometric attributes such as an area A&mdash;the area of the event horizon of the black hole, a length LP&mdash;the Planck length, and a factor 1/4. A purely geometric approach to this formulation will be presented here. The approach is based on a continuous 3D extension of the Heaviside function which draws on the phase-field concept of diffuse interfaces. Entropy enters into the local and statistical description of contrast or gradient distributions in the transition region of the extended Heaviside function definition. The structure of the Bekenstein-Hawking formulation is ultimately derived for a geometric sphere based solely on geometric-statistical considerations.

]]>Entropy doi: 10.3390/e20060452

Authors: Ran Gao Xiaolong Yin Zhiping Li

In multiphase (≥3) equilibrium calculations, when the Newton method is used to solve the material balance (Rachford-Rice) equations, poorly conditioned Jacobian can lead to false convergence. We present a robust successive substitution method that solves the multiphase Rachford-Rice equations sequentially using the method of bi-section while considering the monotonicity of the equations and the locations of singular hyperplanes. Although this method is slower than Newton solution, as it does not rely on Jacobians that can become poorly conditioned, it can be inserted into Newton iterations upon the detection of a poorly conditioned Jacobian. Testing shows that embedded successive substitution steps effectively improved the robustness. The benefit of the Newton method in the speed of convergence is maintained.

]]>Entropy doi: 10.3390/e20060451

Authors: Ángel Sanz

Bohmian mechanics, widely known within the field of the quantum foundations, has been a quite useful resource for computational and interpretive purposes in a wide variety of practical problems. Here, it is used to establish a comparative analysis at different levels of approximation in the problem of the diffraction of helium atoms from a substrate consisting of a defect with axial symmetry on top of a flat surface. The motivation behind this work is to determine which aspects of one level survive in the next level of refinement and, therefore, to get a better idea of what we usually denote as quantum-classical correspondence. To this end, first a quantum treatment of the problem is performed with both an approximated hard-wall model and then with a realistic interaction potential model. The interpretation and explanation of the features displayed by the corresponding diffraction intensity patterns is then revisited with a series of trajectory-based approaches: Fermatian trajectories (optical rays), Newtonian trajectories and Bohmian trajectories. As it is seen, while Fermatian and Newtonian trajectories show some similarities, Bohmian trajectories behave quite differently due to their implicit non-classicality.

]]>Entropy doi: 10.3390/e20060450

Authors: Robert H. Swendsen

Two distinct puzzles, which are both known as Gibbs&rsquo; paradox, have interested physicists since they were first identified in the 1870s. They each have significance for the foundations of statistical mechanics and have led to lively discussions with a wide variety of suggested resolutions. Most proposed resolutions had involved quantum mechanics, although the original puzzles were entirely classical and were posed before quantum mechanics was invented. In this paper, I show that contrary to what has often been suggested, quantum mechanics is not essential for resolving the paradoxes. I present a resolution of the paradoxes that does not depend on quantum mechanics and includes the case of colloidal solutions, for which quantum mechanics is not relevant.

]]>Entropy doi: 10.3390/e20060449

Authors: Anastasiia Bakhchina Karina Arutyunova Alexey Sozinov Alexander Demidovsky Yurii Alexandrov

Cardiac activity is involved in the processes of organization of goal-directed behaviour. Each behavioural act is aimed at achieving an adaptive outcome and it is subserved by the actualization of functional systems consisting of elements distributed across the brain and the rest of the body. This paper proposes a system-evolutionary view on the activity of the heart and its variability. We have compared the irregularity of the heart rate, as measured by sample entropy (SampEn), in behaviours that are subserved by functional systems formed at different stages of individual development, which implement organism-environment interactions with different degrees of differentiation. The results have shown that SampEn of the heart rate was higher during performing tasks that included later acquired knowledge (foreign language vs. native language; mathematical vocabulary vs. general vocabulary) and decreased in the stress and alcohol conditions, as well as at the beginning of learning. These results are in line with the hypothesis that irregularity of the heart rate reflects the properties of a set of functional systems subserving current behaviour, with higher irregularity corresponding to later acquired and more complex behaviour.

]]>Entropy doi: 10.3390/e20060448

Authors: Bing Li Mingliang Liu Zijian Guo Yamin Ji

The mechanical fault diagnosis results of the high voltage circuit breakers (HVCBs) are mainly determined by the feature vector and classifier used. In order to obtain more remarkable characteristics of signals and a robust classifier which is suitable for small sample classification, in this paper, a new mechanical fault diagnosis method is proposed. Firstly, the vibration signals of HVCBs are collected by a designed acquisition system, and the noise of signals is eliminated by a soft threshold de-noising method. Secondly, the empirical wavelet transform (EWT) is adopted to decompose the signals into a series of physically meaningful modes, and then, the improved time-frequency entropy (ITFE) method is used to extract the characteristics of the vibration signals. Finally, a generalized regression neural network (GRNN) is used to identify four types of vibration signals of HVCBs, while the smoothing parameter &delta; of GRNN is optimized by a loop traversal method. The experimental results show that by using this optimal classifier for fault diagnosis, the proposed fault diagnosis method has the better generalization performance and the recognition rate of unknown samples is over 95%, and the signal features obtained by the ITFE method are more significant than those of the traditional TFE method.

]]>Entropy doi: 10.3390/e20060447

Authors: Yubing Huang Wei Dai Lianxi Liu Yu Zhao

During the assembly process, there are inevitable variations and noise factors in the material properties, process parameters and screening scheme, which may affect the quality of the product. Using the stress-strength model, an evaluated screening scheme method, by analyzing the variation of the defect density in the assembly process, is proposed and discussed. The influence of screening stress on product defects is considered to determine the screening scheme. We performed the defect stream analysis by calculating the recursive relations of residual defect density under multi-stress conditions. We find that the probability density function, which shows the defect changing process from latent to dominant relative to the time process, agrees very well with the historical data. We also calculate the risk as the entropy of the assembly task. Finally, we verify our method by analyzing the assembly process of a certain product.

]]>Entropy doi: 10.3390/e20060446

Authors: Wei Zhou Jin Chen Bingqing Ding

One important element of military supply transportation is concealment, especially during war preparations and warfare periods. By introducing entropy to calculate the transportation concealment degree, we investigate the issue about concealed military supply transportation on the whole road network and propose an optimal flow distribution model. This model&rsquo;s objective function is to maximize the concealment of military supply transportation. After analyzing the road network, classifying different nodes, summarizing the constraint conditions based on the properties and assumptions in the transportation process, and combining the general parameter limits, the optimal flow distribution model is further transformed into a calculable non-linear programming model. Thus, based on this non-linear programming model, we can obtain the optimal distribution scheme of military supply transportation from the perspectives of network analysis and concealment measurement. Lastly, an example of military supply transportation in Jiangsu province, China is illustrated to prove the feasibility of the proposed model. The managerial implication is that by utilizing the proposed flow distribution model, military supplies can be efficiently transported to the required destinations based on maximizing the concealment degree. Not only this model can be utilized in the real military supply transportation, it can be also applied in other transportation fields which require time efficiency and concealment.

]]>Entropy doi: 10.3390/e20060445

Authors: Chunlei Fan Qun Ding

In this paper, a novel image encryption scheme is proposed for the secure transmission of image data. A self-synchronous chaotic stream cipher is designed with the purpose of resisting active attack and ensures the limited error propagation of image data. Two-dimensional discrete wavelet transform and Arnold mapping are used to scramble the pixel value of the original image. A four-dimensional hyperchaotic system with four positive Lyapunov exponents serve as the chaotic sequence generator of the self-synchronous stream cipher in order to enhance the security and complexity of the image encryption system. Finally, the simulation experiment results show that this image encryption scheme is both reliable and secure.

]]>Entropy doi: 10.3390/e20060444

Authors: Mirko Polato Ivano Lauriola Fabio Aiolli

Kernel based classifiers, such as SVM, are considered state-of-the-art algorithms and are widely used on many classification tasks. However, this kind of methods are hardly interpretable and for this reason they are often considered as black-box models. In this paper, we propose a new family of Boolean kernels for categorical data where features correspond to propositional formulas applied to the input variables. The idea is to create human-readable features to ease the extraction of interpretation rules directly from the embedding space. Experiments on artificial and benchmark datasets show the effectiveness of the proposed family of kernels with respect to established ones, such as RBF, in terms of classification accuracy.

]]>Entropy doi: 10.3390/e20060443

Authors: Olivier Darrigol

This article is a detailed history of the Gibbs paradox, with philosophical morals. It purports to explain the origins of the paradox, to describe and criticize solutions of the paradox from the early times to the present, to use the history of statistical mechanics as a reservoir of ideas for clarifying foundations and removing prejudices, and to relate the paradox to broad misunderstandings of the nature of physical theory.

]]>Entropy doi: 10.3390/e20060442

Authors: Jack Jewson Jim Q. Smith Chris Holmes

When it is acknowledged that all candidate parameterised statistical models are misspecified relative to the data generating process, the decision maker (DM) must currently concern themselves with inference for the parameter value minimising the Kullback&ndash;Leibler (KL)-divergence between the model and this process (Walker, 2013). However, it has long been known that minimising the KL-divergence places a large weight on correctly capturing the tails of the sample distribution. As a result, the DM is required to worry about the robustness of their model to tail misspecifications if they want to conduct principled inference. In this paper we alleviate these concerns for the DM. We advance recent methodological developments in general Bayesian updating (Bissiri, Holmes &amp; Walker, 2016) to propose a statistically well principled Bayesian updating of beliefs targeting the minimisation of more general divergence criteria. We improve both the motivation and the statistical foundations of existing Bayesian minimum divergence estimation (Hooker &amp; Vidyashankar, 2014; Ghosh &amp; Basu, 2016), allowing the well principled Bayesian to target predictions from the model that are close to the genuine model in terms of some alternative divergence measure to the KL-divergence. Our principled formulation allows us to consider a broader range of divergences than have previously been considered. In fact, we argue defining the divergence measure forms an important, subjective part of any statistical analysis, and aim to provide some decision theoretic rational for this selection. We illustrate how targeting alternative divergence measures can impact the conclusions of simple inference tasks, and discuss then how our methods might apply to more complicated, high dimensional models.

]]>Entropy doi: 10.3390/e20060441

Authors: Ingrid Rotter

The aim of this paper is to study the question of whether or not equilibrium states exist in open quantum systems that are embedded in at least two environments and are described by a non-Hermitian Hamilton operator H . The eigenfunctions of H contain the influence of exceptional points (EPs) and external mixing (EM) of the states via the environment. As a result, equilibrium states exist (far from EPs). They are different from those of the corresponding closed system. Their wavefunctions are orthogonal even though the Hamiltonian is non-Hermitian.

]]>Entropy doi: 10.3390/e20060440

Authors: Oliver Passon

We discuss a common misconception regarding the de Broglie&ndash;Bohm (dBB) theory; namely, that it not only assigns a position to each quantum object but also contains the momenta as &ldquo;hidden variables&rdquo;. Sometimes this alleged property of the theory is even used to argue that the dBB theory is inconsistent with quantum theory. We explain why this claim is unfounded and show in particular how this misconception veils the true novelty of the dBB theory.

]]>Entropy doi: 10.3390/e20060439

Authors: Hongzhi Zhang Xiao Liang Guangluan Xu Kun Fu Feng Li Tinglei Huang

Automatic question answering (QA), which can greatly facilitate the access to information, is an important task in artificial intelligence. Recent years have witnessed the development of QA methods based on deep learning. However, a great amount of data is needed to train deep neural networks, and it is laborious to annotate training data for factoid QA of new domains or languages. In this paper, a distantly supervised method is proposed to automatically generate QA pairs. Additional efforts are paid to let the generated questions reflect the query interests and expression styles of users by exploring the community QA. Specifically, the generated questions are selected according to the estimated probabilities they are asked. Diverse paraphrases of questions are mined from community QA data, considering that the model trained on monotonous synthetic questions is very sensitive to variants of question expressions. Experimental results show that the model solely trained on generated data via the distant supervision and mined paraphrases could answer real-world questions with the accuracy of 49.34%. When limited annotated training data is available, significant improvements could be achieved by incorporating the generated data. An improvement of 1.35 absolute points is still observed on WebQA, a dataset with large-scale annotated training samples.

]]>Entropy doi: 10.3390/e20060438

Authors: Tatsuaki Tsuruyama

Kullback&ndash;Leibler divergence (KLD) is a type of extended mutual entropy, which is used as a measure of information gain when transferring from a prior distribution to a posterior distribution. In this study, KLD is applied to the thermodynamic analysis of cell signal transduction cascade and serves an alternative to mutual entropy. When KLD is minimized, the divergence is given by the ratio of the prior selection probability of the signaling molecule to the posterior selection probability. Moreover, the information gain during the entire channel is shown to be adequately described by average KLD production rate. Thus, this approach provides a framework for the quantitative analysis of signal transduction. Moreover, the proposed approach can identify an effective cascade for a signaling network.

]]>Entropy doi: 10.3390/e20060437

Authors: António M. Lopes J. A. Tenreiro Machado

Climate has complex dynamics due to the plethora of phenomena underlying its evolution. These characteristics pose challenges to conducting solid quantitative analysis and reaching assertive conclusions. In this paper, the global temperature time series (TTS) is viewed as a manifestation of the climate evolution, and its complexity is calculated by means of four different indices, namely the Lempel&ndash;Ziv complexity, sample entropy, signal harmonics power ratio, and fractal dimension. In the first phase, the monthly mean TTS is pre-processed by means of empirical mode decomposition, and the TTS trend is calculated. In the second phase, the complexity of the detrended signals is estimated. The four indices capture distinct features of the TTS dynamics in a 4-dim space. Hierarchical clustering is adopted for dimensional reduction and visualization in the 2-dim space. The results show that TTS complexity exhibits space-time variability, suggesting the presence of distinct climate forcing processes in both dimensions. Numerical examples with real-world data demonstrate the effectiveness of the approach.

]]>Entropy doi: 10.3390/e20060436

Authors: Antonio M. Scarfone Hiroshi Matsuzoe Tatsuaki Wada

In this paper, we present a review of recent developments on the &kappa; -deformed statistical mechanics in the framework of the information geometry. Three different geometric structures are introduced in the &kappa; -formalism which are obtained starting from three, not equivalent, divergence functions, corresponding to the &kappa; -deformed version of Kullback&ndash;Leibler, &ldquo;Kerridge&rdquo; and Br&egrave;gman divergences. The first statistical manifold derived from the &kappa; -Kullback&ndash;Leibler divergence form an invariant geometry with a positive curvature that vanishes in the &kappa; &rarr; 0 limit. The other two statistical manifolds are related to each other by means of a scaling transform and are both dually-flat. They have a dualistic Hessian structure endowed by a deformed Fisher metric and an affine connection that are consistent with a statistical scalar product based on the &kappa; -escort expectation. These flat geometries admit dual potentials corresponding to the thermodynamic Massieu and entropy functions that induce a Legendre structure of &kappa; -thermodynamics in the picture of the information geometry.

]]>Entropy doi: 10.3390/e20060435

Authors: Alessandro Bisio Giacomo Mauro D’Ariano Nicola Mosco Paolo Perinotti Alessandro Tosini

We study the solutions of an interacting Fermionic cellular automaton which is the analogue of the Thirring model with both space and time discrete. We present a derivation of the two-particle solutions of the automaton recently in the literature, which exploits the symmetries of the evolution operator. In the two-particle sector, the evolution operator is given by the sequence of two steps, the first one corresponding to a unitary interaction activated by two-particle excitation at the same site, and the second one to two independent one-dimensional Dirac quantum walks. The interaction step can be regarded as the discrete-time version of the interacting term of some Hamiltonian integrable system, such as the Hubbard or the Thirring model. The present automaton exhibits scattering solutions with nontrivial momentum transfer, jumping between different regions of the Brillouin zone that can be interpreted as Fermion-doubled particles, in stark contrast with the customary momentum-exchange of the one-dimensional Hamiltonian systems. A further difference compared to the Hamiltonian model is that there exist bound states for every value of the total momentum and of the coupling constant. Even in the special case of vanishing coupling, the walk manifests bound states, for finitely many isolated values of the total momentum. As a complement to the analytical derivations we show numerical simulations of the interacting evolution.

]]>Entropy doi: 10.3390/e20060434

Authors: Jiuli Yin Cui Su Yongfen Zhang Xinghua Fan

Carbon markets provide a market-based way to reduce climate pollution. Subject to general market regulations, the major existing emission trading markets present complex characteristics. This paper analyzes the complexity of carbon market by using the multi-scale entropy. Pilot carbon markets in China are taken as the example. Moving average is adopted to extract the scales due to the short length of the data set. Results show a low-level complexity inferring that China&rsquo;s pilot carbon markets are quite immature in lack of market efficiency. However, the complexity varies in different time scales. China&rsquo;s carbon markets (except for the Chongqing pilot) are more complex in the short period than in the long term. Furthermore, complexity level in most pilot markets increases as the markets developed, showing an improvement in market efficiency. All these results demonstrate that an effective carbon market is required for the full function of emission trading.

]]>Entropy doi: 10.3390/e20060433

Authors: Nabor O. Castillo Diego I. Gallardo Heleno Bolfarine Héctor W. Gómez

This paper focuses on studying a truncated positive version of the power-normal (PN) model considered in Durrans (1992). The truncation point is considered to be zero so that the resulting model is an extension of the half normal distribution. Some probabilistic properties are studied for the proposed model along with maximum likelihood and moments estimation. The model is fitted to two real datasets and compared with alternative models for positive data. Results indicate good performance of the proposed model.

]]>Entropy doi: 10.3390/e20060432

Authors: Yanguang Chen Bin Jiang

Hierarchies can be modeled by a set of exponential functions, from which we can derive a set of power laws indicative of scaling. The solution to a scaling relation equation is always a power law. The scaling laws are followed by many natural and social phenomena such as cities, earthquakes, and rivers. This paper reveals the power law behaviors in systems of natural cities by reconstructing the urban hierarchy with cascade structure. Cities of the U.S.A., Britain, France, and Germany are taken as examples to perform empirical analyses. The hierarchical scaling relations can be well fitted to the data points within the scaling ranges of the number, size and area of the natural cities. The size-number and area-number scaling exponents are close to 1, and the size-area allometric scaling exponent is slightly less than 1. The results show that natural cities follow hierarchical scaling laws very well. The principle of entropy maximization of urban evolution is then employed to explain the hierarchical scaling laws, and differences entropy maximizing processes are used to interpret the scaling exponents. This study is helpful for scientists to understand the power law behavior in the development of cities and systems of cities.

]]>Entropy doi: 10.3390/e20060431

Authors: Sang Won Choi Won-Yong Shin Juyeop Kim

Near-optimal transmit beamformers are designed for multiuser multiple-input single-output interference channels with slowly time-varying block fading. The main contribution of this article is to provide a method for deriving closed-form solutions to effective beamforming in both low and high signal-to-noise ratio regimes. The proposed method basically leverages side information obtained from the channel correlation between adjacent coding blocks. More specifically, our methodology is based on a linear algebraic approach, which is more efficient than the optimal scheme based on the Gaussian input in the sense of reducing the average number of search space dimensions for designing the near-optimal transmit beamformers. The proposed method is shown to exhibit near-optimal performance via computer simulations in terms of the average sum-rate.

]]>Entropy doi: 10.3390/e20060430

Authors: Lorenzo Magnani

Eco-cognitive computationalism sees computation in context, exploiting the ideas developed in those projects that have originated the recent views on embodied, situated, and distributed cognition. Turing&rsquo;s original intellectual perspective has already clearly depicted the evolutionary emergence in humans of information, meaning, and of the first rudimentary forms of cognition, as the result of a complex interplay and simultaneous coevolution, in time, of the states of brain/mind, body, and external environment. This cognitive process played a fundamental heuristic role in Turing&rsquo;s invention of the universal logical computing machine. It is by extending this eco-cognitive perspective that we can see that the recent emphasis on the simplification of cognitive and motor tasks generated in organic agents by morphological aspects implies the construction of appropriate &ldquo;mimetic bodies&rdquo;, able to render the accompanied computation simpler, according to a general appeal to the &ldquo;simplexity&rdquo; of animal embodied cognition. I hope it will become clear that eco-cognitive computationalism does not aim at furnishing a final and stable definition of the concept of computation, such as a textbook or a different epistemological approach could provide: I intend to take into account the historical and dynamical character of the concept, to propose an intellectual framework that depicts how we can understand not only the change of its meaning, but also the &ldquo;emergence&rdquo; of new forms of computations.

]]>Entropy doi: 10.3390/e20060429

Authors: Ehsan Gholamalizadeh Jae Dong Chung

Recent developments in solar thermal systems have aroused considerable interest in several countries with high solar potential. One of the most promising solar driven technologies is the solar thermal dish-Stirling system. One of the main issues of the solar dish&ndash;Stirling system is thermal losses from its components. The majority of the thermal losses of the system occur through its receiver before the thermal energy is converted to electrical energy by the Stirling engine. The goal of this investigation is to analyze the thermal performance of the receiver of a standalone pilot solar dish&ndash;Stirling system installed in Kerman City, Iran, to be used in remote off-grid areas of the Kerman Province. An analytical model was developed to predict the input energy, thermal losses, and thermal efficiency of the receiver. The receiver thermal model was first validated by comparing simulation results to experimental measurements for the EuroDish project. Then, the incident flux intensity intercepted by the receiver aperture, the thermal losses through the receiver (including conduction, convection, and radiation losses), and the power output during daytime hours (average day of each month for a year) were predicted. The results showed that the conduction loss was small, while the convection and radiation losses played major roles in the total thermal losses through the receiver. The convection loss is dominant during the early morning and later evening hours, while radiation loss reaches its highest value near midday. Finally, the thermal efficiency of the receiver and the power output for each working hour throughout the year were calculated. The maximum performance of the system occurred at midday in the middle of July, with a predicted power output of 850 W, and a receiver efficiency of about 60%. At this time, a conduction loss of about 266 W, a convection loss of 284 W, and a radiation loss of about 2000 W were estimated.

]]>Entropy doi: 10.3390/e20060428

Authors: Hui Chen Yuanchun Chen Shaotong Sun Ying Hu Jun Feng

According to the fact that high frequency will be abnormally attenuated when seismic signals travel across reservoirs, a new method, which is named high-precision time-frequency entropy based on synchrosqueezing generalized S-transform, is proposed for hydrocarbon reservoir detection in this paper. First, the proposed method obtains the time-frequency spectra by synchrosqueezing generalized S-transform (SSGST), which are concentrated around the real instantaneous frequency of the signals. Then, considering the characteristics and effects of noises, we give a frequency constraint condition to calculate the entropy based on time-frequency spectra. The synthetic example verifies that the entropy will be abnormally high when seismic signals have an abnormal attenuation. Besides, comparing with the GST time-frequency entropy and the original SSGST time-frequency entropy in field data, the results of the proposed method show higher precision. Moreover, the proposed method can not only accurately detect and locate hydrocarbon reservoirs, but also effectively suppress the impact of random noises.

]]>Entropy doi: 10.3390/e20060427

Authors: Yonghua You Zhongda Wu Yong Yang Jie Yu Dong Zhang Zhuang Zhang

In the current work, a novel 2D numerical model of stationary grids was developed for reciprocating magnetic refrigerators, with Gd plates, in which the magneto-caloric properties, derived from the Weiss molecular field theory, were adopted for the built-in energy source of the magneto-caloric effect. The numerical simulation was conducted under the conditions of different structural and operational parameters, and the effects of the relative fluid displacement (&phi;) on the specific refrigeration capacity (qref) and the Coefficient of Performance (COP) were obtained. Besides the variations of entropy, the generation rate and number were studied and the contours of the local entropy generation rate are presented for discussion. From the current work, it is found that with an increase in &phi;, both the qref and COP followed the convex variation trend, while the entropy generation number (Ns) varied concavely. As for the current cases, the maximal qref and COP were equal to 151.2 kW/m3 and 9.11, respectively, while the lowest Ns was the value of 2.4 &times; 10&minus;4 K&minus;1. However, the optimal &phi; for the largest qref and COP, and for the lowest Ns, were inconsistent, thus, some compromises need be made in the optimization of magnetic refrigerators.

]]>Entropy doi: 10.3390/e20060426

Authors: Giorgio Kaniadakis Dionissios T. Hristopulos

Master equations define the dynamics that govern the time evolution of various physical processes on lattices. In the continuum limit, master equations lead to Fokker&ndash;Planck partial differential equations that represent the dynamics of physical systems in continuous spaces. Over the last few decades, nonlinear Fokker&ndash;Planck equations have become very popular in condensed matter physics and in statistical physics. Numerical solutions of these equations require the use of discretization schemes. However, the discrete evolution equation obtained by the discretization of a Fokker&ndash;Planck partial differential equation depends on the specific discretization scheme. In general, the discretized form is different from the master equation that has generated the respective Fokker&ndash;Planck equation in the continuum limit. Therefore, the knowledge of the master equation associated with a given Fokker&ndash;Planck equation is extremely important for the correct numerical integration of the latter, since it provides a unique, physically motivated discretization scheme. This paper shows that the Kinetic Interaction Principle (KIP) that governs the particle kinetics of many body systems, introduced in G. Kaniadakis, Physica A 296, 405 (2001), univocally defines a very simple master equation that in the continuum limit yields the nonlinear Fokker&ndash;Planck equation in its most general form.

]]>Entropy doi: 10.3390/e20060425

Authors: Zhe Chen Yaan Li Hongtao Liang Jing Yu

The classification performance of passive sonar can be improved by extracting the features of ship-radiated noise. Traditional feature extraction methods neglect the nonlinear features in ship-radiated noise, such as entropy. The multiscale sample entropy (MSE) algorithm has been widely used for quantifying the entropy of a signal, but there are still some limitations. To remedy this, the hierarchical cosine similarity entropy (HCSE) is proposed in this paper. Firstly, the hierarchical decomposition is utilized to decompose a time series into some subsequences. Then, the sample entropy (SE) is modified by utilizing Shannon entropy rather than conditional entropy and employing angular distance instead of Chebyshev distance. Finally, the complexity of each subsequence is quantified by the modified SE. Simulation results show that the HCSE method overcomes some limitations in MSE. For example, undefined entropy is not likely to occur in HCSE, and it is more suitable for short time series. Compared with MSE, the experimental results illustrate that the classification accuracy of real ship-radiated noise is significantly improved from 75% to 95.63% by using HCSE. Consequently, the proposed HCSE can be applied in practical applications.

]]>Entropy doi: 10.3390/e20060424

Authors: Ferdinando Di Martino Salvatore Sessa

We present a new method for assessing the strength of fuzzy rules with respect to a dataset, based on the measures of the greatest energy and smallest entropy of a fuzzy relation. Considering a fuzzy automaton (relation), in which A is the input fuzzy set and B the output fuzzy set, the fuzzy relation R1 with greatest energy provides information about the greatest strength of the input-output, and the fuzzy relation R2 with the smallest entropy provides information about uncertainty of the input-output relationship. We consider a new index of the fuzziness of the input-output based on R1 and R2. In our method, this index is calculated for each pair of input and output fuzzy sets in a fuzzy rule. A threshold value is set in order to choose the most relevant fuzzy rules with respect to the data.

]]>Entropy doi: 10.3390/e20060423

Authors: Jen-Tsung Hsiang Bei-Lok Hu

Identifying or constructing a fine-grained microscopic theory that will emerge under specific conditions to a known macroscopic theory is always a formidable challenge. Thermodynamics is perhaps one of the most powerful theories and best understood examples of emergence in physical sciences, which can be used for understanding the characteristics and mechanisms of emergent processes, both in terms of emergent structures and the emergent laws governing the effective or collective variables. Viewing quantum mechanics as an emergent theory requires a better understanding of all this. In this work we aim at a very modest goal, not quantum mechanics as thermodynamics, not yet, but the thermodynamics of quantum systems, or quantum thermodynamics. We will show why even with this minimal demand, there are many new issues which need be addressed and new rules formulated. The thermodynamics of small quantum many-body systems strongly coupled to a heat bath at low temperatures with non-Markovian behavior contains elements, such as quantum coherence, correlations, entanglement and fluctuations, that are not well recognized in traditional thermodynamics, built on large systems vanishingly weakly coupled to a non-dynamical reservoir. For quantum thermodynamics at strong coupling, one needs to reexamine the meaning of the thermodynamic functions, the viability of the thermodynamic relations and the validity of the thermodynamic laws anew. After a brief motivation, this paper starts with a short overview of the quantum formulation based on Gelin &amp; Thoss and Seifert. We then provide a quantum formulation of Jarzynski&rsquo;s two representations. We show how to construct the operator thermodynamic potentials, the expectation values of which provide the familiar thermodynamic variables. Constructing the operator thermodynamic functions and verifying or modifying their relations is a necessary first step in the establishment of a viable thermodynamics theory for quantum systems. We mention noteworthy subtleties for quantum thermodynamics at strong coupling, such as in issues related to energy and entropy, and possible ambiguities of their operator forms. We end by indicating some fruitful pathways for further developments.

]]>Entropy doi: 10.3390/e20060422

Authors: Travis Norsen

The de Broglie-Bohm pilot-wave theory promises not only a realistic description of the microscopic world (in particular, a description in which observers and observation play no fundamental role) but also the ability to derive and explain aspects of the quantum formalism that are, instead, (awkwardly and problematically) postulated in orthodox versions of quantum theory. Chief among these are the various &ldquo;measurement axioms&rdquo; and in particular the Born rule expressing the probability distribution of measurement outcomes. Compared to other candidate non-orthodox quantum theories, the pilot-wave theory suffers from something of an embarrassment of riches in regard to explaining the Born rule statistics, in the sense that there exist, in the literature, not just one but two rather compelling proposed explanations. This paper is an attempt to critically review and clarify these two competing arguments. We summarize both arguments and also survey some objections that have been given against them. In the end, we suggest that there is somewhat less conflict between the two approaches than existing polemics might suggest, and that indeed elements from both arguments may be combined to provide a unified and fully-compelling explanation, from the postulated dynamical first principles, of the Born rule.

]]>Entropy doi: 10.3390/e20060420

Authors: Gregg Jaeger

In the Copenhagen approach to quantum mechanics as characterized by Heisenberg, probabilities relate to the statistics of measurement outcomes on ensembles of systems and to individual measurement events via the actualization of quantum potentiality. Here, brief summaries are given of a series of key results of different sorts that have been obtained since the final elements of the Copenhagen interpretation were offered and it was explicitly named so by Heisenberg&mdash;in particular, results from the investigation of the behavior of quantum probability since that time, the mid-1950s. This review shows that these developments have increased the value to physics of notions characterizing the approach which were previously either less precise or mainly symbolic in character, including complementarity, indeterminism, and unsharpness.

]]>Entropy doi: 10.3390/e20060421

Authors: Guomin Luo Changyuan Yao Yinglin Liu Yingjie Tan Jinghan He

Protection based on transient information is the primary protection of high voltage direct current (HVDC) transmission systems. As a major part of protection function, accurate identification of transient surges is quite crucial to ensure the performance and accuracy of protection algorithms. Recognition of transient surges in an HVDC system faces two challenges: signal distortion and small number of samples. Entropy, which is stable in representing frequency distribution features, and support vector machine (SVM), which is good at dealing with samples with limited numbers, are adopted and combined in this paper to solve the transient recognition problems. Three commonly detected transient surges&mdash;single-pole-to-ground fault (GF), lightning fault (LF), and lightning disturbance (LD)&mdash;are simulated in various scenarios and recognized with the proposed method. The proposed method is proved to be effective in both feature extraction and type classification and shows great potential in protection applications.

]]>Entropy doi: 10.3390/e20060419

Authors: Rui Liu Bharat Karumuri Joshua Adkinson Timothy Noah Hutson Ioannis Vlachos Leon Iasemidis

Quantification of the complexity of signals recorded concurrently from multivariate systems, such as the brain, plays an important role in the study and characterization of their state and state transitions. Multivariate analysis of the electroencephalographic signals (EEG) over time is conceptually most promising in unveiling the global dynamics of dynamical brain disorders such as epilepsy. We employed a novel methodology to study the global complexity of the epileptic brain en route to seizures. The developed measures of complexity were based on Multivariate Matching Pursuit (MMP) decomposition of signals in terms of time&ndash;frequency Gabor functions (atoms) and Shannon entropy. The measures were first validated on simulation data (Lorenz system) and then applied to EEGs from preictal (before seizure onsets) periods, recorded by intracranial electrodes from eight patients with temporal lobe epilepsy and a total of 42 seizures, in search of global trends of complexity before seizures onset. Out of five Gabor measures of complexity we tested, we found that our newly defined measure, the normalized Gabor entropy (NGE), was able to detect statistically significant (p &lt; 0.05) nonlinear trends of the mean global complexity across all patients over 1 h periods prior to seizures&rsquo; onset. These trends pointed to a slow decrease of the epileptic brain&rsquo;s global complexity over time accompanied by an increase of the variance of complexity closer to seizure onsets. These results show that the global complexity of the epileptic brain decreases at least 1 h prior to seizures and imply that the employed methodology and measures could be useful in identifying different brain states, monitoring of seizure susceptibility over time, and potentially in seizure prediction.

]]>Entropy doi: 10.3390/e20060418

Authors: Jingbo Liu Thomas A. Courtade Paul W. Cuff Sergio Verdú

Inspired by the forward and the reverse channels from the image-size characterization problem in network information theory, we introduce a functional inequality that unifies both the Brascamp-Lieb inequality and Barthe&rsquo;s inequality, which is a reverse form of the Brascamp-Lieb inequality. For Polish spaces, we prove its equivalent entropic formulation using the Legendre-Fenchel duality theory. Capitalizing on the entropic formulation, we elaborate on a &ldquo;doubling trick&rdquo; used by Lieb and Geng-Nair to prove the Gaussian optimality in this inequality for the case of Gaussian reference measures.

]]>Entropy doi: 10.3390/e20060417

Authors: Petre Caraiani

In this paper, I propose a methodology to study the comovement between the entropy of different financial markets. The entropy is derived using singular value decomposition of the components of stock market indices in financial markets from selected developed economies, i.e., France, Germany, the United Kingdom, and the United States. I study how a shock in the entropy in the United States affects the entropy in the other financial markets. I also model the entropy using a dynamic factor model and derive a common factor behind the entropy movements in these four markets.

]]>Entropy doi: 10.3390/e20060416

Authors: Tarannom Parhizkar Samaneh Balali Ali Mosleh

Oil pipeline network system health monitoring is important primarily due to the high cost of failure consequences. Optimal sensor selection helps provide more effective system health information from the perspective of economic and technical constraints. Optimization models confront different issues. For instance, many oil pipeline system performance models are inherently nonlinear, requiring nonlinear modelling. Optimization also confronts modeling uncertainties. Oil pipeline systems are among the most complicated and uncertain dynamic systems, as they include human elements, complex failure mechanisms, control systems, and most importantly component interactions. In this paper, an entropy-based Bayesian network optimization methodology for sensor selection and placement under uncertainty is developed. Entropy is a commonly used measure of information often been used to characterize uncertainty, particularly to quantify the effectiveness of measured signals of sensors in system health monitoring contexts. The entropy based Bayesian network optimization outlined herein also incorporates the effect that sensor reliability has on system information entropy content, which can also be related to the sensor cost. This approach is developed further by incorporating system information entropy and sensor costs in order to evaluate the performance of sensor combinations. The paper illustrates the approach using a simple oil pipeline network example. The so-called particle swarm optimization algorithm is used to solve the multi-objective optimization model, establishing the Pareto frontier.

]]>Entropy doi: 10.3390/e20060415

Authors: Lei Zhang Lingen Chen Shaojun Xia Chao Wang Fengrui Sun

Thermal design and optimization for reverse water gas shift (RWGS) reactors is particularly important to fuel synthesis in naval or commercial scenarios. The RWGS reactor with irreversibilities of heat transfer, chemical reaction and viscous flow is studied based on finite time thermodynamics or entropy generation minimization theory in this paper. The total entropy generation rate (EGR) in the RWGS reactor with different boundary conditions is minimized subject to specific feed compositions and chemical conversion using optimal control theory, and the optimal configurations obtained are compared with three reference reactors with linear, constant reservoir temperature and constant heat flux operations, which are commonly used in engineering. The results show that a drastic EGR reduction of up to 23% can be achieved by optimizing the reservoir temperature profile, the inlet temperature of feed gas and the reactor length simultaneously, compared to that of the reference reactor with the linear reservoir temperature. These optimization efforts are mainly achieved by reducing the irreversibility of heat transfer. Optimal paths have subsections of relatively constant thermal force, chemical force and local EGR. A conceptual optimal design of sandwich structure for the compact modular reactor is proposed, without elaborate control tools or excessive interstage equipment. The results can provide guidelines for designing industrial RWGS reactors in naval or commercial scenarios.

]]>Entropy doi: 10.3390/e20060414

Authors: Vasily Tarasov Valentina Tarasova

In this paper, we propose criteria for the existence of memory of power-law type (PLT) memory in economic processes. We give the criterion of existence of power-law long-range dependence in time by using the analogy with the concept of the long-range alpha-interaction. We also suggest the criterion of existence of PLT memory for frequency domain by using the concept of non-integer dimensions. For an economic process, for which it is known that an endogenous variable depends on an exogenous variable, the proposed criteria make it possible to identify the presence of the PLT memory. The suggested criteria are illustrated in various examples. The use of the proposed criteria allows apply the fractional calculus to construct dynamic models of economic processes. These criteria can be also used to identify the linear integro-differential operators that can be considered as fractional derivatives and integrals of non-integer orders.

]]>Entropy doi: 10.3390/e20060413

Authors: Rodolfo Gambini Jorge Pullin

The Montevideo interpretation of quantum mechanics, which consists of supplementing environmental decoherence with fundamental limitations in measurement stemming from gravity, has been described in several publications. However, some of them appeared before the full picture provided by the interpretation was developed. As such, it can be difficult to get a good understanding via the published literature. Here, we summarize it in a self-contained brief presentation including all its principal elements.

]]>Entropy doi: 10.3390/e20060412

Authors: Mohammad Ishaq Gohar Ali Zahir Shah Saeed Islam Sher Muhammad

This research paper investigates entropy generation analysis on two-dimensional nanofluid film flow of Eyring&ndash;Powell fluid with heat amd mass transmission over an unsteady porous stretching sheet in the existence of uniform magnetic field (MHD). The flow of liquid films are taken under the impact of thermal radiation. The basic time dependent equations of heat transfer, momentum and mass transfer are modeled and converted to a system of differential equations by employing appropriate similarity transformation with unsteady dimensionless parameters. Entropy analysis is the main focus in this work and the impact of physical parameters on the entropy profile are discussed in detail. The influence of thermophoresis and Brownian motion has been taken in the nanofluids model. An optima approach has been applied to acquire the solution of modeled problem. The convergence of the HAM (Homotopy Analysis Method) has been presented numerically. The disparity of the Nusslet number, Skin friction, Sherwood number and their influence on the velocity, heat and concentration fields has been scrutinized. Moreover, for comprehension, the physical presentation of the embedded parameters are explored analytically for entropy generation and discussed.

]]>Entropy doi: 10.3390/e20060411

Authors: Allen Parks Scott Spence

Simple intuitive models are presented for the capacity and entropy of retro-causal channels in measured ensembles of quantum systems which can be represented as statistical mixtures of pre-selected only and pre- and post-selected systems. Measurement data from a twin Mach-Zehnder interferometer experiment are used in these models to discuss the capacity and entropy of an apparent retro-causal channel observed in the experimental data. It is noted that low capacity/low entropy retro-causal channels can exist in strong measurement systems.

]]>Entropy doi: 10.3390/e20060410

Authors: Ken Wharton

Globally-constrained classical fields provide a unexplored framework for modeling quantum phenomena, including apparent particle-like behavior. By allowing controllable constraints on unknown past fields, these models are retrocausal but not retro-signaling, respecting the conventional block universe viewpoint of classical spacetime. Several example models are developed that resolve the most essential problems with using classical electromagnetic fields to explain single-photon phenomena. These models share some similarities with Stochastic Electrodynamics, but without the infinite background energy problem, and with a clear path to explaining entanglement phenomena. Intriguingly, the average intermediate field intensities share a surprising connection with quantum &ldquo;weak values&rdquo;, even in the single-photon limit. This new class of models is hoped to guide further research into spacetime-based accounts of weak values, entanglement, and other quantum phenomena.

]]>Entropy doi: 10.3390/e20060409

Authors: D. Elusaí Millán-Ocampo Arianna Parrales-Bahena J. Gonzalo González-Rodríguez Susana Silva-Martínez Jesús Porcayo-Calderón J. Alfredo Hernández-Pérez

In this work, three models based on Artificial Neural Network (ANN) were developed to describe the behavior for the inhibition corrosion of bronze in 3.5% NaCl + 0.1 M Na2SO4, using the experimental data of Electrochemical Impedance Spectroscopy (EIS). The database was divided into training, validation, and test sets randomly. The parameters process used as the inputs of the ANN models were frequency, temperature, and inhibitor concentration. The outputs for each ANN model and the components in the EIS spectrum (Zre, Zim, and Zmod) were predicted. The transfer functions used for the learning process were the hyperbolic tangent sigmoid in the hidden layer and linear in the output layer, while the Levenberg&ndash;Marquardt algorithm was applied to determine the optimum values of the weights and biases. The statistical analysis of the results revealed that ANN models for Zre, Zim, and Zmod can successfully predict the inhibition corrosion behavior of bronze in different conditions, where what was considered included variability in temperature, frequency, and inhibitor concentration. In addition, these three input parameters were keys to describe the behavior according to a sensitivity analysis.

]]>Entropy doi: 10.3390/e20060408

Authors: Jun-Lin Lin

Measuring the consensus for a group of ordinal-type responses is of practical importance in decision making. Many consensus measures appear in the literature, but they sometimes provide inconsistent results. Therefore, it is crucial to compare these consensus measures, and analyze their relationships. In this study, we targeted five consensus measures: &Phi; e (from entropy), &Phi; 1 (from absolute deviation), &Phi; 2 (from variance), &Phi; 3 (from skewness), and &Phi; m v (from conditional probability). We generated 316,251 probability distributions, and analyzed the relationships among their consensus values. Our results showed that &Phi; 1 , &nbsp; &Phi; e , &nbsp; &Phi; 2 , and &Phi; 3 tended to provide consistent results, and the ordering &Phi; 1 &le; &Phi; e &le; &Phi; 2 &le; &Phi; 3 held at a high probability. Although &Phi; m v had a positive correlation with &Phi; 1 , &nbsp; &Phi; e , &nbsp; &Phi; 2 , and &Phi; 3 , it had a much lower tolerance for even a small proportion of extreme opposite opinions than &Phi; 1 , &nbsp; &Phi; e , &nbsp; &Phi; 2 , and &Phi; 3 did.

]]>Entropy doi: 10.3390/e20060407

Authors: Wentao Ma Dongqiao Zheng Zhiyu Zhang Jiandong Duan Jinzhe Qiu Xianzhi Hu

To address the sparse system identification problem under noisy input and non-Gaussian output measurement noise, two novel types of sparse bias-compensated normalized maximum correntropy criterion algorithms are developed, which are capable of eliminating the impact of non-Gaussian measurement noise and noisy input. The first is developed by using the correntropy-induced metric as the sparsity penalty constraint, which is a smoothed approximation of the ℓ 0 norm. The second is designed using the proportionate update scheme, which facilitates the close tracking of system parameter change. Simulation results confirm that the proposed algorithms can effectively improve the identification performance compared with other algorithms presented in the literature for the sparse system identification problem.

]]>Entropy doi: 10.3390/e20060406

Authors: Karl Svozil

Extensions of the Kochen&ndash;Specker theorem use quantum logics whose classical interpretation suggests a true-implies-value indefiniteness property. This can be interpreted as an indication that any view of a quantum state beyond a single context is epistemic. A remark by Gleason about the ad hoc construction of probability measures in Hilbert spaces as a result of the Pythagorean property of vector components is interpreted platonically. Unless there is a total match between preparation and measurement contexts, information about the former from the latter is not ontic, but epistemic. This is corroborated by configurations of observables and contexts with a truth-implies-value indefiniteness property.

]]>Entropy doi: 10.3390/e20060405

Authors: Thomas J. Salez Sawako Nakamae Régine Perzynski Guillaume Mériguet Andrejs Cebers Michel Roger

An analytical model describing the thermoelectric potential production in magnetic nanofluids (dispersions of magnetic and charged colloidal particles in liquid media) is presented. The two major entropy sources, the thermogalvanic and thermodiffusion processes are considered. The thermodiffusion term is described in terms of three physical parameters; the diffusion coefficient, the Eastman entropy of transfer and the electrophoretic charge number of colloidal particles, which all depend on the particle concentration and the applied magnetic field strength and direction. The results are combined with well-known formulation of thermoelectric potential in thermogalvanic cells and compared to the recent observation of Seebeck coefficient enhancement/diminution in magnetic nanofluids in polar media.

]]>Entropy doi: 10.3390/e20060404

Authors: Zhenqing Wang Mingfang Shi Xiaojun Tang Hongqing Lv Lidan Xu

A high-order finite difference method was used to simulate the hypersonic flow field over a blunt cone with different height roughness elements. The unsteady flow field induced by pulse disturbances was analyzed and compared with that under continuous disturbances. The temporal and spatial evolution characteristics of disturbances in the boundary layer were investigated and the propagation of different disturbance modes in the boundary layer was researched through the fast Fourier transform (FFT) method. The effect of the roughness element on the receptivity characteristic of the hypersonic boundary layer under pulse entropy disturbances was explored. The results showed that the different mode disturbances near roughness in the boundary layer were enlarged in the upstream half of the roughness element and suppressed in the downstream half. However, the effect of roughness weakened gradually as the disturbance frequency increased in the boundary layer. A phenomenon of mode competition in the downstream region of the roughness element exited. As the disturbances propagated downstream, the fundamental mode gradually became the dominant mode. A certain promotion effect on the mode competition was induced by the roughness element and the effect was enhanced with the increase in the roughness element height.

]]>Entropy doi: 10.3390/e20060402

Authors: Shixue Sun Yu Jin Chang Chen Baoqing Sun Zhixin Cao Iek Lo Qi Zhao Jun Zheng Yan Shi Xiaohua Zhang

Asthma is a chronic respiratory disease featured with unpredictable flare-ups, for which continuous lung function monitoring is the key for symptoms control. To find new indices to individually classify severity and predict disease prognosis, continuous physiological data collected from monitoring devices is being studied from different perspectives. Entropy, as an analysis method for quantifying the inner irregularity of data, has been widely applied in physiological signals. However, based on our knowledge, there is no such study to summarize the complexity differences of various physiological signals in asthmatic patients. Therefore, we organized a systematic review to summarize the complexity differences of important signals in patients with asthma. We searched several medical databases and systematically reviewed existing asthma clinical trials in which entropy changes in physiological signals were studied. As a conclusion, we find that, for airflow, heart rate variability, center of pressure and respiratory impedance, their entropy values decrease significantly in asthma patients compared to those of healthy people, while, for respiratory sound and airway resistance, their entropy values increase along with the progression of asthma. Entropy of some signals, such as respiratory inter-breath interval, shows strong potential as novel indices of asthma severity. These results will give valuable guidance for the utilization of entropy in physiological signals. Furthermore, these results should promote the development of management and diagnosis of asthma using continuous monitoring data in the future.

]]>Entropy doi: 10.3390/e20060403

Authors: Ganeshsree Selvachandran Harish Garg Shio Gai Quek

The complex vague soft set (CVSS) model is a hybrid of complex fuzzy sets and soft sets that have the ability to accurately represent and model two-dimensional information for real-life phenomena that are periodic in nature. In the existing studies of fuzzy and its extensions, the uncertainties which are present in the data are handled with the help of membership degree which is the subset of real numbers. However, in the present work, this condition has been relaxed with the degrees whose ranges are a subset of the complex subset with unit disc and hence handle the information in a better way. Under this environment, we developed some entropy measures of the CVSS model induced by the axiomatic definition of distance measure. Some desirable relations between them are also investigated. A numerical example related to detection of an image by the robot is given to illustrate the proposed entropy measure.

]]>Entropy doi: 10.3390/e20060401

Authors: Rui She Shanyun Liu Pingyi Fan

Information transfer that characterizes the information feature variation can have a crucial impact on big data analytics and processing. Actually, the measure for information transfer can reflect the system change from the statistics by using the variable distributions, similar to Kullback-Leibler (KL) divergence and Renyi divergence. Furthermore, to some degree, small probability events may carry the most important part of the total message in an information transfer of big data. Therefore, it is significant to propose an information transfer measure with respect to the message importance from the viewpoint of small probability events. In this paper, we present the message importance transfer measure (MITM) and analyze its performance and applications in three aspects. First, we discuss the robustness of MITM by using it to measuring information distance. Then, we present a message importance transfer capacity by resorting to the MITM and give an upper bound for the information transfer process with disturbance. Finally, we apply the MITM to discuss the queue length selection, which is the fundamental problem of caching operation on mobile edge computing.

]]>Entropy doi: 10.3390/e20060400

Authors: Iqbal M. Batiha Reyad El-Khazali Ahmed AlSaedi Shaher Momani

This paper introduces a general solution of singular fractional-order linear-time invariant (FoLTI) continuous systems using the Adomian Decomposition Method (ADM) based on the Caputo's definition of the fractional-order derivative. The complexity of their entropy lies in defining the complete solution of such systems, which depends on introducing a method of decomposing their dynamic states from their static states. The solution is formulated by converting the singular system of regular pencils into a recursive form using the sequence of transformations, which separates the dynamic variables from the algebraic variables. The main idea of this work is demonstrated via numerical examples.

]]>Entropy doi: 10.3390/e20060399

Authors: Jesús Gutiérrez-Gutiérrez Marta Zárraga-Rodríguez Fernando M. Villar-Rosety Xabier Insausti

In this paper, we give upper bounds for the rate-distortion function (RDF) of any Gaussian vector, and we propose coding strategies to achieve such bounds. We use these strategies to reduce the computational complexity of coding Gaussian asymptotically wide sense stationary (AWSS) autoregressive (AR) sources. Furthermore, we also give sufficient conditions for AR processes to be AWSS.

]]>Entropy doi: 10.3390/e20060398

Authors: Chaojun Wang Hongrui Zhao

Distinguishing and characterizing different landscape patterns have long been the primary concerns of quantitative landscape ecology. Information theory and entropy-related metrics have provided the deepest insights in complex system analysis, and have high relevance in landscape ecology. However, ideal methods to compare different landscape patterns from an entropy view are still lacking. The overall aim of this research is to propose a new form of spatial entropy (Hs) in order to distinguish and characterize different landscape patterns. Hs is an entropy-related index based on information theory, and integrates proximity as a key spatial component into the measurement of spatial diversity. Proximity contains two aspects, i.e., total edge length and distance, and by including both aspects gives richer information about spatial pattern than metrics that only consider one aspect. Thus, Hs provides a novel way to study the spatial structures of landscape patterns where both the edge length and distance relationships are relevant. We compare the performances of Hs and other similar approaches through both simulated and real-life landscape patterns. Results show that Hs is more flexible and objective in distinguishing and characterizing different landscape patterns. We believe that this metric will facilitate the exploration of relationships between landscape patterns and ecological processes.

]]>Entropy doi: 10.3390/e20060397

Authors: Jorge F. Silva

This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studying and adopting some recent convergence results of the entropy functional, which is known to be a discontinuous function in the space of probabilities in &infin;-alphabets. Sufficient conditions for the convergence of the entropy are used in conjunction with some deviation inequalities (including scenarios with both finitely and infinitely supported assumptions on the target distribution). From this perspective, four plug-in histogram-based estimators are studied showing that convergence results are instrumental to derive new strong consistent estimators for the entropy. The main application of this methodology is a new data-driven partition (plug-in) estimator. This scheme uses the data to restrict the support where the distribution is estimated by finding an optimal balance between estimation and approximation errors. The proposed scheme offers a consistent (distribution-free) estimator of the entropy in &infin;-alphabets and optimal rates of convergence under certain regularity conditions on the problem (finite and unknown supported assumptions and tail bounded conditions on the target distribution).

]]>Entropy doi: 10.3390/e20060395

Authors: Rongrong Tian Yanbin Tang

This paper considers the existence and uniqueness of stochastic entropy solution for a nonlinear transport equation with a stochastic perturbation. The uniqueness is based on the doubling variable method. For the existence, we develop a new scheme of parabolic approximation motivated by the method of vanishing viscosity given by Feng and Nualart (J. Funct. Anal. 2008, 255, 313–373). Furthermore, we prove the continuous dependence of stochastic strong entropy solutions on the coefficient b and the nonlinear function f.

]]>Entropy doi: 10.3390/e20060396

Authors: Sihan Xiong Yiwei Fu Asok Ray

This paper presents a nonparametric regression model of categorical time series in the setting of conditional tensor factorization and Bayes network. The underlying algorithms are developed to provide a flexible and parsimonious representation for fusion of correlated information from heterogeneous sources, which can be used to improve the performance of prediction tasks and infer the causal relationship between key variables. The proposed method is first illustrated by numerical simulation and then validated with two real-world datasets: (1) experimental data, collected from a swirl-stabilized lean-premixed laboratory-scale combustor, for detection of thermoacoustic instabilities and (2) publicly available economics data for causal inference-making.

]]>Entropy doi: 10.3390/e20060394

Authors: Masanari Asano Irina Basieva Emmanuel M. Pothos Andrei Khrennikov

In the formalism of quantum theory, a state of a system is represented by a density operator. Mathematically, a density operator can be decomposed into a weighted sum of (projection) operators representing an ensemble of pure states (a state distribution), but such decomposition is not unique. Various pure states distributions are mathematically described by the same density operator. These distributions are categorized into classical ones obtained from the Schatten decomposition and other, non-classical, ones. In this paper, we define the quantity called the state entropy. It can be considered as a generalization of the von Neumann entropy evaluating the diversity of states constituting a distribution. Further, we apply the state entropy to the analysis of non-classical states created at the intermediate stages in the process of quantum measurement. To do this, we employ the model of differentiation, where a system experiences step by step state transitions under the influence of environmental factors. This approach can be used for modeling various natural and mental phenomena: cell&rsquo;s differentiation, evolution of biological populations, and decision making.

]]>Entropy doi: 10.3390/e20060393

Authors: Diogo Pratas Raquel M. Silva Armando J. Pinho

An efficient DNA compressor furnishes an approximation to measure and compare information quantities present in, between and across DNA sequences, regardless of the characteristics of the sources. In this paper, we compare directly two information measures, the Normalized Compression Distance (NCD) and the Normalized Relative Compression (NRC). These measures answer different questions; the NCD measures how similar both strings are (in terms of information content) and the NRC (which, in general, is nonsymmetric) indicates the fraction of one of them that cannot be constructed using information from the other one. This leads to the problem of finding out which measure (or question) is more suitable for the answer we need. For computing both, we use a state of the art DNA sequence compressor that we benchmark with some top compressors in different compression modes. Then, we apply the compressor on DNA sequences with different scales and natures, first using synthetic sequences and then on real DNA sequences. The last include mitochondrial DNA (mtDNA), messenger RNA (mRNA) and genomic DNA (gDNA) of seven primates. We provide several insights into evolutionary acceleration rates at different scales, namely, the observation and confirmation across the whole genomes of a higher variation rate of the mtDNA relative to the gDNA. We also show the importance of relative compression for localizing similar information regions using mtDNA.

]]>Entropy doi: 10.3390/e20060392

Authors: Andrea Puglisi Alessandro Sarracino Angelo Vulpiani

A challenging frontier in modern statistical physics is concerned with systems with a small number of degrees of freedom, far from the thermodynamic limit.[...]

]]>Entropy doi: 10.3390/e20050391

Authors: Luis Herrera

The fact that real dissipative (entropy producing) processes may be detected by non-comoving observers (tilted), in systems that appear to be isentropic for comoving observers, in general relativity, is explained in terms of the information theory, analogous with the explanation of the Maxwell&rsquo;s demon paradox.

]]>