Entropy
http://www.mdpi.com/journal/entropy
Latest open access articles published in Entropy at http://www.mdpi.com/journal/entropy<![CDATA[Entropy, Vol. 18, Pages 343: Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method]]>
http://www.mdpi.com/1099-4300/18/10/343
Assessing Landslide Susceptibility Mapping (LSM) contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs) in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70%) and testing (≈30%) samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC) curves in combination with area under the curve (AUC). The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area.Entropy2016-09-271810Article10.3390/e181003433431099-43002016-09-27doi: 10.3390/e18100343Majid Shadman RoodposhtiJagannath AryalHiman ShahabiTaher Safarrad<![CDATA[Entropy, Vol. 18, Pages 349: Recognition of Abnormal Uptake through 123I-mIBG Scintigraphy Entropy for Paediatric Neuroblastoma Identification]]>
http://www.mdpi.com/1099-4300/18/10/349
Whole-body 123I-Metaiodobenzylguanidine (mIBG) scintigraphy is used as primary image modality to visualize neuroblastoma tumours and metastases because it is the most sensitive and specific radioactive tracer in staging the disease and evaluating the response to treatment. However, especially in paediatric neuroblastoma, information from mIBG scans is difficult to extract because of acquisition difficulties that produce low definition images, with poor contours, resolution and contrast. These problems limit physician assessment. Current oncological guidelines are based on qualitative observer-dependant analysis. This makes comparing results taken at different moments of therapy, or in different institutions, difficult. In this paper, we present a computerized method that processes an image and calculates a quantitative measurement considered as its entropy, suitable for the identification of abnormal uptake regions, for which there is enough suspicion that they may be a tumour or metastatic site. This measurement can also be compared with future scintigraphies of the same patient. Over 46 scintigraphies of 22 anonymous patients were tested; the procedure identified 96.7% of regions of abnormal uptake and it showed a low overall false negative rate of 3.3%. This method provides assistance to physicians in diagnosing tumours and also allows the monitoring of patients’ evolution.Entropy2016-09-271810Article10.3390/e181003493491099-43002016-09-27doi: 10.3390/e18100349Milagros Martínez-DíazRafael Martínez-DíazLuis Sánchez-RuizGuillermo Peris-Fajarnés<![CDATA[Entropy, Vol. 18, Pages 345: A Novel Operational Matrix of Caputo Fractional Derivatives of Fibonacci Polynomials: Spectral Solutions of Fractional Differential Equations]]>
http://www.mdpi.com/1099-4300/18/10/345
Herein, two numerical algorithms for solving some linear and nonlinear fractional-order differential equations are presented and analyzed. For this purpose, a novel operational matrix of fractional-order derivatives of Fibonacci polynomials was constructed and employed along with the application of the tau and collocation spectral methods. The convergence and error analysis of the suggested Fibonacci expansion were carefully investigated. Some numerical examples with comparisons are presented to ensure the efficiency, applicability and high accuracy of the proposed algorithms. Two accurate semi-analytic polynomial solutions for linear and nonlinear fractional differential equations are the result.Entropy2016-09-231810Article10.3390/e181003453451099-43002016-09-23doi: 10.3390/e18100345Waleed Abd-ElhameedYoussri Youssri<![CDATA[Entropy, Vol. 18, Pages 344: The Analytical Solution of Parabolic Volterra Integro-Differential Equations in the Infinite Domain]]>
http://www.mdpi.com/1099-4300/18/10/344
This article focuses on obtaining analytical solutions for d-dimensional, parabolic Volterra integro-differential equations with different types of frictional memory kernel. Based on Laplace transform and Fourier transform theories, the properties of the Fox-H function and convolution theorem, analytical solutions for the equations in the infinite domain are derived under three frictional memory kernel functions. The analytical solutions are expressed by infinite series, the generalized multi-parameter Mittag-Leffler function, the Fox-H function and the convolution form of the Fourier transform. In addition, graphical representations of the analytical solution under different parameters are given for one-dimensional parabolic Volterra integro-differential equations with a power-law memory kernel. It can be seen that the solution curves are subject to Gaussian decay at any given moment.Entropy2016-09-231810Article10.3390/e181003443441099-43002016-09-23doi: 10.3390/e18100344Yun ZhaoFengqun Zhao<![CDATA[Entropy, Vol. 18, Pages 347: Boltzmann Complexity: An Emergent Property of the Majorization Partial Order]]>
http://www.mdpi.com/1099-4300/18/10/347
Boltzmann macrostates, which are in 1:1 correspondence with the partitions of integers, are investigated. Integer partitions, unlike entropy, uniquely characterize Boltzmann states, but their use has been limited. Integer partitions are well known to be partially ordered by majorization. It is less well known that this partial order is fundamentally equivalent to the “mixedness” of the set of microstates that comprise each macrostate. Thus, integer partitions represent the fundamental property of the mixing character of Boltzmann states. The standard definition of incomparability in partial orders is applied to the individual Boltzmann macrostates to determine the number of other macrostates with which it is incomparable. We apply this definition to each partition (or macrostate) and calculate the number C with which that partition is incomparable. We show that the value of C complements the value of the Boltzmann entropy, S, obtained in the usual way. Results for C and S are obtained for Boltzmann states comprised of up to N = 50 microstates where there are 204,226 Boltzmann macrostates. We note that, unlike mixedness, neither C nor S uniquely characterizes macrostates. Plots of C vs. S are shown. The results are surprising and support the authors’ earlier suggestion that C be regarded as the complexity of the Boltzmann states. From this we propose that complexity may generally arise from incomparability in other systems as well.Entropy2016-09-231810Article10.3390/e181003473471099-43002016-09-23doi: 10.3390/e18100347William SeitzA. Kirwan<![CDATA[Entropy, Vol. 18, Pages 346: Entropy for the Quantized Field in the Atom-Field Interaction: Initial Thermal Distribution]]>
http://www.mdpi.com/1099-4300/18/10/346
We study the entropy of a quantized field in interaction with a two-level atom (in a pure state) when the field is initially in a mixture of two number states. We then generalise the result for a thermal state; i.e., an (infinite) statistical mixture of number states. We show that for some specific interaction times, the atom passes its purity to the field and therefore the field entropy decreases from its initial value.Entropy2016-09-231810Article10.3390/e181003463461099-43002016-09-23doi: 10.3390/e18100346Luis Andrade-MoralesBraulio Villegas-MartínezHector Moya-Cessa<![CDATA[Entropy, Vol. 18, Pages 340: Application of Information Theory for an Entropic Gradient of Ecological Sites]]>
http://www.mdpi.com/1099-4300/18/10/340
The present study was carried out to compute the straightforward formulations of information entropy for ecological sites and to arrange their locations along the ordination axes using the values of those entropic measures. The data of plant communities taken from six sites found in the Dedegül Mountain sub-district and the Sultan Mountain sub-district located in the Beyşehir Watershed was examined in the present study. Firstly entropic measures (i.e., marginal entropy, joint entropy, conditional entropy and mutual entropy) were computed for each of the sites. Next principal component analysis (PCA) was applied to the data composed of the values of those entropic measures. As a result of the first axis of the applied PCA, the arrangement of the sites was found meaningful from an ecological point of view because the sites were located along with the first component axis of the PCA by illustrating the climatic differences between the sub-districts.Entropy2016-09-221810Article10.3390/e181003403401099-43002016-09-22doi: 10.3390/e18100340Kürşad Özkan<![CDATA[Entropy, Vol. 18, Pages 342: The Differential Entropy of the Joint Distribution of Eigenvalues of Random Density Matrices]]>
http://www.mdpi.com/1099-4300/18/9/342
We derive exactly the differential entropy of the joint distribution of eigenvalues of Wishart matrices. Based on this result, we calculate the differential entropy of the joint distribution of eigenvalues of random mixed quantum states, which is induced by taking the partial trace over the environment of Haar-distributed bipartite pure states. Then, we investigate the differential entropy of the joint distribution of diagonal entries of random mixed quantum states. Finally, we investigate the relative entropy between these two kinds of distributions.Entropy2016-09-21189Article10.3390/e180903423421099-43002016-09-21doi: 10.3390/e18090342Laizhen LuoJiamei WangLin ZhangShifang Zhang<![CDATA[Entropy, Vol. 18, Pages 341: Effects of Fatty Infiltration of the Liver on the Shannon Entropy of Ultrasound Backscattered Signals]]>
http://www.mdpi.com/1099-4300/18/9/341
This study explored the effects of fatty infiltration on the signal uncertainty of ultrasound backscattered echoes from the liver. Standard ultrasound examinations were performed on 107 volunteers. For each participant, raw ultrasound image data of the right lobe of liver were acquired using a clinical scanner equipped with a 3.5-MHz convex transducer. An algorithmic scheme was proposed for ultrasound B-mode and entropy imaging. Fatty liver stage was evaluated using a sonographic scoring system. Entropy values constructed using the ultrasound radiofrequency (RF) and uncompressed envelope signals (denoted by HR and HE, respectively) as a function of fatty liver stage were analyzed using the Pearson correlation coefficient. Data were expressed as the median and interquartile range (IQR). Receiver operating characteristic (ROC) curve analysis with 95% confidence intervals (CIs) was performed to obtain the area under the ROC curve (AUC). The brightness of the entropy image typically increased as the fatty stage varied from mild to severe. The median value of HR monotonically increased from 4.69 (IQR: 4.60–4.79) to 4.90 (IQR: 4.87–4.92) as the severity of fatty liver increased (r = 0.63, p &lt; 0.0001). Concurrently, the median value of HE increased from 4.80 (IQR: 4.69–4.89) to 5.05 (IQR: 5.02–5.07) (r = 0.69, p &lt; 0.0001). In particular, the AUCs obtained using HE (95% CI) were 0.93 (0.87–0.99), 0.88 (0.82–0.94), and 0.76 (0.65–0.87) for fatty stages ≥mild, ≥moderate, and ≥severe, respectively. The sensitivity, specificity, and accuracy were 93.33%, 83.11%, and 86.00%, respectively (≥mild). Fatty infiltration increases the uncertainty of backscattered signals from livers. Ultrasound entropy imaging has potential for the routine examination of fatty liver disease.Entropy2016-09-21189Article10.3390/e180903413411099-43002016-09-21doi: 10.3390/e18090341Po-Hsiang TsuiYung-Liang Wan<![CDATA[Entropy, Vol. 18, Pages 339: Quantum Computation and Information: Multi-Particle Aspects]]>
http://www.mdpi.com/1099-4300/18/9/339
This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.Entropy2016-09-20189Editorial10.3390/e180903393391099-43002016-09-20doi: 10.3390/e18090339Demosthenes EllinasGiorgio KaniadakisJiannis PachosAntonio Scarfone<![CDATA[Entropy, Vol. 18, Pages 338: The Constant Information Radar]]>
http://www.mdpi.com/1099-4300/18/9/338
The constant information radar, or CIR, is a tracking radar that modulates target revisit time by maintaining a fixed mutual information measure. For highly dynamic targets that deviate significantly from the path predicted by the tracking motion model, the CIR adjusts by illuminating the target more frequently than it would for well-modeled targets. If SNR is low, the radar delays revisit to the target until the state entropy overcomes noise uncertainty. As a result, we show that the information measure is highly dependent on target entropy and target measurement covariance. A constant information measure maintains a fixed spectral efficiency to support the RF convergence of radar and communications. The result is a radar implementing a novel target scheduling algorithm based on information instead of heuristic or ad hoc methods. The CIR mathematically ensures that spectral use is justified.Entropy2016-09-19189Article10.3390/e180903383381099-43002016-09-19doi: 10.3390/e18090338Bryan PaulDaniel Bliss<![CDATA[Entropy, Vol. 18, Pages 337: Entropy Minimizing Curves with Application to Flight Path Design and Clustering]]>
http://www.mdpi.com/1099-4300/18/9/337
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the increase of traffic, it is expected that the system will reach its limits in the near future: a paradigm change in ATM is planned with the introduction of trajectory-based operations. In this context, sets of well-separated flight paths are computed in advance, tremendously reducing the number of unsafe situations that must be dealt with by controllers. Unfortunately, automated tools used to generate such planning generally issue trajectories not complying with operational practices or even flight dynamics. In this paper, a means of producing realistic air routes from the output of an automated trajectory design tool is investigated. For that purpose, the entropy of a system of curves is first defined, and a mean of iteratively minimizing it is presented. The resulting curves form a route network that is suitable for use in a semi-automated ATM system with human in the loop. The tool introduced in this work is quite versatile and may be applied also to unsupervised classification of curves: an example is given for French traffic.Entropy2016-09-15189Article10.3390/e180903373371099-43002016-09-15doi: 10.3390/e18090337Stéphane PuechmorelFlorence Nicol<![CDATA[Entropy, Vol. 18, Pages 336: Combined Forecasting of Streamflow Based on Cross Entropy]]>
http://www.mdpi.com/1099-4300/18/9/336
In this study, we developed a model of combined streamflow forecasting based on cross entropy to solve the problems of streamflow complexity and random hydrological processes. First, we analyzed the streamflow data obtained from Wudaogou station on the Huifa River, which is the second tributary of the Songhua River, and found that the streamflow was characterized by fluctuations and periodicity, and it was closely related to rainfall. The proposed method involves selecting similar years based on the gray correlation degree. The forecasting results obtained by the time series model (autoregressive integrated moving average), improved grey forecasting model, and artificial neural network model (a radial basis function) were used as a single forecasting model, and from the viewpoint of the probability density, the method for determining weights was improved by using the cross entropy model. The numerical results showed that compared with the single forecasting model, the combined forecasting model improved the stability of the forecasting model, and the prediction accuracy was better than that of conventional combined forecasting models.Entropy2016-09-15189Article10.3390/e180903363361099-43002016-09-15doi: 10.3390/e18090336Baohui MenRishang LongJianhua Zhang<![CDATA[Entropy, Vol. 18, Pages 335: Enhanced Energy Distribution for Quantum Information Heat Engines]]>
http://www.mdpi.com/1099-4300/18/9/335
A new scenario for energy distribution, security and shareability is presented that assumes the availability of quantum information heat engines and a thermal bath. It is based on the convertibility between entropy and work in the presence of a thermal reservoir. Our approach to the informational content of physical systems that are distributed between users is complementary to the conventional perspective of quantum communication. The latter places the value on the unpredictable content of the transmitted quantum states, while our interest focuses on their certainty. Some well-known results in quantum communication are reused in this context. Particularly, we describe a way to securely distribute quantum states to be used for unlocking energy from thermal sources. We also consider some multi-partite entangled and classically correlated states for a collaborative multi-user sharing of work extraction possibilities. In addition, the relation between the communication and work extraction capabilities is analyzed and written as an equation.Entropy2016-09-14189Article10.3390/e180903353351099-43002016-09-14doi: 10.3390/e18090335Jose Diaz de la CruzMiguel Martin-Delgado<![CDATA[Entropy, Vol. 18, Pages 332: Study on the Inherent Complex Features and Chaos Control of IS–LM Fractional-Order Systems]]>
http://www.mdpi.com/1099-4300/18/9/332
Based on the traditional IS–LM economic theory, which shows the relationship between interest rates and output in the goods and services market and the money market in macroeconomic. We established a four-dimensional IS–LM model involving four variables. With the Caputo fractional calculus theory, we improved it into a fractional order nonlinear model, analyzed the complexity and stability of the fractional order system. The existences conditions of attractors under different order conditions are compared, and obtain the orders when the system reaches a stable state. Have the detail analysis on the dynamic phenomena, such as the strange attractor, sensitivity to initial values through phase diagram and the power spectral. The order changes in two ways: orders changes synchronously or single order changes. The results show regardless of which the order situation is, the economic system will enter into multiple states, such as strong divergence, strange attractor and the convergence, finally, system will enter into the stable state under a certain order; parameter changes have similar effects on the economic system. Therefore, selecting an appropriate order is significant for an economic system, which guarantees a steady development. Furthermore, this paper construct the chaos control to IS–LM fractional-order macroeconomic model by means of linear feedback control method, by calculating and adjusting the feedback coefficient, we make the system return to the convergence state.Entropy2016-09-14189Article10.3390/e180903323321099-43002016-09-14doi: 10.3390/e18090332Junhai MaWenbo RenXueli Zhan<![CDATA[Entropy, Vol. 18, Pages 261: Fuzzy Adaptive Repetitive Control for Periodic Disturbance with Its Application to High Performance Permanent Magnet Synchronous Motor Speed Servo Systems]]>
http://www.mdpi.com/1099-4300/18/9/261
For reducing the steady state speed ripple, especially in high performance speed servo system applications, the steady state precision is more and more important for real servo systems. This paper investigates the steady state speed ripple periodic disturbance problem for a permanent magnet synchronous motor (PMSM) servo system; a fuzzy adaptive repetitive controller is designed in the speed loop based on repetitive control and fuzzy information theory for reducing periodic disturbance. Firstly, the various sources of the PMSM speed ripple problem are described and analyzed. Then, the mathematical model of PMSM is given. Subsequently, a fuzzy adaptive repetitive controller based on repetitive control and fuzzy logic control is designed for the PMSM speed servo system. In addition, the system stability analysis is also deduced. Finally, the simulation and experiment implementation are respectively based on the MATLAB/Simulink and TMS320F2808 of Texas instrument company, DSP (digital signal processor) hardware platform. Comparing to the proportional integral (PI) controller, simulation and experimental results show that the proposed fuzzy adaptive repetitive controller has better periodic disturbance rejection ability and higher steady state precision.Entropy2016-09-14189Article10.3390/e180902612611099-43002016-09-14doi: 10.3390/e18090261Junxiao Wang<![CDATA[Entropy, Vol. 18, Pages 327: Sparse Trajectory Prediction Based on Multiple Entropy Measures]]>
http://www.mdpi.com/1099-4300/18/9/327
Trajectory prediction is an important problem that has a large number of applications. A common approach to trajectory prediction is based on historical trajectories. However, existing techniques suffer from the “data sparsity problem”. The available historical trajectories are far from enough to cover all possible query trajectories. We propose the sparsity trajectory prediction algorithm based on multiple entropy measures (STP-ME) to address the data sparsity problem. Firstly, the moving region is iteratively divided into a two-dimensional plane grid graph, and each trajectory is represented as a grid sequence with temporal information. Secondly, trajectory entropy is used to evaluate trajectory’s regularity, the L-Z entropy estimator is implemented to calculate trajectory entropy, and a new trajectory space is generated through trajectory synthesis. We define location entropy and time entropy to measure the popularity of locations and timeslots respectively. Finally, a second-order Markov model that contains a temporal dimension is adopted to perform sparse trajectory prediction. The experiments show that when trip completed percentage increases towards 90%, the coverage of the baseline algorithm decreases to almost 25%, while the STP-ME algorithm successfully copes with it as expected with only an unnoticeable drop in coverage, and can constantly answer almost 100% of query trajectories. It is found that the STP-ME algorithm improves the prediction accuracy generally by as much as 8%, 3%, and 4%, compared to the baseline algorithm, the second-order Markov model (2-MM), and sub-trajectory synthesis (SubSyn) algorithm, respectively. At the same time, the prediction time of STP-ME algorithm is negligible (10 μ s ), greatly outperforming the baseline algorithm (100 ms ).Entropy2016-09-14189Article10.3390/e180903273271099-43002016-09-14doi: 10.3390/e18090327Lei ZhangLeijun LiuZhanguo XiaWen LiQingfu Fan<![CDATA[Entropy, Vol. 18, Pages 334: Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing]]>
http://www.mdpi.com/1099-4300/18/9/334
Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp) has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.Entropy2016-09-13189Editorial10.3390/e180903343341099-43002016-09-13doi: 10.3390/e18090334James ParkWanlei Zhou<![CDATA[Entropy, Vol. 18, Pages 333: Design of Light-Weight High-Entropy Alloys]]>
http://www.mdpi.com/1099-4300/18/9/333
High-entropy alloys (HEAs) are a new class of solid-solution alloys that have attracted worldwide attention for their outstanding properties. Owing to the demand from transportation and defense industries, light-weight HEAs have also garnered widespread interest from scientists for use as potential structural materials. Great efforts have been made to study the phase-formation rules of HEAs to accelerate and refine the discovery process. In this paper, many proposed solid-solution phase-formation rules are assessed, based on a series of known and newly-designed light-weight HEAs. The results indicate that these empirical rules work for most compositions but also fail for several alloys. Light-weight HEAs often involve the additions of Al and/or Ti in great amounts, resulting in large negative enthalpies for forming solid-solution phases and/or intermetallic compounds. Accordingly, these empirical rules need to be modified with the new experimental data. In contrast, CALPHAD (acronym of the calculation of phase diagrams) method is demonstrated to be an effective approach to predict the phase formation in HEAs as a function of composition and temperature. Future perspectives on the design of light-weight HEAs are discussed in light of CALPHAD modeling and physical metallurgy principles.Entropy2016-09-13189Article10.3390/e180903333331099-43002016-09-13doi: 10.3390/e18090333Rui FengMichael GaoChanho LeeMichael MathesTingting ZuoShuying ChenJeffrey HawkYong ZhangPeter Liaw<![CDATA[Entropy, Vol. 18, Pages 330: Short Term Electrical Load Forecasting Using Mutual Information Based Feature Selection with Generalized Minimum-Redundancy and Maximum-Relevance Criteria]]>
http://www.mdpi.com/1099-4300/18/9/330
A feature selection method based on the generalized minimum redundancy and maximum relevance (G-mRMR) is proposed to improve the accuracy of short-term load forecasting (STLF). First, mutual information is calculated to analyze the relations between the original features and the load sequence, as well as the redundancy among the original features. Second, a weighting factor selected by statistical experiments is used to balance the relevance and redundancy of features when using the G-mRMR. Third, each feature is ranked in a descending order according to its relevance and redundancy as computed by G-mRMR. A sequential forward selection method is utilized for choosing the optimal subset. Finally, a STLF predictor is constructed based on random forest with the obtained optimal subset. The effectiveness and improvement of the proposed method was tested with actual load data.Entropy2016-09-08189Article10.3390/e180903303301099-43002016-09-08doi: 10.3390/e18090330Nantian HuangZhiqiang HuGuowei CaiDongfeng Yang<![CDATA[Entropy, Vol. 18, Pages 329: Solution of Higher Order Nonlinear Time-Fractional Reaction Diffusion Equation]]>
http://www.mdpi.com/1099-4300/18/9/329
The approximate analytical solution of fractional order, nonlinear, reaction differential equations, namely the nonlinear diffusion equations, with a given initial condition, is obtained by using the homotopy analysis method. As a demonstration of a good mathematical model, the present article gives graphical presentations of the effect of the reaction terms on the solution profile for various anomalous exponents of particular cases, to predict damping of the field variable. Numerical computations of the convergence control parameter, used to evaluate the convergence of approximate series solution through minimizing error, are also presented graphically for these cases.Entropy2016-09-08189Article10.3390/e180903293291099-43002016-09-08doi: 10.3390/e18090329Neeraj TripathiSubir DasSeng OngHossein JafariMaysaa Al Qurashi<![CDATA[Entropy, Vol. 18, Pages 331: Network Entropies of the Chinese Financial Market]]>
http://www.mdpi.com/1099-4300/18/9/331
Based on the data from the Chinese financial market, this paper focuses on analyzing three types of network entropies of the financial market, namely, Shannon, Renyi and Tsallis entropies. The findings suggest that Shannon entropy can reflect the volatility of the financial market, that Renyi and Tsallis entropies also have this function when their parameter has a positive value, and that Renyi and Tsallis entropies can reflect the extreme case of the financial market when their parameter has a negative value.Entropy2016-09-08189Article10.3390/e180903313311099-43002016-09-08doi: 10.3390/e18090331Shouwei LiJianmin HeKai Song<![CDATA[Entropy, Vol. 18, Pages 325: Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem]]>
http://www.mdpi.com/1099-4300/18/9/325
This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs). We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.Entropy2016-09-07189Article10.3390/e180903253251099-43002016-09-07doi: 10.3390/e18090325Juliana Bueno-SolerWalter Carnielli<![CDATA[Entropy, Vol. 18, Pages 328: Inferring Weighted Directed Association Networks from Multivariate Time Series with the Small-Shuffle Symbolic Transfer Entropy Spectrum Method]]>
http://www.mdpi.com/1099-4300/18/9/328
Complex network methodology is very useful for complex system exploration. However, the relationships among variables in complex systems are usually not clear. Therefore, inferring association networks among variables from their observed data has been a popular research topic. We propose a method, named small-shuffle symbolic transfer entropy spectrum (SSSTES), for inferring association networks from multivariate time series. The method can solve four problems for inferring association networks, i.e., strong correlation identification, correlation quantification, direction identification and temporal relation identification. The method can be divided into four layers. The first layer is the so-called data layer. Data input and processing are the things to do in this layer. In the second layer, we symbolize the model data, original data and shuffled data, from the previous layer and calculate circularly transfer entropy with different time lags for each pair of time series variables. Thirdly, we compose transfer entropy spectrums for pairwise time series with the previous layer’s output, a list of transfer entropy matrix. We also identify the correlation level between variables in this layer. In the last layer, we build a weighted adjacency matrix, the value of each entry representing the correlation level between pairwise variables, and then get the weighted directed association network. Three sets of numerical simulated data from a linear system, a nonlinear system and a coupled Rossler system are used to show how the proposed approach works. Finally, we apply SSSTES to a real industrial system and get a better result than with two other methods.Entropy2016-09-07189Article10.3390/e180903283281099-43002016-09-07doi: 10.3390/e18090328Yanzhu HuHuiyang ZhaoXinbo Ai<![CDATA[Entropy, Vol. 18, Pages 319: Quantum Hysteresis in Coupled Light–Matter Systems]]>
http://www.mdpi.com/1099-4300/18/9/319
We investigate the non-equilibrium quantum dynamics of a canonical light–matter system—namely, the Dicke model—when the light–matter interaction is ramped up and down through a cycle across the quantum phase transition. Our calculations reveal a rich set of dynamical behaviors determined by the cycle times, ranging from the slow, near adiabatic regime through to the fast, sudden quench regime. As the cycle time decreases, we uncover a crossover from an oscillatory exchange of quantum information between light and matter that approaches a reversible adiabatic process, to a dispersive regime that generates large values of light–matter entanglement. The phenomena uncovered in this work have implications in quantum control, quantum interferometry, as well as in quantum information theory.Entropy2016-09-07189Article10.3390/e180903193191099-43002016-09-07doi: 10.3390/e18090319Fernando Gómez-RuizOscar AcevedoLuis QuirogaFerney RodríguezNeil Johnson<![CDATA[Entropy, Vol. 18, Pages 324: Market Sentiments Distribution Law]]>
http://www.mdpi.com/1099-4300/18/9/324
The Stock Exchange is basically ruled by the extreme market sentiments of euphoria and fear. The type of sentiment is given by the color of the candlestick (white = bullish sentiments, black = bearish sentiments), meanwhile the intensity of the sentiment is given by the size of it. In this paper you will see that the intensity of any sentiment is astonishingly distributed in a robust, systematic and universal way, according to a law of exponential decay, the conclusion of which is supported by the analysis of the Lyapunov exponent, the information entropy and the frequency distribution of candlestick size.Entropy2016-09-07189Article10.3390/e180903243241099-43002016-09-07doi: 10.3390/e18090324Jorge Reyes-Molina<![CDATA[Entropy, Vol. 18, Pages 326: Bayesian Dependence Tests for Continuous, Binary and Mixed Continuous-Binary Variables]]>
http://www.mdpi.com/1099-4300/18/9/326
Tests for dependence of continuous, discrete and mixed continuous-discrete variables are ubiquitous in science. The goal of this paper is to derive Bayesian alternatives to frequentist null hypothesis significance tests for dependence. In particular, we will present three Bayesian tests for dependence of binary, continuous and mixed variables. These tests are nonparametric and based on the Dirichlet Process, which allows us to use the same prior model for all of them. Therefore, the tests are “consistent” among each other, in the sense that the probabilities that variables are dependent computed with these tests are commensurable across the different types of variables being tested. By means of simulations with artificial data, we show the effectiveness of the new tests.Entropy2016-09-06189Article10.3390/e180903263261099-43002016-09-06doi: 10.3390/e18090326Alessio BenavoliCassio de Campos<![CDATA[Entropy, Vol. 18, Pages 320: Using Graph and Vertex Entropy to Compare Empirical Graphs with Theoretical Graph Models]]>
http://www.mdpi.com/1099-4300/18/9/320
Over the years, several theoretical graph generation models have been proposed. Among the most prominent are: the Erdős–Renyi random graph model, Watts–Strogatz small world model, Albert–Barabási preferential attachment model, Price citation model, and many more. Often, researchers working with real-world data are interested in understanding the generative phenomena underlying their empirical graphs. They want to know which of the theoretical graph generation models would most probably generate a particular empirical graph. In other words, they expect some similarity assessment between the empirical graph and graphs artificially created from theoretical graph generation models. Usually, in order to assess the similarity of two graphs, centrality measure distributions are compared. For a theoretical graph model this means comparing the empirical graph to a single realization of a theoretical graph model, where the realization is generated from the given model using an arbitrary set of parameters. The similarity between centrality measure distributions can be measured using standard statistical tests, e.g., the Kolmogorov–Smirnov test of distances between cumulative distributions. However, this approach is both error-prone and leads to incorrect conclusions, as we show in our experiments. Therefore, we propose a new method for graph comparison and type classification by comparing the entropies of centrality measure distributions (degree centrality, betweenness centrality, closeness centrality). We demonstrate that our approach can help assign the empirical graph to the most similar theoretical model using a simple unsupervised learning method.Entropy2016-09-05189Article10.3390/e180903203201099-43002016-09-05doi: 10.3390/e18090320Tomasz KajdanowiczMikołaj Morzy<![CDATA[Entropy, Vol. 18, Pages 322: Mechanical Fault Diagnosis of High Voltage Circuit Breakers with Unknown Fault Type Using Hybrid Classifier Based on LMD and Time Segmentation Energy Entropy]]>
http://www.mdpi.com/1099-4300/18/9/322
In order to improve the identification accuracy of the high voltage circuit breakers’ (HVCBs) mechanical fault types without training samples, a novel mechanical fault diagnosis method of HVCBs using a hybrid classifier constructed with Support Vector Data Description (SVDD) and fuzzy c-means (FCM) clustering method based on Local Mean Decomposition (LMD) and time segmentation energy entropy (TSEE) is proposed. Firstly, LMD is used to decompose nonlinear and non-stationary vibration signals of HVCBs into a series of product functions (PFs). Secondly, TSEE is chosen as feature vectors with the superiority of energy entropy and characteristics of time-delay faults of HVCBs. Then, SVDD trained with normal samples is applied to judge mechanical faults of HVCBs. If the mechanical fault is confirmed, the new fault sample and all known fault samples are clustered by FCM with the cluster number of known fault types. Finally, another SVDD trained by the specific fault samples is used to judge whether the fault sample belongs to an unknown type or not. The results of experiments carried on a real SF6 HVCB validate that the proposed fault-detection method is effective for the known faults with training samples and unknown faults without training samples.Entropy2016-09-03189Article10.3390/e180903223221099-43002016-09-03doi: 10.3390/e18090322Nantian HuangLihua FangGuowei CaiDianguo XuHuaijin ChenYonghui Nie<![CDATA[Entropy, Vol. 18, Pages 323: Entropy Base Estimation of Moisture Content of the Top 10-m Unsaturated Soil for the Badain Jaran Desert in Northwestern China]]>
http://www.mdpi.com/1099-4300/18/9/323
Estimation of soil moisture distribution in desert regions is challenged by the deep unsaturated zone and the extreme natural environment. In this study, an entropy-based method, consisting of information entropy, principle of maximum entropy (PME), solutions to PME with constraints, and the determination of parameters, is used to estimate the soil moisture distribution in the 10 m deep vadose zone of a desert region. Firstly, the soil moisture distribution is described as a scaled probability density function (PDF), which is solved by PME with the constraints of normalization, known arithmetic mean and geometric mean, and the solution is the general form of gamma distribution. A constant arithmetic mean is determined by considering the stable average recharge rate at thousand year scale, and an approximate constant geometric mean is determined by the low flow rate (about 1 cm a year). Followed, the parameters of the scaled PDF of gamma distribution are determined by local environmental factors like terrain and vegetation: the multivariate linear equations are established to qualify the relationship between the parameters and the environmental factors on the basis of nineteen random soil moisture profiles about depth through the application of fuzzy mathematics. Finally, the accuracy is tested using correlation coefficient (CC) and relative error. This method performs with CC larger than 0.9 in more than a half profiles and most larger than 0.8, the relative errors are less than 30% in most of soil moisture profiles and can be as low as less than 15% when parameters fitted appropriately. Therefore, this study provides an alternative method to estimate soil moisture distribution in top 0–10 m layers of the Badain Jaran Desert based on local terrain and vegetation factors instead of drilling sand samples, this method would be useful in desert regions with extreme natural conditions since these environmental factors can be obtained by remote sensing data. Meanwhile, we should bear in mind that this method is challenged in humid regions since more intensive and frequent precipitation, and more vegetation cover make the system much more complex.Entropy2016-09-03189Article10.3390/e180903233231099-43002016-09-03doi: 10.3390/e18090323Xiangyang ZhouWenjuan LeiJinzhu Ma<![CDATA[Entropy, Vol. 18, Pages 321: Lattice Distortions in the FeCoNiCrMn High Entropy Alloy Studied by Theory and Experiment]]>
http://www.mdpi.com/1099-4300/18/9/321
Lattice distortions constitute one of the main features characterizing high entropy alloys. Local lattice distortions have, however, only rarely been investigated in these multi-component alloys. We, therefore, employ a combined theoretical electronic structure and experimental approach to study the atomistic distortions in the FeCoNiCrMn high entropy (Cantor) alloy by means of density-functional theory and extended X-ray absorption fine structure spectroscopy. Particular attention is paid to element-resolved distortions for each constituent. The individual mean distortions are small on average, &lt;1%, but their fluctuations (i.e., standard deviations) are an order of magnitude larger, in particular for Cr and Mn. Good agreement between theory and experiment is found.Entropy2016-09-02189Article10.3390/e180903213211099-43002016-09-02doi: 10.3390/e18090321Hyun OhDuancheng MaGerard LeysonBlazej GrabowskiEun ParkFritz KörmannDierk Raabe<![CDATA[Entropy, Vol. 18, Pages 297: Generalized Robustness of Contextuality]]>
http://www.mdpi.com/1099-4300/18/9/297
Motivated by the importance of contextuality and a work on the robustness of the entanglement of mixed quantum states, the robustness of contextuality (RoC) R C ( e ) of an empirical model e against non-contextual noises was introduced and discussed in Science China Physics, Mechanics and Astronomy (59(4) and 59(9), 2016). Because noises are not always non-contextual, this paper introduces and discusses the generalized robustness of contextuality (GRoC) R g ( e ) of an empirical model e against general noises. It is proven that R g ( e ) = 0 if and only if e is non-contextual. This means that the quantity R g can be used to distinguish contextual empirical models from non-contextual ones. It is also shown that the function R g is convex on the set of all empirical models and continuous on the set of all no-signaling empirical models. For any two empirical models e and f such that the generalized relative robustness of e with respect to f is finite, a fascinating relationship between the GRoCs of e and f is proven, which reads R g ( e ) R g ( f ) ≤ 1 . Lastly, for any n-cycle contextual box e, a relationship between the GRoC R g ( e ) and the extent Δ e of violating the non-contextual inequalities is established.Entropy2016-09-01189Article10.3390/e180902972971099-43002016-09-01doi: 10.3390/e18090297Huixian MengHuaixin CaoWenhua WangYajing FanLiang Chen<![CDATA[Entropy, Vol. 18, Pages 318: Entropy-Based Modeling of Velocity Lag in Sediment-Laden Open Channel Turbulent Flow]]>
http://www.mdpi.com/1099-4300/18/9/318
In the last few decades, a wide variety of instruments with laser-based techniques have been developed that enable experimentally measuring particle velocity and fluid velocity separately in particle-laden flow. Experiments have revealed that stream-wise particle velocity is different from fluid velocity, and this velocity difference is commonly known as “velocity lag” in the literature. A number of experimental, as well as theoretical investigations have been carried out to formulate deterministic mathematical models of velocity lag, based on several turbulent features. However, a probabilistic study of velocity lag does not seem to have been reported, to the best of our knowledge. The present study therefore focuses on the modeling of velocity lag in open channel turbulent flow laden with sediment using the entropy theory along with a hypothesis on the cumulative distribution function. This function contains a parameter η, which is shown to be a function of specific gravity, particle diameter and shear velocity. The velocity lag model is tested using a wide range of twenty-two experimental runs collected from the literature and is also compared with other models of velocity lag. Then, an error analysis is performed to further evaluate the prediction accuracy of the proposed model, especially in comparison to other models. The model is also able to explain the physical characteristics of velocity lag caused by the interaction between the particles and the fluid.Entropy2016-08-30189Article10.3390/e180903183181099-43002016-08-30doi: 10.3390/e18090318Manotosh KumbhakarSnehasis KunduKoeli GhoshalVijay Singh<![CDATA[Entropy, Vol. 18, Pages 317: Entropy Complexity and Stability of a Nonlinear Dynamic Game Model with Two Delays]]>
http://www.mdpi.com/1099-4300/18/9/317
In this paper, a duopoly game model with double delays in hydropower market is established, and the research focus on the influence of time delay parameter on the complexity of the system. Firstly, we established a game model for the enterprises considering both the current and the historical output when making decisions. Secondly, the existence and stability of Hopf bifurcation are analyzed, and the conditions and main conclusions of Hopf bifurcation are given. Thirdly, numerical simulation and analysis are carried out to verify the conclusions of the theoretical analysis. The effect of delay parameter on the stability of the system is simulated by a bifurcation diagram, the Lyapunov exponent, and an entropic diagram; in addition, the stability region of the system is given by a 2D parameter bifurcation diagram and a 3D parameter bifurcation diagram. Finally, the method of delayed feedback control is used to control the chaotic system. The research results can provide a guideline for enterprise decision-making.Entropy2016-08-30189Article10.3390/e180903173171099-43002016-08-30doi: 10.3390/e18090317Zhihui HanJunhai MaFengshan SiWenbo Ren<![CDATA[Entropy, Vol. 18, Pages 314: Exergy Analysis of a Syngas-Fueled Combined Cycle with Chemical-Looping Combustion and CO2 Sequestration]]>
http://www.mdpi.com/1099-4300/18/9/314
Fossil fuels are still widely used for power generation. Nevertheless, it is possible to attain a short- and medium-term substantial reduction of greenhouse gas emissions to the atmosphere through a sequestration of the CO2 produced in fuels’ oxidation. The chemical-looping combustion (CLC) technique is based on a chemical intermediate agent, which gets oxidized in an air reactor and is then conducted to a separated fuel reactor, where it oxidizes the fuel in turn. Thus, the oxidation products CO2 and H2O are obtained in an output flow in which the only non-condensable gas is CO2, allowing the subsequent sequestration of CO2 without an energy penalty. Furthermore, with shrewd configurations, a lower exergy destruction in the combustion chemical transformation can be achieved. This paper focus on a second law analysis of a CLC combined cycle power plant with CO2 sequestration using syngas from coal and biomass gasification as fuel. The key thermodynamic parameters are optimized via the exergy method. The proposed power plant configuration is compared with a similar gas turbine system with a conventional combustion, finding a notable increase of the power plant efficiency. Furthermore, the influence of syngas composition on the results is investigated by considering different H2-content fuels.Entropy2016-08-25189Article10.3390/e180903143141099-43002016-08-25doi: 10.3390/e18090314Álvaro Urdiales MontesinoÁngel Jiménez ÁlvaroJavier Rodríguez MartínRafael Nieto Carlier<![CDATA[Entropy, Vol. 18, Pages 316: Stationary Stability for Evolutionary Dynamics in Finite Populations]]>
http://www.mdpi.com/1099-4300/18/9/316
We demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the Moran process with mutation and generalizations, as well as a generalized notion of evolutionary stability that includes mutation called an incentive stable state (ISS) candidate. For sufficiently large populations, extrema of the stationary distribution are ISS candidates and we give a family of Lyapunov quantities that are locally minimized at the stationary extrema and at ISS candidates. In various examples, including for the Moran and Wright–Fisher processes, we show that the local maxima of the stationary distribution capture the traditionally-defined evolutionarily stable states. The classical stability theory of the replicator dynamic is recovered in the large population limit. Finally we include descriptions of possible extensions to populations of variable size and populations evolving on graphs.Entropy2016-08-25189Article10.3390/e180903163161099-43002016-08-25doi: 10.3390/e18090316Marc HarperDashiell Fryer<![CDATA[Entropy, Vol. 18, Pages 315: Description of Seizure Process for Gas Dynamic Spray of Metal Powders from Non-Equilibrium Thermodynamics Standpoint]]>
http://www.mdpi.com/1099-4300/18/9/315
The seizure process has been considered from the non-equilibrium thermodynamics and self-organization theory standpoints. It has been testified that, for the intensification of powder mix particles seizing with the substrate during spraying, it is required that relatively light components of the powder mix should be preferably transferred into the friction zone. The theory inferences have been experimentally confirmed, as exemplified by the gas dynamic spray of copper-zinc powders mix.Entropy2016-08-25189Article10.3390/e180903153151099-43002016-08-25doi: 10.3390/e18090315Iosif GershmanEugeniy GershmanGerman Fox-RabinovichStephen Veldhuis<![CDATA[Entropy, Vol. 18, Pages 313: Application of Entropy-Based Features to Predict Defibrillation Outcome in Cardiac Arrest]]>
http://www.mdpi.com/1099-4300/18/9/313
Prediction of defibrillation success is of vital importance to guide therapy and improve the survival of patients suffering out-of-hospital cardiac arrest (OHCA). Currently, the most efficient methods to predict shock success are based on the analysis of the electrocardiogram (ECG) during ventricular fibrillation (VF), and recent studies suggest the efficacy of waveform indices that characterize the underlying non-linear dynamics of VF. In this study we introduce, adapt and fully characterize six entropy indices for VF shock outcome prediction, based on the classical definitions of entropy to measure the regularity and predictability of a time series. Data from 163 OHCA patients comprising 419 shocks (107 successful) were used, and the performance of the entropy indices was characterized in terms of embedding dimension (m) and matching tolerance (r). Six classical predictors were also assessed as baseline prediction values. The best prediction results were obtained for fuzzy entropy (FuzzEn) with m = 3 and an amplitude-dependent tolerance of r = 80 μ V . This resulted in a balanced sensitivity/specificity of 80.4%/76.9%, which improved by over five points the results obtained for the best classical predictor. These results suggest that a FuzzEn approach for a joint quantification of VF amplitude and its non-linear dynamics may be a promising tool to optimize OHCA treatment.Entropy2016-08-24189Article10.3390/e180903133131099-43002016-08-24doi: 10.3390/e18090313Beatriz ChicoteUnai IrustaRaúl AlcarazJosé RietaElisabete AramendiIraia IsasiDaniel AlonsoKarlos Ibarguren<![CDATA[Entropy, Vol. 18, Pages 311: Distribution Entropy Boosted VLAD for Image Retrieval]]>
http://www.mdpi.com/1099-4300/18/8/311
Several recent works have shown that aggregating local descriptors to generate global image representation results in great efficiency for retrieval and classification tasks. The most popular method following this approach is VLAD (Vector of Locally Aggregated Descriptors). We present a novel image presentation called Distribution Entropy Boosted VLAD (EVLAD), which extends the original vector of locally aggregated descriptors. The original VLAD adopts only residuals to depict the distribution information of every visual word and neglects other statistical clues, so its discriminative power is limited. To address this issue, this paper proposes the use of the distribution entropy of each cluster as supplementary information to enhance the search accuracy. To fuse two feature sources organically, two fusion methods after a new normalization stage meeting power law are also investigated, which generate identically sized and double-sized vectors as the original VLAD. We validate our approach in image retrieval and image classification experiments. Experimental results demonstrate the effectiveness of our algorithm.Entropy2016-08-24188Article10.3390/e180803113111099-43002016-08-24doi: 10.3390/e18080311Qiuzhan ZhouCheng WangPingping LiuQingliang LiYeran WangShuozhang Chen<![CDATA[Entropy, Vol. 18, Pages 310: SU(2) Yang–Mills Theory: Waves, Particles, and Quantum Thermodynamics]]>
http://www.mdpi.com/1099-4300/18/9/310
We elucidate how Quantum Thermodynamics at temperature T emerges from pure and classical S U ( 2 ) Yang–Mills theory on a four-dimensional Euclidean spacetime slice S 1 × R 3 . The concept of a (deconfining) thermal ground state, composed of certain solutions to the fundamental, classical Yang–Mills equation, allows for a unified addressation of both (classical) wave- and (quantum) particle-like excitations thereof. More definitely, the thermal ground state represents the interplay between nonpropagating, periodic configurations which are electric-magnetically (anti)selfdual in a non-trivial way and possess topological charge modulus unity. Their trivial-holonomy versions—Harrington–Shepard (HS) (anti)calorons—yield an accurate a priori estimate of the thermal ground state in terms of spatially coarse-grained centers, each containing one quantum of action ℏ localized at its inmost spacetime point, which induce an inert adjoint scalar field ϕ ( | ϕ | spatio-temporally constant). The field ϕ , in turn, implies an effective pure-gauge configuration, a μ gs , accurately describing HS (anti)caloron overlap. Spatial homogeneity of the thermal ground-state estimate ϕ , a μ gs demands that (anti)caloron centers are densely packed, thus representing a collective departure from (anti)selfduality. Effectively, such a “nervous” microscopic situation gives rise to two static phenomena: finite ground-state energy density ρ gs and pressure P gs with ρ gs = − P gs as well as the (adjoint) Higgs mechanism. The peripheries of HS (anti)calorons are static and resemble (anti)selfdual dipole fields whose apparent dipole moments are determined by | ϕ | and T, protecting them against deformation potentially caused by overlap. Such a protection extends to the spatial density of HS (anti)caloron centers. Thus the vacuum electric permittivity ϵ 0 and magnetic permeability μ 0 , supporting the propagation of wave-like disturbances in the U ( 1 ) Cartan subalgebra of S U ( 2 ) , can be reliably calculated for disturbances which do not probe HS (anti)caloron centers. Both ϵ 0 and μ 0 turn out to be temperature independent in thermal equilibrium but also for an isolated, monochromatic U ( 1 ) wave. HS (anti)caloron centers, on the other hand, react onto wave-like disturbances, which would resolve their spatio-temporal structure, by indeterministic emissions of quanta of energy and momentum. Thermodynamically seen, such events are Boltzmann weighted and occur independently at distinct locations in space and instants in (Minkowskian) time, entailing the Bose–Einstein distribution. Small correlative ramifications associate with effective radiative corrections, e.g., in terms of polarization tensors. We comment on an S U ( 2 ) × S U ( 2 ) based gauge-theory model, describing wave- and particle-like aspects of electromagnetic disturbances within the so far experimentally/observationally investigated spectrum.Entropy2016-08-23189Article10.3390/e180903103101099-43002016-08-23doi: 10.3390/e18090310Ralf Hofmann<![CDATA[Entropy, Vol. 18, Pages 312: Combinatorial Intricacies of Labeled Fano Planes]]>
http://www.mdpi.com/1099-4300/18/9/312
Given a seven-element set X = { 1 , 2 , 3 , 4 , 5 , 6 , 7 } , there are 30 ways to define a Fano plane on it. Let us call a line of such a Fano plane—that is to say an unordered triple from X—ordinary or defective, according to whether the sum of two smaller integers from the triple is or is not equal to the remaining one, respectively. A point of the labeled Fano plane is said to be of the order s, 0 ≤ s ≤ 3 , if there are s defective lines passing through it. With such structural refinement in mind, the 30 Fano planes are shown to fall into eight distinct types. Out of the total of 35 lines, nine ordinary lines are of five different kinds, whereas the remaining 26 defective lines yield as many as ten distinct types. It is shown that no labeled Fano plane can have all points of zero-th order, or feature just one point of order two. A connection with prominent configurations in Steiner triple systems is also pointed out.Entropy2016-08-23189Letter10.3390/e180903123121099-43002016-08-23doi: 10.3390/e18090312Metod Saniga<![CDATA[Entropy, Vol. 18, Pages 272: Sleep Stage Classification Using EEG Signal Analysis: A Comprehensive Survey and New Investigation]]>
http://www.mdpi.com/1099-4300/18/9/272
Sleep specialists often conduct manual sleep stage scoring by visually inspecting the patient’s neurophysiological signals collected at sleep labs. This is, generally, a very difficult, tedious and time-consuming task. The limitations of manual sleep stage scoring have escalated the demand for developing Automatic Sleep Stage Classification (ASSC) systems. Sleep stage classification refers to identifying the various stages of sleep and is a critical step in an effort to assist physicians in the diagnosis and treatment of related sleep disorders. The aim of this paper is to survey the progress and challenges in various existing Electroencephalogram (EEG) signal-based methods used for sleep stage identification at each phase; including pre-processing, feature extraction and classification; in an attempt to find the research gaps and possibly introduce a reasonable solution. Many of the prior and current related studies use multiple EEG channels, and are based on 30 s or 20 s epoch lengths which affect the feasibility and speed of ASSC for real-time applications. Thus, in this paper, we also present a novel and efficient technique that can be implemented in an embedded hardware device to identify sleep stages using new statistical features applied to 10 s epochs of single-channel EEG signals. In this study, the PhysioNet Sleep European Data Format (EDF) Database was used. The proposed methodology achieves an average classification sensitivity, specificity and accuracy of 89.06%, 98.61% and 93.13%, respectively, when the decision tree classifier is applied. Finally, our new method is compared with those in recently published studies, which reiterates the high classification accuracy performance.Entropy2016-08-23189Review10.3390/e180902722721099-43002016-08-23doi: 10.3390/e18090272Khald AboalayonMiad FaezipourWafaa AlmuhammadiSaeid Moslehpour<![CDATA[Entropy, Vol. 18, Pages 307: Weighted-Permutation Entropy Analysis of Resting State EEG from Diabetics with Amnestic Mild Cognitive Impairment]]>
http://www.mdpi.com/1099-4300/18/8/307
Diabetes is a significant public health issue as it increases the risk for dementia and Alzheimer’s disease (AD). In this study, we aim to investigate whether weighted-permutation entropy (WPE) and permutation entropy (PE) of resting-state EEG (rsEEG) could be applied as potential objective biomarkers to distinguish type 2 diabetes patients with amnestic mild cognitive impairment (aMCI) from those with normal cognitive function. rsEEG series were acquired from 28 patients with type 2 diabetes (16 aMCI patients and 12 controls), and neuropsychological assessments were performed. The rsEEG signals were analysed using WPE and PE methods. The correlations between the PE or WPE of the rsEEG and the neuropsychological assessments were analysed as well. The WPE in the right temporal (RT) region of the aMCI diabetics was lower than the controls, and the WPE was significantly positively correlated to the scores of the Auditory Verbal Learning Test (AVLT) (AVLT-Immediate recall, AVLT-Delayed recall, AVLT-Delayed recognition) and the Wechsler Adult Intelligence Scale Digit Span Test (WAIS-DST). These findings were not obtained with PE. We concluded that the WPE of rsEEG recordings could distinguish aMCI diabetics from normal cognitive function diabetic controls among the current sample of diabetic patients. Thus, the WPE could be a potential index for assisting diagnosis of aMCI in type 2 diabetes.Entropy2016-08-22188Article10.3390/e180803073071099-43002016-08-22doi: 10.3390/e18080307Zhijie BianGaoxiang OuyangZheng LiQiuli LiLei WangXiaoli Li<![CDATA[Entropy, Vol. 18, Pages 401: Exploring the Key Risk Factors for Application of Cloud Computing in Auditing]]>
http://www.mdpi.com/1099-4300/18/8/401
In the cloud computing information technology environment, cloud computing has some advantages such as lower cost, immediate access to hardware resources, lower IT barriers to innovation, higher scalability, etc., but for the financial audit information flow and processing in the cloud system, CPA (Certified Public Accountant) firms need special considerations, for example: system problems, information security and other related issues. Auditing cloud computing applications is the future trend in the CPA firms, given this issue is an important factor for them and very few studies have been conducted to investigate this issue; hence this study seeks to explore the key risk factors for the cloud computing and audit considerations. The dimensions/perspectives of the application of cloud computing audit considerations are huge and cover many criteria/factors. These risk factors are becoming increasingly complex, and interdependent. If the dimensions could be established, the mutually influential relations of the dimensions and criteria determined, and the current execution performance established; a prioritized improvement strategy designed could be constructed to use as a reference for CPA firm management decision making; as well as provide CPA firms with a reference for build auditing cloud computing systems. Empirical results show that key risk factors to consider when using cloud computing in auditing are, in order of priority for improvement: Operations (D), Automating user provisioning (C), Technology Risk (B) and Protection system (A).Entropy2016-08-22188Article10.3390/e180804014011099-43002016-08-22doi: 10.3390/e18080401Kuang-Hua HuFu-Hsiang ChenWei-Jhou We<![CDATA[Entropy, Vol. 18, Pages 403: Interplay between Lattice Distortions, Vibrations and Phase Stability in NbMoTaW High Entropy Alloys]]>
http://www.mdpi.com/1099-4300/18/8/403
Refractory high entropy alloys (HEA), such as BCC NbMoTaW, represent a promising materials class for next-generation high-temperature applications, due to their extraordinary mechanical properties. A characteristic feature of HEAs is the formation of single-phase solid solutions. For BCC NbMoTaW, recent computational studies revealed, however, a B2(Mo,W;Nb,Ta)-ordering at ambient temperature. This ordering could impact many materials properties, such as thermodynamic, mechanical, or diffusion properties, and hence be of relevance for practical applications. In this work, we theoretically address how the B2-ordering impacts thermodynamic properties of BCC NbMoTaW and how the predicted ordering temperature itself is affected by vibrations, electronic excitations, lattice distortions, and relaxation energies.Entropy2016-08-20188Article10.3390/e180804034031099-43002016-08-20doi: 10.3390/e18080403Fritz KörmannMarcel Sluiter<![CDATA[Entropy, Vol. 18, Pages 402: Analytical Solutions of the Electrical RLC Circuit via Liouville–Caputo Operators with Local and Non-Local Kernels]]>
http://www.mdpi.com/1099-4300/18/8/402
In this work we obtain analytical solutions for the electrical RLC circuit model defined with Liouville–Caputo, Caputo–Fabrizio and the new fractional derivative based in the Mittag-Leffler function. Numerical simulations of alternative models are presented for evaluating the effectiveness of these representations. Different source terms are considered in the fractional differential equations. The classical behaviors are recovered when the fractional order α is equal to 1.Entropy2016-08-20188Article10.3390/e180804024021099-43002016-08-20doi: 10.3390/e18080402José Gómez-AguilarVictor Morales-DelgadoMarco Taneco-HernándezDumitru BaleanuRicardo Escobar-JiménezMaysaa Al Qurashi<![CDATA[Entropy, Vol. 18, Pages 400: Optimal Noise Benefit in Composite Hypothesis Testing under Different Criteria]]>
http://www.mdpi.com/1099-4300/18/8/400
The detectability for a noise-enhanced composite hypothesis testing problem according to different criteria is studied. In this work, the noise-enhanced detection problem is formulated as a noise-enhanced classical Neyman–Pearson (NP), Max–min, or restricted NP problem when the prior information is completely known, completely unknown, or partially known, respectively. Next, the detection performances are compared and the feasible range of the constraint on the minimum detection probability is discussed. Under certain conditions, the noise-enhanced restricted NP problem is equivalent to a noise-enhanced classical NP problem with modified prior distribution. Furthermore, the corresponding theorems and algorithms are given to search the optimal additive noise in the restricted NP framework. In addition, the relationship between the optimal noise-enhanced average detection probability and the constraint on the minimum detection probability is explored. Finally, numerical examples and simulations are provided to illustrate the theoretical results.Entropy2016-08-19188Article10.3390/e180804004001099-43002016-08-19doi: 10.3390/e18080400Shujun LiuTing YangMingchun TangHongqing LiuKui ZhangXinzheng Zhang<![CDATA[Entropy, Vol. 18, Pages 309: Potential of Entropic Force in Markov Systems with Nonequilibrium Steady State, Generalized Gibbs Function and Criticality]]>
http://www.mdpi.com/1099-4300/18/8/309
In this paper, we revisit the notion of the “minus logarithm of stationary probability” as a generalized potential in nonequilibrium systems and attempt to illustrate its central role in an axiomatic approach to stochastic nonequilibrium thermodynamics of complex systems. It is demonstrated that this quantity arises naturally through both monotonicity results of Markov processes and as the rate function when a stochastic process approaches a deterministic limit. We then undertake a more detailed mathematical analysis of the consequences of this quantity, culminating in a necessary and sufficient condition for the criticality of stochastic systems. This condition is then discussed in the context of recent results about criticality in biological systems.Entropy2016-08-18188Article10.3390/e180803093091099-43002016-08-18doi: 10.3390/e18080309Lowell ThompsonHong Qian<![CDATA[Entropy, Vol. 18, Pages 308: Soft Magnetic Properties of High-Entropy Fe-Co-Ni-Cr-Al-Si Thin Films]]>
http://www.mdpi.com/1099-4300/18/8/308
Soft magnetic properties of Fe-Co-Ni-Al-Cr-Si thin films were studied. As-deposited Fe-Co-Ni-Al-Cr-Si nano-grained thin films showing no magnetic anisotropy were subjected to field-annealing at different temperatures to induce magnetic anisotropy. Optimized magnetic and electrical properties of Fe-Co-Ni-Al-Cr-Si films annealed at 200 °C are saturation magnetization 9.13 × 105 A/m, coercivity 79.6 A/m, out-of-plane uniaxial anisotropy field 1.59 × 103 A/m, and electrical resistivity 3.75 μΩ·m. Based on these excellent properties, we employed such films to fabricate magnetic thin film inductor. The performance of the high entropy alloy thin film inductors is superior to that of air core inductor.Entropy2016-08-18188Article10.3390/e180803083081099-43002016-08-18doi: 10.3390/e18080308Pei-Chung LinChun-Yang ChengJien-Wei YehTsung-Shune Chin<![CDATA[Entropy, Vol. 18, Pages 306: Contact-Free Detection of Obstructive Sleep Apnea Based on Wavelet Information Entropy Spectrum Using Bio-Radar]]>
http://www.mdpi.com/1099-4300/18/8/306
Judgment and early danger warning of obstructive sleep apnea (OSA) is meaningful to the diagnosis of sleep illness. This paper proposed a novel method based on wavelet information entropy spectrum to make an apnea judgment of the OSA respiratory signal detected by bio-radar in wavelet domain. It makes full use of the features of strong irregularity and disorder of respiratory signal resulting from the brain stimulation by real, low airflow during apnea. The experimental results demonstrated that the proposed method is effective for detecting the occurrence of sleep apnea and is also able to detect some apnea cases that the energy spectrum method cannot. Ultimately, the comprehensive judgment accuracy resulting from 10 groups of OSA data is 93.1%, which is promising for the non-contact aided-diagnosis of the OSA.Entropy2016-08-18188Article10.3390/e180803063061099-43002016-08-18doi: 10.3390/e18080306Fugui QiChuantao LiShuaijie WangHua ZhangJianqi WangGuohua Lu<![CDATA[Entropy, Vol. 18, Pages 305: An Efficient Method to Construct Parity-Check Matrices for Recursively Encoding Spatially Coupled LDPC Codes †]]>
http://www.mdpi.com/1099-4300/18/8/305
Spatially coupled low-density parity-check (LDPC) codes have attracted considerable attention due to their promising performance. Recursive encoding of the codes with low delay and low complexity has been proposed in the literature but with constraints or restrictions. In this manuscript we propose an efficient method to construct parity-check matrices for recursively encoding spatially coupled LDPC codes with arbitrarily chosen node degrees. A general principle is proposed, which provides feasible and practical guidance for the construction of parity-check matrices. According to the specific structure of the matrix, each parity bit at a coupling position is jointly determined by the information bits at the current position and the encoded bits at former positions. Performance analysis in terms of design rate and density evolution has been presented. It can be observed that, in addition to the feature of recursive encoding, selected code structures constructed by the newly proposed method may lead to better belief-propagation thresholds than the conventional structures. Finite-length simulation results are provided as well, which verify the theoretical analysis.Entropy2016-08-17188Article10.3390/e180803053051099-43002016-08-17doi: 10.3390/e18080305Zhongwei SiSijie WangJunyang Ma<![CDATA[Entropy, Vol. 18, Pages 304: Entropy as a Metric Generator of Dissipation in Complete Metriplectic Systems]]>
http://www.mdpi.com/1099-4300/18/8/304
This lecture is a short review on the role entropy plays in those classical dissipative systems whose equations of motion may be expressed via a Leibniz Bracket Algebra (LBA). This means that the time derivative of any physical observable f of the system is calculated by putting this f in a “bracket” together with a “special observable” F, referred to as a Leibniz generator of the dynamics. While conservative dynamics is given an LBA formulation in the Hamiltonian framework, so that F is the Hamiltonian H of the system that generates the motion via classical Poisson brackets or quantum commutation brackets, an LBA formulation can be given to classical dissipative dynamics through the Metriplectic Bracket Algebra (MBA): the conservative component of the dynamics is still generated via Poisson algebra by the total energy H, while S, the entropy of the degrees of freedom statistically encoded in friction, generates dissipation via a metric bracket. The motivation of expressing through a bracket algebra and a motion-generating function F is to endow the theory of the system at hand with all the powerful machinery of Hamiltonian systems in terms of symmetries that become evident and readable. Here a (necessarily partial) overview of the types of systems subject to MBA formulation is presented, and the physical meaning of the quantity S involved in each is discussed. Here the aim is to review the different MBAs for isolated systems in a synoptic way. At the end of this collection of examples, the fact that dissipative dynamics may be constructed also in the absence of friction with microscopic degrees of freedom is stressed. This reasoning is a hint to introduce dissipation at a more fundamental level.Entropy2016-08-16188Review10.3390/e180803043041099-43002016-08-16doi: 10.3390/e18080304Massimo Materassi<![CDATA[Entropy, Vol. 18, Pages 303: A Geographically Temporal Weighted Regression Approach with Travel Distance for House Price Estimation]]>
http://www.mdpi.com/1099-4300/18/8/303
Previous studies have demonstrated that non-Euclidean distance metrics can improve model fit in the geographically weighted regression (GWR) model. However, the GWR model often considers spatial nonstationarity and does not address variations in local temporal issues. Therefore, this paper explores a geographically temporal weighted regression (GTWR) approach that accounts for both spatial and temporal nonstationarity simultaneously to estimate house prices based on travel time distance metrics. Using house price data collected between 1980 and 2016, the house price response and explanatory variables are then modeled using both the GWR and the GTWR approaches. Comparing the GWR model with Euclidean and travel distance metrics, the GTWR model with travel distance obtains the highest value for the coefficient of determination ( R 2 ) and the lowest values for the Akaike information criterion (AIC). The results show that the GTWR model provides a relatively high goodness of fit and sufficient space-time explanatory power with non-Euclidean distance metrics. The results of this study can be used to formulate more effective policies for real estate management.Entropy2016-08-16188Article10.3390/e180803033031099-43002016-08-16doi: 10.3390/e18080303Jiping LiuYi YangShenghua XuYangyang ZhaoYong WangFuhao Zhang<![CDATA[Entropy, Vol. 18, Pages 301: Thermal Analysis of Shell-and-Tube Thermoacoustic Heat Exchangers]]>
http://www.mdpi.com/1099-4300/18/8/301
Heat exchangers are of key importance in overall performance and commercialization of thermoacoustic devices. The main goal in designing efficient thermoacoustic heat exchangers (TAHXs) is the achievement of the required heat transfer rate in conjunction with low acoustic energy dissipation. A numerical investigation is performed to examine the effects of geometry on both the viscous and thermal-relaxation losses of shell-and-tube TAHXs. Further, the impact of the drive ratio as well as the temperature difference between the oscillating gas and the TAHX tube wall on acoustic energy dissipation are explored. While viscous losses decrease with d i / δ κ , thermal-relaxation losses increase; however, thermal relaxation effects mainly determine the acoustic power dissipated in TAHXs. The results indicate the existence of an optimal configuration for which the acoustic energy dissipation minimizes depending on both the TAHX metal temperature and the drive ratio.Entropy2016-08-16188Article10.3390/e180803013011099-43002016-08-16doi: 10.3390/e18080301Mohammad GholamrezaeiKaveh Ghorbanian<![CDATA[Entropy, Vol. 18, Pages 299: Determining the Entropic Index q of Tsallis Entropy in Images through Redundancy]]>
http://www.mdpi.com/1099-4300/18/8/299
The Boltzmann–Gibbs and Tsallis entropies are essential concepts in statistical physics, which have found multiple applications in many engineering and science areas. In particular, we focus our interest on their applications to image processing through information theory. We present in this article a novel numeric method to calculate the Tsallis entropic index q characteristic to a given image, considering the image as a non-extensive system. The entropic index q is calculated through q-redundancy maximization, which is a methodology that comes from information theory. We find better results in the image processing in the grayscale by using the Tsallis entropy and thresholding q instead of the Shannon entropy.Entropy2016-08-15188Article10.3390/e180802992991099-43002016-08-15doi: 10.3390/e18080299Abdiel Ramírez-ReyesAlejandro Hernández-MontoyaGerardo Herrera-CorralIsmael Domínguez-Jiménez<![CDATA[Entropy, Vol. 18, Pages 300: Assessing the Exergy Costs of a 332-MW Pulverized Coal-Fired Boiler]]>
http://www.mdpi.com/1099-4300/18/8/300
In this paper, we analyze the exergy costs of a real large industrial boiler with the aim of improving efficiency. Specifically, the 350-MW front-fired, natural circulation, single reheat and balanced draft coal-fired boiler forms part of a 1050-MW conventional power plant located in Spain. We start with a diagram of the power plant, followed by a formulation of the exergy cost allocation problem to determine the exergy cost of the product of the boiler as a whole and the expenses of the individual components and energy streams. We also define a productive structure of the system. Furthermore, a proposal for including the exergy of radiation is provided in this study. Our results show that the unit exergy cost of the product of the boiler goes from 2.352 to 2.5, and that the maximum values are located in the ancillary electrical devices, such as induced-draft fans and coil heaters. Finally, radiation does not have an effect on the electricity cost, but affects at least 30% of the unit exergy cost of the boiler’s product.Entropy2016-08-15188Article10.3390/e180803003001099-43002016-08-15doi: 10.3390/e18080300Victor Rangel-HernandezCesar Damian-AscencioJuan Belman-FloresAlejandro Zaleta-Aguilar<![CDATA[Entropy, Vol. 18, Pages 302: Heat Transfer and Entropy Generation of Non-Newtonian Laminar Flow in Microchannels with Four Flow Control Structures]]>
http://www.mdpi.com/1099-4300/18/8/302
Flow characteristics and heat transfer performances of carboxymethyl cellulose (CMC) aqueous solutions in the microchannels with flow control structures were investigated in this study. The researches were carried out with various flow rates and concentrations of the CMC aqueous solutions. The results reveal that the pin-finned microchannel has the most uniform temperature distribution on the structured walls, and the average temperature on the structured wall reaches the minimum value in cylinder-ribbed microchannels at the same flow rate and CMC concentration. Moreover, the protruded microchannel obtains the minimum relative Fanning friction factor f/f0, while, the maximum f/f0 is observed in the cylinder-ribbed microchannel. Furthermore, the minimum f/f0 is reached at the cases with CMC2000, and also, the relative Nusselt number Nu/Nu0 of CMC2000 cases is larger than that of other cases in the four structured microchannels. Therefore, 2000 ppm is the recommended concentration of CMC aqueous solutions in all the cases with different flow rates and flow control structures. Pin-finned microchannels are preferred in low flow rate cases, while, V-grooved microchannels have the minimum relative entropy generation S’/S0’ and best thermal performance TP at CMC2000 in high flow rates.Entropy2016-08-12188Article10.3390/e180803023021099-43002016-08-12doi: 10.3390/e18080302Ke YangDi ZhangYonghui XieGongnan Xie<![CDATA[Entropy, Vol. 18, Pages 298: Voice Activity Detection Using Fuzzy Entropy and Support Vector Machine]]>
http://www.mdpi.com/1099-4300/18/8/298
This paper proposes support vector machine (SVM) based voice activity detection using FuzzyEn to improve detection performance under noisy conditions. The proposed voice activity detection (VAD) uses fuzzy entropy (FuzzyEn) as a feature extracted from noise-reduced speech signals to train an SVM model for speech/non-speech classification. The proposed VAD method was tested by conducting various experiments by adding real background noises of different signal-to-noise ratios (SNR) ranging from −10 dB to 10 dB to actual speech signals collected from the TIMIT database. The analysis proves that FuzzyEn feature shows better results in discriminating noise and corrupted noisy speech. The efficacy of the SVM classifier was validated using 10-fold cross validation. Furthermore, the results obtained by the proposed method was compared with those of previous standardized VAD algorithms as well as recently developed methods. Performance comparison suggests that the proposed method is proven to be more efficient in detecting speech under various noisy environments with an accuracy of 93.29%, and the FuzzyEn feature detects speech efficiently even at low SNR levels.Entropy2016-08-12188Article10.3390/e180802982981099-43002016-08-12doi: 10.3390/e18080298R. Johny EltonP. VasukiJ. Mohanalin<![CDATA[Entropy, Vol. 18, Pages 295: Information Theoretical Measures for Achieving Robust Learning Machines]]>
http://www.mdpi.com/1099-4300/18/8/295
Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.Entropy2016-08-12188Article10.3390/e180802952951099-43002016-08-12doi: 10.3390/e18080295Pablo ZegersB. FriedenCarlos AlarcónAlexis Fuentes<![CDATA[Entropy, Vol. 18, Pages 296: Temporal Predictability of Online Behavior in Foursquare]]>
http://www.mdpi.com/1099-4300/18/8/296
With the widespread use of Internet technologies, online behaviors play a more and more important role in humans’ daily lives. Knowing the times when humans perform their next online activities can be quite valuable for developing better online services, which prompts us to wonder whether the times of users’ next online activities are predictable. In this paper, we investigate the temporal predictability in human online activities through exploiting the dataset from the social network Foursquare. Through discretizing the inter-event times of users’ Foursquare activities into symbols, we map each user’s inter-event time sequence to a sequence of inter-event time symbols. By applying the information-theoretic method to the sequences of inter-event time symbols, we show that for a user’s Foursquare activities, knowing the time interval between the current activity and the previous activity decreases the entropy of the time interval between the next activity and current activity, i.e., the time of the user’s next Foursquare activity is predictable. Much of the predictability is explained by the equal-interval repeat; that is, users perform consecutive Foursquare activities with approximately equal time intervals. On the other hand, the unequal-interval preference, i.e., the preference of performing Foursquare activities with a fixed time interval after another given time interval, is also an origin for predictability. Furthermore, our results reveal that the Foursquare activities on weekdays have a higher temporal predictability than those on weekends and that users’ Foursquare activity is more temporally predictable if his/her previous activity is performed in a location that he/she visits more frequently.Entropy2016-08-12188Article10.3390/e180802962961099-43002016-08-12doi: 10.3390/e18080296Wang ChenQiang GaoHuagang Xiong<![CDATA[Entropy, Vol. 18, Pages 293: Characterization of Seepage Velocity beneath a Complex Rock Mass Dam Based on Entropy Theory]]>
http://www.mdpi.com/1099-4300/18/8/293
Owing to the randomness in the fracture flow system, the seepage system beneath a complex rock mass dam is inherently complex and highly uncertain, an investigation of the dam leakage by estimating the spatial distribution of the seepage field by conventional methods is quite difficult. In this paper, the entropy theory, as a relation between the definiteness and probability, is used to probabilistically analyze the characteristics of the seepage system in a complex rock mass dam. Based on the principle of maximum entropy, an equation for the vertical distribution of the seepage velocity in a dam borehole is derived. The achieved distribution is tested and compared with actual field data, and the results show good agreement. According to the entropy of flow velocity in boreholes, the rupture degree of a dam bedrock has been successfully estimated. Moreover, a new sampling scheme is presented. The sampling frequency has a negative correlation with the distance to the site of the minimum velocity, which is preferable to the traditional one. This paper demonstrates the significant advantage of applying the entropy theory for seepage velocity analysis in a complex rock mass dam.Entropy2016-08-11188Article10.3390/e180802932931099-43002016-08-11doi: 10.3390/e18080293Xixi ChenJiansheng ChenTao WangHuaidong ZhouLinghua Liu<![CDATA[Entropy, Vol. 18, Pages 294: Understanding Gating Operations in Recurrent Neural Networks through Opinion Expression Extraction]]>
http://www.mdpi.com/1099-4300/18/8/294
Extracting opinion expressions from text is an essential task of sentiment analysis, which is usually treated as one of the word-level sequence labeling problems. In such problems, compositional models with multiplicative gating operations provide efficient ways to encode the contexts, as well as to choose critical information. Thus, in this paper, we adopt Long Short-Term Memory (LSTM) recurrent neural networks to address the task of opinion expression extraction and explore the internal mechanisms of the model. The proposed approach is evaluated on the Multi-Perspective Question Answering (MPQA) opinion corpus. The experimental results demonstrate improvement over previous approaches, including the state-of-the-art method based on simple recurrent neural networks. We also provide a novel micro perspective to analyze the run-time processes and gain new insights into the advantages of LSTM selecting the source of information with its flexible connections and multiplicative gating operations.Entropy2016-08-11188Article10.3390/e180802942941099-43002016-08-11doi: 10.3390/e18080294Xin WangYuanchao LiuMing LiuChengjie SunXiaolong Wang<![CDATA[Entropy, Vol. 18, Pages 292: On Multi-Scale Entropy Analysis of Order-Tracking Measurement for Bearing Fault Diagnosis under Variable Speed]]>
http://www.mdpi.com/1099-4300/18/8/292
The research objective in this paper is to investigate the feasibility and effectiveness of utilizing envelope extraction combining the multi-scale entropy (MSE) analysis for identifying different roller bearing faults. The features were extracted from the angle-domain vibration signals that were measured through the hardware-implemented order-tracking technique, so that the characteristics of bearing defects are not affected by the rotating speed. The envelope analysis was employed to the vibration measurements as well as the selected intrinsic mode function (IMF) that was separated by the empirical mode decomposition (EMD) method. By using the coarse-grain process, the entropy of the envelope signals in the different scales was calculated to form the MSE distributions that represent the complexity of the signals. The decision tree was used to distinguish the entropy-related features which reveal the different classes of bearing faults.Entropy2016-08-10188Article10.3390/e180802922921099-43002016-08-10doi: 10.3390/e18080292Tian-Yau WuChang-Ling YuDa-Chun Liu<![CDATA[Entropy, Vol. 18, Pages 291: Indicators of Evidence for Bioequivalence]]>
http://www.mdpi.com/1099-4300/18/8/291
Some equivalence tests are based on two one-sided tests, where in many applications the test statistics are approximately normal. We define and find evidence for equivalence in Z-tests and then one- and two-sample binomial tests as well as for t-tests. Multivariate equivalence tests are typically based on statistics with non-central chi-squared or non-central F distributions in which the non-centrality parameter λ is a measure of heterogeneity of several groups. Classical tests of the null λ ≥ λ 0 versus the equivalence alternative λ &lt; λ 0 are available, but simple formulae for power functions are not. In these tests, the equivalence limit λ 0 is typically chosen by context. We provide extensions of classical variance stabilizing transformations for the non-central chi-squared and F distributions that are easy to implement and which lead to indicators of evidence for equivalence. Approximate power functions are also obtained via simple expressions for the expected evidence in these equivalence tests.Entropy2016-08-09188Article10.3390/e180802912911099-43002016-08-09doi: 10.3390/e18080291Stephan MorgenthalerRobert Staudte<![CDATA[Entropy, Vol. 18, Pages 290: Control of Self-Organized Criticality through Adaptive Behavior of Nano-Structured Thin Film Coatings]]>
http://www.mdpi.com/1099-4300/18/8/290
In this paper, we will develop a strategy for controlling the self-organized critical process using the example of extreme tribological conditions caused by intensive build-up edge (BUE) formation that take place during machining of hard-to-cut austentic superduplex stainless steel SDSS UNS32750. From a tribological viewpoint, machining of this material involves intensive seizure and build-up edge formation at the tool/chip interface, which can result in catastrophic tool failure. Built-up edge is considered to be a very damaging process in the system. The periodical breakage of the build-ups may eventually result in tool tip breakage and, thereby, lead to a catastrophe (complete loss of workability) in the system. The dynamic process of build-up edge formation is similar to an avalanche. It is governed by stick-slip phenomenon during friction and associated with the self-organized critical process. Investigation of wear patterns on the frictional surfaces of cutting tools using Scanning Electron Microscope (SEM), combined with chip undersurface characterization and frictional (cutting) force analyses, confirms this hypothesis. The control of self-organized criticality is accomplished through application of a nano-multilayer TiAl60CrSiYN/TiAlCrN thin film Physical Vapor Deposition (PVD) coating containing elevated aluminum content on a cemented carbide tool. The suggested coating enhanced the formation of protective nano-scale tribo-films on the friction surface under operation. Moreover, machining process optimization contributed to further enhancement of this beneficial process, as evidenced by X-ray Photoelectron Spectroscopy (XPS) studies of tribo-films. This resulted in a reduction of the scale of the build ups leading to overall wear performance improvement. A new thermodynamic analysis is proposed concerning entropy production during friction in machining with buildup edge formation. This model is able to predict various phenomena and shows a good agreement with experimental results. In the presented research we demonstrated a novel experimental approach for controlling self-organized criticality using an example of the machining with buildup edge formation, which is similar to avalanches. This was done through enhanced adaptive performance of the surface engineered tribo-system, in the aim of reducing the scale and frequency of the avalanches.Entropy2016-08-09188Article10.3390/e180802902901099-43002016-08-09doi: 10.3390/e18080290German Fox-RabinovichJose PaivaIosif GershmanMaryam ArameshDanielle CavelliKenji YamamotoGoulnara DosbaevaStephen Veldhuis<![CDATA[Entropy, Vol. 18, Pages 289: Correction to Yao, H.; Qiao, J.-W.; Gao, M.C.; Hawk, J.A.; Ma, S.-G.; Zhou, H. MoNbTaV Medium-Entropy Alloy. Entropy 2016, 18, 189]]>
http://www.mdpi.com/1099-4300/18/8/289
The authors wish to make the following correction to their paper [1].[...]Entropy2016-08-09188Correction10.3390/e180802892891099-43002016-08-09doi: 10.3390/e18080289Hongwei YaoJun-Wei QiaoMichael GaoJeffrey HawkSheng-Guo MaHefeng Zhou<![CDATA[Entropy, Vol. 18, Pages 287: Hawking-Like Radiation from the Trapping Horizon of Both Homogeneous and Inhomogeneous Spherically Symmetric Spacetime Model of the Universe]]>
http://www.mdpi.com/1099-4300/18/8/287
The present work deals with the semi-classical tunnelling approach and the Hamilton–Jacobi method to study Hawking radiation from the dynamical horizon of both the homogeneous Friedmann–Robertson–Walker (FRW) model and the inhomogeneous Lemaitre–Tolman–Bondi (LTB) model of the Universe. In the tunnelling prescription, radial null geodesics are used to visualize particles from behind the trapping horizon and the Hawking-like temperature has been calculated. On the other hand, in the Hamilton–Jacobi formulation, quantum corrections have been incorporated by solving the Klein–Gordon wave equation. In both the approaches, the temperature agrees at the semiclassical level.Entropy2016-08-08188Article10.3390/e180802872871099-43002016-08-08doi: 10.3390/e18080287Subenoy ChakrabortySubhajit SahaChristian Corda<![CDATA[Entropy, Vol. 18, Pages 288: Microstructures of Al7.5Cr22.5Fe35Mn20Ni15 High-Entropy Alloy and Its Polarization Behaviors in Sulfuric Acid, Nitric Acid and Hydrochloric Acid Solutions]]>
http://www.mdpi.com/1099-4300/18/8/288
This paper investigates the microstructures and the polarization behaviors of Al7.5Cr22.5Fe35Mn20Ni15 high-entropy alloy in 1M (1 mol/L) deaerated sulfuric acid (H2SO4), nitric acid (HNO3), and hydrochloric acid (HCl) solutions at temperatures of 30–60 °C. The three phases of the Al7.5Cr22.5Fe35Mn20Ni15 high-entropy alloy are body-centered cubic (BCC) dendrites, face-centered cubic (FCC) interdendrites, and ordered BCC precipitates uniformly dispersed in the BCC dendrites. The different phases were corroded in different acidic solutions. The passivation regions of the Al7.5Cr22.5Fe35Mn20Ni15 alloy are divided into three and two sub-regions in the solutions of H2SO4 and HNO3 at 30–60 °C, respectively. The passivation region of the Al7.5Cr22.5Fe35Mn20Ni15 alloy is also divided into two sub-domains in 1M deaerated HCl solution at 30 °C. The Al7.5Cr22.5Fe35Mn20Ni15 alloy has almost equal corrosion resistance in comparison with 304 stainless steel (304SS) in both the 1M H2SO4 and 1M HCl solutions. The polarization behaviors indicated that the Al7.5Cr22.5Fe35Mn20Ni15 alloy possessed much better corrosion resistance than 304SS in 1M HNO3 solution. However, in 1M NaCl solution, the corrosion resistance of the Al7.5Cr22.5Fe35Mn20Ni15 alloy was less than 304SS.Entropy2016-08-08188Article10.3390/e180802882881099-43002016-08-08doi: 10.3390/e18080288Chun-Huei TsauPo-Yen Lee<![CDATA[Entropy, Vol. 18, Pages 284: A Five Species Cyclically Dominant Evolutionary Game with Fixed Direction: A New Way to Produce Self-Organized Spatial Patterns]]>
http://www.mdpi.com/1099-4300/18/8/284
Cyclically dominant systems are hot issues in academia, and they play an important role in explaining biodiversity in Nature. In this paper, we construct a five-strategy cyclically dominant system. Each individual in our system changes its strategy along a fixed direction. The dominant strategy can promote a change in the dominated strategy, and the dominated strategy can block a change in the dominant strategy. We use mean-field theory and cellular automaton simulation to discuss the evolving characters of the system. In the cellular automaton simulation, we find the emergence of spiral waves on spatial patterns without a migration rate, which suggests a new way to produce self-organized spatial patterns.Entropy2016-08-08188Article10.3390/e180802842841099-43002016-08-08doi: 10.3390/e18080284Yibin KangQiuhui PanXueting WangMingfeng He<![CDATA[Entropy, Vol. 18, Pages 286: Parametric Analysis of the Exergoeconomic Operation Costs, Environmental and Human Toxicity Indexes of the MF501F3 Gas Turbine]]>
http://www.mdpi.com/1099-4300/18/8/286
This work presents an energetic, exergoeconomic, environmental, and toxicity analysis of the simple gas turbine M501F3 based on a parametric analysis of energetic (thermal efficiency, fuel and air flow rates, and specific work output), exergoeconomic (exergetic efficiency and exergoeconomic operation costs), environmental (global warming, smog formation, acid rain indexes), and human toxicity indexes, by taking the compressor pressure ratio and the turbine inlet temperature as the operating parameters. The aim of this paper is to provide an integral, systematic, and powerful diagnostic tool to establish possible operation and maintenance actions to improve the gas turbine’s exergoeconomic, environmental, and human toxicity indexes. Despite the continuous changes in the price of natural gas, the compressor, combustion chamber, and turbine always contribute 18.96%, 53.02%, and 28%, respectively, to the gas turbine’s exergoeconomic operation costs. The application of this methodology can be extended to other simple gas turbines using the pressure drops and isentropic efficiencies, among others, as the degradation parameters, as well as to other energetic systems, without loss of generality.Entropy2016-08-06188Article10.3390/e180802862861099-43002016-08-06doi: 10.3390/e18080286Edgar Torres-GonzálezRaul Lugo-LeyteHelen Lugo-MéndezMartin Salazar-PereyraAlejandro Torres-Aldaco<![CDATA[Entropy, Vol. 18, Pages 283: A Critical Reassessment of the Hess–Murray Law]]>
http://www.mdpi.com/1099-4300/18/8/283
The Hess–Murray law is a correlation between the radii of successive branchings in bi/trifurcated vessels in biological tissues. First proposed by the Swiss physiologist and Nobel laureate Walter Rudolf Hess in his 1914 doctoral thesis and published in 1917, the law was “rediscovered” by the American physiologist Cecil Dunmore Murray in 1926. The law is based on the assumption that blood or lymph circulation in living organisms is governed by a “work minimization” principle that—under a certain set of specified conditions—leads to an “optimal branching ratio” of r i + 1 r i = 1 2 3 = 0.7937 . This “cubic root of 2” correlation underwent extensive theoretical and experimental reassessment in the second half of the 20th century, and the results indicate that—under a well-defined series of conditions—the law is sufficiently accurate for the smallest vessels (r of the order of fractions of millimeter) but fails for the larger ones; moreover, it cannot be successfully extended to turbulent flows. Recent comparisons with numerical investigations of branched flows led to similar conclusions. More recently, the Hess–Murray law came back into the limelight when it was taken as a founding paradigm of the Constructal Law, a theory that employs physical intuition and mathematical reasoning to derive “optimal paths” for the transport of matter and energy between a source and a sink, regardless of the mode of transportation (continuous, like in convection and conduction, or discrete, like in the transportation of goods and people). This paper examines the foundation of the law and argues that both for natural flows and for engineering designs, a minimization of the irreversibility under physically sound boundary conditions leads to somewhat different results. It is also shown that, in the light of an exergy-based resource analysis, an amended version of the Hess–Murray law may still hold an important position in engineering and biological sciences.Entropy2016-08-05188Article10.3390/e180802832831099-43002016-08-05doi: 10.3390/e18080283Enrico Sciubba<![CDATA[Entropy, Vol. 18, Pages 285: ECG Classification Using Wavelet Packet Entropy and Random Forests]]>
http://www.mdpi.com/1099-4300/18/8/285
The electrocardiogram (ECG) is one of the most important techniques for heart disease diagnosis. Many traditional methodologies of feature extraction and classification have been widely applied to ECG analysis. However, the effectiveness and efficiency of such methodologies remain to be improved, and much existing research did not consider the separation of training and testing samples from the same set of patients (so called inter-patient scheme). To cope with these issues, in this paper, we propose a method to classify ECG signals using wavelet packet entropy (WPE) and random forests (RF) following the Association for the Advancement of Medical Instrumentation (AAMI) recommendations and the inter-patient scheme. Specifically, we firstly decompose the ECG signals by wavelet packet decomposition (WPD), and then calculate entropy from the decomposed coefficients as representative features, and finally use RF to build an ECG classification model. To the best of our knowledge, it is the first time that WPE and RF are used to classify ECG following the AAMI recommendations and the inter-patient scheme. Extensive experiments are conducted on the publicly available MIT–BIH Arrhythmia database and influence of mother wavelets and level of decomposition for WPD, type of entropy and the number of base learners in RF on the performance are also discussed. The experimental results are superior to those by several state-of-the-art competing methods, showing that WPE and RF is promising for ECG classification.Entropy2016-08-05188Article10.3390/e180802852851099-43002016-08-05doi: 10.3390/e18080285Taiyong LiMin Zhou<![CDATA[Entropy, Vol. 18, Pages 269: Traceability Analyses between Features and Assets in Software Product Lines]]>
http://www.mdpi.com/1099-4300/18/8/269
In a Software Product Line (SPL), the central notion of implementability provides the requisite connection between specifications and their implementations, leading to the definition of products. While it appears to be a simple extension of the traceability relation between components and features, it involves several subtle issues that were overlooked in the existing literature. In this paper, we have introduced a precise and formal definition of implementability over a fairly expressive traceability relation. The consequent definition of products in the given SPL naturally entails a set of useful analysis problems that are either refinements of known problems or are completely novel. We also propose a new approach to solve these analysis problems by encoding them as Quantified Boolean Formulae (QBF) and solving them through Quantified Satisfiability (QSAT) solvers. QBF can represent more complex analysis operations, which cannot be represented by using propositional formulae. The methodology scales much better than the SAT-based solutions hinted in the literature and were demonstrated through a tool called SPLAnE (SPL Analysis Engine) on a large set of SPL models.Entropy2016-08-03188Article10.3390/e180802692691099-43002016-08-03doi: 10.3390/e18080269Ganesh NarwaneJosé GalindoShankara KrishnaDavid BenavidesJean-Vivien MilloS. Ramesh<![CDATA[Entropy, Vol. 18, Pages 276: A Novel Image Encryption Scheme Using the Composite Discrete Chaotic System]]>
http://www.mdpi.com/1099-4300/18/8/276
The composite discrete chaotic system (CDCS) is a complex chaotic system that combines two or more discrete chaotic systems. This system holds the chaotic characteristics of different chaotic systems in a random way and has more complex chaotic behaviors. In this paper, we aim to provide a novel image encryption algorithm based on a new two-dimensional (2D) CDCS. The proposed scheme consists of two parts: firstly, we propose a new 2D CDCS and analysis the chaotic behaviors, then, we introduce the bit-level permutation and pixel-level diffusion encryption architecture with the new CDCS to form the full proposed algorithm. Random values and the total information of the plain image are added into the diffusion procedure to enhance the security of the proposed algorithm. Both the theoretical analysis and simulations confirm the security of the proposed algorithm.Entropy2016-08-01188Article10.3390/e180802762761099-43002016-08-01doi: 10.3390/e18080276Hegui ZhuXiangde ZhangHai YuCheng ZhaoZhiliang Zhu<![CDATA[Entropy, Vol. 18, Pages 282: How Is a Data-Driven Approach Better than Random Choice in Label Space Division for Multi-Label Classification?]]>
http://www.mdpi.com/1099-4300/18/8/282
We propose using five data-driven community detection approaches from social networks to partition the label space in the task of multi-label classification as an alternative to random partitioning into equal subsets as performed by RAkELd. We evaluate modularity-maximizing using fast greedy and leading eigenvector approximations, infomap, walktrap and label propagation algorithms. For this purpose, we propose to construct a label co-occurrence graph (both weighted and unweighted versions) based on training data and perform community detection to partition the label set. Then, each partition constitutes a label space for separate multi-label classification sub-problems. As a result, we obtain an ensemble of multi-label classifiers that jointly covers the whole label space. Based on the binary relevance and label powerset classification methods, we compare community detection methods to label space divisions against random baselines on 12 benchmark datasets over five evaluation measures. We discover that data-driven approaches are more efficient and more likely to outperform RAkELd than binary relevance or label powerset is, in every evaluated measure. For all measures, apart from Hamming loss, data-driven approaches are significantly better than RAkELd ( α = 0 . 05 ), and at least one data-driven approach is more likely to outperform RAkELd than a priori methods in the case of RAkELd’s best performance. This is the largest RAkELd evaluation published to date with 250 samplings per value for 10 values of RAkELd parameter k on 12 datasets published to date.Entropy2016-07-30188Article10.3390/e180802822821099-43002016-07-30doi: 10.3390/e18080282Piotr SzymańskiTomasz KajdanowiczKristian Kersting<![CDATA[Entropy, Vol. 18, Pages 281: Acoustic Detection of Coronary Occlusions before and after Stent Placement Using an Electronic Stethoscope]]>
http://www.mdpi.com/1099-4300/18/8/281
More than 370,000 Americans die every year from coronary artery disease (CAD). Early detection and treatment are crucial to reducing this number. Current diagnostic and disease-monitoring methods are invasive, costly, and time-consuming. Using an electronic stethoscope and spectral and nonlinear dynamics analysis of the recorded heart sound, we investigated the acoustic signature of CAD in subjects with only a single coronary occlusion before and after stent placement, as well as subjects with clinically normal coronary arteries. The CAD signature was evaluated by estimating power ratios of the total power above 150 Hz over the total power below 150 Hz of the FFT of the acoustic signal. Additionally, approximate entropy values were estimated to assess the differences induced by the stent placement procedure to the acoustic signature of the signals in the time domain. The groups were identified with this method with 82% sensitivity and 64% specificity (using the power ratio method) and 82% sensitivity and 55% specificity (using the approximate entropy). Power ratios and approximate entropy values after stent placement are not statistically different from those estimated from subjects with no coronary occlusions. Our approach demonstrates that the effect of stent placement on coronary occlusions can be monitored using an electronic stethoscope.Entropy2016-07-29188Article10.3390/e180802812811099-43002016-07-29doi: 10.3390/e18080281Andrei DragomirAllison PostYasemin AkayHani JneidDavid PaniaguaAli DenktasBiykem BozkurtMetin Akay<![CDATA[Entropy, Vol. 18, Pages 280: Acoustic Entropy of the Materials in the Course of Degradation]]>
http://www.mdpi.com/1099-4300/18/8/280
We report experimental observations on the evolution of acoustic entropy in the course of cyclic loading as degradation occurs due to fatigue. The measured entropy is a result of the materials’ microstructural changes that occur as degradation due to cyclic mechanical loading. Experimental results demonstrate that maximum acoustic entropy emanating from materials during the course of degradation remains similar. Experiments are shown for two different types of materials: Aluminum 6061 (a metallic alloy) and glass/epoxy (a composite laminate). The evolution of the acoustic entropy demonstrates a persistent trend over the course of degradation.Entropy2016-07-28188Article10.3390/e180802802801099-43002016-07-28doi: 10.3390/e18080280Ali KahirdehM. Khonsari<![CDATA[Entropy, Vol. 18, Pages 279: Entropy Generation through Non-Equilibrium Ordered Structures in Corner Flows with Sidewall Mass Injection]]>
http://www.mdpi.com/1099-4300/18/8/279
Additional entropy generation rates through non-equilibrium ordered structures are predicted for corner flows with sidewall mass injection. Well-defined non-equilibrium ordered structures are predicted at a normalized vertical station of approximately eighteen percent of the boundary-layer thickness. These structures are in addition to the ordered structures previously reported at approximately thirty-eight percent of the boundary layer thickness. The computational procedure is used to determine the entropy generation rate for each spectral velocity component at each of several stream wise stations and for each of several injection velocity values. Application of the procedure to possible thermal system processes is discussed. These results indicate that cooling sidewall mass injection into a horizontal laminar boundary layer may actually increase the heat transfer to the horizontal surface.Entropy2016-07-28188Article10.3390/e180802792791099-43002016-07-28doi: 10.3390/e18080279LaVar Isaacson<![CDATA[Entropy, Vol. 18, Pages 278: Expected Logarithm of Central Quadratic Form and Its Use in KL-Divergence of Some Distributions]]>
http://www.mdpi.com/1099-4300/18/8/278
In this paper, we develop three different methods for computing the expected logarithm of central quadratic forms: a series method, an integral method and a fast (but inexact) set of methods. The approach used for deriving the integral method is novel and can be used for computing the expected logarithm of other random variables. Furthermore, we derive expressions for the Kullback–Leibler (KL) divergence of elliptical gamma distributions and angular central Gaussian distributions, which turn out to be functions dependent on the expected logarithm of a central quadratic form. Through several experimental studies, we compare the performance of these methods.Entropy2016-07-28188Article10.3390/e180802782781099-43002016-07-28doi: 10.3390/e18080278Pourya Habib ZadehReshad Hosseini<![CDATA[Entropy, Vol. 18, Pages 277: A Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models]]>
http://www.mdpi.com/1099-4300/18/8/277
Estimators derived from a divergence criterion such as φ - divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation of φ–divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estimator. The algorithm contains by construction the well-known Expectation Maximization (EM) algorithm. Our work is based on the paper of Tseng on the likelihood function. We provide some convergence properties by adapting the ideas of Tseng. We improve Tseng’s results by relaxing the identifiability condition on the proximal term, a condition which is not verified for most mixture models and is hard to be verified for “non mixture” ones. Convergence of the EM algorithm in a two-component Gaussian mixture is discussed in the spirit of our approach. Several experimental results on mixture models are provided to confirm the validity of the approach.Entropy2016-07-27188Article10.3390/e180802772771099-43002016-07-27doi: 10.3390/e18080277Diaa Al MohamadMichel Broniatowski<![CDATA[Entropy, Vol. 18, Pages 275: Symmetric Fractional Diffusion and Entropy Production]]>
http://www.mdpi.com/1099-4300/18/7/275
The discovery of the entropy production paradox (Hoffmann et al., 1998) raised basic questions about the nature of irreversibility in the regime between diffusion and waves. First studied in the form of spatial movements of moments of H functions, pseudo propagation is the pre-limit propagation-like movements of skewed probability density function (PDFs) in the domain between the wave and diffusion equations that goes over to classical partial differential equation propagation of characteristics in the wave limit. Many of the strange properties that occur in this extraordinary regime were thought to be connected in some manner to this form of proto-movement. This paper eliminates pseudo propagation by employing a similar evolution equation that imposes spatial unimodal symmetry on evolving PDFs. Contrary to initial expectations, familiar peculiarities emerge despite the imposed symmetry, but they have a distinct character.Entropy2016-07-23187Article10.3390/e180702752751099-43002016-07-23doi: 10.3390/e18070275Janett PrehlFrank BoldtKarl HoffmannChristopher Essex<![CDATA[Entropy, Vol. 18, Pages 273: Efficiency Bound of Local Z-Estimators on Discrete Sample Spaces]]>
http://www.mdpi.com/1099-4300/18/7/273
Many statistical models over a discrete sample space often face the computational difficulty of the normalization constant. Because of that, the maximum likelihood estimator does not work. In order to circumvent the computation difficulty, alternative estimators such as pseudo-likelihood and composite likelihood that require only a local computation over the sample space have been proposed. In this paper, we present a theoretical analysis of such localized estimators. The asymptotic variance of localized estimators depends on the neighborhood system on the sample space. We investigate the relation between the neighborhood system and estimation accuracy of localized estimators. Moreover, we derive the efficiency bound. The theoretical results are applied to investigate the statistical properties of existing estimators and some extended ones.Entropy2016-07-23187Article10.3390/e180702732731099-43002016-07-23doi: 10.3390/e18070273Takafumi Kanamori<![CDATA[Entropy, Vol. 18, Pages 271: Thermal Characteristic Analysis and Experimental Study of a Spindle-Bearing System]]>
http://www.mdpi.com/1099-4300/18/7/271
In this paper, a thermo-mechanical coupling analysis model of the spindle-bearing system based on Hertz’s contact theory and a point contact non-Newtonian thermal elastohydrodynamic lubrication (EHL) theory are developed. In this model, the effect of preload, centrifugal force, the gyroscopic moment, and the lubrication state of the spindle-bearing system are considered. According to the heat transfer theory, the mathematical model for the temperature field of the spindle system is developed and the effect of the spindle cooling system on the spindle temperature distribution is analyzed. The theoretical simulations and the experimental results indicate that the bearing preload has great effect on the frictional heat generation; the cooling fluid has great effect on the heat balance of the spindle system. If a steady-state heat balance between the friction heat generation and the cooling system cannot be reached, thermally-induced preload will lead to a further increase of the frictional heat generation and then cause the thermal failure of the spindle.Entropy2016-07-22187Article10.3390/e180702712711099-43002016-07-22doi: 10.3390/e18070271Li WuQingchang Tan<![CDATA[Entropy, Vol. 18, Pages 274: An Entropy-Based Kernel Learning Scheme toward Efficient Data Prediction in Cloud-Assisted Network Environments]]>
http://www.mdpi.com/1099-4300/18/7/274
With the recent emergence of wireless sensor networks (WSNs) in the cloud computing environment, it is now possible to monitor and gather physical information via lots of sensor nodes to meet the requirements of cloud services. Generally, those sensor nodes collect data and send data to sink node where end-users can query all the information and achieve cloud applications. Currently, one of the main disadvantages in the sensor nodes is that they are with limited physical performance relating to less memory for storage and less source of power. Therefore, in order to avoid such limitation, it is necessary to develop an efficient data prediction method in WSN. To serve this purpose, by reducing the redundant data transmission between sensor nodes and sink node while maintaining the required acceptable errors, this article proposes an entropy-based learning scheme for data prediction through the use of kernel least mean square (KLMS) algorithm. The proposed scheme called E-KLMS develops a mechanism to maintain the predicted data synchronous at both sides. Specifically, the kernel-based method is able to adjust the coefficients adaptively in accordance with every input, which will achieve a better performance with smaller prediction errors, while employing information entropy to remove these data which may cause relatively large errors. E-KLMS can effectively solve the tradeoff problem between prediction accuracy and computational efforts while greatly simplifying the training structure compared with some other data prediction approaches. What’s more, the kernel-based method and entropy technique could ensure the prediction effect by both improving the accuracy and reducing errors. Experiments with some real data sets have been carried out to validate the efficiency and effectiveness of E-KLMS learning scheme, and the experiment results show advantages of the our method in prediction accuracy and computational time.Entropy2016-07-22187Article10.3390/e180702742741099-43002016-07-22doi: 10.3390/e18070274Xiong LuoJi LiuDandan ZhangWeiping WangYueqin Zhu<![CDATA[Entropy, Vol. 18, Pages 270: Toward Improved Understanding of the Physical Meaning of Entropy in Classical Thermodynamics]]>
http://www.mdpi.com/1099-4300/18/7/270
The year 2015 marked the 150th anniversary of “entropy” as a concept in classical thermodynamics. Despite its central role in the mathematical formulation of the Second Law and most of classical thermodynamics, its physical meaning continues to be elusive and confusing. This is especially true when we seek a reconstruction of the classical thermodynamics of a system from the statistical behavior of its constituent microscopic particles or vice versa. This paper sketches the classical definition by Clausius and offers a modified mathematical definition that is intended to improve its conceptual meaning. In the modified version, the differential of specific entropy appears as a non-dimensional energy term that captures the invigoration or reduction of microscopic motion upon addition or withdrawal of heat from the system. It is also argued that heat transfer is a better model process to illustrate entropy; the canonical heat engines and refrigerators often used to illustrate this concept are not very relevant to new areas of thermodynamics (e.g., thermodynamics of biological systems). It is emphasized that entropy changes, as invoked in the Second Law, are necessarily related to the non-equilibrium interactions of two or more systems that might have initially been in thermal equilibrium but at different temperatures. The overall direction of entropy increase indicates the direction of naturally occurring heat transfer processes in an isolated system that consists of internally interacting (non-isolated) sub systems. We discuss the implication of the proposed modification on statements of the Second Law, interpretation of entropy in statistical thermodynamics, and the Third Law.Entropy2016-07-22187Article10.3390/e180702702701099-43002016-07-22doi: 10.3390/e18070270Ben Akih-Kumgeh<![CDATA[Entropy, Vol. 18, Pages 268: Mechanothermodynamic Entropy and Analysis of Damage State of Complex Systems]]>
http://www.mdpi.com/1099-4300/18/7/268
Mechanics from its side and thermodynamics from its side consider evolution of complex systems, including the Universe. Created classical thermodynamic theory of evolution has one important drawback since it predicts an inevitable heat death of the Universe which is unlikely to take place according to the modern perceptions. The attempts to create a generalized theory of evolution in mechanics were unsuccessful since mechanical equations do not discriminate between future and past. It is natural that the union of mechanics and thermodynamics was difficult to realize since they are based on different methodology. We make an attempt to propose a generalized theory of evolution which is based on the concept of tribo-fatigue entropy. Essence of the proposed approach is that tribo-fatigue entropy is determined by the processes of damageability conditioned by thermodynamic and mechanical effects causing to the change of states of any systems. Law of entropy increase is formulated analytically in the general form. Mechanothermodynamical function is constructed for specific case of fatigue damage of materials due to variation of temperature from 3 K to 0.8 of melting temperature basing on the analysis of 136 experimental results.Entropy2016-07-20187Article10.3390/e180702682681099-43002016-07-20doi: 10.3390/e18070268Leonid SosnovskiySergei Sherbakov<![CDATA[Entropy, Vol. 18, Pages 266: Complex Dynamics of a Continuous Bertrand Duopoly Game Model with Two-Stage Delay]]>
http://www.mdpi.com/1099-4300/18/7/266
This paper studies a continuous Bertrand duopoly game model with two-stage delay. Our aim is to investigate the influence of delay and weight on the complex dynamic characteristics of the system. We obtain the bifurcation point of the system respect to delay parameter by calculating. In addition, the dynamic properties of the system are simulated by power spectrum, attractor, bifurcation diagram, the largest Lyapunov exponent, 3D surface chart, 4D Cubic Chart, 2D parameter bifurcation diagram, and 3D parameter bifurcation diagram. The results show that the stability of the system depends on the delay and weight, in order to maintain stability of price and ensure the firm profit, the firms must control the parameters in the reasonable region. Otherwise, the system will lose stability, and even into chaos, which will cause fluctuations in prices, the firms cannot be profitable. Finally, the chaos control of the system is carried out by a control strategy of the state variables’ feedback and parameter variation, which effectively avoid the damage of chaos to the economic system. Therefore, the results of this study have an important practical significance to make decisions with multi-stage delay for oligopoly firms.Entropy2016-07-20187Article10.3390/e180702662661099-43002016-07-20doi: 10.3390/e18070266Junhai MaFengshan Si<![CDATA[Entropy, Vol. 18, Pages 267: Novel Criteria for Deterministic Remote State Preparation via the Entangled Six-Qubit State]]>
http://www.mdpi.com/1099-4300/18/7/267
In this paper, our concern is to design some criteria for deterministic remote state preparation for preparing an arbitrary three-particle state via a genuinely entangled six-qubit state. First, we put forward two schemes in both the real and complex Hilbert space, respectively. Using an appropriate set of eight-qubit measurement basis, the remote three-qubit preparation is completed with unit success probability. Departing from previous research, our protocol has a salient feature in that the serviceable measurement basis only contains the initial coefficients and their conjugate values. By utilizing the permutation group, it is convenient to provide the permutation relationship between coefficients. Second, our ideas and methods can also be generalized to the situation of preparing an arbitrary N-particle state in complex case by taking advantage of Bell states as quantum resources. More importantly, criteria satisfied conditions for preparation with 100% success probability in complex Hilbert space is summarized. Third, the classical communication costs of our scheme are calculated to determine the classical recourses required. It is also worth mentioning that our protocol has higher efficiency and lower resource costs compared with the other papers.Entropy2016-07-20187Article10.3390/e180702672671099-43002016-07-20doi: 10.3390/e18070267Gang XuXiu-Bo ChenZhao DouJing LiXin LiuZongpeng Li<![CDATA[Entropy, Vol. 18, Pages 264: The Structure of the Class of Maximum Tsallis–Havrda–Chavát Entropy Copulas]]>
http://www.mdpi.com/1099-4300/18/7/264
A maximum entropy copula is the copula associated with the joint distribution, with prescribed marginal distributions on [ 0 , 1 ] , which maximizes the Tsallis–Havrda–Chavát entropy with q = 2 . We find necessary and sufficient conditions for each maximum entropy copula to be a copula in the class introduced in Rodríguez-Lallena and Úbeda-Flores (2004), and we also show that each copula in that class is a maximum entropy copula.Entropy2016-07-19187Article10.3390/e180702642641099-43002016-07-19doi: 10.3390/e18070264Jesús GarcíaVerónica González-LópezRoger Nelsen<![CDATA[Entropy, Vol. 18, Pages 265: Noise Suppression in 94 GHz Radar-Detected Speech Based on Perceptual Wavelet Packet]]>
http://www.mdpi.com/1099-4300/18/7/265
A millimeter wave (MMW) radar sensor is employed in our laboratory to detect human speech because it provides a new non-contact speech acquisition method that is suitable for various applications. However, the speech detected by the radar sensor is often degraded by combined noise. This paper proposes a new perceptual wavelet packet method that is able to enhance the speech acquired using a 94 GHz MMW radar system by suppressing the noise. The process is as follows. First, the radar speech signal is decomposed using a perceptual wavelet packet. Then, an adaptive wavelet threshold and new modified thresholding function are employed to remove the noise from the detected speech. The results obtained from the speech spectrograms, listening tests and objective evaluation show that the new method significantly improves the performance of the detected speech.Entropy2016-07-19187Article10.3390/e180702652651099-43002016-07-19doi: 10.3390/e18070265Fuming ChenChuantao LiQiang AnFulai LiangFugui QiSheng LiJianqi Wang<![CDATA[Entropy, Vol. 18, Pages 263: Positive Sofic Entropy Implies Finite Stabilizer]]>
http://www.mdpi.com/1099-4300/18/7/263
We prove that, for a measure preserving action of a sofic group with positive sofic entropy, the stabilizer is finite on a set of positive measures. This extends the results of Weiss and Seward for amenable groups and free groups, respectively. It follows that the action of a sofic group on its subgroups by inner automorphisms has zero topological sofic entropy, and that a faithful action that has completely positive sofic entropy must be free.Entropy2016-07-18187Article10.3390/e180702632631099-43002016-07-18doi: 10.3390/e18070263Tom Meyerovitch<![CDATA[Entropy, Vol. 18, Pages 262: Greedy Algorithms for Optimal Distribution Approximation]]>
http://www.mdpi.com/1099-4300/18/7/262
The approximation of a discrete probability distribution t by an M-type distribution p is considered. The approximation error is measured by the informational divergence D ( t ∥ p ) , which is an appropriate measure, e.g., in the context of data compression. Properties of the optimal approximation are derived and bounds on the approximation error are presented, which are asymptotically tight. A greedy algorithm is proposed that solves this M-type approximation problem optimally. Finally, it is shown that different instantiations of this algorithm minimize the informational divergence D ( p ∥ t ) or the variational distance ∥ p − t ∥ 1 .Entropy2016-07-18187Article10.3390/e180702622621099-43002016-07-18doi: 10.3390/e18070262Bernhard GeigerGeorg Böcherer<![CDATA[Entropy, Vol. 18, Pages 252: Three Strategies for the Design of Advanced High-Entropy Alloys]]>
http://www.mdpi.com/1099-4300/18/7/252
High-entropy alloys (HEAs) have recently become a vibrant field of study in the metallic materials area. In the early years, the design of HEAs was more of an exploratory nature. The selection of compositions was somewhat arbitrary, and there was typically no specific goal to be achieved in the design. Very recently, however, the development of HEAs has gradually entered a different stage. Unlike the early alloys, HEAs developed nowadays are usually designed to meet clear goals, and have carefully chosen components, deliberately introduced multiple phases, and tailored microstructures. These alloys are referred to as advanced HEAs. In this paper, the progress in advanced HEAs is briefly reviewed. The design strategies for these materials are examined and are classified into three categories. Representative works in each category are presented. Finally, important issues and future directions in the development of advanced HEAs are pointed out and discussed.Entropy2016-07-15187Review10.3390/e180702522521099-43002016-07-15doi: 10.3390/e18070252Ming-Hung Tsai<![CDATA[Entropy, Vol. 18, Pages 255: Coupled Thermoelectric Devices: Theory and Experiment]]>
http://www.mdpi.com/1099-4300/18/7/255
In this paper, we address theoretically and experimentally the optimization problem of the heat transfer occurring in two coupled thermoelectric devices. A simple experimental set up is used. The optimization parameters are the applied electric currents. When one thermoelectric is analysed, the temperature difference Δ T between the thermoelectric boundaries shows a parabolic profile with respect to the applied electric current. This behaviour agrees qualitatively with the corresponding experimental measurement. The global entropy generation shows a monotonous increase with the electric current. In the case of two coupled thermoelectric devices, elliptic isocontours for Δ T are obtained in applying an electric current through each of the thermoelectrics. The isocontours also fit well with measurements. Optimal figure of merit is found for a specific set of values of the applied electric currents. The entropy generation-thermal figure of merit relationship is studied. It is shown that, given a value of the thermal figure of merit, the device can be operated in a state of minimum entropy production.Entropy2016-07-14187Article10.3390/e180702552551099-43002016-07-14doi: 10.3390/e18070255Jaziel RojasIván RiveraAldo FigueroaFederico Vázquez<![CDATA[Entropy, Vol. 18, Pages 260: Maximum Entropy Closure of Balance Equations for Miniband Semiconductor Superlattices]]>
http://www.mdpi.com/1099-4300/18/7/260
Charge transport in nanosized electronic systems is described by semiclassical or quantum kinetic equations that are often costly to solve numerically and difficult to reduce systematically to macroscopic balance equations for densities, currents, temperatures and other moments of macroscopic variables. The maximum entropy principle can be used to close the system of equations for the moments but its accuracy or range of validity are not always clear. In this paper, we compare numerical solutions of balance equations for nonlinear electron transport in semiconductor superlattices. The equations have been obtained from Boltzmann–Poisson kinetic equations very far from equilibrium for strong fields, either by the maximum entropy principle or by a systematic Chapman–Enskog perturbation procedure. Both approaches produce the same current-voltage characteristic curve for uniform fields. When the superlattices are DC voltage biased in a region where there are stable time periodic solutions corresponding to recycling and motion of electric field pulses, the differences between the numerical solutions produced by numerically solving both types of balance equations are smaller than the expansion parameter used in the perturbation procedure. These results and possible new research venues are discussed.Entropy2016-07-14187Article10.3390/e180702602601099-43002016-07-14doi: 10.3390/e18070260Luis BonillaManuel Carretero<![CDATA[Entropy, Vol. 18, Pages 257: Using Wearable Accelerometers in a Community Service Context to Categorize Falling Behavior]]>
http://www.mdpi.com/1099-4300/18/7/257
In this paper, the Multiscale Entropy (MSE) analysis of acceleration data collected from a wearable inertial sensor was compared with other features reported in the literature to observe falling behavior from the acceleration data, and traditional clinical scales to evaluate falling behavior. We use a fall risk assessment over a four-month period to examine &gt;65 year old participants in a community service context using simple clinical tests, including the Short Form Berg Balance Scale (SFBBS), Timed Up and Go test (TUG), and the Short Portable Mental Status Questionnaire (SPMSQ), with wearable accelerometers for the TUG test. We classified participants into fallers and non-fallers to (1) compare the features extracted from the accelerometers and (2) categorize fall risk using statistics from TUG test results. Combined, TUG and SFBBS results revealed defining features were test time, Slope(A) and slope(B) in Sit(A)-to-stand(B), and range(A) and slope(B) in Stand(B)-to-sit(A). Of (1) SPMSQ; (2) TUG and SPMSQ; and (3) BBS and SPMSQ results, only range(A) in Stand(B)-to-sit(A) was a defining feature. From MSE indicators, we found that whether in the X, Y or Z direction, TUG, BBS, and the combined TUG and SFBBS are all distinguishable, showing that MSE can effectively classify participants in these clinical tests using behavioral actions. This study highlights the advantages of body-worn sensors as ordinary and low cost tools available outside the laboratory. The results indicated that MSE analysis of acceleration data can be used as an effective metric to categorize falling behavior of community-dwelling elderly. In addition to clinical application, (1) our approach requires no expert physical therapist, nurse, or doctor for evaluations and (2) fallers can be categorized irrespective of the critical value from clinical tests.Entropy2016-07-13187Article10.3390/e180702572571099-43002016-07-13doi: 10.3390/e18070257Chia-Hsuan LeeTien-Lung SunBernard JiangVictor Choi<![CDATA[Entropy, Vol. 18, Pages 258: Structures in Sound: Analysis of Classical Music Using the Information Length]]>
http://www.mdpi.com/1099-4300/18/7/258
We show that music is represented by fluctuations away from the minimum path through statistical space. Our key idea is to envision music as the evolution of a non-equilibrium system and to construct probability distribution functions (PDFs) from musical instrument digital interface (MIDI) files of classical compositions. Classical music is then viewed through the lens of generalized position and velocity, based on the Fisher metric. Through these statistical tools we discuss a way to quantitatively discriminate between music and noise.Entropy2016-07-13187Article10.3390/e180702582581099-43002016-07-13doi: 10.3390/e18070258Schuyler NicholsonEun-jin Kim<![CDATA[Entropy, Vol. 18, Pages 256: The Logical Consistency of Simultaneous Agnostic Hypothesis Tests]]>
http://www.mdpi.com/1099-4300/18/7/256
Simultaneous hypothesis tests can fail to provide results that meet logical requirements. For example, if A and B are two statements such that A implies B, there exist tests that, based on the same data, reject B but not A. Such outcomes are generally inconvenient to statisticians (who want to communicate the results to practitioners in a simple fashion) and non-statisticians (confused by conflicting pieces of information). Based on this inconvenience, one might want to use tests that satisfy logical requirements. However, Izbicki and Esteves shows that the only tests that are in accordance with three logical requirements (monotonicity, invertibility and consonance) are trivial tests based on point estimation, which generally lack statistical optimality. As a possible solution to this dilemma, this paper adapts the above logical requirements to agnostic tests, in which one can accept, reject or remain agnostic with respect to a given hypothesis. Each of the logical requirements is characterized in terms of a Bayesian decision theoretic perspective. Contrary to the results obtained for regular hypothesis tests, there exist agnostic tests that satisfy all logical requirements and also perform well statistically. In particular, agnostic tests that fulfill all logical requirements are characterized as region estimator-based tests. Examples of such tests are provided.Entropy2016-07-13187Article10.3390/e180702562561099-43002016-07-13doi: 10.3390/e18070256Luís EstevesRafael IzbickiJulio SternRafael Stern<![CDATA[Entropy, Vol. 18, Pages 259: Ensemble Equivalence for Distinguishable Particles]]>
http://www.mdpi.com/1099-4300/18/7/259
Statistics of distinguishable particles has become relevant in systems of colloidal particles and in the context of applications of statistical mechanics to complex networks. In this paper, we present evidence that a commonly used expression for the partition function of a system of distinguishable particles leads to huge fluctuations of the number of particles in the grand canonical ensemble and, consequently, to nonequivalence of statistical ensembles. We will show that the alternative definition of the partition function including, naturally, Boltzmann’s correct counting factor for distinguishable particles solves the problem and restores ensemble equivalence. Finally, we also show that this choice for the partition function does not produce any inconsistency for a system of distinguishable localized particles, where the monoparticular partition function is not extensive.Entropy2016-07-13187Article10.3390/e180702592591099-43002016-07-13doi: 10.3390/e18070259Antonio Fernández-PeraltaRaúl Toral<![CDATA[Entropy, Vol. 18, Pages 254: Link between Lie Group Statistical Mechanics and Thermodynamics of Continua]]>
http://www.mdpi.com/1099-4300/18/7/254
In this work, we consider the value of the momentum map of the symplectic mechanics as an affine tensor called momentum tensor. From this point of view, we analyze the underlying geometric structure of the theories of Lie group statistical mechanics and relativistic thermodynamics of continua, formulated by Souriau independently of each other. We bridge the gap between them in the classical Galilean context. These geometric structures of the thermodynamics are rich and we think they might be a source of inspiration for the geometric theory of information based on the concept of entropy.Entropy2016-07-12187Article10.3390/e180702542541099-43002016-07-12doi: 10.3390/e18070254Géry de Saxcé