Entropy doi: 10.3390/e21020200

Authors: Alex Dytso Mario Goldenbaum H. Vincent Poor Shlomo Shamai (Shitz)

In this work, the capacity of multiple-input multiple-output channels that are subject to constraints on the support of the input is studied. The paper consists of two parts. The first part focuses on the general structure of capacity-achieving input distributions. Known results are surveyed and several new results are provided. With regard to the latter, it is shown that the support of a capacity-achieving input distribution is a small set in both a topological and a measure theoretical sense. Moreover, explicit conditions on the channel input space and the channel matrix are found such that the support of a capacity-achieving input distribution is concentrated on the boundary of the input space only. The second part of this paper surveys known bounds on the capacity and provides several novel upper and lower bounds for channels with arbitrary constraints on the support of the channel input symbols. As an immediate practical application, the special case of multiple-input multiple-output channels with amplitude constraints is considered. The bounds are shown to be within a constant gap to the capacity if the channel matrix is invertible and are tight in the high amplitude regime for arbitrary channel matrices. Moreover, in the regime of high amplitudes, it is shown that the capacity scales linearly with the minimum between the number of transmit and receive antennas, similar to the case of average power-constrained inputs.

]]>Entropy doi: 10.3390/e21020199

Authors: Soheil Keshmiri Hidenobu Sumioka Ryuji Yamazaki Hiroshi Ishiguro

Todays&rsquo; communication media virtually impact and transform every aspect of our daily communication and yet the extent of their embodiment on our brain is unexplored. The study of this topic becomes more crucial, considering the rapid advances in such fields as socially assistive robotics that envision the use of intelligent and interactive media for providing assistance through social means. In this article, we utilize the multiscale entropy (MSE) to investigate the effect of the physical embodiment on the older people&rsquo;s prefrontal cortex (PFC) activity while listening to stories. We provide evidence that physical embodiment induces a significant increase in MSE of the older people&rsquo;s PFC activity and that such a shift in the dynamics of their PFC activation significantly reflects their perceived feeling of fatigue. Our results benefit researchers in age-related cognitive function and rehabilitation who seek for the adaptation of these media in robot-assistive cognitive training of the older people. In addition, they offer a complementary information to the field of human-robot interaction via providing evidence that the use of MSE can enable the interactive learning algorithms to utilize the brain&rsquo;s activation patterns as feedbacks for improving their level of interactivity, thereby forming a stepping stone for rich and usable human mental model.

]]>Entropy doi: 10.3390/e21020198

Authors: Huaining Sun Xuegang Hu Yuhong Zhang

Uncertainty evaluation based on statistical probabilistic information entropy is a commonly used mechanism for a heuristic method construction of decision tree learning. The entropy kernel potentially links its deviation and decision tree classification performance. This paper presents a decision tree learning algorithm based on constrained gain and depth induction optimization. Firstly, the calculation and analysis of single- and multi-value event uncertainty distributions of information entropy is followed by an enhanced property of single-value event entropy kernel and multi-value event entropy peaks as well as a reciprocal relationship between peak location and the number of possible events. Secondly, this study proposed an estimated method for information entropy whose entropy kernel is replaced with a peak-shift sine function to establish a decision tree learning (CGDT) algorithm on the basis of constraint gain. Finally, by combining branch convergence and fan-out indices under an inductive depth of a decision tree, we built a constraint gained and depth inductive improved decision tree (CGDIDT) learning algorithm. Results show the benefits of the CGDT and CGDIDT algorithms.

]]>Entropy doi: 10.3390/e21020197

Authors: Jin Li Jin Cai Yiqun Peng Xian Zhang Cong Zhou Guang Li Jingtian Tang

Natural magnetotelluric signals are extremely weak and susceptible to various types of noise pollution. To obtain more useful magnetotelluric data for further analysis and research, effective signal-noise identification and separation is critical. To this end, we propose a novel method of magnetotelluric signal-noise identification and separation based on ApEn-MSE and Stagewise orthogonal matching pursuit (StOMP). Parameters with good irregularity metrics are introduced: Approximate entropy (ApEn) and multiscale entropy (MSE), in combination with k-means clustering, can be used to accurately identify the data segments that are disturbed by noise. Stagewise orthogonal matching pursuit (StOMP) is used for noise suppression only in data segments identified as containing strong interference. Finally, we reconstructed the signal. The results show that the proposed method can better preserve the low-frequency slow-change information of the magnetotelluric signal compared with just using StOMP, thus avoiding the loss of useful information due to over-processing, while producing a smoother and more continuous apparent resistivity curve. Moreover, the results more accurately reflect the inherent electrical structure information of the measured site itself.

]]>Entropy doi: 10.3390/e21020196

Authors: Auxiliadora Sarmiento Irene Fondón Iván Durán-Díaz Sergio Cruces

Centroid-based clustering is a widely used technique within unsupervised learning algorithms in many research fields. The success of any centroid-based clustering relies on the choice of the similarity measure under use. In recent years, most studies focused on including several divergence measures in the traditional hard k-means algorithm. In this article, we consider the problem of centroid-based clustering using the family of &alpha; &beta; -divergences, which is governed by two parameters, &alpha; and &beta; . We propose a new iterative algorithm, &alpha; &beta; -k-means, giving closed-form solutions for the computation of the sided centroids. The algorithm can be fine-tuned by means of this pair of values, yielding a wide range of the most frequently used divergences. Moreover, it is guaranteed to converge to local minima for a wide range of values of the pair ( &alpha; , &beta; ). Our theoretical contribution has been validated by several experiments performed with synthetic and real data and exploring the ( &alpha; , &beta; ) plane. The numerical results obtained confirm the quality of the algorithm and its suitability to be used in several practical applications.

]]>Entropy doi: 10.3390/e21020195

Authors: Guillermo de Anda-Jáuregui Jesús Espinal-Enriquez Enrique Hernández-Lemus

Gene regulation may be studied from an information-theoretic perspective. Gene regulatory programs are representations of the complete regulatory phenomenon associated to each biological state. In diseases such as cancer, these programs exhibit major alterations, which have been associated with the spatial organization of the genome into chromosomes. In this work, we analyze intrachromosomal, or cis-, and interchromosomal, or trans-gene regulatory programs in order to assess the differences that arise in the context of breast cancer. We find that using information theoretic approaches, it is possible to differentiate cis-and trans-regulatory programs in terms of the changes that they exhibit in the breast cancer context, indicating that in breast cancer there is a loss of trans-regulation. Finally, we use these programs to reconstruct a possible spatial relationship between chromosomes.

]]>Entropy doi: 10.3390/e21020194

Authors: Juan P. Ugarte Catalina Tobón Andrés Orozco-Duque

Catheter ablation of critical electrical propagation sites is a promising tool for reducing the recurrence of atrial fibrillation (AF). The spatial identification of the arrhythmogenic mechanisms sustaining AF requires the evaluation of electrograms (EGMs) recorded over the atrial surface. This work aims to characterize functional reentries using measures of entropy to track and detect a reentry core. To this end, different AF episodes are simulated using a 2D model of atrial tissue. Modified Courtemanche human action potential and Fenton&ndash;Karma models are implemented. Action potential propagation is modeled by a fractional diffusion equation, and virtual unipolar EGM are calculated. Episodes with stable and meandering rotors, figure-of-eight reentry, and disorganized propagation with multiple reentries are generated. Shannon entropy ( S h E n ), approximate entropy ( A p E n ), and sample entropy ( S a m p E n ) are computed from the virtual EGM, and entropy maps are built. Phase singularity maps are implemented as references. The results show that A p E n and S a m p E n maps are able to detect and track the reentry core of rotors and figure-of-eight reentry, while the S h E n results are not satisfactory. Moreover, A p E n and S a m p E n consistently highlight a reentry core by high entropy values for all of the studied cases, while the ability of S h E n to characterize the reentry core depends on the propagation dynamics. Such features make the A p E n and S a m p E n maps attractive tools for the study of AF reentries that persist for a period of time that is similar to the length of the observation window, and reentries could be interpreted as AF-sustaining mechanisms. Further research is needed to determine and fully understand the relation of these entropy measures with fibrillation mechanisms other than reentries.

]]>Entropy doi: 10.3390/e21020193

Authors: Alexander K. Hartmann Oliver Melchert Christoph Norrenbrock

Spin glasses are prototypical random systems modelling magnetic alloys. One important way to investigate spin glass models is to study domain walls. For two dimensions, this can be algorithmically understood as the calculation of a shortest path, which allows for negative distances or weights. This led to the creation of the negative weight percolation (NWP) model, which is presented here along with all necessary basics from spin glasses, graph theory and corresponding algorithms. The algorithmic approach involves a mapping to the classical matching problem for graphs. In addition, a summary of results is given, which were obtained during the past decade. This includes the study of percolation transitions in dimension from d = 2 up to and beyond the upper critical dimension d u = 6 , also for random graphs. It is shown that NWP is in a different universality class than standard percolation. Furthermore, the question of whether NWP exhibits properties of Stochastic&ndash;Loewner Evolution is addressed and recent results for directed NWP are presented.

]]>Entropy doi: 10.3390/e21020192

Authors: Mike Yuliana Wirawan Suwadi

Limitations of the computational and energy capabilities of IoT devices provide new challenges in securing communication between devices. Physical layer security (PHYSEC) is one of the solutions that can be used to solve the communication security challenges. In this paper, we conducted an investigation on PHYSEC which utilizes channel reciprocity in generating a secret key, commonly known as secret key generation (SKG) schemes. Our research focused on the efforts to get a simple SKG scheme by eliminating the information reconciliation stage so as to reduce the high computational and communication cost. We exploited the pre-processing method by proposing a modified Kalman (MK) and performing a combination of the method with a multilevel quantization, i.e., combined multilevel quantization (CMQ). Our approach produces a simple SKG scheme for its significant increase in reciprocity so that an identical secret key between two legitimate users can be obtained without going through the information reconciliation stage.

]]>Entropy doi: 10.3390/e21020191

Authors: Jundika C. Kurnia Desmond C. Lim Lianjun Chen Lishuai Jiang Agus P. Sasmito

Owing to its relatively high heat transfer performance and simple configurations, liquid cooling remains the preferred choice for electronic cooling and other applications. In this cooling approach, channel design plays an important role in dictating the cooling performance of the heat sink. Most cooling channel studies evaluate the performance in view of the first thermodynamics aspect. This study is conducted to investigate flow behaviour and heat transfer performance of an incompressible fluid in a cooling channel with oblique fins with regards to first law and second law of thermodynamics. The effect of oblique fin angle and inlet Reynolds number are investigated. In addition, the performance of the cooling channels for different heat fluxes is evaluated. The results indicate that the oblique fin channel with 20&deg; angle yields the highest figure of merit, especially at higher Re (250&ndash;1000). The entropy generation is found to be lowest for an oblique fin channel with 90&deg; angle, which is about twice than that of a conventional parallel channel. Increasing Re decreases the entropy generation, while increasing heat flux increases the entropy generation.

]]>Entropy doi: 10.3390/e21020190

Authors: Billy Peralta Ariel Saavedra Luis Caro Alvaro Soto

Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. &ldquo;Mixture-of-experts&rdquo; is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a &ldquo;winner-takes-all&rdquo; output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3&ndash;6% in some datasets. In future work, we plan to embed feature selection into this model.

]]>Entropy doi: 10.3390/e21020189

Authors: Bicao Li Huazhong Shu Zhoufeng Liu Zhuhong Shao Chunlei Li Min Huang Jie Huang

This paper introduces a new nonrigid registration approach for medical images applying an information theoretic measure based on Arimoto entropy with gradient distributions. A normalized dissimilarity measure based on Arimoto entropy is presented, which is employed to measure the independence between two images. In addition, a regularization term is integrated into the cost function to obtain the smooth elastic deformation. To take the spatial information between voxels into account, the distance of gradient distributions is constructed. The goal of nonrigid alignment is to find the optimal solution of a cost function including a dissimilarity measure, a regularization term, and a distance term between the gradient distributions of two images to be registered, which would achieve a minimum value when two misaligned images are perfectly registered using limited-memory Broyden&ndash;Fletcher&ndash;Goldfarb&ndash;Shanno (L-BFGS) optimization scheme. To evaluate the test results of our presented algorithm in non-rigid medical image registration, experiments on simulated three-dimension (3D) brain magnetic resonance imaging (MR) images, real 3D thoracic computed tomography (CT) volumes and 3D cardiac CT volumes were carried out on elastix package. Comparison studies including mutual information (MI) and the approach without considering spatial information were conducted. These results demonstrate a slight improvement in accuracy of non-rigid registration.

]]>Entropy doi: 10.3390/e21020188

Authors: Fang Yuan Yuxia Li Guangyi Wang Gang Dou Guanrong Chen

In this paper, a new memcapacitor model and its corresponding circuit emulator are proposed, based on which, a chaotic oscillator is designed and the system dynamic characteristics are investigated, both analytically and experimentally. Extreme multistability and coexisting attractors are observed in this complex system. The basins of attraction, multistability, bifurcations, Lyapunov exponents, and initial-condition-triggered similar bifurcation are analyzed. Finally, the memcapacitor-based chaotic oscillator is realized via circuit implementation with experimental results presented.

]]>Entropy doi: 10.3390/e21020187

Authors: António M. Lopes J. A. Tenreiro Machado

This paper adopts the information and fractional calculus tools for studying the dynamics of a national soccer league. A soccer league season is treated as a complex system (CS) with a state observable at discrete time instants, that is, at the time of rounds. The CS state, consisting of the goals scored by the teams, is processed by means of different tools, namely entropy, mutual information and Jensen&ndash;Shannon divergence. The CS behavior is visualized in 3-D maps generated by multidimensional scaling. The points on the maps represent rounds and their relative positioning allows for a direct interpretation of the results.

]]>Entropy doi: 10.3390/e21020186

Authors: Dai Hou Xu Yang Zhu Chen Huang Yan

Polymer nanocomposite materials, consisting of a polymer matrix embedded with nanoscale fillers or additives that reinforce the inherent properties of the matrix polymer, play a key role in many industrial applications. Understanding of the relation between thermodynamic interactions and macroscopic morphologies of the composites allow for the optimization of design and mechanical processing. This review article summarizes the recent advancement in various aspects of entropic effects in polymer nanocomposites, and highlights molecular methods used to perform numerical simulations, morphologies and phase behaviors of polymer matrices and fillers, and characteristic parameters that significantly correlate with entropic interactions in polymer nanocomposites. Experimental findings and insight obtained from theories and simulations are combined to understand how the entropic effects are turned into effective interparticle interactions that can be harnessed for tailoring nanostructures of polymer nanocomposites.

]]>Entropy doi: 10.3390/e21020185

Authors: Yeong-Cherng Liang Yanbao Zhang

The device-independent approach to physics is one where conclusions about physical systems (and hence of Nature) are drawn directly and solely from the observed correlations between measurement outcomes. This operational approach to physics arose as a byproduct of Bell&rsquo;s seminal work to distinguish, via a Bell test, quantum correlations from the set of correlations allowed by local-hidden-variable theories. In practice, since one can only perform a finite number of experimental trials, deciding whether an empirical observation is compatible with some class of physical theories will have to be carried out via the task of hypothesis testing. In this paper, we show that the prediction-based-ratio method&mdash;initially developed for performing a hypothesis test of local-hidden-variable theories&mdash;can equally well be applied to test many other classes of physical theories, such as those constrained only by the nonsignaling principle, and those that are constrained to produce any of the outer approximation to the quantum set of correlations due to Navascu&eacute;s-Pironio-Ac&iacute;n. We numerically simulate Bell tests using hypothetical nonlocal sources of correlations to illustrate the applicability of the method in both the independent and identically distributed (i.i.d.) scenario and the non-i.i.d. scenario. As a further application, we demonstrate how this method allows us to unveil an apparent violation of the nonsignaling conditions in certain experimental data collected in a Bell test. This, in turn, highlights the importance of the randomization of measurement settings, as well as a consistency check of the nonsignaling conditions in a Bell test.

]]>Entropy doi: 10.3390/e21020184

Authors: Patrick L. McDermott Christopher K. Wikle

Recurrent neural networks (RNNs) are nonlinear dynamical models commonly used in the machine learning and dynamical systems literature to represent complex dynamical or sequential relationships between variables. Recently, as deep learning models have become more common, RNNs have been used to forecast increasingly complicated systems. Dynamical spatio-temporal processes represent a class of complex systems that can potentially benefit from these types of models. Although the RNN literature is expansive and highly developed, uncertainty quantification is often ignored. Even when considered, the uncertainty is generally quantified without the use of a rigorous framework, such as a fully Bayesian setting. Here we attempt to quantify uncertainty in a more formal framework while maintaining the forecast accuracy that makes these models appealing, by presenting a Bayesian RNN model for nonlinear spatio-temporal forecasting. Additionally, we make simple modifications to the basic RNN to help accommodate the unique nature of nonlinear spatio-temporal data. The proposed model is applied to a Lorenz simulation and two real-world nonlinear spatio-temporal forecasting applications.

]]>Entropy doi: 10.3390/e21020183

Authors: Manuel Lozano-García Luis Estrada Raimon Jané

Fixed sample entropy (fSampEn) has been successfully applied to myographic signals for inspiratory muscle activity estimation, attenuating interference from cardiac activity. However, several values have been suggested for fSampEn parameters depending on the application, and there is no consensus standard for optimum values. This study aimed to perform a thorough evaluation of the performance of the most relevant fSampEn parameters in myographic respiratory signals, and to propose, for the first time, a set of optimal general fSampEn parameters for a proper estimation of inspiratory muscle activity. Different combinations of fSampEn parameters were used to calculate fSampEn in both non-invasive and the gold standard invasive myographic respiratory signals. All signals were recorded in a heterogeneous population of healthy subjects and chronic obstructive pulmonary disease patients during loaded breathing, thus allowing the performance of fSampEn to be evaluated for a variety of inspiratory muscle activation levels. The performance of fSampEn was assessed by means of the cross-covariance of fSampEn time-series and both mouth and transdiaphragmatic pressures generated by inspiratory muscles. A set of optimal general fSampEn parameters was proposed, allowing fSampEn of different subjects to be compared and contributing to improving the assessment of inspiratory muscle activity in health and disease.

]]>Entropy doi: 10.3390/e21020182

Authors: Tarsha Eason Wen Ching-Chuang Shana Sundstrom Heriberto Cabezas

Given the intensity and frequency of environmental change, the linked and cross-scale nature of social-ecological systems, and the proliferation of big data, methods that can help synthesize complex system behavior over a geographical area are of great value. Fisher information evaluates order in data and has been established as a robust and effective tool for capturing changes in system dynamics, including the detection of regimes and regime shifts. The methods developed to compute Fisher information can accommodate multivariate data of various types and requires no a priori decisions about system drivers, making it a unique and powerful tool. However, the approach has primarily been used to evaluate temporal patterns. In its sole application to spatial data, Fisher information successfully detected regimes in terrestrial and aquatic systems over transects. Although the selection of adjacently positioned sampling stations provided a natural means of ordering the data, such an approach limits the types of questions that can be answered in a spatial context. Here, we expand the approach to develop a method for more fully capturing spatial dynamics. The results reflect changes in the index that correspond with geographical patterns and demonstrate the utility of the method in uncovering hidden spatial trends in complex systems.

]]>Entropy doi: 10.3390/e21020181

Authors: Muñoz-Cobo Berna

In this paper first, we review the physical root bases of chemical reaction networks as a Markov process in multidimensional vector space. Then we study the chemical reactions from a microscopic point of view, to obtain the expression for the propensities for the different reactions that can happen in the network. These chemical propensities, at a given time, depend on the system state at that time, and do not depend on the state at an earlier time indicating that we are dealing with Markov processes. Then the Chemical Master Equation (CME) is deduced for an arbitrary chemical network from a probability balance and it is expressed in terms of the reaction propensities. This CME governs the dynamics of the chemical system. Due to the difficulty to solve this equation two methods are studied, the first one is the probability generating function method or z-transform, which permits to obtain the evolution of the factorial moment of the system with time in an easiest way or after some manipulation the evolution of the polynomial moments. The second method studied is the expansion of the CME in terms of an order parameter (system volume). In this case we study first the expansion of the CME using the propensities obtained previously and splitting the molecular concentration into a deterministic part and a random part. An expression in terms of multinomial coefficients is obtained for the evolution of the probability of the random part. Then we study how to reconstruct the probability distribution from the moments using the maximum entropy principle. Finally, the previous methods are applied to simple chemical networks and the consistency of these methods is studied.

]]>Entropy doi: 10.3390/e21020180

Authors: Xiaoyi Chen Hongbo Xu Mengyao Pan Jiupeng Zhao Yao Li Ying Song

Cracks and defects, which could result in lower reflectivity and larger full width at half maximum (FWHM), are the major obstacles for obtaining highly ordered structures of colloidal crystals (CCs). The high-quality CCs with high reflectivity (more than 90%) and 9.2 nm narrow FWHM have been successfully fabricated using a fixed proportion of a soft matter system composed of silica particles (SPs), polyethylene glycol diacrylate (PEGDA), and ethanol. The influences of refractivity difference, volume fractions, and particle dimension on FWHM were illuminated. Firstly, we clarified the influences of the planar interface and the bending interface on the self-assembly. The CCs had been successfully fabricated on the planar interface and presented unfavorable results on the bending interface. Secondly, a hard sphere system consisting of SPs, PEGDA, and ethanol was established, and the entropy-driven phase transition mechanism of a polydisperse system was expounded. The FWHM and reflectivity of CCs showed an increasing trend with increasing temperature. Consequently, high-quality CCs were obtained by adjusting temperatures (ordered structure formed at 90 &deg;C and solidified at 0 &deg;C) based on the surface phase rule of the system. We acquired a profound understanding of the principle and process of self-assembly, which is significant for preparation and application of CCs such as optical filters.

]]>Entropy doi: 10.3390/e21020179

Authors: Ramon F. Álvarez-Estrada

We review and improve previous work on non-equilibrium classical and quantum statistical systems, subject to potentials, without ab initio dissipation. We treat classical closed three-dimensional many-particle interacting systems without any &ldquo;heat bath&rdquo; ( h b ), evolving through the Liouville equation for the non-equilibrium classical distribution W c , with initial states describing thermal equilibrium at large distances but non-equilibrium at finite distances. We use Boltzmann&rsquo;s Gaussian classical equilibrium distribution W c , e q , as weight function to generate orthogonal polynomials ( H n &rsquo;s) in momenta. The moments of W c , implied by the H n &rsquo;s, fulfill a non-equilibrium hierarchy. Under long-term approximations, the lowest moment dominates the evolution towards thermal equilibrium. A non-increasing Liapunov function characterizes the long-term evolution towards equilibrium. Non-equilibrium chemical reactions involving two and three particles in a h b are studied classically and quantum-mechanically (by using Wigner functions W). Difficulties related to the non-positivity of W are bypassed. Equilibrium Wigner functions W e q generate orthogonal polynomials, which yield non-equilibrium moments of W and hierarchies. In regimes typical of chemical reactions (short thermal wavelength and long times), non-equilibrium hierarchies yield approximate Smoluchowski-like equations displaying dissipation and quantum effects. The study of three-particle chemical reactions is new.

]]>Entropy doi: 10.3390/e21020178

Authors: Garland Culbreth Bruce J. West Paolo Grigolini

In this paper, we establish a clear distinction between two processes yielding anomalous diffusion and 1 / f noise. The first process is called Stationary Fractional Brownian Motion (SFBM) and is characterized by the use of stationary correlation functions. The second process rests on the action of crucial events generating ergodicity breakdown and aging effects. We refer to the latter as Aging Fractional Brownian Motion (AFBM). To settle the confusion between these different forms of Fractional Brownian Motion (FBM) we use an entropic approach properly updated to incorporate the recent advances of biology and psychology sciences on cognition. We show that although the joint action of crucial and non-crucial events may have the effect of making the crucial events virtually invisible, the entropic approach allows us to detect their action. The results of this paper lead us to the conclusion that the communication between the heart and the brain is accomplished by AFBM processes.

]]>Entropy doi: 10.3390/e21020177

Authors: Andrea Auconi Andrea Giansanti Edda Klipp

The entropy production in stochastic dynamical systems is linked to the structure of their causal representation in terms of Bayesian networks. Such a connection was formalized for bipartite (or multipartite) systems with an integral fluctuation theorem in [Phys. Rev. Lett. 111, 180603 (2013)]. Here we introduce the information thermodynamics for time series, that are non-bipartite in general, and we show that the link between irreversibility and information can only result from an incomplete causal representation. In particular, we consider a backward transfer entropy lower bound to the conditional time series irreversibility that is induced by the absence of feedback in signal-response models. We study such a relation in a linear signal-response model providing analytical solutions, and in a nonlinear biological model of receptor-ligand systems where the time series irreversibility measures the signaling efficiency.

]]>Entropy doi: 10.3390/e21020176

Authors: Guohui Li Zhichao Yang Hong Yang

To improve the recognition accuracy of ship-radiated noise, a feature extraction method based on regenerated phase-shifted sinusoid-assisted empirical mode decomposition (RPSEMD), mutual information (MI), and differential symbolic entropy (DSE) is proposed in this paper. RPSEMD is an improved empirical mode decomposition (EMD) that alleviates the mode mixing problem of EMD. DSE is a new tool to quantify the complexity of nonlinear time series. It not only has high computational efficiency, but also can measure the nonlinear complexity of short time series. Firstly, the ship-radiated noise is decomposed into a series of intrinsic mode functions (IMFs) by RPSEMD, and the DSE of each IMF is calculated. Then, the MI between each IMF and the original signal is calculated; the sum of MIs is taken as the denominator; and each normalized MI (norMI) is obtained. Finally, each norMI is used as the weight coefficient to weight the corresponding DSE, and the weighted DSE (WDSE) is obtained. The WDSEs are sent into the support vector machine (SVM) classifier to classify and recognize three types of ship-radiated noise. The experimental results demonstrate that the recognition rate of the proposed method reaches 98.3333%. Consequently, the proposed WDSE method can effectively achieve the classification of ships.

]]>Entropy doi: 10.3390/e21020175

Authors: Hung T. Diep

In this review, we outline some principal theoretical knowledge of the properties of frustrated spin systems and magnetic thin films. The two points we would like to emphasize: (i) the physics in low dimensions where exact solutions can be obtained; (ii) the physics at phase boundaries where interesting phenomena can occur due to competing interactions of the two phases around the boundary. This competition causes a frustration. We will concentrate our attention on magnetic thin films and phenomena occurring near the boundary of two phases of different symmetries. Two-dimensional (2D) systems are in fact the limiting case of thin films with a monolayer. Naturally, we will treat this case at the beginning. We begin by defining the frustration and giving examples of frustrated 2D Ising systems that we can exactly solve by transforming them into vertex models. We will show that these simple systems already contain most of the striking features of frustrated systems such as the high degeneracy of the ground state (GS), many phases in the GS phase diagram in the space of interaction parameters, the reentrance occurring near the boundaries of these phases, the disorder lines in the paramagnetic phase, and the partial disorder coexisting with the order at equilibrium. Thin films are then presented with different aspects: surface elementary excitations (surface spin waves), surface phase transition, and criticality. Several examples are shown and discussed. New results on skyrmions in thin films and superlattices are also displayed. By the examples presented in this review we show that the frustration when combined with the surface effect in low dimensions gives rise to striking phenomena observed in particular near the phase boundaries.

]]>Entropy doi: 10.3390/e21020174

Authors: Penglei Li Lingen Chen Shaojun Xia Lei Zhang

The methanol synthesis via CO2 hydrogenation (MSCH) reaction is a useful CO2 utilization strategy, and this synthesis path has also been widely applied commercially for many years. In this work the performance of a MSCH reactor with the minimum entropy generation rate (EGR) as the objective function is optimized by using finite time thermodynamic and optimal control theory. The exterior wall temperature (EWR) is taken as the control variable, and the fixed methanol yield and conservation equations are taken as the constraints in the optimization problem. Compared with the reference reactor with a constant EWR, the total EGR of the optimal reactor decreases by 20.5%, and the EGR caused by the heat transfer decreases by 68.8%. In the optimal reactor, the total EGRs mainly distribute in the first 30% reactor length, and the EGRs caused by the chemical reaction accounts for more than 84% of the total EGRs. The selectivity of CH3OH can be enhanced by increasing the inlet molar flow rate of CO, and the CO2 conversion rate can be enhanced by removing H2O from the reaction system. The results obtained herein are in favor of optimal designs of practical tubular MSCH reactors.

]]>Entropy doi: 10.3390/e21020173

Authors: Eugenio Lippiello Cataldo Godano Lucilla de Arcangelis

An increase of seismic activity is often observed before large earthquakes. Events responsible for this increase are usually named foreshock and their occurrence probably represents the most reliable precursory pattern. Many foreshocks statistical features can be interpreted in terms of the standard mainshock-to-aftershock triggering process and are recovered in the Epidemic Type Aftershock Sequence ETAS model. Here we present a statistical study of instrumental seismic catalogs from four different geographic regions. We focus on some common features of foreshocks in the four catalogs which cannot be reproduced by the ETAS model. In particular we find in instrumental catalogs a significantly larger number of foreshocks than the one predicted by the ETAS model. We show that this foreshock excess cannot be attributed to catalog incompleteness. We therefore propose a generalized formulation of the ETAS model, the ETAFS model, which explicitly includes foreshock occurrence. Statistical features of aftershocks and foreshocks in the ETAFS model are in very good agreement with instrumental results.

]]>Entropy doi: 10.3390/e21020172

Authors: Chao-Qiang Geng Wei-Cheng Hsu Jhih-Rong Lu Ling-Wei Luo

We study thermodynamics in f ( R ) gravity with the disformal transformation. The transformation applied to the matter Lagrangian has the form of &gamma; &mu; &nu; = A ( ϕ , X ) g &mu; &nu; + B ( ϕ , X ) &part; &mu; ϕ &part; &nu; ϕ with the assumption of the Minkowski matter metric &gamma; &mu; &nu; = &eta; &mu; &nu; , where ϕ is the disformal scalar and X is the corresponding kinetic term of ϕ . We verify the generalized first and second laws of thermodynamics in this disformal type of f ( R ) gravity in the Friedmann-Lema&icirc;tre-Robertson-Walker (FLRW) universe. In addition, we show that the Hubble parameter contains the disformally induced terms, which define the effectively varying equations of state for matter.

]]>Entropy doi: 10.3390/e21020171

Authors: Emmanuel Zambrini Cruzeiro Nicolas Gisin

We study Bell scenarios with binary outcomes supplemented by one bit of classical communication. We developed a method to find facet inequalities for such scenarios even when direct facet enumeration is not possible, or at least difficult. Using this method, we partially solved the scenario where Alice and Bob choose between three inputs, finding a total of 668 inequivalent facet inequalities (with respect to relabelings of inputs and outputs). We also show that some of these inequalities are constructed from facet inequalities found in scenarios without communication, that is, the well-known Bell inequalities.

]]>Entropy doi: 10.3390/e21020170

Authors: Xianzhi Wang Shubin Si Yu Wei Yongbo Li

Multi-scale permutation entropy (MPE) is a statistic indicator to detect nonlinear dynamic changes in time series, which has merits of high calculation efficiency, good robust ability, and independence from prior knowledge, etc. However, the performance of MPE is dependent on the parameter selection of embedding dimension and time delay. To complete the automatic parameter selection of MPE, a novel parameter optimization strategy of MPE is proposed, namely optimized multi-scale permutation entropy (OMPE). In the OMPE method, an improved Cao method is proposed to adaptively select the embedding dimension. Meanwhile, the time delay is determined based on mutual information. To verify the effectiveness of OMPE method, a simulated signal and two experimental signals are used for validation. Results demonstrate that the proposed OMPE method has a better feature extraction ability comparing with existing MPE methods.

]]>Entropy doi: 10.3390/e21020169

Authors: Haas Manzoni Krieg Glatzel

High entropy or compositionally complex alloys provide opportunities for optimization towards new high-temperature materials. Improvements in the equiatomic alloy Al17Co17Cr17Cu17Fe17Ni17 (at.%) led to the base alloy for this work with the chemical composition Al10Co25Cr8Fe15Ni36Ti6 (at.%). Characterization of the beneficial particle-strengthened microstructure by scanning electron microscopy (SEM) and observation of good mechanical properties at elevated temperatures arose the need of accomplishing further optimization steps. For this purpose, the refractory metals hafnium and molybdenum were added in small amounts (0.5 and 1.0 at.% respectively) because of their well-known positive effects on mechanical properties of Ni-based superalloys. By correlation of microstructural examinations using SEM with tensile tests in the temperature range of room temperature up to 900 &deg;C, conclusions could be drawn for further optimization steps.

]]>Entropy doi: 10.3390/e21020168

Authors: Chang Wang Zongya Zhao Qiongqiong Ren Yongtao Xu Yi Yu

Various retinal vessel segmentation methods based on convolutional neural networks were proposed recently, and Dense U-net as a new semantic segmentation network was successfully applied to scene segmentation. Retinal vessel is tiny, and the features of retinal vessel can be learned effectively by the patch-based learning strategy. In this study, we proposed a new retinal vessel segmentation framework based on Dense U-net and the patch-based learning strategy. In the process of training, training patches were obtained by random extraction strategy, Dense U-net was adopted as a training network, and random transformation was used as a data augmentation strategy. In the process of testing, test images were divided into image patches, test patches were predicted by training model, and the segmentation result can be reconstructed by overlapping-patches sequential reconstruction strategy. This proposed method was applied to public datasets DRIVE and STARE, and retinal vessel segmentation was performed. Sensitivity (Se), specificity (Sp), accuracy (Acc), and area under each curve (AUC) were adopted as evaluation metrics to verify the effectiveness of proposed method. Compared with state-of-the-art methods including the unsupervised, supervised, and convolutional neural network (CNN) methods, the result demonstrated that our approach is competitive in these evaluation metrics. This method can obtain a better segmentation result than specialists, and has clinical application value.

]]>Entropy doi: 10.3390/e21020167

Authors: Fei-Quan Tu Yi-Xin Chen Qi-Hong Huang

It has previously been shown that it is more common to describe the evolution of the universe based on the emergence of space and the energy balance relation. Here we investigate the thermodynamic properties of the universe described by such a model. We show that the first law of thermodynamics and the generalized second law of thermodynamics (GSLT) are both satisfied and the weak energy condition are also fulfilled for two typical examples. Finally, we examine the physical consistency for the present model. The results show that there exists a good thermodynamic description for such a universe.

]]>Entropy doi: 10.3390/e21020166

Authors: Bukovsky Kinsner Homma

Recently, a novel concept of a non-probabilistic novelty detection measure, based on a multi-scale quantification of unusually large learning efforts of machine learning systems, was introduced as learning entropy (LE). The key finding with LE is that the learning effort of learning systems is quantifiable as a novelty measure for each individually observed data point of otherwise complex dynamic systems, while the model accuracy is not a necessary requirement for novelty detection. This brief paper extends the explanation of LE from the point of an informatics approach towards a cognitive (learning-based) information measure emphasizing the distinction from Shannon&rsquo;s concept of probabilistic information. Fundamental derivations of learning entropy and of its practical estimations are recalled and further extended. The potentials, limitations, and, thus, the current challenges of LE are discussed.

]]>Entropy doi: 10.3390/e21020165

Authors: Xiantao Jiang Tian Song Daqi Zhu Takafumi Katayama Lu Wang

Perceptual video coding (PVC) can provide a lower bitrate with the same visual quality compared with traditional H.265/high efficiency video coding (HEVC). In this work, a novel H.265/HEVC-compliant PVC framework is proposed based on the video saliency model. Firstly, both an effective and efficient spatiotemporal saliency model is used to generate a video saliency map. Secondly, a perceptual coding scheme is developed based on the saliency map. A saliency-based quantization control algorithm is proposed to reduce the bitrate. Finally, the simulation results demonstrate that the proposed perceptual coding scheme shows its superiority in objective and subjective tests, achieving up to a 9.46% bitrate reduction with negligible subjective and objective quality loss. The advantage of the proposed method is the high quality adapted for a high-definition video application.

]]>Entropy doi: 10.3390/e21020164

Authors: Wei Zhang Mingyang Zhang Yingbo Peng Fangzhou Liu Yong Liu Songhao Hu Yang Hu

In this study, an effective way of applying Ti/Ni deposited coating to the surface of diamond single crystal particles by magnetron sputtering was proposed and novel high-entropy alloy (HEA)/diamond composites were prepared by spark plasma sintering (SPS). The results show that the interfacial bonding state of the coated diamond composite is obviously better than that of the uncoated diamond composite. Corresponding mechanical properties such as hardness, density, transverse fracture strength and friction properties of the coated diamond composite were also found to be better than those of the uncoated diamond composite. The effects of interface structure and defects on the mechanical properties of HEA/diamond composites were investigated. The research directions for further improving the structure and properties of high-entropy alloy/diamond composites were proposed.

]]>Entropy doi: 10.3390/e21020163

Authors: Qian Pan Deyun Zhou Yongchuan Tang Xiaoyang Li Jichuan Huang

Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main work of this study is to define a new belief entropy for measuring uncertainty of BPA. The proposed belief entropy has two components. The first component is based on the summation of the probability mass function (PMF) of single events contained in each BPA, which are obtained using plausibility transformation. The second component is the same as the weighted Hartley entropy. The two components could effectively measure the discord uncertainty and non-specificity uncertainty found in DST framework, respectively. The proposed belief entropy is proved to satisfy the majority of the desired properties for an uncertainty measure in DST framework. In addition, when BPA is probability distribution, the proposed method could degrade to Shannon entropy. The feasibility and superiority of the new belief entropy is verified according to the results of numerical experiments.

]]>Entropy doi: 10.3390/e21020162

Authors: Qian Luo Wang

The Hammerstein adaptive filter using maximum correntropy criterion (MCC) has been shown to be more robust to outliers than the ones using the traditional mean square error (MSE) criterion. As there is no report on the robust Hammerstein adaptive filters in the complex domain, in this paper, we develop the robust Hammerstein adaptive filter under MCC to the complex domain, and propose the Hammerstein maximum complex correntropy criterion (HMCCC) algorithm. Thus, the new Hammerstein adaptive filter can be used to directly handle the complex-valued data. Additionally, we analyze the stability and steady-state mean square performance of HMCCC. Simulations illustrate that the proposed HMCCC algorithm is convergent in the impulsive noise environment, and achieves a higher accuracy and faster convergence speed than the Hammerstein complex least mean square (HCLMS) algorithm.

]]>Entropy doi: 10.3390/e21020161

Authors: Wenying Zhang Xifu Wang Kai Yang

In the management of intermodal transportation, incentive contract design problem has significant impacts on the benefit of a multimodal transport operator (MTO). In this paper, we analyze a typical water-rail-road (WRR) intermodal transportation that is composed of three serial transportation stages: water, rail and road. In particular, the entire transportation process is planned, organized, and funded by an MTO that outsources the transportation task at each stage to independent carriers (subcontracts). Due to the variability of transportation conditions, the travel time of each transportation stage depending on the respective carrier&rsquo;s effort level is unknown (asymmetric information) and characterized as an uncertain variable via the experts&rsquo; estimations. Considering the decentralized decision-making process, we interpret the incentive contract design problem for the WRR intermodal transportation as a Stackelberg game in which the risk-neutral MTO serves as the leader and the risk-averse carriers serve as the followers. Within the framework of uncertainty theory, we formulate an uncertain bi-level programming model for the incentive contract design problem under expectation and entropy decision criteria. Subsequently, we provide the analytical results of the proposed model and analyze the optimal time-based incentive contracts by developing a hybrid solution method which combines a decomposition approach and an iterative algorithm. Finally, we give a simulation example to investigate the impact of asymmetric information on the optimal time-based incentive contracts and to identify the value of information for WRR intermodal transportation.

]]>Entropy doi: 10.3390/e21020159

Authors: Gustavo M. Bosyk Sebastian Fortin Pedro W. Lamberti Federico Holik

The VII Conference on Quantum Foundations: 90 years of uncertainty (https://sites [...]

]]>Entropy doi: 10.3390/e21020160

Authors: Marcos Herrera Jesus Mur and Manuel Mur Ruiz

The practice of spatial econometrics revolves around a weighting matrix, which is often supplied by the user on previous knowledge. This is the so-called W issue. Probably, the aprioristic approach is not the best solution although, presently, there are few alternatives for the user. Our contribution focuses on the problem of selecting a W matrix from among a finite set of matrices, all of them considered appropriate for the case. We develop a new and simple method based on the entropy corresponding to the distribution of probability estimated for the data. Other alternatives, which are common in current applied work, are also reviewed. The paper includes a large study of Monte Carlo to calibrate the effectiveness of our approach compared to others. A well-known case study is also included.

]]>Entropy doi: 10.3390/e21020158

Authors: Albert No

We establish an universal property of logarithmic loss in the successive refinement problem. If the first decoder operates under logarithmic loss, we show that any discrete memoryless source is successively refinable under an arbitrary distortion criterion for the second decoder. Based on this result, we propose a low-complexity lossy compression algorithm for any discrete memoryless source.

]]>Entropy doi: 10.3390/e21020157

Authors: Andrei Khrennikov Alexander Alodjants

We start with a review on classical probability representations of quantum states and observables. We show that the correlations of the observables involved in the Bohm–Bell type experiments can be expressed as correlations of classical random variables. The main part of the paper is devoted to the conditional probability model with conditioning on the selection of the pairs of experimental settings. From the viewpoint of quantum foundations, this is a local contextual hidden-variables model. Following the recent works of Dzhafarov and collaborators, we apply our conditional probability approach to characterize (no-)signaling. Consideration of the Bohm–Bell experimental scheme in the presence of signaling is important for applications outside quantum mechanics, e.g., in psychology and social science. The main message of this paper (rooted to Ballentine) is that quantum probabilities and more generally probabilities related to the Bohm–Bell type experiments (not only in physics, but also in psychology, sociology, game theory, economics, and finances) can be classically represented as conditional probabilities.

]]>Entropy doi: 10.3390/e21020156

Authors: Hadi Jahanshahi Maryam Shahriari-Kahkeshi Raúl Alcaraz Xiong Wang Vijay P. Singh Viet-Thanh Pham

Today, four-dimensional chaotic systems are attracting considerable attention because of their special characteristics. This paper presents a non-equilibrium four-dimensional chaotic system with hidden attractors and investigates its dynamical behavior using a bifurcation diagram, as well as three well-known entropy measures, such as approximate entropy, sample entropy, and Fuzzy entropy. In order to stabilize the proposed chaotic system, an adaptive radial-basis function neural network (RBF-NN)&ndash;based control method is proposed to represent the model of the uncertain nonlinear dynamics of the system. The Lyapunov direct method-based stability analysis of the proposed approach guarantees that all of the closed-loop signals are semi-globally uniformly ultimately bounded. Also, adaptive learning laws are proposed to tune the weight coefficients of the RBF-NN. The proposed adaptive control approach requires neither the prior information about the uncertain dynamics nor the parameters value of the considered system. Results of simulation validate the performance of the proposed control method.

]]>Entropy doi: 10.3390/e21020155

Authors: Sun Zhang Xu Zhang

Attribute reduction as an important preprocessing step for data mining, and has become a hot research topic in rough set theory. Neighborhood rough set theory can overcome the shortcoming that classical rough set theory may lose some useful information in the process of discretization for continuous-valued data sets. In this paper, to improve the classification performance of complex data, a novel attribute reduction method using neighborhood entropy measures, combining algebra view with information view, in neighborhood rough sets is proposed, which has the ability of dealing with continuous data whilst maintaining the classification information of original attributes. First, to efficiently analyze the uncertainty of knowledge in neighborhood rough sets, by combining neighborhood approximate precision with neighborhood entropy, a new average neighborhood entropy, based on the strong complementarity between the algebra definition of attribute significance and the definition of information view, is presented. Then, a concept of decision neighborhood entropy is investigated for handling the uncertainty and noisiness of neighborhood decision systems, which integrates the credibility degree with the coverage degree of neighborhood decision systems to fully reflect the decision ability of attributes. Moreover, some of their properties are derived and the relationships among these measures are established, which helps to understand the essence of knowledge content and the uncertainty of neighborhood decision systems. Finally, a heuristic attribute reduction algorithm is proposed to improve the classification performance of complex data sets. The experimental results under an instance and several public data sets demonstrate that the proposed method is very effective for selecting the most relevant attributes with great classification performance.

]]>Entropy doi: 10.3390/e21020154

Authors: Pragna Mannam Alexander Volkov Robert Paolini Chirikjian Mason

This paper is a study of 2D manipulation without sensing and planning, by exploring the effects of unplanned randomized action sequences on 2D object pose uncertainty. Our approach follows the work of Erdmann and Mason&rsquo;s sensorless reorienting of an object into a completely determined pose, regardless of its initial pose. While Erdmann and Mason proposed a method using Newtonian mechanics, this paper shows that under some circumstances, a long enough sequence of random actions will also converge toward a determined final pose of the object. This is verified through several simulation and real robot experiments where randomized action sequences are shown to reduce entropy of the object pose distribution. The effects of varying object shapes, action sequences, and surface friction are also explored.

]]>Entropy doi: 10.3390/e21020153

Authors: Damien Foster Ralph Kenna Claire Pinettes

The complex zeros of the canonical (fixed walk-length) partition function are calculated for both the self-avoiding trails model and the vertex-interacting self-avoiding walk model, both in bulk and in the presence of an attractive surface. The finite-size behavior of the zeros is used to estimate the location of phase transitions: the collapse transition in the bulk and the adsorption transition in the presence of a surface. The bulk and surface cross-over exponents, ϕ and ϕ S , are estimated from the scaling behavior of the leading partition function zeros.

]]>Entropy doi: 10.3390/e21020152

Authors: Nibaldo Rodriguez Pablo Alvarez Lida Barba Guillermo Cabrera-Guerrero

Discriminative feature extraction and rolling element bearing failure diagnostics are very important to ensure the reliability of rotating machines. Therefore, in this paper, we propose multi-scale wavelet Shannon entropy as a discriminative fault feature to improve the diagnosis accuracy of bearing fault under variable work conditions. To compute the multi-scale wavelet entropy, we consider integrating stationary wavelet packet transform with both dispersion (SWPDE) and permutation (SWPPE) entropies. The multi-scale entropy features extracted by our proposed methods are then passed on to the kernel extreme learning machine (KELM) classifier to diagnose bearing failure types with different severities. In the end, both the SWPDE&ndash;KELM and the SWPPE&ndash;KELM methods are evaluated on two bearing vibration signal databases. We compare these two feature extraction methods to a recently proposed method called stationary wavelet packet singular value entropy (SWPSVE). Based on our results, we can say that the diagnosis accuracy obtained by the SWPDE&ndash;KELM method is slightly better than the SWPPE&ndash;KELM method and they both significantly outperform the SWPSVE&ndash;KELM method.

]]>Entropy doi: 10.3390/e21020151

Authors: Andrés Aragoneses Yingqi Ding

We study the time series of the output intensity of a Raman fiber laser with an ordinal patterns analysis in the laminar-turbulent transition. We look for signatures among consecutive events that indicate when the system changes from triggering low-intensity to high-intensity events. We set two thresholds, a low one and a high one, to distinguish between low intensity versus high-intensity events. We find that when the time series is performing low-intensity events (below the low threshold), it shows some preferred temporal patterns before triggering high-intensity events (above a high threshold). The preferred temporal patterns remain the same all through the pump current range studied, even though two clearly different dynamical regimes are covered (laminar regime for low pump currents and turbulent regime for high pump currents). We also find that the turbulent regime shows clearer signatures of determinism than the laminar regime.

]]>Entropy doi: 10.3390/e21020150

Authors: Podoshvedov

We propose and develop the theory of quantum teleportation of an unknown qubit based on the interaction mechanism between discrete-variable (DV) and continuous-variable (CV) states on highly transmissive beam splitter (HTBS). This DV-CV interaction mechanism is based on the simultaneous displacement of the DV state on equal in absolute value, but opposite in sign displacement amplitudes by coherent components of the hybrid in such a way that all the information about the displacement amplitudes is lost with subsequent registration of photons in the auxiliary modes. The relative phase of the displaced unknown qubit in the measurement number state basis can vary on opposite, depending on the parity of the basis states in the case of the negative amplitude of displacement that is akin to action of nonlinear effect on the teleported qubit. All measurement outcomes of the quantum teleportation are distinguishable, but the teleported state at Bob&rsquo;s disposal may acquire a predetermined amplitude-distorting factor. Two methods of getting rid of the factors are considered. The quantum teleportation is considered in various interpretations. A method for increasing the efficiency of quantum teleportation of an unknown qubit is proposed.

]]>Entropy doi: 10.3390/e21020148

Authors: Boudewijn van Milligen Benjamin Carreras Luis García Javier Nicolau

Heat transport is studied in strongly heated fusion plasmas, far from thermodynamic equilibrium. The radial propagation of perturbations is studied using a technique based on the transfer entropy. Three different magnetic confinement devices are studied, and similar results are obtained. &ldquo;Minor transport barriers&rdquo; are detected that tend to form near rational magnetic surfaces, thought to be associated with zonal flows. Occasionally, heat transport &ldquo;jumps&rdquo; over these barriers, and this &ldquo;jumping&rdquo; behavior seems to increase in intensity when the heating power is raised, suggesting an explanation for the ubiquitous phenomenon of &ldquo;power degradation&rdquo; observed in magnetically confined plasmas. Reinterpreting the analysis results in terms of a continuous time random walk, &ldquo;fast&rdquo; and &ldquo;slow&rdquo; transport channels can be discerned. The cited results can partially be understood in the framework of a resistive Magneto-HydroDynamic model. The picture that emerges shows that plasma self-organization and competing transport mechanisms are essential ingredients for a fuller understanding of heat transport in fusion plasmas.

]]>Entropy doi: 10.3390/e21020149

Authors: X. San Liang

A fundamental problem regarding the storm&ndash;jet stream interaction in the extratropical atmosphere is how energy and information are exchanged between scales. While energy transfer has been extensively investigated, the latter has been mostly overlooked, mainly due to a lack of appropriate theory and methodology. Using a recently established rigorous formalism of information flow, this study attempts to examine the problem in the setting of a three-dimensional quasi-geostrophic zonal jet, with storms excited by a set of optimal perturbation modes. We choose for this study a period when the self-sustained oscillation is in quasi-equilibrium, and when the energetics mimick the mid-latitude atmospheric circulation where available potential energy is cascaded downward to smaller scales, and kinetic energy is inversely transferred upward toward larger scales. By inverting a three-dimensional elliptic differential operator, the model is first converted into a low-dimensional dynamical system, where the components correspond to different time scales. The information exchange between the scales is then computed through ensemble prediction. For this particular problem, the resulting cross-scale information flow is mostly from smaller scales to larger scales. That is to say, during this period, this model extratropical atmosphere is dominated by a bottom-up causation, as collective patterns emerge out of independent entities and macroscopic thermodynamic properties evolve from random molecular motions. This study makes a first step toward an important field in understanding the eddy&ndash;mean flow interaction in weather and climate phenomena such as atmospheric blocking, storm track, North Atlantic Oscillation, to name a few.

]]>Entropy doi: 10.3390/e21020147

Authors: Abdel-Baset A. Mohamed Shoukry S. Hassan Rania A. Alharbey

Wehrl entropy and its density are used to investigate the dynamics of loss of coherence and information in a phase space for an atomic model of two-photon two-level atom coupled to different radiation reservoirs (namely, normal vacuum (NV), thermal field (TF) and squeezed vacuum (SV) reservoirs). Particularly, quantum interference (QI) effect, due to the 2-photon transition decay channels, has a paramount role in: (i) the atomic inversion decay in the NV case, which behaves as quantum Zeno and anti-Zeno decay effect; (ii) the coherence and information loss in the phase space; and (iii) identifying temporal information entropy squeezing. Results are also sensitive to the initial atomic state.

]]>Entropy doi: 10.3390/e21020146

Authors: Wei-Bing Liao Hongti Zhang Zhi-Yuan Liu Pei-Feng Li Jian-Jun Huang Chun-Yan Yu Yang Lu

Recently, high-entropy alloy thin films (HEATFs) with nanocrystalline structures and high hardness were developed by magnetron sputtering technique and have exciting potential to make small structure devices and precision instruments with sizes ranging from nanometers to micrometers. However, the strength and deformation mechanisms are still unclear. In this work, nanocrystalline Al0.3CoCrFeNi HEATFs with a thickness of ~4 &mu;m were prepared. The microstructures of the thin films were comprehensively characterized, and the mechanical properties were systematically studied. It was found that the thin film was smooth, with a roughness of less than 5 nm. The chemical composition of the high entropy alloy thin film was homogeneous with a main single face-centered cubic (FCC) structure. Furthermore, it was observed that the hardness and the yield strength of the high-entropy alloy thin film was about three times that of the bulk samples, and the plastic deformation was inhomogeneous. Our results could provide an in-depth understanding of the mechanics and deformation mechanism for future design of nanocrystalline HEATFs with desired properties.

]]>Entropy doi: 10.3390/e21020145

Authors: Tra Duong Kim Sohaib Kim

This paper proposes a reliable fault diagnosis model for a spherical storage tank. The proposed method first used a blind source separation (BSS) technique to de-noise the input signals so that the signals acquired from a spherical tank under two types of conditions (i.e., normal and crack conditions) were easily distinguishable. BSS split the signals into different sources that provided information about the noise and useful components of the signals. Therefore, an unimpaired signal could be restored from the useful components. From the de-noised signals, wavelet-based fault features, i.e., the relative energy (REWPN) and entropy (EWPN) of a wavelet packet node, were extracted. Finally, these features were used to train one-against-all multiclass support vector machines (OAA MCSVMs), which classified the instances of normal and faulty states of the tank. The efficiency of the proposed fault diagnosis model was examined by visualizing the de-noised signals obtained from the BSS method and its classification performance. The proposed fault diagnostic model was also compared to existing techniques. Experimental results showed that the proposed method outperformed conventional techniques, yielding average classification accuracies of 97.25% and 98.48% for the two datasets used in this study.

]]>Entropy doi: 10.3390/e21020144

Authors: Jorge Rosenblatt

We describe society as an out-of-equilibrium probabilistic system: in it, N individuals occupy W resource states and produce entropy S over definite time periods. The resulting thermodynamics are however unusual, because a second entropy, H , measures inequality or diversity―a typically social feature―in the distribution of available resources. A symmetry phase transition takes place at Gini values 1 / 3 , where realistic distributions become asymmetric. Four constraints act on S : N and W , and new ones, diversity and interactions between individuals; the latter are determined by the coordinates of a single point in the data, the peak. The occupation number of a job is either zero or one, suggesting Fermi&ndash;Dirac statistics for employment. Contrariwise, an indefinite number of individuals can occupy a state defined as a quantile of income or of age, so Bose&ndash;Einstein statistics may be required. Indistinguishability rather than anonymity of individuals and resources is thus needed. Interactions between individuals define classes of equivalence that happen to coincide with acceptable definitions of social classes or periods in human life. The entropy S is non-extensive and obtainable from data. Theoretical laws are compared to empirical data in four different cases of economic or physiological diversity. Acceptable fits are found for all of them.

]]>Entropy doi: 10.3390/e21020143

Authors: Zhipeng Lin Yuhua Tang Yongjun Zhang

The Recommender System (RS) has obtained a pivotal role in e-commerce. To improve the performance of RS, review text information has been extensively utilized. However, it is still a challenge for RS to extract the most informative feature from a tremendous amount of reviews. Another significant issue is the modeling of user&ndash;item interaction, which is rarely considered to capture high- and low-order interactions simultaneously. In this paper, we design a multi-level attention mechanism to learn the usefulness of reviews and the significance of words by Deep Neural Networks (DNN). In addition, we develop a hybrid prediction structure that integrates Factorization Machine (FM) and DNN to model low-order user&ndash;item interactions as in FM and capture the high-order interactions as in DNN. Based on these two designs, we build a Multi-level Attentional and Hybrid-prediction-based Recommender (MAHR) model for recommendation. Extensive experiments on Amazon and Yelp datasets showed that our approach provides more accurate recommendations than the state-of-the-art recommendation approaches. Furthermore, the verification experiments and explainability study, including the visualization of attention modules and the review-usefulness prediction test, also validated the reasonability of our multi-level attention mechanism and hybrid prediction.

]]>Entropy doi: 10.3390/e21020142

Authors: Ernesto Benítez Rodríguez Luis Manuel Arévalo Aguilar

The concept of disturbance is of transcendental importance in Quantum Mechanics (QM). This key concept has been described in two different ways, the first one considering that the disturbance affects observables like x and p, as in the Heisenberg&rsquo;s analysis of the measurement process and the other one takes into consideration that disturbance affects the state of the system instead. Entropic information measures have provided a path for studying disturbance in these both approaches; in fact, we found that initially it was studied by employing these entropic measures. In addition, in the last decade, there was an extensive amount of analyses and several new definitions of the disturbance concept emerged. Many crucial factors like this have inspired this concise paper which gathers the different concepts and definitions that have emerged through time for the better understanding of this topic.

]]>Entropy doi: 10.3390/e21020141

Authors: Gregg Jaeger

The question of whether virtual quantum particles exist is considered here in light of previous critical analysis and under the assumption that there are particles in the world as described by quantum field theory. The relationship of the classification of particles to quantum-field-theoretic calculations and the diagrammatic aids that are often used in them is clarified. It is pointed out that the distinction between virtual particles and others and, therefore, judgments regarding their reality have been made on basis of these methods rather than on their physical characteristics. As such, it has obscured the question of their existence. It is here argued that the most influential arguments against the existence of virtual particles but not other particles fail because they either are arguments against the existence of particles in general rather than virtual particles per se, or are dependent on the imposition of classical intuitions on quantum systems, or are simply beside the point. Several reasons are then provided for considering virtual particles real, such as their descriptive, explanatory, and predictive value, and a clearer characterization of virtuality&mdash;one in terms of intermediate states&mdash;that also applies beyond perturbation theory is provided. It is also pointed out that in the role of force mediators, they serve to preclude action-at-a-distance between interacting particles. For these reasons, it is concluded that virtual particles are as real as other quantum particles.

]]>Entropy doi: 10.3390/e21020140

Authors: Simon Wing Jay R. Johnson

Characterizing and modeling processes at the sun and space plasma in our solar system are difficult because the underlying physics is often complex, nonlinear, and not well understood. The drivers of a system are often nonlinearly correlated with one another, which makes it a challenge to understand the relative effects caused by each driver. However, entropy-based information theory can be a valuable tool that can be used to determine the information flow among various parameters, causalities, untangle the drivers, and provide observational constraints that can help guide the development of the theories and physics-based models. We review two examples of the applications of the information theoretic tools at the Sun and near-Earth space environment. In the first example, the solar wind drivers of radiation belt electrons are investigated using mutual information (MI), conditional mutual information (CMI), and transfer entropy (TE). As previously reported, radiation belt electron flux (Je) is anticorrelated with solar wind density (nsw) with a lag of 1 day. However, this lag time and anticorrelation can be attributed mainly to the Je(t + 2 days) correlation with solar wind velocity (Vsw)(t) and nsw(t + 1 day) anticorrelation with Vsw(t). Analyses of solar wind driving of the magnetosphere need to consider the large lag times, up to 3 days, in the (Vsw, nsw) anticorrelation. Using CMI to remove the effects of Vsw, the response of Je to nsw is 30% smaller and has a lag time &lt;24 h, suggesting that the loss mechanism due to nsw or solar wind dynamic pressure has to start operating in &lt;24 h. Nonstationarity in the system dynamics is investigated using windowed TE. The triangle distribution in Je(t + 2 days) vs. Vsw(t) can be better understood with TE. In the second example, the previously identified causal parameters of the solar cycle in the Babcock&ndash;Leighton type model such as the solar polar field, meridional flow, polar faculae (proxy for polar field), and flux emergence are investigated using TE. The transfer of information from the polar field to the sunspot number (SSN) peaks at lag times of 3&ndash;4 years. Both the flux emergence and the meridional flow contribute to the polar field, but at different time scales. The polar fields from at least the last 3 cycles contain information about SSN.

]]>Entropy doi: 10.3390/e21020139

Authors: Noor Khan Zahir Shah Saeed Islam Ilyas Khan Tawfeeq Alkanhal Iskander Tlili

Chemical reaction in mixed convection magnetohydrodynamic second grade nanoliquid thin film flow through a porous medium containing nanoparticles and gyrotactic microorganisms is considered with entropy generation. The stratification phenomena, heat and mass transfer simultaneously take place within system. Microorganisms are utilized to stabilize the suspended nanoparticles through bioconvection. For the chemical reaction of species, the mass transfer increases. The governing equations of the problem are transformed to nonlinear differential equations through similarity variables, which are solved through a well known scheme called homotopy analysis method. The solution is expressed through graphs and illustrations which show the influences of all the parameters. The residual error graphs elucidate the authentication of the present work.

]]>Entropy doi: 10.3390/e21020138

Authors: Sun Wang Xu Zhang

For continuous numerical data sets, neighborhood rough sets-based attribute reduction is an important step for improving classification performance. However, most of the traditional reduction algorithms can only handle finite sets, and yield low accuracy and high cardinality. In this paper, a novel attribute reduction method using Lebesgue and entropy measures in neighborhood rough sets is proposed, which has the ability of dealing with continuous numerical data whilst maintaining the original classification information. First, Fisher score method is employed to eliminate irrelevant attributes to significantly reduce computation complexity for high-dimensional data sets. Then, Lebesgue measure is introduced into neighborhood rough sets to investigate uncertainty measure. In order to analyze the uncertainty and noisy of neighborhood decision systems well, based on Lebesgue and entropy measures, some neighborhood entropy-based uncertainty measures are presented, and by combining algebra view with information view in neighborhood rough sets, a neighborhood roughness joint entropy is developed in neighborhood decision systems. Moreover, some of their properties are derived and the relationships are established, which help to understand the essence of knowledge and the uncertainty of neighborhood decision systems. Finally, a heuristic attribute reduction algorithm is designed to improve the classification performance of large-scale complex data. The experimental results under an instance and several public data sets show that the proposed method is very effective for selecting the most relevant attributes with high classification accuracy.

]]>Entropy doi: 10.3390/e21020137

Authors: Murtadha D. Hssayeni Joohi Jimenez-Shahed Behnaz Ghoraani

The success of medication adjustment in Parkinson&rsquo;s disease (PD) patients with motor fluctuation relies on the knowledge about their fluctuation severity. However, because of the temporal and spatial variability in motor fluctuations, a single clinical examination often fails to capture the spectrum of motor impairment experienced in routine daily life. In this study, we developed an algorithm to estimate the degree of motor fluctuation severity from two wearable sensors&rsquo; data during subjects&rsquo; free body movements. Specifically, we developed a new hybrid feature extraction method to represent the longitudinal changes of motor function from the sensor data. Next, we developed a classification model based on random forest to learn the changes in the patterns of the sensor data as the severity of the motor function changes. We evaluated our algorithm using data from 24 subjects with idiopathic PD as they performed a variety of daily routine activities. A leave-one-subject-out assessment of the algorithm resulted in 83.33% accuracy, indicating that our approach holds a great promise to passively detect degree of motor fluctuation severity from continuous monitoring of an individual&rsquo;s free body movements. Such a sensor-based assessment system and algorithm combination could provide the objective and comprehensive information about the fluctuation severity that can be used by the treating physician to effectively adjust therapy for PD patients with troublesome motor fluctuation.

]]>Entropy doi: 10.3390/e21020136

Authors: Shize Xiao Xiaohui Cheng Zhou Yang

This paper establishes a non-equilibrium thermodynamic constitutive model that can predict the undrained shear behavior of saturated sand. Starting from the basic laws of thermodynamics, the model does not require the classical concepts in elasto-plastic models, such as the yield function, the flow rule, and the hardening rule. It is also different from the existing thermodynamic constitutive models in soil mechanics literatures. The model does not use a complex nonlinear elastic potential as usually and introduces a coupling energy dissipative mechanism between the viscosity and elasticity relaxation, which is essential in granular materials. Then this model was used to simulate the undrained shear test of Toyoura sand. The model can predict the critical state, dilatancy-contraction and hardening-softening characteristics of sand during undrained triaxial shearing.

]]>Entropy doi: 10.3390/e21020135

Authors: Zezhong Feng Jun Ma Xiaodong Wang Jiande Wu Chengjiang Zhou

The Empirical Wavelet Transform (EWT), which has a reliable mathematical derivation process and can adaptively decompose signals, has been widely used in mechanical applications, EEG, seismic detection and other fields. However, the EWT still faces the problem of how to optimally divide the Fourier spectrum during the application process. When there is noise interference in the analyzed signal, the parameterless scale-space histogram method will divide the spectrum into a variety of narrow bands, which will weaken or even fail to extract the fault modulation information. To accurately determine the optimal resonant demodulation frequency band, this paper proposes a method for applying Adaptive Average Spectral Negentropy (AASN) to EWT analysis (AEWT): Firstly, the spectrum is segmented by the parameterless clustering scale-space histogram method to obtain the corresponding empirical mode. Then, by comprehensively considering the Average Spectral Negentropy (ASN) index and correlation coefficient index on each mode, the correlation coefficient is used to adjust the ASN value of each mode, and the IMF with the highest value is used as the center frequency band of the fault information. Finally, a new resonant frequency band is reconstructed for the envelope demodulation analysis. The experimental results of different background noise intensities show that the proposed method can effectively detect the repetitive transients in the signal.

]]>Entropy doi: 10.3390/e21020134

Authors: Kishor Bharti Maharshi Ray Leong-Chuan Kwek

Quantum communication and quantum computation form the two crucial facets of quantum information theory. While entanglement and its manifestation as Bell non-locality have been proved to be vital for communication tasks, contextuality (a generalisation of Bell non-locality) has shown to be the crucial resource behind various models of quantum computation. The practical and fundamental aspects of these non-classical resources are still poorly understood despite decades of research. We explore non-classical correlations exhibited by some of these quantum as well as super-quantum resources in the n-cycle setting. In particular, we focus on correlations manifested by Kochen&ndash;Specker&ndash;Klyachko box (KS box), scenarios involving n-cycle non-contextuality inequalities and Popescu&ndash;Rohlrich boxes (PR box). We provide the criteria for optimal classical simulation of a KS box of arbitrary n dimension. The non-contextuality inequalities are analysed for n-cycle setting, and the condition for the quantum violation for odd as well as even n-cycle is discussed. We offer a simple extension of even cycle non-contextuality inequalities to the phase space case. Furthermore, we simulate a generalised PR box using KS box and provide some interesting insights. Towards the end, we discuss a few possible interesting open problems for future research. Our work connects generalised PR boxes, arbitrary dimensional KS boxes, and n-cycle non-contextuality inequalities and thus provides the pathway for the study of these contextual and nonlocal resources at their junction.

]]>Entropy doi: 10.3390/e21020133

Authors: Junjie Ren Qiao Zheng Ping Guo Chunlan Zhao

In the development of tight gas reservoirs, gas flow through porous media usually takes place deep underground with multiple mechanisms, including gas slippage and stress sensitivity of permeability and porosity. However, little work has been done to simultaneously incorporate these mechanisms in the lattice Boltzmann model for simulating gas flow through porous media. This paper presents a lattice Boltzmann model for gas flow through porous media with a consideration of these effects. The apparent permeability and porosity are calculated based on the intrinsic permeability, intrinsic porosity, permeability modulus, porosity sensitivity exponent, and pressure. Gas flow in a two-dimensional channel filled with a homogeneous porous medium is simulated to validate the present model. Simulation results reveal that gas slippage can enhance the flow rate in tight porous media, while stress sensitivity of permeability and porosity reduces the flow rate. The simulation results of gas flow in a porous medium with different mineral components show that the gas slippage and stress sensitivity of permeability and porosity not only affect the global velocity magnitude, but also have an effect on the flow field. In addition, gas flow in a porous medium with fractures is also investigated. It is found that the fractures along the pressure-gradient direction significantly enhance the total flow rate, while the fractures perpendicular to the pressure-gradient direction have little effect on the global permeability of the porous medium. For the porous medium without fractures, the gas-slippage effect is a major influence factor on the global permeability, especially under low pressure; for the porous medium with fractures, the stress-sensitivity effect plays a more important role in gas flow.

]]>Entropy doi: 10.3390/e21020132

Authors: Gengxi Zhang Zhenghong Zhou Xiaoling Su Olusola O. Ayantobo

Streamflow forecasting is vital for reservoir operation, flood control, power generation, river ecological restoration, irrigation and navigation. Although monthly streamflow time series are statistic, they also exhibit seasonal and periodic patterns. Using maximum Burg entropy, maximum configurational entropy and minimum relative entropy, the forecasting models for monthly streamflow series were constructed for five hydrological stations in northwest China. The evaluation criteria of average relative error (RE), root mean square error (RMSE), correlation coefficient (R) and determination coefficient (DC) were selected as performance metrics. Results indicated that the RESA model had the highest forecasting accuracy, followed by the CESA model. However, the BESA model had the highest forecasting accuracy in a low-flow period, and the prediction accuracies of RESA and CESA models in the flood season were relatively higher. In future research, these entropy spectral analysis methods can further be applied to other rivers to verify the applicability in the forecasting of monthly streamflow in China.

]]>Entropy doi: 10.3390/e21020131

Authors: Michael Widom Michael Gao

The information required to specify a liquid structure equals, in suitable units, its thermodynamic entropy. Hence, an expansion of the entropy in terms of multi-particle correlation functions can be interpreted as a hierarchy of information measures. Utilizing first principles molecular dynamics simulations, we simulate the structure of liquid aluminum to obtain its density, pair and triplet correlation functions, allowing us to approximate the experimentally measured entropy and relate the excess entropy to the information content of the correlation functions. We discuss the accuracy and convergence of the method.

]]>Entropy doi: 10.3390/e21020130

Authors: Entropy Editorial Office

On behalf of the Editor-in-Chief, Prof. Dr. Kevin H. Knuth, we are pleased to announce the Entropy Best Paper Award for 2018 [...]

]]>Entropy doi: 10.3390/e21020129

Authors: Florian Ries Yongxiang Li Kaushal Nishad Johannes Janicka Amsini Sadiki

In this work, entropy generation analysis is applied to characterize and optimize a turbulent impinging jet on a heated solid surface. In particular, the influence of plate inclinations and Reynolds numbers on the turbulent heat and fluid flow properties and its impact on the thermodynamic performance of such flow arrangements are numerically investigated. For this purpose, novel model equations are derived in the frame of Large Eddy Simulation (LES) that allows calculation of local entropy generation rates in a post-processing phase including the effect of unresolved subgrid-scale irreversibilities. From this LES-based study, distinctive features of heat and flow dynamics of the impinging fluid are detected and optimal operating designs for jet impingement cooling are identified. It turned out that (1) the location of the stagnation point and that of the maximal Nusselt number differ in the case of plate inclination; (2) predominantly the impinged wall acts as a strong source of irreversibility; and (3) a flow arrangement with a jet impinging normally on the heated surface allows the most efficient use of energy which is associated with lowest exergy lost. Furthermore, it is found that increasing the Reynolds number intensifies the heat transfer and upgrades the second law efficiency of such thermal systems. Thereby, the thermal efficiency enhancement can overwhelm the frictional exergy loss.

]]>Entropy doi: 10.3390/e21020128

Authors: Aline Viol Fernanda Palhano-Fontes Heloisa Onias Draulio B. de Araujo Philipp Hövel Gandhi M. Viswanathan

With the aim of further advancing the understanding of the human brain&rsquo;s functional connectivity, we propose a network metric which we term the geodesic entropy. This metric quantifies the Shannon entropy of the distance distribution to a specific node from all other nodes. It allows to characterize the influence exerted on a specific node considering statistics of the overall network structure. The measurement and characterization of this structural information has the potential to greatly improve our understanding of sustained activity and other emergent behaviors in networks. We apply this method to study how the psychedelic infusion Ayahuasca affects the functional connectivity of the human brain in resting state. We show that the geodesic entropy is able to differentiate functional networks of the human brain associated with two different states of consciousness in the awaking resting state: (i) the ordinary state and (ii) a state altered by ingestion of the Ayahuasca. The functional brain networks from subjects in the altered state have, on average, a larger geodesic entropy compared to the ordinary state. Finally, we discuss why the geodesic entropy may bring even further valuable insights into the study of the human brain and other empirical networks.

]]>Entropy doi: 10.3390/e21020127

Authors: Jose Diazdelacruz Miguel Angel Martin-Delgado

A physical system out of thermal equilibrium is a resource for obtaining useful work when a heat bath at some temperature is available. Information Heat Engines are the devices which generalize the Szilard cylinders and make use of the celebrated Maxwell demons to this end. In this paper, we consider a thermo-chemical reservoir of electrons which can be exchanged for entropy and work. Qubits are used as messengers between electron reservoirs to implement long-range voltage transformers with neither electrical nor magnetic interactions between the primary and secondary circuits. When they are at different temperatures, the transformers work according to Carnot cycles. A generalization is carried out to consider an electrical network where quantum techniques can furnish additional security.

]]>Entropy doi: 10.3390/e21020126

Authors: Martina Formichini Giulio Cimini Emanuele Pugliese Andrea Gabrielli

In this work we aim at identifying combinations of technological advancements that reveal the presence of local capabilities for a given industrial production. To this end, we generated a multilayer network using country-level patent and trade data, and performed motif-based analysis on this network using a statistical-validation approach derived from maximum-entropy arguments. We show that in many cases the signal far exceeds the noise, providing robust evidence of synergies between different technologies that can lead to a competitive advantage in specific markets. Our results can be highly useful for policymakers to inform industrial and innovation policies.

]]>Entropy doi: 10.3390/e21020125

Authors: James A. Rodger

This paper investigates the underlying driving force in strategic decision-making. From a conceptual standpoint, few studies empirically studied the decision-maker&rsquo;s intrinsic state composed of entropy and uncertainty. This study examines a mutual information theory approach integrated into a state of qualia complexity that minimizes exclusion and maximizes the interactions of the information system and its dynamic environment via logical metonymy, illusion, and epigenetics. The article questions whether decision-makers at all levels of the organization are responding from the consciousness of an objective quale from a more subjective qualia awareness in the narrow-sense perspective of individual instances of their conscious experience. To quantify this research question, we explore several hypotheses revolving around strategic information system decisions. In this research, we posit that the eigenvalues of factor analysis along with the reduction in the uncertainty coefficients of the qualia entropy will be balanced by the quale enthalpy of our information theory structural equation model of trust, flexibility, expertise, top management support, and competitive advantage performance. We operationalize the integration of the aforementioned top management support, information systems competencies, and competitive advantage performance concepts into the qualia consciousness awareness and information theory quale framework.

]]>Entropy doi: 10.3390/e21020124

Authors: Václav Uruba

The role of energy and entropy in the decomposition of turbulent velocity flow-fields is shown in this paper. Decomposition methods based on the energy concept are taken into account&mdash;proper orthogonal decomposition (POD) and its extension bi-orthogonal decomposition (BOD). The methods are well known; however, various versions are used and the interpretation of results is not straightforward. To make this clearer, the specific definition of modes is suggested and specified; moreover, energy- and entropy-motivated views on the decomposed modes are presented. This concept could offer new possibilities in the physical interpretation of modes and in reduced-order modeling (ROM) strategy efficiency evaluation.

]]>Entropy doi: 10.3390/e21020123

Authors: Zhongfan Zhu Jingshan Yu

In the research field of river dynamics, the thickness of bed-load is an important parameter in determining sediment discharge in open channels. Some studies have estimated the bed-load thickness from theoretical and/or experimental perspectives. This study attempts to propose the mathematical formula for the bed-load thickness by using the Tsallis entropy theory. Assuming the bed-load thickness is a random variable and using the method for the maximization of the entropy function, the present study derives an explicit expression for the thickness of the bed-load layer as a function with non-dimensional shear stress, by adopting a hypothesis regarding the cumulative distribution function of the bed-load thickness. This expression is verified against six experimental datasets and are also compared with existing deterministic models and the Shannon entropy-based expression. It has been found that there is good agreement between the derived expression and the experimental data, and the derived expression has a better fitting accuracy than some existing deterministic models. It has been also found that the derived Tsallis entropy-based expression has a comparable prediction ability for experimental data to the Shannon entropy-based expression. Finally, the impacts of the mass density of the particle and particle diameter on the bed-load thickness in open channels are also discussed based on this derived expression.

]]>Entropy doi: 10.3390/e21020122

Authors: Yuchen Sun Boren Ke Yulin Li Kai Yang Mingqi Yang Wei Ji Zhengyi Fu

In the study, an equiatomic CoCrNiCuZn high-entropy alloy (HEA) was prepared by mechanical alloying (MA) and the phases, microstructures, and thermal properties of the alloy powder were explored. The results suggest that a solid solution with body-centered cubic (BCC) phase and a crystalline size of 10 nm formed after 60 h of milling. Subsequently, the alloy powder was consolidated by spark plasma sintering (SPS) at different temperatures (600 &deg;C, 700 &deg;C, 800 &deg;C, and 900 &deg;C). Two kinds of face-centered cubic (FCC) phases co-existed in the as-sintered samples. Besides, Vickers hardness and compressive strength of the consolidated alloy sintered at 900 &deg;C were respectively 615 HV and 2121 MPa, indicating excellent mechanical properties.

]]>Entropy doi: 10.3390/e21020121

Authors: Yongsheng Qi Xuebin Meng Chenxi Lu Xuejin Gao Lin Wang

Multiple phases with phase to phase transitions are important characteristics of many batch processes. The linear characteristics between phases are taken into consideration in the traditional algorithms while nonlinearities are neglected, which can lead to inaccuracy and inefficiency in monitoring. The focus of this paper is nonlinear multi-phase batch processes. A similarity metric is defined based on kernel entropy component analysis (KECA). A KECA similarity-based method is proposed for phase division and fault monitoring. First, nonlinear characteristics can be extracted in feature space via performing KECA on each preprocessed time-slice data matrix. Then phase division is achieved with the similarity variation of the extracted feature information. Then, a series of KECA models and slide-KECA models are established for steady and transitions phases respectively, which can reflect the diversity of transitional characteristics objectively and preferably deal with the stage-transition monitoring problem in multistage batch processes. Next, in order to overcome the problem that the traditional contribution plot cannot be applied to the kernel mapping space, a nonlinear contribution plot diagnosis algorithm is proposed, which is easier, more intuitive and implementable compared with the traditional one. Finally, simulations are performed on penicillin fermentation and industrial application. Specifically, the proposed method detects the abnormal agitation power and the abnormal substrate supply at 47 h and 86 h, respectively. Compared with traditional methods, it has better real-time performance and higher efficiency. Results demonstrate the ability of the proposed method to detect faults accurately and effectively in practice.

]]>Entropy doi: 10.3390/e21020120

Authors: Ping Hou Jun Hu Jie Gao Peican Zhu

In this paper, the problem of stability analysis for memristor-based complex-valued neural networks (MCVNNs) with time-varying delays is investigated extensively. This paper focuses on the exponential stability of the MCVNNs with time-varying delays. By means of the Brouwer&rsquo;s fixed-point theorem and M-matrix, the existence, uniqueness, and exponential stability of the equilibrium point for MCVNNs are studied, and several sufficient conditions are obtained. In particular, these results can be applied to general MCVNNs whether the activation functions could be explicitly described by dividing into two parts of the real parts and imaginary parts or not. Two numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.

]]>Entropy doi: 10.3390/e21020119

Authors: Yusuke Uchiyama Takanori Kadoya Kei Nakagawa

Risk diversification is one of the dominant concerns for portfolio managers. Various portfolio constructions have been proposed to minimize the risk of the portfolio under some constraints, including expected returns. We propose a portfolio construction method that incorporates the complex valued principal component analysis into the risk diversification portfolio construction. The proposed method was verified to outperform the conventional risk parity and risk diversification portfolio constructions.

]]>Entropy doi: 10.3390/e21020118

Authors: Qinghua Huang

Seismicity pattern changes that are associated with strong earthquakes are an interesting topic with potential applications for natural hazard mitigation. As a retrospective case study of the Ms7.3 Yutian earthquake, which was an inland normal faulting event that occurred on 21 March 2008, the Region-Time-Length (RTL) method is applied to the seismological data of the China Earthquake Administration (CEA) to analyze the features of the seismicity pattern changes before the Yutian earthquake. The temporal variations of the RTL parameters of the earthquake epicenter showed that a quiescence anomaly of seismicity appeared in 2005. The Yutian main shock did not occur immediately after the local seismicity recovered to the background level, but with a time delay of about two years. The spatial variations of seismic quiescence indicated that an anomalous zone of seismic quiescence appeared near the Yutian epicentral region in 2005. This result is consistent with that obtained from the temporal changes of seismicity. The above spatio-temporal seismicity changes prior to the inland normal faulting Yutian earthquake showed similar features to those reported for some past strong earthquakes with inland strike faulting or thrust faulting. This study may provide useful information for understanding the seismogenic evolution of strong earthquakes.

]]>Entropy doi: 10.3390/e21020117

Authors: Li Zou Yibo Sun Xinhua Yang

In order to obtain comprehensive assessment of the factors influencing fatigue life and to further improve the accuracy of fatigue life prediction of welded joints, soft computing methods, including entropy-based neighborhood rough set reduction algorithm, the particle swarm optimization (PSO) algorithm and support vector regression machine (SVRM) are combined to construct a fatigue life prediction model of titanium alloy welded joints. By using an entropy-based neighborhood rough set reduction algorithm, the influencing factors of the fatigue life of titanium alloy welded joints such as joint type, plate thickness, etc. are analyzed and the reduction results are obtained. Fatigue characteristic domains are proposed and determined subsequently according to the reduction results. The PSO-SVRM model for fatigue life prediction of titanium alloy welded joints is established in the suggested fatigue characteristic domains. Experimental results show that by taking into account the impact of joint type, the PSO-SVRM model could better predict the fatigue life of titanium alloy welded joints. The PSO-SVRM model indicates the relationship between fatigue life and fatigue life influencing factors in multidimensional space compared with the conventional least-square S-N curve fitting method, it could predict the fatigue life of the titanium alloy welded joints more accurately thus helps to the reliability design of the structure.

]]>Entropy doi: 10.3390/e21020116

Authors: Hakan F. Oztop Mohammed A. Almeshaal Lioua Kolsi Mohammed Mehdi Rashidi Mohamed E. Ali

A numerical study on natural convection in a cubical cavity with partial top and bottom openings is performed in this paper. One of the vertical walls of the cavity has higher temperature than that of the opposite one; the remaining walls are insulated perfectly. Three-dimensional simulations of governing equations have been performed using a finite volume technique. The results are presented for different parameters such as opening length and Rayleigh number. It is observed that heat transfer rate and fluid flow can be controlled via opening ratio size and Rayleigh number.

]]>Entropy doi: 10.3390/e21020115

Authors: Bin Ju Haijiao Zhang Yongbin Liu Donghui Pan Ping Zheng Lanbing Xu Guoli Li

In this study, a nonlinear analysis method called improved information entropy (IIE) is proposed on the basis of constructing a special probability mass function for the normalized analysis of Shannon entropy for a time series. The definition is directly applied to several typical time series, and the characteristic of IIE is analyzed. This method can distinguish different kinds of signals and reflects the complexity of one-dimensional time series of high sensitivity to the changes in signal. Thus, the method is applied to the fault diagnosis of a rolling bearing. Experimental results show that the method can effectively extract the sensitive characteristics of the bearing running state and has fast operation time and minimal parameter requirements.

]]>Entropy doi: 10.3390/e21020114

Authors: Jiří Zýka Jaroslav Málek Jaroslav Veselý František Lukáč Jakub Čížek Jan Kuriplach Oksana Melikhova

Refractory high entropy alloys (HEA) are promising materials for high temperature applications. This work presents investigations of the room temperature tensile mechanical properties of selected 3 and 4 elements medium entropy alloys (MEA) derived from the HfNbTaTiZr system. Tensile testing was combined with fractographic and microstructure analysis, using scanning electron microscope (SEM), wavelength dispersive spectroscope (WDS) and X-Ray powder diffraction (XRD). The 5 element HEA alloy HfNbTaTiZr exhibits the best combination of strength and elongation while 4 and 3 element MEAs have lower strength. Some of them are ductile, some of them brittle, depending on microstructure. Simultaneous presence of Ta and Zr in the alloy resulted in a significant reduction of ductility caused by reduction of the BCC phase content. Precipitation of Ta rich particles on grain boundaries reduces further the maximum elongation to failure down to zero values.

]]>Entropy doi: 10.3390/e21020113

Authors: Jan Walleczek Gerhard Grössing Paavo Pylkkänen Basil Hiley

Emergent quantum mechanics (EmQM) explores the possibility of an ontology for quantum mechanics. The resurgence of interest in realist approaches to quantum mechanics challenges the standard textbook view, which represents an operationalist approach. The possibility of an ontological, i.e., realist, quantum mechanics was first introduced with the original de Broglie&ndash;Bohm theory, which has also been developed in another context as Bohmian mechanics. This Editorial introduces a Special Issue featuring contributions which were invited as part of the David Bohm Centennial symposium of the EmQM conference series (www.emqm17.org). Questions directing the EmQM research agenda are: Is reality intrinsically random or fundamentally interconnected? Is the universe local or nonlocal? Might a radically new conception of reality include a form of quantum causality or quantum ontology? What is the role of the experimenter agent in ontological quantum mechanics? The Special Issue also includes research examining ontological propositions that are not based on the Bohm-type nonlocality. These include, for example, local, yet time-symmetric, ontologies, such as quantum models based upon retrocausality. This Editorial provides topical overviews of thirty-one contributions which are organized into seven categories to provide orientation.

]]>Entropy doi: 10.3390/e21020112

Authors: Jan Korbel Rudolf Hanel Stefan Thurner

In the world of generalized entropies&mdash;which, for example, play a role in physical systems with sub- and super-exponential phase space growth per degree of freedom&mdash;there are two ways for implementing constraints in the maximum entropy principle: linear and escort constraints. Both appear naturally in different contexts. Linear constraints appear, e.g., in physical systems, when additional information about the system is available through higher moments. Escort distributions appear naturally in the context of multifractals and information geometry. It was shown recently that there exists a fundamental duality that relates both approaches on the basis of the corresponding deformed logarithms (deformed-log duality). Here, we show that there exists another duality that arises in the context of information geometry, relating the Fisher information of ϕ -deformed exponential families that correspond to linear constraints (as studied by J.Naudts) to those that are based on escort constraints (as studied by S.-I. Amari). We explicitly demonstrate this information geometric duality for the case of ( c , d ) -entropy, which covers all situations that are compatible with the first three Shannon&ndash;Khinchin axioms and that include Shannon, Tsallis, Anteneodo&ndash;Plastino entropy, and many more as special cases. Finally, we discuss the relation between the deformed-log duality and the information geometric duality and mention that the escort distributions arising in these two dualities are generally different and only coincide for the case of the Tsallis deformation.

]]>Entropy doi: 10.3390/e21020111

Authors: Sheng-Wen Li

Macroscopic many-body systems always exhibit irreversible behaviors. However, in principle, the underlying microscopic dynamics of the many-body system, either the (quantum) von Neumann or (classical) Liouville equation, guarantees that the entropy of an isolated system does not change with time, which is quite confusing compared with the macroscopic irreversibility. We notice that indeed the macroscopic entropy increase in standard thermodynamics is associated with the correlation production inside the full ensemble state of the whole system. In open systems, the irreversible entropy production of the open system can be proved to be equivalent with the correlation production between the open system and its environment. During the free diffusion of an isolated ideal gas, the correlation between the spatial and momentum distributions is increasing monotonically, and it could well reproduce the entropy increase result in standard thermodynamics. In the presence of particle collisions, the single-particle distribution always approaches the Maxwell-Boltzmann distribution as its steady state, and its entropy increase indeed indicates the correlation production between the particles. In all these examples, the total entropy of the whole isolated system keeps constant, while the correlation production reproduces the irreversible entropy increase in the standard macroscopic thermodynamics. In this sense, the macroscopic irreversibility and the microscopic reversibility no longer contradict with each other.

]]>Entropy doi: 10.3390/e21020110

Authors: Stephen Taylor

Information geometry provides a correspondence between differential geometry and statistics through the Fisher information matrix. In particular, given two models from the same parametric family of distributions, one can define the distance between these models as the length of the geodesic connecting them in a Riemannian manifold whose metric is given by the model&rsquo;s Fisher information matrix. One limitation that has hindered the adoption of this similarity measure in practical applications is that the Fisher distance is typically difficult to compute in a robust manner. We review such complications and provide a general form for the distance function for one parameter model. We next focus on higher dimensional extreme value models including the generalized Pareto and generalized extreme value distributions that will be used in financial risk applications. Specifically, we first develop a technique to identify the nearest neighbors of a target security in the sense that their best fit model distributions have minimal Fisher distance to the target. Second, we develop a hierarchical clustering technique that utilizes the Fisher distance. Specifically, we compare generalized extreme value distributions fit to block maxima of a set of equity loss distributions and group together securities whose worst single day yearly loss distributions exhibit similarities.

]]>Entropy doi: 10.3390/e21020109

Authors: Mohit Thakur Gerhard Kramer

Standard upper and lower bounds on the capacity of relay channels are cut-set (CS), decode-forward (DF), and quantize-forward (QF) rates. For real additive white Gaussian noise (AWGN) multicast relay channels with one source node and one relay node, these bounds are shown to be quasi-concave in the receiver signal-to-noise ratios and the squared source-relay correlation coefficient. Furthermore, the CS rates are shown to be quasi-concave in the relay position for a fixed correlation coefficient, and the DF rates are shown to be quasi-concave in the relay position. The latter property characterizes the optimal relay position when using DF. The results extend to complex AWGN channels with random phase variations.

]]>Entropy doi: 10.3390/e21020108

Authors: Chuan-Hao Guo Yuan Guo Bei-Bei Liu

The densest k-subgraph (DkS) maximization problem is to find a set of k vertices with maximum total weight of edges in the subgraph induced by this set. This problem is in general NP-hard. In this paper, two relaxation methods for solving the DkS problem are presented. One is doubly nonnegative relaxation, and the other is semidefinite relaxation with tighter relaxation compare with the relaxation of standard semidefinite. The two relaxation problems are equivalent under the suitable conditions. Moreover, the corresponding approximation ratios&rsquo; results are given for these relaxation problems. Finally, some numerical examples are tested to show the comparison of these relaxation problems, and the numerical results show that the doubly nonnegative relaxation is more promising than the semidefinite relaxation for solving some DkS problems.

]]>Entropy doi: 10.3390/e21020107

Authors: Katarzyna Harezlak Dariusz R. Augustyn Pawel Kasprowski

Analysis of eye movement has attracted a lot of attention recently in terms of exploring areas of people&rsquo;s interest, cognitive ability, and skills. The basis for eye movement usage in these applications is the detection of its main components&mdash;namely, fixations and saccades, which facilitate understanding of the spatiotemporal processing of a visual scene. In the presented research, a novel approach for the detection of eye movement events is proposed, based on the concept of approximate entropy. By using the multiresolution time-domain scheme, a structure entitled the Multilevel Entropy Map was developed for this purpose. The dataset was collected during an experiment utilizing the &ldquo;jumping point&rdquo; paradigm. Eye positions were registered with a 1000 Hz sampling rate. For event detection, the knn classifier was applied. The best classification efficiency in recognizing the saccadic period ranged from 83% to 94%, depending on the sample size used. These promising outcomes suggest that the proposed solution may be used as a potential method for describing eye movement dynamics.

]]>Entropy doi: 10.3390/e21020106

Authors: Qingfeng He Zhihao Xu Shaojun Li Renwei Li Shuai Zhang Nianqin Wang Binh Thai Pham Wei Chen

Landslides are a major geological hazard worldwide. Landslide susceptibility assessments are useful to mitigate human casualties, loss of property, and damage to natural resources, ecosystems, and infrastructures. This study aims to evaluate landslide susceptibility using a novel hybrid intelligence approach with the rotation forest-based credal decision tree (RF-CDT) classifier. First, 152 landslide locations and 15 landslide conditioning factors were collected from the study area. Then, these conditioning factors were assigned values using an entropy method and subsequently optimized using correlation attribute evaluation (CAE). Finally, the performance of the proposed hybrid model was validated using the receiver operating characteristic (ROC) curve and compared with two well-known ensemble models, bagging (bag-CDT) and MultiBoostAB (MB-CDT). Results show that the proposed RF-CDT model had better performance than the single CDT model and hybrid bag-CDT and MB-CDT models. The findings in the present study overall confirm that a combination of the meta model with a decision tree classifier could enhance the prediction power of the single landslide model. The resulting susceptibility maps could be effective for enforcement of land management regulations to reduce landslide hazards in the study area and other similar areas in the world.

]]>Entropy doi: 10.3390/e21020105

Authors: Will Hicks

The Accardi–Boukas quantum Black–Scholes framework, provides a means by which one can apply the Hudson–Parthasarathy quantum stochastic calculus to problems in finance. Solutions to these equations can be modelled using nonlocal diffusion processes, via a Kramers–Moyal expansion, and this provides useful tools to understand their behaviour. In this paper we develop further links between quantum stochastic processes, and nonlocal diffusions, by inverting the question, and showing how certain nonlocal diffusions can be written as quantum stochastic processes. We then go on to show how one can use path integral formalism, and PT symmetric quantum mechanics, to build a non-Gaussian kernel function for the Accardi–Boukas quantum Black–Scholes. Behaviours observed in the real market are a natural model output, rather than something that must be deliberately included.

]]>Entropy doi: 10.3390/e21020104

Authors: Alberto Montina Stefan Wolf

In view of the importance of quantum non-locality in cryptography, quantum computation, and communication complexity, it is crucial to decide whether a given correlation exhibits non-locality or not. As proved by Pitowski, this problem is NP-complete, and is thus computationally intractable unless NP is equal to P. In this paper, we first prove that the Euclidean distance of given correlations from the local polytope can be computed in polynomial time with arbitrary fixed error, granted the access to a certain oracle; namely, given a fixed error, we derive two upper bounds on the running time. The first bound is linear in the number of measurements. The second bound scales with the number of measurements to the sixth power. The former holds only for a very high number of measurements, and is never observed in the performed numerical tests. We, then, introduce a simple algorithm for simulating the oracle. In all of the considered numerical tests, the simulation of the oracle contributes with a multiplicative factor to the overall running time and, thus, does not affect the sixth-power law of the oracle-assisted algorithm.

]]>Entropy doi: 10.3390/e21020103

Authors: Wael Al-Kouz Ahmad Al-Muhtady Wahib Owhaib Sameer Al-Dahidi Montasir Hader Rama Abu-Alghanam

Computational Fluid Dynamics (CFD) is utilized to study entropy generation for the rarefied steady state laminar 2-D flow of air-Al2O3 nanofluid in a square cavity equipped with two solid fins at the hot wall. Such flows are of great importance in industrial applications, such as the cooling of electronic equipment and nuclear reactors. In this current study, effects of the Knudsen number (Kn), Rayleigh number (Ra) and the nano solid particle&rsquo;s volume fraction ( ϕ ) on entropy generation were investigated. The values of the parameters considered in this work were as follows: 0 &le; K n &le; 0.1 , 10 3 &le; R a &le; 10 6 , &nbsp; 0 &le; ϕ &le; 0.2 . The length of the fins (LF) was considered to be fixed and equal to 0.5 m, whereas the location of the fins with respect to the lower wall (HF) was set to 0.25 and 0.75 m. Simulations demonstrated that there was an inverse direct effect of Kn on the entropy generation. Moreover, it was found that when Ra was less than 104, the entropy generation, due to the flow, increased as ϕ increases. In addition, the entropy generation due to the flow will decrease at Ra greater than 104 as ϕ increases. Moreover, the entropy generation due to heat will increase as both the ϕ and Ra increase. In addition, a correlation model of the total entropy generation as a function of all of the investigated parameters in this study was proposed. Finally, an optimization technique was adapted to find out the conditions at which the total entropy generation was minimized.

]]>Entropy doi: 10.3390/e21020102

Authors: Daniel Traian Pele Miruna Mazurencu-Marinescu-Pele

In this paper we investigate the ability of several econometrical models to forecast value at risk for a sample of daily time series of cryptocurrency returns. Using high frequency data for Bitcoin, we estimate the entropy of intraday distribution of logreturns through the symbolic time series analysis (STSA), producing low-resolution data from high-resolution data. Our results show that entropy has a strong explanatory power for the quantiles of the distribution of the daily returns. Based on Christoffersen&rsquo;s tests for Value at Risk (VaR) backtesting, we can conclude that the VaR forecast build upon the entropy of intraday returns is the best, compared to the forecasts provided by the classical GARCH models.

]]>Entropy doi: 10.3390/e21020101

Authors: Aixian Zhang Zhe Ji

Maximum distance separable (MDS) self-dual codes have useful properties due to their optimality with respect to the Singleton bound and its self-duality. MDS self-dual codes are completely determined by the length n , so the problem of constructing q-ary MDS self-dual codes with various lengths is a very interesting topic. Recently X. Fang et al. using a method given in previous research, where several classes of new MDS self-dual codes were constructed through (extended) generalized Reed-Solomon codes, in this paper, based on the method given in we achieve several classes of MDS self-dual codes.

]]>