Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 18, Issue 9 (September 2016)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-34
Export citation of selected articles as:

Editorial

Jump to: Research, Review, Other

Open AccessEditorial Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing
Entropy 2016, 18(9), 334; doi:10.3390/e18090334
Received: 7 September 2016 / Accepted: 7 September 2016 / Published: 13 September 2016
PDF Full-text (157 KB) | HTML Full-text | XML Full-text
Abstract
Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp) has emerged rapidly as an exciting new paradigm. In this special
[...] Read more.
Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp) has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc. Full article
Open AccessEditorial Quantum Computation and Information: Multi-Particle Aspects
Entropy 2016, 18(9), 339; doi:10.3390/e18090339
Received: 14 September 2016 / Accepted: 14 September 2016 / Published: 20 September 2016
PDF Full-text (180 KB) | HTML Full-text | XML Full-text
Abstract This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers. Full article
(This article belongs to the Special Issue Quantum Computation and Information: Multi-Particle Aspects)

Research

Jump to: Editorial, Review, Other

Open AccessArticle SU(2) Yang–Mills Theory: Waves, Particles, and Quantum Thermodynamics
Entropy 2016, 18(9), 310; doi:10.3390/e18090310
Received: 14 April 2016 / Revised: 1 August 2016 / Accepted: 16 August 2016 / Published: 23 August 2016
Cited by 2 | PDF Full-text (306 KB) | HTML Full-text | XML Full-text
Abstract
We elucidate how Quantum Thermodynamics at temperature T emerges from pure and classical SU(2) Yang–Mills theory on a four-dimensional Euclidean spacetime slice S1×R3. The concept of a (deconfining) thermal ground state, composed of certain
[...] Read more.
We elucidate how Quantum Thermodynamics at temperature T emerges from pure and classical S U ( 2 ) Yang–Mills theory on a four-dimensional Euclidean spacetime slice S 1 × R 3 . The concept of a (deconfining) thermal ground state, composed of certain solutions to the fundamental, classical Yang–Mills equation, allows for a unified addressation of both (classical) wave- and (quantum) particle-like excitations thereof. More definitely, the thermal ground state represents the interplay between nonpropagating, periodic configurations which are electric-magnetically (anti)selfdual in a non-trivial way and possess topological charge modulus unity. Their trivial-holonomy versions—Harrington–Shepard (HS) (anti)calorons—yield an accurate a priori estimate of the thermal ground state in terms of spatially coarse-grained centers, each containing one quantum of action localized at its inmost spacetime point, which induce an inert adjoint scalar field ϕ ( | ϕ | spatio-temporally constant). The field ϕ , in turn, implies an effective pure-gauge configuration, a μ gs , accurately describing HS (anti)caloron overlap. Spatial homogeneity of the thermal ground-state estimate ϕ , a μ gs demands that (anti)caloron centers are densely packed, thus representing a collective departure from (anti)selfduality. Effectively, such a “nervous” microscopic situation gives rise to two static phenomena: finite ground-state energy density ρ gs and pressure P gs with ρ gs = P gs as well as the (adjoint) Higgs mechanism. The peripheries of HS (anti)calorons are static and resemble (anti)selfdual dipole fields whose apparent dipole moments are determined by | ϕ | and T, protecting them against deformation potentially caused by overlap. Such a protection extends to the spatial density of HS (anti)caloron centers. Thus the vacuum electric permittivity ϵ 0 and magnetic permeability μ 0 , supporting the propagation of wave-like disturbances in the U ( 1 ) Cartan subalgebra of S U ( 2 ) , can be reliably calculated for disturbances which do not probe HS (anti)caloron centers. Both ϵ 0 and μ 0 turn out to be temperature independent in thermal equilibrium but also for an isolated, monochromatic U ( 1 ) wave. HS (anti)caloron centers, on the other hand, react onto wave-like disturbances, which would resolve their spatio-temporal structure, by indeterministic emissions of quanta of energy and momentum. Thermodynamically seen, such events are Boltzmann weighted and occur independently at distinct locations in space and instants in (Minkowskian) time, entailing the Bose–Einstein distribution. Small correlative ramifications associate with effective radiative corrections, e.g., in terms of polarization tensors. We comment on an S U ( 2 ) × S U ( 2 ) based gauge-theory model, describing wave- and particle-like aspects of electromagnetic disturbances within the so far experimentally/observationally investigated spectrum. Full article
(This article belongs to the Special Issue Quantum Thermodynamics)
Open AccessArticle Application of Entropy-Based Features to Predict Defibrillation Outcome in Cardiac Arrest
Entropy 2016, 18(9), 313; doi:10.3390/e18090313
Received: 19 July 2016 / Revised: 11 August 2016 / Accepted: 19 August 2016 / Published: 24 August 2016
Cited by 1 | PDF Full-text (968 KB) | HTML Full-text | XML Full-text
Abstract
Prediction of defibrillation success is of vital importance to guide therapy and improve the survival of patients suffering out-of-hospital cardiac arrest (OHCA). Currently, the most efficient methods to predict shock success are based on the analysis of the electrocardiogram (ECG) during ventricular fibrillation
[...] Read more.
Prediction of defibrillation success is of vital importance to guide therapy and improve the survival of patients suffering out-of-hospital cardiac arrest (OHCA). Currently, the most efficient methods to predict shock success are based on the analysis of the electrocardiogram (ECG) during ventricular fibrillation (VF), and recent studies suggest the efficacy of waveform indices that characterize the underlying non-linear dynamics of VF. In this study we introduce, adapt and fully characterize six entropy indices for VF shock outcome prediction, based on the classical definitions of entropy to measure the regularity and predictability of a time series. Data from 163 OHCA patients comprising 419 shocks (107 successful) were used, and the performance of the entropy indices was characterized in terms of embedding dimension (m) and matching tolerance (r). Six classical predictors were also assessed as baseline prediction values. The best prediction results were obtained for fuzzy entropy (FuzzEn) with m = 3 and an amplitude-dependent tolerance of r = 80 μ V . This resulted in a balanced sensitivity/specificity of 80.4%/76.9%, which improved by over five points the results obtained for the best classical predictor. These results suggest that a FuzzEn approach for a joint quantification of VF amplitude and its non-linear dynamics may be a promising tool to optimize OHCA treatment. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications)
Figures

Figure 1

Open AccessArticle Exergy Analysis of a Syngas-Fueled Combined Cycle with Chemical-Looping Combustion and CO2 Sequestration
Entropy 2016, 18(9), 314; doi:10.3390/e18090314
Received: 30 May 2016 / Revised: 29 July 2016 / Accepted: 17 August 2016 / Published: 25 August 2016
PDF Full-text (1293 KB) | HTML Full-text | XML Full-text
Abstract
Fossil fuels are still widely used for power generation. Nevertheless, it is possible to attain a short- and medium-term substantial reduction of greenhouse gas emissions to the atmosphere through a sequestration of the CO2 produced in fuels’ oxidation. The chemical-looping combustion (CLC)
[...] Read more.
Fossil fuels are still widely used for power generation. Nevertheless, it is possible to attain a short- and medium-term substantial reduction of greenhouse gas emissions to the atmosphere through a sequestration of the CO2 produced in fuels’ oxidation. The chemical-looping combustion (CLC) technique is based on a chemical intermediate agent, which gets oxidized in an air reactor and is then conducted to a separated fuel reactor, where it oxidizes the fuel in turn. Thus, the oxidation products CO2 and H2O are obtained in an output flow in which the only non-condensable gas is CO2, allowing the subsequent sequestration of CO2 without an energy penalty. Furthermore, with shrewd configurations, a lower exergy destruction in the combustion chemical transformation can be achieved. This paper focus on a second law analysis of a CLC combined cycle power plant with CO2 sequestration using syngas from coal and biomass gasification as fuel. The key thermodynamic parameters are optimized via the exergy method. The proposed power plant configuration is compared with a similar gas turbine system with a conventional combustion, finding a notable increase of the power plant efficiency. Furthermore, the influence of syngas composition on the results is investigated by considering different H2-content fuels. Full article
Figures

Figure 1

Open AccessArticle Description of Seizure Process for Gas Dynamic Spray of Metal Powders from Non-Equilibrium Thermodynamics Standpoint
Entropy 2016, 18(9), 315; doi:10.3390/e18090315
Received: 14 June 2016 / Revised: 11 August 2016 / Accepted: 19 August 2016 / Published: 25 August 2016
PDF Full-text (3337 KB) | HTML Full-text | XML Full-text
Abstract
The seizure process has been considered from the non-equilibrium thermodynamics and self-organization theory standpoints. It has been testified that, for the intensification of powder mix particles seizing with the substrate during spraying, it is required that relatively light components of the powder mix
[...] Read more.
The seizure process has been considered from the non-equilibrium thermodynamics and self-organization theory standpoints. It has been testified that, for the intensification of powder mix particles seizing with the substrate during spraying, it is required that relatively light components of the powder mix should be preferably transferred into the friction zone. The theory inferences have been experimentally confirmed, as exemplified by the gas dynamic spray of copper-zinc powders mix. Full article
(This article belongs to the Special Issue Entropy Application in Tribology)
Figures

Figure 1

Open AccessArticle Stationary Stability for Evolutionary Dynamics in Finite Populations
Entropy 2016, 18(9), 316; doi:10.3390/e18090316
Received: 3 April 2016 / Revised: 24 July 2016 / Accepted: 16 August 2016 / Published: 25 August 2016
Cited by 1 | PDF Full-text (3619 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
We demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the
[...] Read more.
We demonstrate a vast expansion of the theory of evolutionary stability to finite populations with mutation, connecting the theory of the stationary distribution of the Moran process with the Lyapunov theory of evolutionary stability. We define the notion of stationary stability for the Moran process with mutation and generalizations, as well as a generalized notion of evolutionary stability that includes mutation called an incentive stable state (ISS) candidate. For sufficiently large populations, extrema of the stationary distribution are ISS candidates and we give a family of Lyapunov quantities that are locally minimized at the stationary extrema and at ISS candidates. In various examples, including for the Moran and Wright–Fisher processes, we show that the local maxima of the stationary distribution capture the traditionally-defined evolutionarily stable states. The classical stability theory of the replicator dynamic is recovered in the large population limit. Finally we include descriptions of possible extensions to populations of variable size and populations evolving on graphs. Full article
(This article belongs to the Special Issue Information and Entropy in Biological Systems)
Figures

Open AccessArticle Entropy Complexity and Stability of a Nonlinear Dynamic Game Model with Two Delays
Entropy 2016, 18(9), 317; doi:10.3390/e18090317
Received: 13 June 2016 / Revised: 17 August 2016 / Accepted: 17 August 2016 / Published: 30 August 2016
Cited by 3 | PDF Full-text (3453 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, a duopoly game model with double delays in hydropower market is established, and the research focus on the influence of time delay parameter on the complexity of the system. Firstly, we established a game model for the enterprises considering both
[...] Read more.
In this paper, a duopoly game model with double delays in hydropower market is established, and the research focus on the influence of time delay parameter on the complexity of the system. Firstly, we established a game model for the enterprises considering both the current and the historical output when making decisions. Secondly, the existence and stability of Hopf bifurcation are analyzed, and the conditions and main conclusions of Hopf bifurcation are given. Thirdly, numerical simulation and analysis are carried out to verify the conclusions of the theoretical analysis. The effect of delay parameter on the stability of the system is simulated by a bifurcation diagram, the Lyapunov exponent, and an entropic diagram; in addition, the stability region of the system is given by a 2D parameter bifurcation diagram and a 3D parameter bifurcation diagram. Finally, the method of delayed feedback control is used to control the chaotic system. The research results can provide a guideline for enterprise decision-making. Full article
(This article belongs to the Special Issue Computational Complexity)
Figures

Figure 1

Open AccessArticle Entropy-Based Modeling of Velocity Lag in Sediment-Laden Open Channel Turbulent Flow
Entropy 2016, 18(9), 318; doi:10.3390/e18090318
Received: 31 May 2016 / Revised: 21 August 2016 / Accepted: 23 August 2016 / Published: 30 August 2016
Cited by 1 | PDF Full-text (665 KB) | HTML Full-text | XML Full-text
Abstract
In the last few decades, a wide variety of instruments with laser-based techniques have been developed that enable experimentally measuring particle velocity and fluid velocity separately in particle-laden flow. Experiments have revealed that stream-wise particle velocity is different from fluid velocity, and this
[...] Read more.
In the last few decades, a wide variety of instruments with laser-based techniques have been developed that enable experimentally measuring particle velocity and fluid velocity separately in particle-laden flow. Experiments have revealed that stream-wise particle velocity is different from fluid velocity, and this velocity difference is commonly known as “velocity lag” in the literature. A number of experimental, as well as theoretical investigations have been carried out to formulate deterministic mathematical models of velocity lag, based on several turbulent features. However, a probabilistic study of velocity lag does not seem to have been reported, to the best of our knowledge. The present study therefore focuses on the modeling of velocity lag in open channel turbulent flow laden with sediment using the entropy theory along with a hypothesis on the cumulative distribution function. This function contains a parameter η, which is shown to be a function of specific gravity, particle diameter and shear velocity. The velocity lag model is tested using a wide range of twenty-two experimental runs collected from the literature and is also compared with other models of velocity lag. Then, an error analysis is performed to further evaluate the prediction accuracy of the proposed model, especially in comparison to other models. The model is also able to explain the physical characteristics of velocity lag caused by the interaction between the particles and the fluid. Full article
(This article belongs to the Section Statistical Mechanics)
Figures

Figure 1

Open AccessArticle Quantum Hysteresis in Coupled Light–Matter Systems
Entropy 2016, 18(9), 319; doi:10.3390/e18090319
Received: 28 June 2016 / Revised: 18 August 2016 / Accepted: 29 August 2016 / Published: 7 September 2016
Cited by 1 | PDF Full-text (6766 KB) | HTML Full-text | XML Full-text
Abstract
We investigate the non-equilibrium quantum dynamics of a canonical light–matter system—namely, the Dicke model—when the light–matter interaction is ramped up and down through a cycle across the quantum phase transition. Our calculations reveal a rich set of dynamical behaviors determined by the cycle
[...] Read more.
We investigate the non-equilibrium quantum dynamics of a canonical light–matter system—namely, the Dicke model—when the light–matter interaction is ramped up and down through a cycle across the quantum phase transition. Our calculations reveal a rich set of dynamical behaviors determined by the cycle times, ranging from the slow, near adiabatic regime through to the fast, sudden quench regime. As the cycle time decreases, we uncover a crossover from an oscillatory exchange of quantum information between light and matter that approaches a reversible adiabatic process, to a dispersive regime that generates large values of light–matter entanglement. The phenomena uncovered in this work have implications in quantum control, quantum interferometry, as well as in quantum information theory. Full article
(This article belongs to the Special Issue Quantum Nonequilibrium Dynamics)
Figures

Figure 1

Open AccessArticle Using Graph and Vertex Entropy to Compare Empirical Graphs with Theoretical Graph Models
Entropy 2016, 18(9), 320; doi:10.3390/e18090320
Received: 31 May 2016 / Revised: 19 July 2016 / Accepted: 24 August 2016 / Published: 5 September 2016
PDF Full-text (1647 KB) | HTML Full-text | XML Full-text
Abstract
Over the years, several theoretical graph generation models have been proposed. Among the most prominent are: the Erdős–Renyi random graph model, Watts–Strogatz small world model, Albert–Barabási preferential attachment model, Price citation model, and many more. Often, researchers working with real-world data are interested
[...] Read more.
Over the years, several theoretical graph generation models have been proposed. Among the most prominent are: the Erdős–Renyi random graph model, Watts–Strogatz small world model, Albert–Barabási preferential attachment model, Price citation model, and many more. Often, researchers working with real-world data are interested in understanding the generative phenomena underlying their empirical graphs. They want to know which of the theoretical graph generation models would most probably generate a particular empirical graph. In other words, they expect some similarity assessment between the empirical graph and graphs artificially created from theoretical graph generation models. Usually, in order to assess the similarity of two graphs, centrality measure distributions are compared. For a theoretical graph model this means comparing the empirical graph to a single realization of a theoretical graph model, where the realization is generated from the given model using an arbitrary set of parameters. The similarity between centrality measure distributions can be measured using standard statistical tests, e.g., the Kolmogorov–Smirnov test of distances between cumulative distributions. However, this approach is both error-prone and leads to incorrect conclusions, as we show in our experiments. Therefore, we propose a new method for graph comparison and type classification by comparing the entropies of centrality measure distributions (degree centrality, betweenness centrality, closeness centrality). We demonstrate that our approach can help assign the empirical graph to the most similar theoretical model using a simple unsupervised learning method. Full article
Figures

Figure 1

Open AccessArticle Lattice Distortions in the FeCoNiCrMn High Entropy Alloy Studied by Theory and Experiment
Entropy 2016, 18(9), 321; doi:10.3390/e18090321
Received: 8 July 2016 / Revised: 17 August 2016 / Accepted: 29 August 2016 / Published: 2 September 2016
Cited by 4 | PDF Full-text (1131 KB) | HTML Full-text | XML Full-text
Abstract
Lattice distortions constitute one of the main features characterizing high entropy alloys. Local lattice distortions have, however, only rarely been investigated in these multi-component alloys. We, therefore, employ a combined theoretical electronic structure and experimental approach to study the atomistic distortions in the
[...] Read more.
Lattice distortions constitute one of the main features characterizing high entropy alloys. Local lattice distortions have, however, only rarely been investigated in these multi-component alloys. We, therefore, employ a combined theoretical electronic structure and experimental approach to study the atomistic distortions in the FeCoNiCrMn high entropy (Cantor) alloy by means of density-functional theory and extended X-ray absorption fine structure spectroscopy. Particular attention is paid to element-resolved distortions for each constituent. The individual mean distortions are small on average, <1%, but their fluctuations (i.e., standard deviations) are an order of magnitude larger, in particular for Cr and Mn. Good agreement between theory and experiment is found. Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)
Figures

Figure 1

Open AccessArticle Mechanical Fault Diagnosis of High Voltage Circuit Breakers with Unknown Fault Type Using Hybrid Classifier Based on LMD and Time Segmentation Energy Entropy
Entropy 2016, 18(9), 322; doi:10.3390/e18090322
Received: 13 June 2016 / Revised: 22 August 2016 / Accepted: 30 August 2016 / Published: 3 September 2016
Cited by 2 | PDF Full-text (2996 KB) | HTML Full-text | XML Full-text
Abstract
In order to improve the identification accuracy of the high voltage circuit breakers’ (HVCBs) mechanical fault types without training samples, a novel mechanical fault diagnosis method of HVCBs using a hybrid classifier constructed with Support Vector Data Description (SVDD) and fuzzy c-means (FCM)
[...] Read more.
In order to improve the identification accuracy of the high voltage circuit breakers’ (HVCBs) mechanical fault types without training samples, a novel mechanical fault diagnosis method of HVCBs using a hybrid classifier constructed with Support Vector Data Description (SVDD) and fuzzy c-means (FCM) clustering method based on Local Mean Decomposition (LMD) and time segmentation energy entropy (TSEE) is proposed. Firstly, LMD is used to decompose nonlinear and non-stationary vibration signals of HVCBs into a series of product functions (PFs). Secondly, TSEE is chosen as feature vectors with the superiority of energy entropy and characteristics of time-delay faults of HVCBs. Then, SVDD trained with normal samples is applied to judge mechanical faults of HVCBs. If the mechanical fault is confirmed, the new fault sample and all known fault samples are clustered by FCM with the cluster number of known fault types. Finally, another SVDD trained by the specific fault samples is used to judge whether the fault sample belongs to an unknown type or not. The results of experiments carried on a real SF6 HVCB validate that the proposed fault-detection method is effective for the known faults with training samples and unknown faults without training samples. Full article
(This article belongs to the Special Issue Information Theoretic Learning)
Figures

Figure 1

Open AccessArticle Entropy Base Estimation of Moisture Content of the Top 10-m Unsaturated Soil for the Badain Jaran Desert in Northwestern China
Entropy 2016, 18(9), 323; doi:10.3390/e18090323
Received: 12 July 2016 / Revised: 24 August 2016 / Accepted: 31 August 2016 / Published: 3 September 2016
PDF Full-text (14659 KB) | HTML Full-text | XML Full-text
Abstract
Estimation of soil moisture distribution in desert regions is challenged by the deep unsaturated zone and the extreme natural environment. In this study, an entropy-based method, consisting of information entropy, principle of maximum entropy (PME), solutions to PME with constraints, and the determination
[...] Read more.
Estimation of soil moisture distribution in desert regions is challenged by the deep unsaturated zone and the extreme natural environment. In this study, an entropy-based method, consisting of information entropy, principle of maximum entropy (PME), solutions to PME with constraints, and the determination of parameters, is used to estimate the soil moisture distribution in the 10 m deep vadose zone of a desert region. Firstly, the soil moisture distribution is described as a scaled probability density function (PDF), which is solved by PME with the constraints of normalization, known arithmetic mean and geometric mean, and the solution is the general form of gamma distribution. A constant arithmetic mean is determined by considering the stable average recharge rate at thousand year scale, and an approximate constant geometric mean is determined by the low flow rate (about 1 cm a year). Followed, the parameters of the scaled PDF of gamma distribution are determined by local environmental factors like terrain and vegetation: the multivariate linear equations are established to qualify the relationship between the parameters and the environmental factors on the basis of nineteen random soil moisture profiles about depth through the application of fuzzy mathematics. Finally, the accuracy is tested using correlation coefficient (CC) and relative error. This method performs with CC larger than 0.9 in more than a half profiles and most larger than 0.8, the relative errors are less than 30% in most of soil moisture profiles and can be as low as less than 15% when parameters fitted appropriately. Therefore, this study provides an alternative method to estimate soil moisture distribution in top 0–10 m layers of the Badain Jaran Desert based on local terrain and vegetation factors instead of drilling sand samples, this method would be useful in desert regions with extreme natural conditions since these environmental factors can be obtained by remote sensing data. Meanwhile, we should bear in mind that this method is challenged in humid regions since more intensive and frequent precipitation, and more vegetation cover make the system much more complex. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Figures

Figure 1

Open AccessArticle Market Sentiments Distribution Law
Entropy 2016, 18(9), 324; doi:10.3390/e18090324
Received: 6 August 2016 / Revised: 28 August 2016 / Accepted: 1 September 2016 / Published: 7 September 2016
PDF Full-text (931 KB) | HTML Full-text | XML Full-text
Abstract
The Stock Exchange is basically ruled by the extreme market sentiments of euphoria and fear. The type of sentiment is given by the color of the candlestick (white = bullish sentiments, black = bearish sentiments), meanwhile the intensity of the sentiment is given
[...] Read more.
The Stock Exchange is basically ruled by the extreme market sentiments of euphoria and fear. The type of sentiment is given by the color of the candlestick (white = bullish sentiments, black = bearish sentiments), meanwhile the intensity of the sentiment is given by the size of it. In this paper you will see that the intensity of any sentiment is astonishingly distributed in a robust, systematic and universal way, according to a law of exponential decay, the conclusion of which is supported by the analysis of the Lyapunov exponent, the information entropy and the frequency distribution of candlestick size. Full article
Figures

Open AccessArticle Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem
Entropy 2016, 18(9), 325; doi:10.3390/e18090325
Received: 10 June 2016 / Revised: 9 August 2016 / Accepted: 19 August 2016 / Published: 7 September 2016
PDF Full-text (296 KB) | HTML Full-text | XML Full-text
Abstract
This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs). We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions
[...] Read more.
This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs). We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed. Full article
(This article belongs to the Special Issue Statistical Significance and the Logic of Hypothesis Testing)
Open AccessArticle Bayesian Dependence Tests for Continuous, Binary and Mixed Continuous-Binary Variables
Entropy 2016, 18(9), 326; doi:10.3390/e18090326
Received: 20 June 2016 / Revised: 25 August 2016 / Accepted: 26 August 2016 / Published: 6 September 2016
PDF Full-text (1611 KB) | HTML Full-text | XML Full-text
Abstract
Tests for dependence of continuous, discrete and mixed continuous-discrete variables are ubiquitous in science. The goal of this paper is to derive Bayesian alternatives to frequentist null hypothesis significance tests for dependence. In particular, we will present three Bayesian tests for dependence of
[...] Read more.
Tests for dependence of continuous, discrete and mixed continuous-discrete variables are ubiquitous in science. The goal of this paper is to derive Bayesian alternatives to frequentist null hypothesis significance tests for dependence. In particular, we will present three Bayesian tests for dependence of binary, continuous and mixed variables. These tests are nonparametric and based on the Dirichlet Process, which allows us to use the same prior model for all of them. Therefore, the tests are “consistent” among each other, in the sense that the probabilities that variables are dependent computed with these tests are commensurable across the different types of variables being tested. By means of simulations with artificial data, we show the effectiveness of the new tests. Full article
(This article belongs to the Special Issue Statistical Significance and the Logic of Hypothesis Testing)
Figures

Figure 1a

Open AccessArticle Sparse Trajectory Prediction Based on Multiple Entropy Measures
Entropy 2016, 18(9), 327; doi:10.3390/e18090327
Received: 9 June 2016 / Revised: 29 August 2016 / Accepted: 30 August 2016 / Published: 14 September 2016
PDF Full-text (1965 KB) | HTML Full-text | XML Full-text
Abstract
Trajectory prediction is an important problem that has a large number of applications. A common approach to trajectory prediction is based on historical trajectories. However, existing techniques suffer from the “data sparsity problem”. The available historical trajectories are far from enough to cover
[...] Read more.
Trajectory prediction is an important problem that has a large number of applications. A common approach to trajectory prediction is based on historical trajectories. However, existing techniques suffer from the “data sparsity problem”. The available historical trajectories are far from enough to cover all possible query trajectories. We propose the sparsity trajectory prediction algorithm based on multiple entropy measures (STP-ME) to address the data sparsity problem. Firstly, the moving region is iteratively divided into a two-dimensional plane grid graph, and each trajectory is represented as a grid sequence with temporal information. Secondly, trajectory entropy is used to evaluate trajectory’s regularity, the L-Z entropy estimator is implemented to calculate trajectory entropy, and a new trajectory space is generated through trajectory synthesis. We define location entropy and time entropy to measure the popularity of locations and timeslots respectively. Finally, a second-order Markov model that contains a temporal dimension is adopted to perform sparse trajectory prediction. The experiments show that when trip completed percentage increases towards 90%, the coverage of the baseline algorithm decreases to almost 25%, while the STP-ME algorithm successfully copes with it as expected with only an unnoticeable drop in coverage, and can constantly answer almost 100% of query trajectories. It is found that the STP-ME algorithm improves the prediction accuracy generally by as much as 8%, 3%, and 4%, compared to the baseline algorithm, the second-order Markov model (2-MM), and sub-trajectory synthesis (SubSyn) algorithm, respectively. At the same time, the prediction time of STP-ME algorithm is negligible (10 μ s ), greatly outperforming the baseline algorithm (100 ms ). Full article
(This article belongs to the Special Issue Information Theoretic Learning)
Figures

Figure 1

Open AccessArticle Inferring Weighted Directed Association Networks from Multivariate Time Series with the Small-Shuffle Symbolic Transfer Entropy Spectrum Method
Entropy 2016, 18(9), 328; doi:10.3390/e18090328
Received: 14 June 2016 / Revised: 12 August 2016 / Accepted: 2 September 2016 / Published: 7 September 2016
PDF Full-text (2252 KB) | HTML Full-text | XML Full-text
Abstract
Complex network methodology is very useful for complex system exploration. However, the relationships among variables in complex systems are usually not clear. Therefore, inferring association networks among variables from their observed data has been a popular research topic. We propose a method, named
[...] Read more.
Complex network methodology is very useful for complex system exploration. However, the relationships among variables in complex systems are usually not clear. Therefore, inferring association networks among variables from their observed data has been a popular research topic. We propose a method, named small-shuffle symbolic transfer entropy spectrum (SSSTES), for inferring association networks from multivariate time series. The method can solve four problems for inferring association networks, i.e., strong correlation identification, correlation quantification, direction identification and temporal relation identification. The method can be divided into four layers. The first layer is the so-called data layer. Data input and processing are the things to do in this layer. In the second layer, we symbolize the model data, original data and shuffled data, from the previous layer and calculate circularly transfer entropy with different time lags for each pair of time series variables. Thirdly, we compose transfer entropy spectrums for pairwise time series with the previous layer’s output, a list of transfer entropy matrix. We also identify the correlation level between variables in this layer. In the last layer, we build a weighted adjacency matrix, the value of each entry representing the correlation level between pairwise variables, and then get the weighted directed association network. Three sets of numerical simulated data from a linear system, a nonlinear system and a coupled Rossler system are used to show how the proposed approach works. Finally, we apply SSSTES to a real industrial system and get a better result than with two other methods. Full article
(This article belongs to the Special Issue Transfer Entropy II)
Figures

Figure 1

Open AccessArticle Solution of Higher Order Nonlinear Time-Fractional Reaction Diffusion Equation
Entropy 2016, 18(9), 329; doi:10.3390/e18090329
Received: 8 July 2016 / Revised: 30 August 2016 / Accepted: 31 August 2016 / Published: 8 September 2016
PDF Full-text (2008 KB) | HTML Full-text | XML Full-text
Abstract
The approximate analytical solution of fractional order, nonlinear, reaction differential equations, namely the nonlinear diffusion equations, with a given initial condition, is obtained by using the homotopy analysis method. As a demonstration of a good mathematical model, the present article gives graphical presentations
[...] Read more.
The approximate analytical solution of fractional order, nonlinear, reaction differential equations, namely the nonlinear diffusion equations, with a given initial condition, is obtained by using the homotopy analysis method. As a demonstration of a good mathematical model, the present article gives graphical presentations of the effect of the reaction terms on the solution profile for various anomalous exponents of particular cases, to predict damping of the field variable. Numerical computations of the convergence control parameter, used to evaluate the convergence of approximate series solution through minimizing error, are also presented graphically for these cases. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory II)
Figures

Figure 1

Open AccessArticle Short Term Electrical Load Forecasting Using Mutual Information Based Feature Selection with Generalized Minimum-Redundancy and Maximum-Relevance Criteria
Entropy 2016, 18(9), 330; doi:10.3390/e18090330
Received: 21 July 2016 / Revised: 5 September 2016 / Accepted: 5 September 2016 / Published: 8 September 2016
PDF Full-text (2218 KB) | HTML Full-text | XML Full-text
Abstract
A feature selection method based on the generalized minimum redundancy and maximum relevance (G-mRMR) is proposed to improve the accuracy of short-term load forecasting (STLF). First, mutual information is calculated to analyze the relations between the original features and the load sequence, as
[...] Read more.
A feature selection method based on the generalized minimum redundancy and maximum relevance (G-mRMR) is proposed to improve the accuracy of short-term load forecasting (STLF). First, mutual information is calculated to analyze the relations between the original features and the load sequence, as well as the redundancy among the original features. Second, a weighting factor selected by statistical experiments is used to balance the relevance and redundancy of features when using the G-mRMR. Third, each feature is ranked in a descending order according to its relevance and redundancy as computed by G-mRMR. A sequential forward selection method is utilized for choosing the optimal subset. Finally, a STLF predictor is constructed based on random forest with the obtained optimal subset. The effectiveness and improvement of the proposed method was tested with actual load data. Full article
(This article belongs to the Section Information Theory)
Figures

Figure 1

Open AccessArticle Network Entropies of the Chinese Financial Market
Entropy 2016, 18(9), 331; doi:10.3390/e18090331
Received: 23 July 2016 / Revised: 22 August 2016 / Accepted: 3 September 2016 / Published: 8 September 2016
Cited by 1 | PDF Full-text (1082 KB) | HTML Full-text | XML Full-text
Abstract
Based on the data from the Chinese financial market, this paper focuses on analyzing three types of network entropies of the financial market, namely, Shannon, Renyi and Tsallis entropies. The findings suggest that Shannon entropy can reflect the volatility of the financial market,
[...] Read more.
Based on the data from the Chinese financial market, this paper focuses on analyzing three types of network entropies of the financial market, namely, Shannon, Renyi and Tsallis entropies. The findings suggest that Shannon entropy can reflect the volatility of the financial market, that Renyi and Tsallis entropies also have this function when their parameter has a positive value, and that Renyi and Tsallis entropies can reflect the extreme case of the financial market when their parameter has a negative value. Full article
Figures

Figure 1

Open AccessArticle Study on the Inherent Complex Features and Chaos Control of IS–LM Fractional-Order Systems
Entropy 2016, 18(9), 332; doi:10.3390/e18090332
Received: 24 June 2016 / Revised: 24 August 2016 / Accepted: 5 September 2016 / Published: 14 September 2016
Cited by 1 | PDF Full-text (3512 KB) | HTML Full-text | XML Full-text
Abstract
Based on the traditional IS–LM economic theory, which shows the relationship between interest rates and output in the goods and services market and the money market in macroeconomic. We established a four-dimensional IS–LM model involving four variables. With the Caputo fractional calculus theory,
[...] Read more.
Based on the traditional IS–LM economic theory, which shows the relationship between interest rates and output in the goods and services market and the money market in macroeconomic. We established a four-dimensional IS–LM model involving four variables. With the Caputo fractional calculus theory, we improved it into a fractional order nonlinear model, analyzed the complexity and stability of the fractional order system. The existences conditions of attractors under different order conditions are compared, and obtain the orders when the system reaches a stable state. Have the detail analysis on the dynamic phenomena, such as the strange attractor, sensitivity to initial values through phase diagram and the power spectral. The order changes in two ways: orders changes synchronously or single order changes. The results show regardless of which the order situation is, the economic system will enter into multiple states, such as strong divergence, strange attractor and the convergence, finally, system will enter into the stable state under a certain order; parameter changes have similar effects on the economic system. Therefore, selecting an appropriate order is significant for an economic system, which guarantees a steady development. Furthermore, this paper construct the chaos control to IS–LM fractional-order macroeconomic model by means of linear feedback control method, by calculating and adjusting the feedback coefficient, we make the system return to the convergence state. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory II)
Figures

Figure 1

Open AccessArticle Design of Light-Weight High-Entropy Alloys
Entropy 2016, 18(9), 333; doi:10.3390/e18090333
Received: 2 July 2016 / Revised: 21 August 2016 / Accepted: 5 September 2016 / Published: 13 September 2016
Cited by 10 | PDF Full-text (4197 KB) | HTML Full-text | XML Full-text
Abstract
High-entropy alloys (HEAs) are a new class of solid-solution alloys that have attracted worldwide attention for their outstanding properties. Owing to the demand from transportation and defense industries, light-weight HEAs have also garnered widespread interest from scientists for use as potential structural materials.
[...] Read more.
High-entropy alloys (HEAs) are a new class of solid-solution alloys that have attracted worldwide attention for their outstanding properties. Owing to the demand from transportation and defense industries, light-weight HEAs have also garnered widespread interest from scientists for use as potential structural materials. Great efforts have been made to study the phase-formation rules of HEAs to accelerate and refine the discovery process. In this paper, many proposed solid-solution phase-formation rules are assessed, based on a series of known and newly-designed light-weight HEAs. The results indicate that these empirical rules work for most compositions but also fail for several alloys. Light-weight HEAs often involve the additions of Al and/or Ti in great amounts, resulting in large negative enthalpies for forming solid-solution phases and/or intermetallic compounds. Accordingly, these empirical rules need to be modified with the new experimental data. In contrast, CALPHAD (acronym of the calculation of phase diagrams) method is demonstrated to be an effective approach to predict the phase formation in HEAs as a function of composition and temperature. Future perspectives on the design of light-weight HEAs are discussed in light of CALPHAD modeling and physical metallurgy principles. Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)
Figures

Figure 1

Open AccessArticle Enhanced Energy Distribution for Quantum Information Heat Engines
Entropy 2016, 18(9), 335; doi:10.3390/e18090335
Received: 5 August 2016 / Revised: 6 September 2016 / Accepted: 12 September 2016 / Published: 14 September 2016
PDF Full-text (903 KB) | HTML Full-text | XML Full-text
Abstract
A new scenario for energy distribution, security and shareability is presented that assumes the availability of quantum information heat engines and a thermal bath. It is based on the convertibility between entropy and work in the presence of a thermal reservoir. Our approach
[...] Read more.
A new scenario for energy distribution, security and shareability is presented that assumes the availability of quantum information heat engines and a thermal bath. It is based on the convertibility between entropy and work in the presence of a thermal reservoir. Our approach to the informational content of physical systems that are distributed between users is complementary to the conventional perspective of quantum communication. The latter places the value on the unpredictable content of the transmitted quantum states, while our interest focuses on their certainty. Some well-known results in quantum communication are reused in this context. Particularly, we describe a way to securely distribute quantum states to be used for unlocking energy from thermal sources. We also consider some multi-partite entangled and classically correlated states for a collaborative multi-user sharing of work extraction possibilities. In addition, the relation between the communication and work extraction capabilities is analyzed and written as an equation. Full article
(This article belongs to the Special Issue Quantum Information 2016)
Figures

Figure 1

Open AccessArticle Combined Forecasting of Streamflow Based on Cross Entropy
Entropy 2016, 18(9), 336; doi:10.3390/e18090336
Received: 23 June 2016 / Revised: 6 September 2016 / Accepted: 6 September 2016 / Published: 15 September 2016
PDF Full-text (1893 KB) | HTML Full-text | XML Full-text
Abstract
In this study, we developed a model of combined streamflow forecasting based on cross entropy to solve the problems of streamflow complexity and random hydrological processes. First, we analyzed the streamflow data obtained from Wudaogou station on the Huifa River, which is the
[...] Read more.
In this study, we developed a model of combined streamflow forecasting based on cross entropy to solve the problems of streamflow complexity and random hydrological processes. First, we analyzed the streamflow data obtained from Wudaogou station on the Huifa River, which is the second tributary of the Songhua River, and found that the streamflow was characterized by fluctuations and periodicity, and it was closely related to rainfall. The proposed method involves selecting similar years based on the gray correlation degree. The forecasting results obtained by the time series model (autoregressive integrated moving average), improved grey forecasting model, and artificial neural network model (a radial basis function) were used as a single forecasting model, and from the viewpoint of the probability density, the method for determining weights was improved by using the cross entropy model. The numerical results showed that compared with the single forecasting model, the combined forecasting model improved the stability of the forecasting model, and the prediction accuracy was better than that of conventional combined forecasting models. Full article
Figures

Figure 1

Open AccessArticle Entropy Minimizing Curves with Application to Flight Path Design and Clustering
Entropy 2016, 18(9), 337; doi:10.3390/e18090337
Received: 26 July 2016 / Revised: 8 September 2016 / Accepted: 8 September 2016 / Published: 15 September 2016
PDF Full-text (1018 KB) | HTML Full-text | XML Full-text
Abstract
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the increase
[...] Read more.
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the increase of traffic, it is expected that the system will reach its limits in the near future: a paradigm change in ATM is planned with the introduction of trajectory-based operations. In this context, sets of well-separated flight paths are computed in advance, tremendously reducing the number of unsafe situations that must be dealt with by controllers. Unfortunately, automated tools used to generate such planning generally issue trajectories not complying with operational practices or even flight dynamics. In this paper, a means of producing realistic air routes from the output of an automated trajectory design tool is investigated. For that purpose, the entropy of a system of curves is first defined, and a mean of iteratively minimizing it is presented. The resulting curves form a route network that is suitable for use in a semi-automated ATM system with human in the loop. The tool introduced in this work is quite versatile and may be applied also to unsupervised classification of curves: an example is given for French traffic. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics) Printed Edition available
Figures

Figure 1

Open AccessArticle The Constant Information Radar
Entropy 2016, 18(9), 338; doi:10.3390/e18090338
Received: 12 August 2016 / Revised: 9 September 2016 / Accepted: 14 September 2016 / Published: 19 September 2016
Cited by 4 | PDF Full-text (510 KB) | HTML Full-text | XML Full-text
Abstract
The constant information radar, or CIR, is a tracking radar that modulates target revisit time by maintaining a fixed mutual information measure. For highly dynamic targets that deviate significantly from the path predicted by the tracking motion model, the CIR adjusts by illuminating
[...] Read more.
The constant information radar, or CIR, is a tracking radar that modulates target revisit time by maintaining a fixed mutual information measure. For highly dynamic targets that deviate significantly from the path predicted by the tracking motion model, the CIR adjusts by illuminating the target more frequently than it would for well-modeled targets. If SNR is low, the radar delays revisit to the target until the state entropy overcomes noise uncertainty. As a result, we show that the information measure is highly dependent on target entropy and target measurement covariance. A constant information measure maintains a fixed spectral efficiency to support the RF convergence of radar and communications. The result is a radar implementing a novel target scheduling algorithm based on information instead of heuristic or ad hoc methods. The CIR mathematically ensures that spectral use is justified. Full article
(This article belongs to the Special Issue Radar and Information Theory)
Figures

Figure 1

Open AccessArticle Effects of Fatty Infiltration of the Liver on the Shannon Entropy of Ultrasound Backscattered Signals
Entropy 2016, 18(9), 341; doi:10.3390/e18090341
Received: 20 June 2016 / Revised: 7 September 2016 / Accepted: 19 September 2016 / Published: 21 September 2016
PDF Full-text (5802 KB) | HTML Full-text | XML Full-text
Abstract
This study explored the effects of fatty infiltration on the signal uncertainty of ultrasound backscattered echoes from the liver. Standard ultrasound examinations were performed on 107 volunteers. For each participant, raw ultrasound image data of the right lobe of liver were acquired using
[...] Read more.
This study explored the effects of fatty infiltration on the signal uncertainty of ultrasound backscattered echoes from the liver. Standard ultrasound examinations were performed on 107 volunteers. For each participant, raw ultrasound image data of the right lobe of liver were acquired using a clinical scanner equipped with a 3.5-MHz convex transducer. An algorithmic scheme was proposed for ultrasound B-mode and entropy imaging. Fatty liver stage was evaluated using a sonographic scoring system. Entropy values constructed using the ultrasound radiofrequency (RF) and uncompressed envelope signals (denoted by HR and HE, respectively) as a function of fatty liver stage were analyzed using the Pearson correlation coefficient. Data were expressed as the median and interquartile range (IQR). Receiver operating characteristic (ROC) curve analysis with 95% confidence intervals (CIs) was performed to obtain the area under the ROC curve (AUC). The brightness of the entropy image typically increased as the fatty stage varied from mild to severe. The median value of HR monotonically increased from 4.69 (IQR: 4.60–4.79) to 4.90 (IQR: 4.87–4.92) as the severity of fatty liver increased (r = 0.63, p < 0.0001). Concurrently, the median value of HE increased from 4.80 (IQR: 4.69–4.89) to 5.05 (IQR: 5.02–5.07) (r = 0.69, p < 0.0001). In particular, the AUCs obtained using HE (95% CI) were 0.93 (0.87–0.99), 0.88 (0.82–0.94), and 0.76 (0.65–0.87) for fatty stages ≥mild, ≥moderate, and ≥severe, respectively. The sensitivity, specificity, and accuracy were 93.33%, 83.11%, and 86.00%, respectively (≥mild). Fatty infiltration increases the uncertainty of backscattered signals from livers. Ultrasound entropy imaging has potential for the routine examination of fatty liver disease. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications)
Figures

Figure 1

Open AccessArticle The Differential Entropy of the Joint Distribution of Eigenvalues of Random Density Matrices
Entropy 2016, 18(9), 342; doi:10.3390/e18090342
Received: 5 August 2016 / Revised: 11 September 2016 / Accepted: 19 September 2016 / Published: 21 September 2016
PDF Full-text (330 KB) | HTML Full-text | XML Full-text
Abstract
We derive exactly the differential entropy of the joint distribution of eigenvalues of Wishart matrices. Based on this result, we calculate the differential entropy of the joint distribution of eigenvalues of random mixed quantum states, which is induced by taking the partial trace
[...] Read more.
We derive exactly the differential entropy of the joint distribution of eigenvalues of Wishart matrices. Based on this result, we calculate the differential entropy of the joint distribution of eigenvalues of random mixed quantum states, which is induced by taking the partial trace over the environment of Haar-distributed bipartite pure states. Then, we investigate the differential entropy of the joint distribution of diagonal entries of random mixed quantum states. Finally, we investigate the relative entropy between these two kinds of distributions. Full article
(This article belongs to the Section Information Theory)
Open AccessArticle Fuzzy Adaptive Repetitive Control for Periodic Disturbance with Its Application to High Performance Permanent Magnet Synchronous Motor Speed Servo Systems
Entropy 2016, 18(9), 261; doi:10.3390/e18090261
Received: 12 May 2016 / Revised: 8 July 2016 / Accepted: 8 July 2016 / Published: 14 September 2016
PDF Full-text (2310 KB) | HTML Full-text | XML Full-text
Abstract
For reducing the steady state speed ripple, especially in high performance speed servo system applications, the steady state precision is more and more important for real servo systems. This paper investigates the steady state speed ripple periodic disturbance problem for a permanent magnet
[...] Read more.
For reducing the steady state speed ripple, especially in high performance speed servo system applications, the steady state precision is more and more important for real servo systems. This paper investigates the steady state speed ripple periodic disturbance problem for a permanent magnet synchronous motor (PMSM) servo system; a fuzzy adaptive repetitive controller is designed in the speed loop based on repetitive control and fuzzy information theory for reducing periodic disturbance. Firstly, the various sources of the PMSM speed ripple problem are described and analyzed. Then, the mathematical model of PMSM is given. Subsequently, a fuzzy adaptive repetitive controller based on repetitive control and fuzzy logic control is designed for the PMSM speed servo system. In addition, the system stability analysis is also deduced. Finally, the simulation and experiment implementation are respectively based on the MATLAB/Simulink and TMS320F2808 of Texas instrument company, DSP (digital signal processor) hardware platform. Comparing to the proportional integral (PI) controller, simulation and experimental results show that the proposed fuzzy adaptive repetitive controller has better periodic disturbance rejection ability and higher steady state precision. Full article
Figures

Figure 1

Open AccessArticle Generalized Robustness of Contextuality
Entropy 2016, 18(9), 297; doi:10.3390/e18090297
Received: 16 May 2016 / Revised: 4 August 2016 / Accepted: 9 August 2016 / Published: 1 September 2016
Cited by 1 | PDF Full-text (1325 KB) | HTML Full-text | XML Full-text
Abstract
Motivated by the importance of contextuality and a work on the robustness of the entanglement of mixed quantum states, the robustness of contextuality (RoC) RC(e) of an empirical model e against non-contextual noises was introduced and discussed in Science
[...] Read more.
Motivated by the importance of contextuality and a work on the robustness of the entanglement of mixed quantum states, the robustness of contextuality (RoC) R C ( e ) of an empirical model e against non-contextual noises was introduced and discussed in Science China Physics, Mechanics and Astronomy (59(4) and 59(9), 2016). Because noises are not always non-contextual, this paper introduces and discusses the generalized robustness of contextuality (GRoC) R g ( e ) of an empirical model e against general noises. It is proven that R g ( e ) = 0 if and only if e is non-contextual. This means that the quantity R g can be used to distinguish contextual empirical models from non-contextual ones. It is also shown that the function R g is convex on the set of all empirical models and continuous on the set of all no-signaling empirical models. For any two empirical models e and f such that the generalized relative robustness of e with respect to f is finite, a fascinating relationship between the GRoCs of e and f is proven, which reads R g ( e ) R g ( f ) 1 . Lastly, for any n-cycle contextual box e, a relationship between the GRoC R g ( e ) and the extent Δ e of violating the non-contextual inequalities is established. Full article
(This article belongs to the Special Issue Quantum Information 2016)
Figures

Figure 1

Review

Jump to: Editorial, Research, Other

Open AccessReview Sleep Stage Classification Using EEG Signal Analysis: A Comprehensive Survey and New Investigation
Entropy 2016, 18(9), 272; doi:10.3390/e18090272
Received: 26 June 2016 / Revised: 13 August 2016 / Accepted: 17 August 2016 / Published: 23 August 2016
Cited by 2 | PDF Full-text (3998 KB) | HTML Full-text | XML Full-text
Abstract
Sleep specialists often conduct manual sleep stage scoring by visually inspecting the patient’s neurophysiological signals collected at sleep labs. This is, generally, a very difficult, tedious and time-consuming task. The limitations of manual sleep stage scoring have escalated the demand for developing Automatic
[...] Read more.
Sleep specialists often conduct manual sleep stage scoring by visually inspecting the patient’s neurophysiological signals collected at sleep labs. This is, generally, a very difficult, tedious and time-consuming task. The limitations of manual sleep stage scoring have escalated the demand for developing Automatic Sleep Stage Classification (ASSC) systems. Sleep stage classification refers to identifying the various stages of sleep and is a critical step in an effort to assist physicians in the diagnosis and treatment of related sleep disorders. The aim of this paper is to survey the progress and challenges in various existing Electroencephalogram (EEG) signal-based methods used for sleep stage identification at each phase; including pre-processing, feature extraction and classification; in an attempt to find the research gaps and possibly introduce a reasonable solution. Many of the prior and current related studies use multiple EEG channels, and are based on 30 s or 20 s epoch lengths which affect the feasibility and speed of ASSC for real-time applications. Thus, in this paper, we also present a novel and efficient technique that can be implemented in an embedded hardware device to identify sleep stages using new statistical features applied to 10 s epochs of single-channel EEG signals. In this study, the PhysioNet Sleep European Data Format (EDF) Database was used. The proposed methodology achieves an average classification sensitivity, specificity and accuracy of 89.06%, 98.61% and 93.13%, respectively, when the decision tree classifier is applied. Finally, our new method is compared with those in recently published studies, which reiterates the high classification accuracy performance. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Figures

Figure 1

Other

Jump to: Editorial, Research, Review

Open AccessLetter Combinatorial Intricacies of Labeled Fano Planes
Entropy 2016, 18(9), 312; doi:10.3390/e18090312
Received: 20 July 2016 / Revised: 5 August 2016 / Accepted: 19 August 2016 / Published: 23 August 2016
PDF Full-text (247 KB) | HTML Full-text | XML Full-text
Abstract
Given a seven-element set X={1,2,3,4,5,6,7}, there are 30 ways to define a Fano plane on it. Let us call a line of such a Fano plane—that
[...] Read more.
Given a seven-element set X = { 1 , 2 , 3 , 4 , 5 , 6 , 7 } , there are 30 ways to define a Fano plane on it. Let us call a line of such a Fano plane—that is to say an unordered triple from X—ordinary or defective, according to whether the sum of two smaller integers from the triple is or is not equal to the remaining one, respectively. A point of the labeled Fano plane is said to be of the order s, 0 s 3 , if there are s defective lines passing through it. With such structural refinement in mind, the 30 Fano planes are shown to fall into eight distinct types. Out of the total of 35 lines, nine ordinary lines are of five different kinds, whereas the remaining 26 defective lines yield as many as ten distinct types. It is shown that no labeled Fano plane can have all points of zero-th order, or feature just one point of order two. A connection with prominent configurations in Steiner triple systems is also pointed out. Full article
(This article belongs to the collection Advances in Applied Statistical Mechanics)
Figures

Figure 1

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
E-Mail: 
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy Edit a special issue Review for Entropy
loading...
Back to Top