Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 16, Issue 4 (April 2014), Pages 1842-2383

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-29
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle An Entropy-Based Contagion Index and Its Sampling Properties for Landscape Analysis
Entropy 2014, 16(4), 1842-1859; doi:10.3390/e16041842
Received: 6 July 2012 / Revised: 18 March 2014 / Accepted: 19 March 2014 / Published: 26 March 2014
Cited by 1 | PDF Full-text (1794 KB) | HTML Full-text | XML Full-text
Abstract
Studies of spatial patterns of landscapes are useful to quantify human impact, predict wildlife effects, or describe various landscape features. A robust landscape index should quantify two components of landscape diversity: composition and configuration. One category of landscape index is the contagion index.
[...] Read more.
Studies of spatial patterns of landscapes are useful to quantify human impact, predict wildlife effects, or describe various landscape features. A robust landscape index should quantify two components of landscape diversity: composition and configuration. One category of landscape index is the contagion index. Some landscape ecologists promote the use of relative contagion indices. It is demonstrated, using simulated landscapes, that relativized contagion indices are mathematically untenable. A new entropy contagion index (Γ) is developed. Distributional properties of Γ^ are derived. It is shown to be asymptotically unbiased, consistent, and asymptotically normally distributed. A variance formula for Γ^ is derived using the delta method. As an application, the pattern and changes in forest types across four soil-geologic landform strata were analyzed on the 80,000 ha Savannah River Site in South Carolina, USA. One-way analysis of variance was used for hypothesis testing of contagion among strata. The differences in contagion across the strata provide insight to managers to meet structural objectives. Full article
Figures

Open AccessArticle A Study of Fractality and Long-Range Order in the Distribution of Transposable Elements in Eukaryotic Genomes Using the Scaling Properties of Block Entropy and Box-Counting
Entropy 2014, 16(4), 1860-1882; doi:10.3390/e16041860
Received: 11 December 2013 / Revised: 5 February 2014 / Accepted: 13 March 2014 / Published: 26 March 2014
Cited by 1 | PDF Full-text (2071 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Repeats or Transposable Elements (TEs) are highly repeated sequence stretches, present in virtually all eukaryotic genomes. We explore the distribution of representative TEs from all major classes in entire chromosomes across various organisms. We employ two complementary approaches, the scaling of block entropy
[...] Read more.
Repeats or Transposable Elements (TEs) are highly repeated sequence stretches, present in virtually all eukaryotic genomes. We explore the distribution of representative TEs from all major classes in entire chromosomes across various organisms. We employ two complementary approaches, the scaling of block entropy and box-counting. Both converge to the conclusion that well-developed fractality is typical of small genomes while in large genomes it appears sporadically and in some cases is rudimentary. The human genome is particularly prone to develop this pattern, as TE chromosomal distributions therein are often highly clustered and inhomogeneous. Comparing with previous works, where occurrence of power-law-like size distributions in inter-repeat distances is studied, we conclude that fractality in entire chromosomes is a more stringent (thus less often encountered) condition. We have formulated a simple evolutionary scenario for the genomic dynamics of TEs, which may account for their fractal distribution in real genomes. The observed fractality and long-range properties of TE genomic distributions have probably contributed to the formation of the “fractal globule”, a model for the confined chromatin organization of the eukaryotic nucleus proposed on the basis of experimental evidence. Full article
Open AccessArticle District Heating Mode Analysis Based on an Air-cooled Combined Heat and Power Station
Entropy 2014, 16(4), 1883-1901; doi:10.3390/e16041883
Received: 27 December 2013 / Revised: 25 February 2014 / Accepted: 13 March 2014 / Published: 26 March 2014
Cited by 2 | PDF Full-text (1195 KB) | HTML Full-text | XML Full-text
Abstract
As an important research subject, district heating with combined heat and power (CHP) has significant potential for energy conservation. This paper utilised a 200 MW air-cooled unit as an actual case and presented a design scheme and energy consumption analysis of three typical
[...] Read more.
As an important research subject, district heating with combined heat and power (CHP) has significant potential for energy conservation. This paper utilised a 200 MW air-cooled unit as an actual case and presented a design scheme and energy consumption analysis of three typical CHP modes, including the low vacuum mode (LVM), the extraction condensing mode (ECM), and the absorbing heat pump mode (AHPM). The advantages and disadvantages of each mode (including their practical problems) were analysed, and suggestions for the best mode were proposed. The energy consumption of the three heating modes changed with the heating load. When the heating load was increased, the net power of the entire system decreased to different degrees. In this paper, the energy conservation effect of the LVM was the most ideal, followed by the ECM and the AHPM. Besides, the LVM and AHPM were able to supply larger heat loads than the ECM, which was limited by the minimum cooling flow of the low pressure cylinder. Furthermore, in order to get a more general conclusion, a similar case with an air-cooled 300 MW unit is studied, showing that the fuel consumption levels of ECM and AHPM have changed. Full article
(This article belongs to the Special Issue Advances in Applied Thermodynamics)
Open AccessArticle A Dynamic Dark Information Energy Consistent with Planck Data
Entropy 2014, 16(4), 1902-1916; doi:10.3390/e16041902
Received: 16 February 2014 / Revised: 10 March 2014 / Accepted: 19 March 2014 / Published: 27 March 2014
PDF Full-text (762 KB) | HTML Full-text | XML Full-text
Abstract
The 2013 cosmology results from the European Space Agency Planck spacecraft provide new limits to the dark energy equation of state parameter. Here we show that Holographic Dark Information Energy (HDIE), a dynamic dark energy model, achieves an optimal fit to the published
[...] Read more.
The 2013 cosmology results from the European Space Agency Planck spacecraft provide new limits to the dark energy equation of state parameter. Here we show that Holographic Dark Information Energy (HDIE), a dynamic dark energy model, achieves an optimal fit to the published datasets where Planck data is combined with other astrophysical measurements. HDIE uses Landauer’s principle to account for dark energy by the energy equivalent of information, or entropy, of stellar heated gas and dust. Combining Landauer’s principle with the Holographic principle yields an equation of state parameter determined solely by star formation history, effectively solving the “cosmic coincidence problem”. While HDIE mimics a cosmological constant at low red-shifts, z < 1, the small difference from a cosmological constant expected at higher red-shifts will only be resolved by the next generation of dark energy instrumentation. The HDIE model is shown to provide a viable alternative to the main cosmological constant/vacuum energy and scalar field/ quintessence explanations. Full article
Open AccessArticle The Entropy Production Distribution in Non-Markovian Thermal Baths
Entropy 2014, 16(4), 1917-1930; doi:10.3390/e16041917
Received: 16 December 2013 / Revised: 15 February 2014 / Accepted: 17 March 2014 / Published: 28 March 2014
Cited by 4 | PDF Full-text (209 KB) | HTML Full-text | XML Full-text
Abstract
In this work we study the distribution function for the total entropy production of a Brownian particle embedded in a non-Markovian thermal bath. The problem is studied in the overdamped approximation of the generalized Langevin equation, which accounts for a friction memory kernel
[...] Read more.
In this work we study the distribution function for the total entropy production of a Brownian particle embedded in a non-Markovian thermal bath. The problem is studied in the overdamped approximation of the generalized Langevin equation, which accounts for a friction memory kernel characteristic of a Gaussian colored noise. The problem is studied in two physical situations: (i) when the particle in the harmonic trap is subjected to an arbitrary time-dependent driving force; and (ii) when the minimum of the harmonic trap is arbitrarily dragged out of equilibrium by an external force. By assuming a natural non Markovian canonical distribution for the initial conditions, the distribution function for the total entropy production becomes a non Gaussian one. Its characterization is then given through the first three cumulants. Full article
Open AccessArticle Information in Biological Systems and the Fluctuation Theorem
Entropy 2014, 16(4), 1931-1948; doi:10.3390/e16041931
Received: 17 January 2014 / Revised: 27 March 2014 / Accepted: 28 March 2014 / Published: 1 April 2014
PDF Full-text (297 KB) | HTML Full-text | XML Full-text
Abstract
Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in
[...] Read more.
Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in living systems are capable of degrading more energy, and processing complex information through developmental and environmental constraints. The generalized fluctuation theorem can quantify the hysteresis observed in the amount of the irreversible work in nonequilibrium regimes in the presence of information and thermodynamic coupling. Full article
(This article belongs to the Special Issue Advances in Methods and Foundations of Non-Equilibrium Thermodynamics)
Open AccessArticle Topological Classification of Limit Cycles of Piecewise Smooth Dynamical Systems and Its Associated Non-Standard Bifurcations
Entropy 2014, 16(4), 1949-1968; doi:10.3390/e16041949
Received: 2 September 2013 / Revised: 20 March 2014 / Accepted: 20 March 2014 / Published: 1 April 2014
Cited by 1 | PDF Full-text (1553 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we propose a novel strategy for the synthesis and the classification of nonsmooth limit cycles and its bifurcations (named Non-Standard Bifurcations or Discontinuity Induced Bifurcations or DIBs) in n-dimensional piecewise-smooth dynamical systems, particularly Continuous PWS and Discontinuous PWS (or
[...] Read more.
In this paper, we propose a novel strategy for the synthesis and the classification of nonsmooth limit cycles and its bifurcations (named Non-Standard Bifurcations or Discontinuity Induced Bifurcations or DIBs) in n-dimensional piecewise-smooth dynamical systems, particularly Continuous PWS and Discontinuous PWS (or Filippov-type PWS) systems. The proposed qualitative approach explicitly includes two main aspects: multiple discontinuity boundaries (DBs) in the phase space and multiple intersections between DBs (or corner manifolds—CMs). Previous classifications of DIBs of limit cycles have been restricted to generic cases with a single DB or a single CM. We use the definition of piecewise topological equivalence in order to synthesize all possibilities of nonsmooth limit cycles. Families, groups and subgroups of cycles are defined depending on smoothness zones and discontinuity boundaries (DB) involved. The synthesized cycles are used to define bifurcation patterns when the system is perturbed with parametric changes. Four families of DIBs of limit cycles are defined depending on the properties of the cycles involved. Well-known and novel bifurcations can be classified using this approach. Full article
(This article belongs to the Special Issue Dynamical Systems) Print Edition available
Figures

Open AccessArticle Stochastic Dynamics of Proteins and the Action of Biological Molecular Machines
Entropy 2014, 16(4), 1969-1982; doi:10.3390/e16041969
Received: 10 January 2014 / Revised: 27 February 2014 / Accepted: 25 March 2014 / Published: 1 April 2014
Cited by 2 | PDF Full-text (496 KB) | HTML Full-text | XML Full-text
Abstract
It is now well established that most if not all enzymatic proteins display a slow stochastic dynamics of transitions between a variety of conformational substates composing their native state. A hypothesis is stated that the protein conformational transition networks, as just as higher-level
[...] Read more.
It is now well established that most if not all enzymatic proteins display a slow stochastic dynamics of transitions between a variety of conformational substates composing their native state. A hypothesis is stated that the protein conformational transition networks, as just as higher-level biological networks, the protein interaction network, and the metabolic network, have evolved in the process of self-organized criticality. Here, the criticality means that all the three classes of networks are scale-free and, moreover, display a transition from the fractal organization on a small length-scale to the small-world organization on the large length-scale. Good mathematical models of such networks are stochastic critical branching trees extended by long-range shortcuts. Biological molecular machines are proteins that operate under isothermal conditions and hence are referred to as free energy transducers. They can be formally considered as enzymes that simultaneously catalyze two chemical reactions: the free energy-donating (input) reaction and the free energy-accepting (output) one. The far-from-equilibrium degree of coupling between the output and the input reaction fluxes have been studied both theoretically and by means of the Monte Carlo simulations on model networks. For single input and output gates the degree of coupling cannot exceed unity. Study simulations of random walks on model networks involving more extended gates indicate that the case of the degree of coupling value higher than one is realized on the mentioned above critical branching trees extended by long-range shortcuts. Full article
(This article belongs to the Special Issue Advances in Methods and Foundations of Non-Equilibrium Thermodynamics)
Open AccessArticle Intersection Information Based on Common Randomness
Entropy 2014, 16(4), 1985-2000; doi:10.3390/e16041985
Received: 25 October 2013 / Revised: 27 March 2014 / Accepted: 28 March 2014 / Published: 4 April 2014
Cited by 8 | PDF Full-text (516 KB) | HTML Full-text | XML Full-text
Abstract
The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory.
[...] Read more.
The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory. A palatable measure of intersection information would provide a principled way to quantify slippery concepts, such as synergy. Here, we introduce an intersection information measure based on the Gács-Körner common random variable that is the first to satisfy the coveted target monotonicity property. Our measure is imperfect, too, and we suggest directions for improvement. Full article
(This article belongs to the Special Issue Entropy Methods in Guided Self-Organization)
Open AccessArticle A Fuzzy Parallel Processing Scheme for Enhancing the Effectiveness of a Dynamic Just-in-time Location-aware Service System
Entropy 2014, 16(4), 2001-2022; doi:10.3390/e16042001
Received: 26 February 2014 / Revised: 31 March 2014 / Accepted: 1 April 2014 / Published: 8 April 2014
Cited by 5 | PDF Full-text (542 KB) | HTML Full-text | XML Full-text
Abstract
Location-aware service systems are a hot topic in diverse research fields including mobile commerce, ambient intelligence, remote sensing and ubiquitous computing. However, the timeliness and efficiency of such systems are two issues that have rarely been emphasized. For this reason, this study tries
[...] Read more.
Location-aware service systems are a hot topic in diverse research fields including mobile commerce, ambient intelligence, remote sensing and ubiquitous computing. However, the timeliness and efficiency of such systems are two issues that have rarely been emphasized. For this reason, this study tries to establish a location-aware service system in which both the timeliness and efficiency of service provision are addressed. To this end, some innovative treatments have been used in the proposed methodology. First, the uncertainty of detecting a user’s location using the global positioning system is considered by modeling the location and speed of the user with fuzzy numbers. Subsequently, a fuzzy integer-nonlinear programming model is formulated to address the problem of finding the dynamic just-in-time service location and path for the user. To help solve the problem, the maximum entropy weighting function and the basic defuzzification distribution (BADD) method are applied to defuzzify the fuzzy variables. In addition, to enhance the efficiency of solving the problem, a fuzzy parallel processing scheme is also proposed for decomposing the problem into smaller pieces that can be handled by separate processing modules. An illustrative example is used to illustrate the proposed methodology. Finally, the effectiveness of the proposed methodology has been confirmed with an experiment. According to the results, using the proposed methodology the waiting time could be reduced by 60%. Full article
(This article belongs to the Special Issue Complex Systems and Nonlinear Dynamics)
Open AccessArticle Matrix Algebraic Properties of the Fisher Information Matrix of Stationary Processes
Entropy 2014, 16(4), 2023-2055; doi:10.3390/e16042023
Received: 12 February 2014 / Revised: 11 March 2014 / Accepted: 24 March 2014 / Published: 8 April 2014
PDF Full-text (365 KB) | HTML Full-text | XML Full-text
Abstract
In this survey paper, a summary of results which are to be found in a series of papers, is presented. The subject of interest is focused on matrix algebraic properties of the Fisher information matrix (FIM) of stationary processes. The FIM is an
[...] Read more.
In this survey paper, a summary of results which are to be found in a series of papers, is presented. The subject of interest is focused on matrix algebraic properties of the Fisher information matrix (FIM) of stationary processes. The FIM is an ingredient of the Cram´er-Rao inequality, and belongs to the basics of asymptotic estimation theory in mathematical statistics. The FIM is interconnected with the Sylvester, Bezout and tensor Sylvester matrices. Through these interconnections it is shown that the FIM of scalar and multiple stationary processes fulfill the resultant matrix property. A statistical distance measure involving entries of the FIM is presented. In quantum information, a different statistical distance measure is set forth. It is related to the Fisher information but where the information about one parameter in a particular measurement procedure is considered. The FIM of scalar stationary processes is also interconnected to the solutions of appropriate Stein equations, conditions for the FIM to verify certain Stein equations are formulated. The presence of Vandermonde matrices is also emphasized. Full article
(This article belongs to the Special Issue Information Geometry)
Open AccessArticle Entropy and Exergy Analysis of a Heat Recovery Vapor Generator for Ammonia-Water Mixtures
Entropy 2014, 16(4), 2056-2070; doi:10.3390/e16042056
Received: 6 January 2014 / Revised: 7 April 2014 / Accepted: 9 April 2014 / Published: 11 April 2014
Cited by 4 | PDF Full-text (314 KB) | HTML Full-text | XML Full-text
Abstract
Recently power generation systems using ammonia-water binary mixtures as a working fluid have been attracting much attention for their efficient conversion of low-grade heat sources into useful energy forms. This paper presents the First and Second Law thermodynamic analysis for a heat recovery
[...] Read more.
Recently power generation systems using ammonia-water binary mixtures as a working fluid have been attracting much attention for their efficient conversion of low-grade heat sources into useful energy forms. This paper presents the First and Second Law thermodynamic analysis for a heat recovery vapor generator (HRVG) of ammonia-water mixtures when the heat source is low-temperature energy in the form of sensible heat. In the analysis, key parameters such as ammonia mass concentration and pressure of the binary mixture are studied to investigate their effects on the system performance, including the effectiveness of heat transfer, entropy generation, and exergy efficiency. The results show that the ammonia concentration and the pressure of the mixture have significant effects on the system performance of the HRVG. Full article
(This article belongs to the Special Issue Entropy and the Second Law of Thermodynamics)
Open AccessArticle The Entropic Potential Concept: a New Way to Look at Energy Transfer Operations
Entropy 2014, 16(4), 2071-2084; doi:10.3390/e16042071
Received: 4 February 2014 / Revised: 7 April 2014 / Accepted: 10 April 2014 / Published: 14 April 2014
Cited by 5 | PDF Full-text (227 KB) | HTML Full-text | XML Full-text
Abstract
Energy transfer operations or processes are systematically analyzed with respect to the way they can be assessed. It turns out that the energy transfer should not only be characterized by the operation or process itself but that it should be seen in a
[...] Read more.
Energy transfer operations or processes are systematically analyzed with respect to the way they can be assessed. It turns out that the energy transfer should not only be characterized by the operation or process itself but that it should be seen in a wider context. This context is introduced as the entropic potential of the energy that is transferred. It takes into account the overall transfer from the energy in its initial and finite states, i.e., starting as pure exergy when it is a primary energy, for example, and ending as pure anergy when it has become part of the internal energy of the ambient. With this concept an energy devaluation number can be defined which has several properties with a reasonable physical background. Two examples of different complexity of the process assessed are given and discussed with respect to the physical meaning of the new energy devaluation number. Full article
Open AccessArticle Cross Layer Interference Management in Wireless Biomedical Networks
Entropy 2014, 16(4), 2085-2104; doi:10.3390/e16042085
Received: 10 November 2013 / Revised: 4 March 2014 / Accepted: 4 April 2014 / Published: 14 April 2014
Cited by 4 | PDF Full-text (1706 KB) | HTML Full-text | XML Full-text
Abstract
Interference, in wireless networks, is a central phenomenon when multiple uncoordinated links share a common communication medium. The study of the interference channel was initiated by Shannon in 1961 and since then this problem has been thoroughly elaborated at the Information theoretic level
[...] Read more.
Interference, in wireless networks, is a central phenomenon when multiple uncoordinated links share a common communication medium. The study of the interference channel was initiated by Shannon in 1961 and since then this problem has been thoroughly elaborated at the Information theoretic level but its characterization still remains an open issue. When multiple uncoordinated links share a common medium the effect of interference is a crucial limiting factor for network performance. In this work, using cross layer cooperative communication techniques, we study how to compensate interference in the context of wireless biomedical networks, where many links transferring biomedical or other health related data may be formed and suffer from all other interfering transmissions, to allow successful receptions and improve the overall network performance. We define the interference limited communication range to be the critical communication region around a receiver, with a number of surrounding interfering nodes, within which a successful communication link can be formed. Our results indicate that we can achieve more successful transmissions by adapting the transmission rate and power, to the path loss exponent, and the selected mode of the underline communication technique allowing interference mitigation and when possible lower power consumption and increase achievable transmission rates. Full article
(This article belongs to the Special Issue Advances in Information Theory)
Open AccessArticle Wiretap Channel with Information Embedding on Actions
Entropy 2014, 16(4), 2105-2130; doi:10.3390/e16042105
Received: 11 November 2013 / Revised: 28 March 2014 / Accepted: 10 April 2014 / Published: 14 April 2014
Cited by 3 | PDF Full-text (348 KB) | HTML Full-text | XML Full-text
Abstract
Information embedding on actions is a new channel model in which a specific decoder is used to observe the actions taken by the encoder and retrieve part of the message intended for the receiver. We revisit this model and consider a different scenario
[...] Read more.
Information embedding on actions is a new channel model in which a specific decoder is used to observe the actions taken by the encoder and retrieve part of the message intended for the receiver. We revisit this model and consider a different scenario where a secrecy constraint is imposed. By adding a wiretapper in the model, we aim to send the confidential message to the receiver and keep it secret from the wiretapper as much as possible. We characterize the inner and outer bounds on the capacity-equivocation region of such a channel with noncausal (and causal) channel state information. Furthermore, the lower and upper bounds on the sum secrecy capacity are also obtained. Besides, by eliminating the specific decoder, we get a new outer bound on the capacity-equivocation region of the wiretap channel with action-dependent states and prove it is tighter than the existing outer bound. A binary example is presented to illustrate the tradeoff between the sum secrecy rate and the information embedding rate under the secrecy constraint. We find that the secrecy constraint and the communication requirements of information embedding have a negative impact on improving the secrecy transmission rate of the given communication link. Full article
Open AccessArticle Information Geometry of Positive Measures and Positive-Definite Matrices: Decomposable Dually Flat Structure
Entropy 2014, 16(4), 2131-2145; doi:10.3390/e16042131
Received: 14 February 2014 / Revised: 9 April 2014 / Accepted: 10 April 2014 / Published: 14 April 2014
Cited by 7 | PDF Full-text (246 KB) | HTML Full-text | XML Full-text
Abstract
Information geometry studies the dually flat structure of a manifold, highlighted by the generalized Pythagorean theorem. The present paper studies a class of Bregman divergences called the (ρ,τ)-divergence. A (ρ,τ) -divergence generates a dually flat structure in the manifold of positive measures, as well
[...] Read more.
Information geometry studies the dually flat structure of a manifold, highlighted by the generalized Pythagorean theorem. The present paper studies a class of Bregman divergences called the (ρ,τ)-divergence. A (ρ,τ) -divergence generates a dually flat structure in the manifold of positive measures, as well as in the manifold of positive-definite matrices. The class is composed of decomposable divergences, which are written as a sum of componentwise divergences. Conversely, a decomposable dually flat divergence is shown to be a (ρ,τ) -divergence. A (ρ,τ) -divergence is determined from two monotone scalar functions, ρ and τ. The class includes the KL-divergence, α-, β- and (α, β)-divergences as special cases. The transformation between an affine parameter and its dual is easily calculated in the case of a decomposable divergence. Therefore, such a divergence is useful for obtaining the center for a cluster of points, which will be applied to classification and information retrieval in vision. For the manifold of positive-definite matrices, in addition to the dually flatness and decomposability, we require the invariance under linear transformations, in particular under orthogonal transformations. This opens a way to define a new class of divergences, called the (ρ,τ) -structure in the manifold of positive-definite matrices. Full article
(This article belongs to the Special Issue Information Geometry)
Open AccessArticle Quantifying Unique Information
Entropy 2014, 16(4), 2161-2183; doi:10.3390/e16042161
Received: 15 January 2014 / Revised: 24 March 2014 / Accepted: 4 April 2014 / Published: 15 April 2014
Cited by 15 | PDF Full-text (277 KB) | HTML Full-text | XML Full-text
Abstract
We propose new measures of shared information, unique information and synergistic information that can be used to decompose the mutual information of a pair of random variables (Y, Z) with a third random variable X. Our measures are motivated by
[...] Read more.
We propose new measures of shared information, unique information and synergistic information that can be used to decompose the mutual information of a pair of random variables (Y, Z) with a third random variable X. Our measures are motivated by an operational idea of unique information, which suggests that shared information and unique information should depend only on the marginal distributions of the pairs (X, Y) and (X,Z). Although this invariance property has not been studied before, it is satisfied by other proposed measures of shared information. The invariance property does not uniquely determine our new measures, but it implies that the functions that we define are bounds to any other measures satisfying the same invariance property. We study properties of our measures and compare them to other candidate measures. Full article
Open AccessArticle A Two-stage Method for Solving Multi-resident Activity Recognition in Smart Environments
Entropy 2014, 16(4), 2184-2203; doi:10.3390/e16042184
Received: 14 January 2014 / Revised: 21 March 2014 / Accepted: 4 April 2014 / Published: 15 April 2014
Cited by 3 | PDF Full-text (455 KB) | HTML Full-text | XML Full-text
Abstract
To recognize individual activities in multi-resident environments with pervasive sensors, some researchers have pointed out that finding data associations can contribute to activity recognition and previous methods either need or infer data association when recognizing new multi-resident activities based on new observations from
[...] Read more.
To recognize individual activities in multi-resident environments with pervasive sensors, some researchers have pointed out that finding data associations can contribute to activity recognition and previous methods either need or infer data association when recognizing new multi-resident activities based on new observations from sensors. However, it is often difficult to find out data associations, and available approaches to multi-resident activity recognition degrade when the data association is not given or induced with low accuracy. This paper exploits some simple knowledge of multi-resident activities through defining Combined label and the state set, and proposes a two-stage activity recognition method for multi-resident activity recognition. We define Combined label states at the model building phase with the help of data association, and learn Combined label states at the new activity recognition phase without the help of data association. Our two stages method is embodied in the new activity recognition phase, where we figure out multi-resident activities in the second stage after learning Combined label states at first stage. The experiments using the multi-resident CASAS data demonstrate that our method can increase the recognition accuracy by approximately 10%. Full article
Open AccessArticle Co and In Doped Ni-Mn-Ga Magnetic Shape Memory Alloys: A Thorough Structural, Magnetic and Magnetocaloric Study
Entropy 2014, 16(4), 2204-2222; doi:10.3390/e16042204
Received: 20 March 2014 / Revised: 10 April 2014 / Accepted: 10 April 2014 / Published: 16 April 2014
Cited by 17 | PDF Full-text (1932 KB) | HTML Full-text | XML Full-text
Abstract
In Ni-Mn-Ga ferromagnetic shape memory alloys, Co-doping plays a major role in determining a peculiar phase diagram where, besides a change in the critical temperatures, a change of number, order and nature of phase transitions (e.g., from ferromagnetic to paramagnetic or from paramagnetic
[...] Read more.
In Ni-Mn-Ga ferromagnetic shape memory alloys, Co-doping plays a major role in determining a peculiar phase diagram where, besides a change in the critical temperatures, a change of number, order and nature of phase transitions (e.g., from ferromagnetic to paramagnetic or from paramagnetic to ferromagnetic, on heating) can be obtained, together with a change in the giant magnetocaloric effect from direct to inverse. Here we present a thorough study of the intrinsic magnetic and structural properties, including their dependence on hydrostatic pressure, that are at the basis of the multifunctional behavior of Co and In-doped alloys. We study in depth their magnetocaloric properties, taking advantage of complementary calorimetric and magnetic techniques, and show that if a proper measurement protocol is adopted they all merge to the same values, even in case of first order transitions. A simplified model for the estimation of the adiabatic temperature change that relies only on indirect measurements is proposed, allowing for the quick and reliable evaluation of the magnetocaloric potentiality of new materials starting from readily available magnetic measurements. Full article
(This article belongs to the Special Issue Entropy in Shape Memory Alloys)
Open AccessArticle An Extended Result on the Optimal Estimation Under the Minimum Error Entropy Criterion
Entropy 2014, 16(4), 2223-2233; doi:10.3390/e16042223
Received: 24 January 2014 / Revised: 2 April 2014 / Accepted: 4 April 2014 / Published: 17 April 2014
PDF Full-text (338 KB) | HTML Full-text | XML Full-text
Abstract
The minimum error entropy (MEE) criterion has been successfully used in fields such as parameter estimation, system identification and the supervised machine learning. There is in general no explicit expression for the optimal MEE estimate unless some constraints on the conditional distribution are
[...] Read more.
The minimum error entropy (MEE) criterion has been successfully used in fields such as parameter estimation, system identification and the supervised machine learning. There is in general no explicit expression for the optimal MEE estimate unless some constraints on the conditional distribution are imposed. A recent paper has proved that if the conditional density is conditionally symmetric and unimodal (CSUM), then the optimal MEE estimate (with Shannon entropy) equals the conditional median. In this study, we extend this result to the generalized MEE estimation where the optimality criterion is the Renyi entropy or equivalently, the α-order information potential (IP). Full article
Open AccessArticle Finite-Time Chaos Suppression of Permanent Magnet Synchronous Motor Systems
Entropy 2014, 16(4), 2234-2243; doi:10.3390/e16042234
Received: 12 September 2013 / Revised: 11 April 2014 / Accepted: 11 April 2014 / Published: 21 April 2014
Cited by 3 | PDF Full-text (303 KB) | HTML Full-text | XML Full-text
Abstract
This paper considers the problem of the chaos suppression for the Permanent Magnet Synchronous Motor (PMSM) system via the finite-time control. Based on Lyapunov stability theory and the finite-time controller are developed such that the chaos behaviors of PMSM system can be suppressed.
[...] Read more.
This paper considers the problem of the chaos suppression for the Permanent Magnet Synchronous Motor (PMSM) system via the finite-time control. Based on Lyapunov stability theory and the finite-time controller are developed such that the chaos behaviors of PMSM system can be suppressed. The effectiveness and accuracy of the proposed methods are shown in numerical simulations. Full article
(This article belongs to the Special Issue Dynamical Systems) Print Edition available
Open AccessArticle Parameter Estimation for Spatio-Temporal Maximum Entropy Distributions: Application to Neural Spike Trains
Entropy 2014, 16(4), 2244-2277; doi:10.3390/e16042244
Received: 19 February 2014 / Revised: 28 March 2014 / Accepted: 8 April 2014 / Published: 22 April 2014
Cited by 1 | PDF Full-text (1785 KB) | HTML Full-text | XML Full-text
Abstract
We propose a numerical method to learn maximum entropy (MaxEnt) distributions with spatio-temporal constraints from experimental spike trains. This is an extension of two papers, [10] and [4], which proposed the estimation of parameters where only spatial constraints were taken into account. The
[...] Read more.
We propose a numerical method to learn maximum entropy (MaxEnt) distributions with spatio-temporal constraints from experimental spike trains. This is an extension of two papers, [10] and [4], which proposed the estimation of parameters where only spatial constraints were taken into account. The extension we propose allows one to properly handle memory effects in spike statistics, for large-sized neural networks. Full article
(This article belongs to the Special Issue Maximum Entropy and Its Application)
Open AccessArticle Going Round in Circles: Landauer vs. Norton on the Thermodynamics of Computation
Entropy 2014, 16(4), 2278-2290; doi:10.3390/e16042278
Received: 26 September 2013 / Revised: 9 January 2014 / Accepted: 16 January 2014 / Published: 22 April 2014
Cited by 1 | PDF Full-text (191 KB) | HTML Full-text | XML Full-text
Abstract
There seems to be a consensus among physicists that there is a connection between information processing and thermodynamics. In particular, Landauer’s Principle (LP) is widely assumed as part of the foundation of information theoretic/computational reasoning in diverse areas of physics including cosmology. It
[...] Read more.
There seems to be a consensus among physicists that there is a connection between information processing and thermodynamics. In particular, Landauer’s Principle (LP) is widely assumed as part of the foundation of information theoretic/computational reasoning in diverse areas of physics including cosmology. It is also often appealed to in discussions about Maxwell’s demon and the status of the Second Law of Thermodynamics. However, LP has been challenged. In his 2005, Norton argued that LP has not been proved. LPSG offered a new proof of LP. Norton argued that the LPSG proof is unsound and Ladyman and Robertson defended it. However, Norton’s latest work also generalizes his critique to argue for a no go result that he purports to be the end of the thermodynamics of computation. Here we review the dialectic as it currently stands and consider Norton’s no go result. Full article
(This article belongs to the Special Issue Maxwell’s Demon 2013)
Open AccessArticle On the Clausius-Duhem Inequality and Maximum Entropy Production in a Simple Radiating System
Entropy 2014, 16(4), 2291-2308; doi:10.3390/e16042291
Received: 5 January 2014 / Revised: 10 March 2014 / Accepted: 14 April 2014 / Published: 22 April 2014
Cited by 2 | PDF Full-text (342 KB) | HTML Full-text | XML Full-text
Abstract
A black planet irradiated by a sun serves as the archetype for a simple radiating two-layer system admitting of a continuum of steady states under steadfast insolation. Steady entropy production rates may be calculated for different opacities of one of the layers, explicitly
[...] Read more.
A black planet irradiated by a sun serves as the archetype for a simple radiating two-layer system admitting of a continuum of steady states under steadfast insolation. Steady entropy production rates may be calculated for different opacities of one of the layers, explicitly so for the radiative interactions, and indirectly for all the material irreversibilities involved in maintaining thermal uniformity in each layer. The second law of thermodynamics is laid down in two versions, one of which is the well-known Clausius-Duhem inequality, the other being a modern version known as the entropy inequality. By maximizing the material entropy production rate, a state may be selected that always fulfills the Clausius-Duhem inequality. Some formally possible steady states, while violating the latter, still obey the entropy inequality. In terms of Earth’s climate, global entropy production rates exhibit extrema for any “greenhouse effect”. However, and only insofar as the model be accepted as representative of Earth’s climate, the extrema will not be found to agree with observed (effective) temperatures assignable to both the atmosphere and surface. This notwithstanding, the overall entropy production for the present greenhouse effect on Earth is very close to the maximum entropy production rate of a uniformly warm steady state at the planet’s effective temperature. For an Earth with a weak(er) greenhouse effect the statement is no longer true. Full article
(This article belongs to the Special Issue Maximum Entropy Production)
Open AccessArticle Measures of Causality in Complex Datasets with Application to Financial Data
Entropy 2014, 16(4), 2309-2349; doi:10.3390/e16042309
Received: 8 January 2014 / Revised: 10 March 2014 / Accepted: 8 April 2014 / Published: 24 April 2014
Cited by 2 | PDF Full-text (410 KB) | HTML Full-text | XML Full-text
Abstract
This article investigates the causality structure of financial time series. We concentrate on three main approaches to measuring causality: linear Granger causality, kernel generalisations of Granger causality (based on ridge regression and the Hilbert–Schmidt norm of the cross-covariance operator) and transfer entropy, examining
[...] Read more.
This article investigates the causality structure of financial time series. We concentrate on three main approaches to measuring causality: linear Granger causality, kernel generalisations of Granger causality (based on ridge regression and the Hilbert–Schmidt norm of the cross-covariance operator) and transfer entropy, examining each method and comparing their theoretical properties, with special attention given to the ability to capture nonlinear causality. We also present the theoretical benefits of applying non-symmetrical measures rather than symmetrical measures of dependence. We apply the measures to a range of simulated and real data. The simulated data sets were generated with linear and several types of nonlinear dependence, using bivariate, as well as multivariate settings. An application to real-world financial data highlights the practical difficulties, as well as the potential of the methods. We use two real data sets: (1) U.S. inflation and one-month Libor; (2) S&P data and exchange rates for the following currencies: AUDJPY, CADJPY, NZDJPY, AUDCHF, CADCHF, NZDCHF. Overall, we reach the conclusion that no single method can be recognised as the best in all circumstances, and each of the methods has its domain of best applicability. We also highlight areas for improvement and future research. Full article
(This article belongs to the collection Advances in Applied Statistical Mechanics)
Open AccessArticle Fractional Order Generalized Information
Entropy 2014, 16(4), 2350-2361; doi:10.3390/e16042350
Received: 3 April 2014 / Revised: 16 April 2014 / Accepted: 21 April 2014 / Published: 24 April 2014
Cited by 22 | PDF Full-text (4266 KB) | HTML Full-text | XML Full-text
Abstract
This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow
[...] Read more.
This paper formulates a novel expression for entropy inspired in the properties of Fractional Calculus. The characteristics of the generalized fractional entropy are tested both in standard probability distributions and real world data series. The results reveal that tuning the fractional order allow an high sensitivity to the signal evolution, which is useful in describing the dynamics of complex systems. The concepts are also extended to relative distances and tested with several sets of data, confirming the goodness of the generalization. Full article
(This article belongs to the Special Issue Complex Systems and Nonlinear Dynamics)
Open AccessArticle Gyarmati’s Variational Principle of Dissipative Processes
Entropy 2014, 16(4), 2362-2383; doi:10.3390/e16042362
Received: 30 November 2013 / Revised: 11 April 2014 / Accepted: 16 April 2014 / Published: 24 April 2014
Cited by 6 | PDF Full-text (238 KB) | HTML Full-text | XML Full-text
Abstract
Like in mechanics and electrodynamics, the fundamental laws of the thermodynamics of dissipative processes can be compressed into Gyarmati’s variational principle. This variational principle both in its differential (local) and in integral (global) forms was formulated by Gyarmati in 1965. The consistent application
[...] Read more.
Like in mechanics and electrodynamics, the fundamental laws of the thermodynamics of dissipative processes can be compressed into Gyarmati’s variational principle. This variational principle both in its differential (local) and in integral (global) forms was formulated by Gyarmati in 1965. The consistent application of both the local and the global forms of Gyarmati’s principle provides all the advantages throughout explicating the theory of irreversible thermodynamics that are provided in the study of mechanics and electrodynamics by the corresponding classical variational principles, e.g., Gauss’ differential principle of least constraint or Hamilton’s integral principle. Full article
(This article belongs to the Special Issue Advances in Methods and Foundations of Non-Equilibrium Thermodynamics)

Review

Jump to: Research, Other

Open AccessReview Possible Further Evidence for the Thixotropic Phenomenon of Water
Entropy 2014, 16(4), 2146-2160; doi:10.3390/e16042146
Received: 27 November 2013 / Revised: 14 February 2014 / Accepted: 4 April 2014 / Published: 14 April 2014
Cited by 4 | PDF Full-text (283 KB) | HTML Full-text | XML Full-text
Abstract
In this work we review the literature for possible confirmation of a phenomenon that was proposed to develop when water is left to stand for some time undisturbed in closed vessels. The phenomenon has been termed thixotropy of water due to the weak
[...] Read more.
In this work we review the literature for possible confirmation of a phenomenon that was proposed to develop when water is left to stand for some time undisturbed in closed vessels. The phenomenon has been termed thixotropy of water due to the weak gel-like behaviour which may develop spontaneously over time where ions and contact with hydrophilic surfaces seem to play important roles. Thixotropy is a property of certain gels and liquids that under normal conditions are highly viscous, whereas during mechanical processing their viscosity diminishes. We found experiments indicating water’s self-organizing properties, long-lived inhomogeneities and time-dependent changes in the spectral parameters of aqueous systems. The large-scale inhomogeneities in aqueous solutions seem to occur in a vast number of systems. Long-term spectral changes of aqueous systems were observed even though the source of radiation was switched off or removed. And water was considered to be an active excitable medium in which appropriate conditions for self-organization can be established. In short, the thixotropic phenomenon of water is further indicated by different experimental techniques and may be triggered by large-scale ordering of water in the vicinity of nucleating solutes and hydrophilic surfaces. Full article

Other

Jump to: Research, Review

Open AccessRetraction Retraction: Zheng, T. et al. Effect of Heat Leak and Finite Thermal Capacity on the Optimal Configuration of a Two-Heat-Reservoir Heat Engine for Another Linear Heat Transfer Law. Entropy 2003, 5, 519–530
Entropy 2014, 16(4), 1983-1984; doi:10.3390/e16041983
Received: 28 March 2014 / Accepted: 4 April 2014 / Published: 4 April 2014
PDF Full-text (120 KB) | HTML Full-text | XML Full-text
Abstract
The editors were recently made aware that a paper published in Entropy in 2003 [1] exhibited characteristics of duplication and self-plagiarism. After investigating the matter, and discussing the situation with the authors, they have offered to retract this paper (http://www.mdpi.com/1099-4300/5/5/519) .
[...] Read more.
The editors were recently made aware that a paper published in Entropy in 2003 [1] exhibited characteristics of duplication and self-plagiarism. After investigating the matter, and discussing the situation with the authors, they have offered to retract this paper (http://www.mdpi.com/1099-4300/5/5/519) . Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top