Next Issue
Previous Issue

E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Table of Contents

Entropy, Volume 18, Issue 8 (August 2016)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
View options order results:
result details:
Displaying articles 1-39
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle Optimal Noise Benefit in Composite Hypothesis Testing under Different Criteria
Entropy 2016, 18(8), 400; doi:10.3390/e18080400
Received: 30 May 2016 / Revised: 14 August 2016 / Accepted: 16 August 2016 / Published: 19 August 2016
PDF Full-text (958 KB) | HTML Full-text | XML Full-text
Abstract
The detectability for a noise-enhanced composite hypothesis testing problem according to different criteria is studied. In this work, the noise-enhanced detection problem is formulated as a noise-enhanced classical Neyman–Pearson (NP), Max–min, or restricted NP problem when the prior information is completely known, completely
[...] Read more.
The detectability for a noise-enhanced composite hypothesis testing problem according to different criteria is studied. In this work, the noise-enhanced detection problem is formulated as a noise-enhanced classical Neyman–Pearson (NP), Max–min, or restricted NP problem when the prior information is completely known, completely unknown, or partially known, respectively. Next, the detection performances are compared and the feasible range of the constraint on the minimum detection probability is discussed. Under certain conditions, the noise-enhanced restricted NP problem is equivalent to a noise-enhanced classical NP problem with modified prior distribution. Furthermore, the corresponding theorems and algorithms are given to search the optimal additive noise in the restricted NP framework. In addition, the relationship between the optimal noise-enhanced average detection probability and the constraint on the minimum detection probability is explored. Finally, numerical examples and simulations are provided to illustrate the theoretical results. Full article
(This article belongs to the Special Issue Statistical Significance and the Logic of Hypothesis Testing)
Figures

Figure 1

Open AccessArticle Assessing the Exergy Costs of a 332-MW Pulverized Coal-Fired Boiler
Entropy 2016, 18(8), 300; doi:10.3390/e18080300
Received: 4 May 2016 / Revised: 8 August 2016 / Accepted: 9 August 2016 / Published: 15 August 2016
Cited by 1 | PDF Full-text (1262 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we analyze the exergy costs of a real large industrial boiler with the aim of improving efficiency. Specifically, the 350-MW front-fired, natural circulation, single reheat and balanced draft coal-fired boiler forms part of a 1050-MW conventional power plant located in
[...] Read more.
In this paper, we analyze the exergy costs of a real large industrial boiler with the aim of improving efficiency. Specifically, the 350-MW front-fired, natural circulation, single reheat and balanced draft coal-fired boiler forms part of a 1050-MW conventional power plant located in Spain. We start with a diagram of the power plant, followed by a formulation of the exergy cost allocation problem to determine the exergy cost of the product of the boiler as a whole and the expenses of the individual components and energy streams. We also define a productive structure of the system. Furthermore, a proposal for including the exergy of radiation is provided in this study. Our results show that the unit exergy cost of the product of the boiler goes from 2.352 to 2.5, and that the maximum values are located in the ancillary electrical devices, such as induced-draft fans and coil heaters. Finally, radiation does not have an effect on the electricity cost, but affects at least 30% of the unit exergy cost of the boiler’s product. Full article
(This article belongs to the Special Issue Thermoeconomics for Energy Efficiency)
Figures

Open AccessArticle Exploring the Key Risk Factors for Application of Cloud Computing in Auditing
Entropy 2016, 18(8), 401; doi:10.3390/e18080401
Received: 21 April 2016 / Revised: 30 July 2016 / Accepted: 15 August 2016 / Published: 22 August 2016
PDF Full-text (1649 KB) | HTML Full-text | XML Full-text
Abstract
In the cloud computing information technology environment, cloud computing has some advantages such as lower cost, immediate access to hardware resources, lower IT barriers to innovation, higher scalability, etc., but for the financial audit information flow and processing in the cloud system, CPA
[...] Read more.
In the cloud computing information technology environment, cloud computing has some advantages such as lower cost, immediate access to hardware resources, lower IT barriers to innovation, higher scalability, etc., but for the financial audit information flow and processing in the cloud system, CPA (Certified Public Accountant) firms need special considerations, for example: system problems, information security and other related issues. Auditing cloud computing applications is the future trend in the CPA firms, given this issue is an important factor for them and very few studies have been conducted to investigate this issue; hence this study seeks to explore the key risk factors for the cloud computing and audit considerations. The dimensions/perspectives of the application of cloud computing audit considerations are huge and cover many criteria/factors. These risk factors are becoming increasingly complex, and interdependent. If the dimensions could be established, the mutually influential relations of the dimensions and criteria determined, and the current execution performance established; a prioritized improvement strategy designed could be constructed to use as a reference for CPA firm management decision making; as well as provide CPA firms with a reference for build auditing cloud computing systems. Empirical results show that key risk factors to consider when using cloud computing in auditing are, in order of priority for improvement: Operations (D), Automating user provisioning (C), Technology Risk (B) and Protection system (A). Full article
Figures

Figure 1

Open AccessArticle Thermal Analysis of Shell-and-Tube Thermoacoustic Heat Exchangers
Entropy 2016, 18(8), 301; doi:10.3390/e18080301
Received: 7 June 2016 / Revised: 2 August 2016 / Accepted: 8 August 2016 / Published: 16 August 2016
Cited by 1 | PDF Full-text (1852 KB) | HTML Full-text | XML Full-text
Abstract
Heat exchangers are of key importance in overall performance and commercialization of thermoacoustic devices. The main goal in designing efficient thermoacoustic heat exchangers (TAHXs) is the achievement of the required heat transfer rate in conjunction with low acoustic energy dissipation. A numerical investigation
[...] Read more.
Heat exchangers are of key importance in overall performance and commercialization of thermoacoustic devices. The main goal in designing efficient thermoacoustic heat exchangers (TAHXs) is the achievement of the required heat transfer rate in conjunction with low acoustic energy dissipation. A numerical investigation is performed to examine the effects of geometry on both the viscous and thermal-relaxation losses of shell-and-tube TAHXs. Further, the impact of the drive ratio as well as the temperature difference between the oscillating gas and the TAHX tube wall on acoustic energy dissipation are explored. While viscous losses decrease with d i / δ κ , thermal-relaxation losses increase; however, thermal relaxation effects mainly determine the acoustic power dissipated in TAHXs. The results indicate the existence of an optimal configuration for which the acoustic energy dissipation minimizes depending on both the TAHX metal temperature and the drive ratio. Full article
(This article belongs to the Section Thermodynamics)
Figures

Open AccessArticle Analytical Solutions of the Electrical RLC Circuit via Liouville–Caputo Operators with Local and Non-Local Kernels
Entropy 2016, 18(8), 402; doi:10.3390/e18080402
Received: 22 June 2016 / Revised: 9 August 2016 / Accepted: 17 August 2016 / Published: 20 August 2016
Cited by 4 | PDF Full-text (572 KB) | HTML Full-text | XML Full-text
Abstract
In this work we obtain analytical solutions for the electrical RLC circuit model defined with Liouville–Caputo, Caputo–Fabrizio and the new fractional derivative based in the Mittag-Leffler function. Numerical simulations of alternative models are presented for evaluating the effectiveness of these representations. Different source
[...] Read more.
In this work we obtain analytical solutions for the electrical RLC circuit model defined with Liouville–Caputo, Caputo–Fabrizio and the new fractional derivative based in the Mittag-Leffler function. Numerical simulations of alternative models are presented for evaluating the effectiveness of these representations. Different source terms are considered in the fractional differential equations. The classical behaviors are recovered when the fractional order α is equal to 1. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory II)
Figures

Figure 1

Open AccessArticle Heat Transfer and Entropy Generation of Non-Newtonian Laminar Flow in Microchannels with Four Flow Control Structures
Entropy 2016, 18(8), 302; doi:10.3390/e18080302
Received: 28 June 2016 / Revised: 31 July 2016 / Accepted: 8 August 2016 / Published: 12 August 2016
PDF Full-text (4958 KB) | HTML Full-text | XML Full-text
Abstract
Flow characteristics and heat transfer performances of carboxymethyl cellulose (CMC) aqueous solutions in the microchannels with flow control structures were investigated in this study. The researches were carried out with various flow rates and concentrations of the CMC aqueous solutions. The results reveal
[...] Read more.
Flow characteristics and heat transfer performances of carboxymethyl cellulose (CMC) aqueous solutions in the microchannels with flow control structures were investigated in this study. The researches were carried out with various flow rates and concentrations of the CMC aqueous solutions. The results reveal that the pin-finned microchannel has the most uniform temperature distribution on the structured walls, and the average temperature on the structured wall reaches the minimum value in cylinder-ribbed microchannels at the same flow rate and CMC concentration. Moreover, the protruded microchannel obtains the minimum relative Fanning friction factor f/f0, while, the maximum f/f0 is observed in the cylinder-ribbed microchannel. Furthermore, the minimum f/f0 is reached at the cases with CMC2000, and also, the relative Nusselt number Nu/Nu0 of CMC2000 cases is larger than that of other cases in the four structured microchannels. Therefore, 2000 ppm is the recommended concentration of CMC aqueous solutions in all the cases with different flow rates and flow control structures. Pin-finned microchannels are preferred in low flow rate cases, while, V-grooved microchannels have the minimum relative entropy generation S’/S0 and best thermal performance TP at CMC2000 in high flow rates. Full article
(This article belongs to the Special Issue Advances in Applied Thermodynamics II)
Figures

Open AccessArticle Interplay between Lattice Distortions, Vibrations and Phase Stability in NbMoTaW High Entropy Alloys
Entropy 2016, 18(8), 403; doi:10.3390/e18080403
Received: 26 July 2016 / Revised: 17 August 2016 / Accepted: 18 August 2016 / Published: 20 August 2016
Cited by 3 | PDF Full-text (1181 KB) | HTML Full-text | XML Full-text
Abstract
Refractory high entropy alloys (HEA), such as BCC NbMoTaW, represent a promising materials class for next-generation high-temperature applications, due to their extraordinary mechanical properties. A characteristic feature of HEAs is the formation of single-phase solid solutions. For BCC NbMoTaW, recent computational studies revealed,
[...] Read more.
Refractory high entropy alloys (HEA), such as BCC NbMoTaW, represent a promising materials class for next-generation high-temperature applications, due to their extraordinary mechanical properties. A characteristic feature of HEAs is the formation of single-phase solid solutions. For BCC NbMoTaW, recent computational studies revealed, however, a B2(Mo,W;Nb,Ta)-ordering at ambient temperature. This ordering could impact many materials properties, such as thermodynamic, mechanical, or diffusion properties, and hence be of relevance for practical applications. In this work, we theoretically address how the B2-ordering impacts thermodynamic properties of BCC NbMoTaW and how the predicted ordering temperature itself is affected by vibrations, electronic excitations, lattice distortions, and relaxation energies. Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)
Figures

Figure 1

Open AccessArticle A Geographically Temporal Weighted Regression Approach with Travel Distance for House Price Estimation
Entropy 2016, 18(8), 303; doi:10.3390/e18080303
Received: 12 May 2016 / Revised: 10 August 2016 / Accepted: 10 August 2016 / Published: 16 August 2016
Cited by 1 | PDF Full-text (1217 KB) | HTML Full-text | XML Full-text
Abstract
Previous studies have demonstrated that non-Euclidean distance metrics can improve model fit in the geographically weighted regression (GWR) model. However, the GWR model often considers spatial nonstationarity and does not address variations in local temporal issues. Therefore, this paper explores a geographically temporal
[...] Read more.
Previous studies have demonstrated that non-Euclidean distance metrics can improve model fit in the geographically weighted regression (GWR) model. However, the GWR model often considers spatial nonstationarity and does not address variations in local temporal issues. Therefore, this paper explores a geographically temporal weighted regression (GTWR) approach that accounts for both spatial and temporal nonstationarity simultaneously to estimate house prices based on travel time distance metrics. Using house price data collected between 1980 and 2016, the house price response and explanatory variables are then modeled using both the GWR and the GTWR approaches. Comparing the GWR model with Euclidean and travel distance metrics, the GTWR model with travel distance obtains the highest value for the coefficient of determination ( R 2 ) and the lowest values for the Akaike information criterion (AIC). The results show that the GTWR model provides a relatively high goodness of fit and sufficient space-time explanatory power with non-Euclidean distance metrics. The results of this study can be used to formulate more effective policies for real estate management. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Figures

Open AccessArticle An Efficient Method to Construct Parity-Check Matrices for Recursively Encoding Spatially Coupled LDPC Codes †
Entropy 2016, 18(8), 305; doi:10.3390/e18080305
Received: 15 June 2016 / Revised: 23 July 2016 / Accepted: 15 August 2016 / Published: 17 August 2016
PDF Full-text (439 KB) | HTML Full-text | XML Full-text
Abstract
Spatially coupled low-density parity-check (LDPC) codes have attracted considerable attention due to their promising performance. Recursive encoding of the codes with low delay and low complexity has been proposed in the literature but with constraints or restrictions. In this manuscript we propose an
[...] Read more.
Spatially coupled low-density parity-check (LDPC) codes have attracted considerable attention due to their promising performance. Recursive encoding of the codes with low delay and low complexity has been proposed in the literature but with constraints or restrictions. In this manuscript we propose an efficient method to construct parity-check matrices for recursively encoding spatially coupled LDPC codes with arbitrarily chosen node degrees. A general principle is proposed, which provides feasible and practical guidance for the construction of parity-check matrices. According to the specific structure of the matrix, each parity bit at a coupling position is jointly determined by the information bits at the current position and the encoded bits at former positions. Performance analysis in terms of design rate and density evolution has been presented. It can be observed that, in addition to the feature of recursive encoding, selected code structures constructed by the newly proposed method may lead to better belief-propagation thresholds than the conventional structures. Finite-length simulation results are provided as well, which verify the theoretical analysis. Full article
(This article belongs to the Section Information Theory)
Figures

Open AccessArticle Contact-Free Detection of Obstructive Sleep Apnea Based on Wavelet Information Entropy Spectrum Using Bio-Radar
Entropy 2016, 18(8), 306; doi:10.3390/e18080306
Received: 14 June 2016 / Revised: 6 August 2016 / Accepted: 15 August 2016 / Published: 18 August 2016
Cited by 3 | PDF Full-text (2206 KB) | HTML Full-text | XML Full-text
Abstract
Judgment and early danger warning of obstructive sleep apnea (OSA) is meaningful to the diagnosis of sleep illness. This paper proposed a novel method based on wavelet information entropy spectrum to make an apnea judgment of the OSA respiratory signal detected by bio-radar
[...] Read more.
Judgment and early danger warning of obstructive sleep apnea (OSA) is meaningful to the diagnosis of sleep illness. This paper proposed a novel method based on wavelet information entropy spectrum to make an apnea judgment of the OSA respiratory signal detected by bio-radar in wavelet domain. It makes full use of the features of strong irregularity and disorder of respiratory signal resulting from the brain stimulation by real, low airflow during apnea. The experimental results demonstrated that the proposed method is effective for detecting the occurrence of sleep apnea and is also able to detect some apnea cases that the energy spectrum method cannot. Ultimately, the comprehensive judgment accuracy resulting from 10 groups of OSA data is 93.1%, which is promising for the non-contact aided-diagnosis of the OSA. Full article
(This article belongs to the Special Issue Entropy on Biosignals and Intelligent Systems)
Figures

Figure 1

Open AccessArticle Weighted-Permutation Entropy Analysis of Resting State EEG from Diabetics with Amnestic Mild Cognitive Impairment
Entropy 2016, 18(8), 307; doi:10.3390/e18080307
Received: 24 May 2016 / Revised: 6 July 2016 / Accepted: 8 August 2016 / Published: 22 August 2016
PDF Full-text (1274 KB) | HTML Full-text | XML Full-text
Abstract
Diabetes is a significant public health issue as it increases the risk for dementia and Alzheimer’s disease (AD). In this study, we aim to investigate whether weighted-permutation entropy (WPE) and permutation entropy (PE) of resting-state EEG (rsEEG) could be applied as potential objective
[...] Read more.
Diabetes is a significant public health issue as it increases the risk for dementia and Alzheimer’s disease (AD). In this study, we aim to investigate whether weighted-permutation entropy (WPE) and permutation entropy (PE) of resting-state EEG (rsEEG) could be applied as potential objective biomarkers to distinguish type 2 diabetes patients with amnestic mild cognitive impairment (aMCI) from those with normal cognitive function. rsEEG series were acquired from 28 patients with type 2 diabetes (16 aMCI patients and 12 controls), and neuropsychological assessments were performed. The rsEEG signals were analysed using WPE and PE methods. The correlations between the PE or WPE of the rsEEG and the neuropsychological assessments were analysed as well. The WPE in the right temporal (RT) region of the aMCI diabetics was lower than the controls, and the WPE was significantly positively correlated to the scores of the Auditory Verbal Learning Test (AVLT) (AVLT-Immediate recall, AVLT-Delayed recall, AVLT-Delayed recognition) and the Wechsler Adult Intelligence Scale Digit Span Test (WAIS-DST). These findings were not obtained with PE. We concluded that the WPE of rsEEG recordings could distinguish aMCI diabetics from normal cognitive function diabetic controls among the current sample of diabetic patients. Thus, the WPE could be a potential index for assisting diagnosis of aMCI in type 2 diabetes. Full article
(This article belongs to the Special Issue Entropy and Electroencephalography II)
Figures

Open AccessArticle Soft Magnetic Properties of High-Entropy Fe-Co-Ni-Cr-Al-Si Thin Films
Entropy 2016, 18(8), 308; doi:10.3390/e18080308
Received: 20 July 2016 / Revised: 6 August 2016 / Accepted: 16 August 2016 / Published: 18 August 2016
PDF Full-text (1758 KB) | HTML Full-text | XML Full-text
Abstract
Soft magnetic properties of Fe-Co-Ni-Al-Cr-Si thin films were studied. As-deposited Fe-Co-Ni-Al-Cr-Si nano-grained thin films showing no magnetic anisotropy were subjected to field-annealing at different temperatures to induce magnetic anisotropy. Optimized magnetic and electrical properties of Fe-Co-Ni-Al-Cr-Si films annealed at 200 °C are saturation
[...] Read more.
Soft magnetic properties of Fe-Co-Ni-Al-Cr-Si thin films were studied. As-deposited Fe-Co-Ni-Al-Cr-Si nano-grained thin films showing no magnetic anisotropy were subjected to field-annealing at different temperatures to induce magnetic anisotropy. Optimized magnetic and electrical properties of Fe-Co-Ni-Al-Cr-Si films annealed at 200 °C are saturation magnetization 9.13 × 105 A/m, coercivity 79.6 A/m, out-of-plane uniaxial anisotropy field 1.59 × 103 A/m, and electrical resistivity 3.75 μΩ·m. Based on these excellent properties, we employed such films to fabricate magnetic thin film inductor. The performance of the high entropy alloy thin film inductors is superior to that of air core inductor. Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)
Figures

Figure 1

Open AccessArticle Potential of Entropic Force in Markov Systems with Nonequilibrium Steady State, Generalized Gibbs Function and Criticality
Entropy 2016, 18(8), 309; doi:10.3390/e18080309
Received: 3 May 2016 / Revised: 6 August 2016 / Accepted: 15 August 2016 / Published: 18 August 2016
PDF Full-text (797 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we revisit the notion of the “minus logarithm of stationary probability” as a generalized potential in nonequilibrium systems and attempt to illustrate its central role in an axiomatic approach to stochastic nonequilibrium thermodynamics of complex systems. It is demonstrated that
[...] Read more.
In this paper, we revisit the notion of the “minus logarithm of stationary probability” as a generalized potential in nonequilibrium systems and attempt to illustrate its central role in an axiomatic approach to stochastic nonequilibrium thermodynamics of complex systems. It is demonstrated that this quantity arises naturally through both monotonicity results of Markov processes and as the rate function when a stochastic process approaches a deterministic limit. We then undertake a more detailed mathematical analysis of the consequences of this quantity, culminating in a necessary and sufficient condition for the criticality of stochastic systems. This condition is then discussed in the context of recent results about criticality in biological systems. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Open AccessArticle Distribution Entropy Boosted VLAD for Image Retrieval
Entropy 2016, 18(8), 311; doi:10.3390/e18080311
Received: 26 February 2016 / Revised: 12 July 2016 / Accepted: 16 August 2016 / Published: 24 August 2016
PDF Full-text (1057 KB) | HTML Full-text | XML Full-text
Abstract
Several recent works have shown that aggregating local descriptors to generate global image representation results in great efficiency for retrieval and classification tasks. The most popular method following this approach is VLAD (Vector of Locally Aggregated Descriptors). We present a novel image presentation
[...] Read more.
Several recent works have shown that aggregating local descriptors to generate global image representation results in great efficiency for retrieval and classification tasks. The most popular method following this approach is VLAD (Vector of Locally Aggregated Descriptors). We present a novel image presentation called Distribution Entropy Boosted VLAD (EVLAD), which extends the original vector of locally aggregated descriptors. The original VLAD adopts only residuals to depict the distribution information of every visual word and neglects other statistical clues, so its discriminative power is limited. To address this issue, this paper proposes the use of the distribution entropy of each cluster as supplementary information to enhance the search accuracy. To fuse two feature sources organically, two fusion methods after a new normalization stage meeting power law are also investigated, which generate identically sized and double-sized vectors as the original VLAD. We validate our approach in image retrieval and image classification experiments. Experimental results demonstrate the effectiveness of our algorithm. Full article
Figures

Figure 1

Open AccessArticle Traceability Analyses between Features and Assets in Software Product Lines
Entropy 2016, 18(8), 269; doi:10.3390/e18080269
Received: 11 February 2016 / Revised: 22 June 2016 / Accepted: 4 July 2016 / Published: 3 August 2016
PDF Full-text (1258 KB) | HTML Full-text | XML Full-text
Abstract
In a Software Product Line (SPL), the central notion of implementability provides the requisite connection between specifications and their implementations, leading to the definition of products. While it appears to be a simple extension of the traceability relation between components and features, it
[...] Read more.
In a Software Product Line (SPL), the central notion of implementability provides the requisite connection between specifications and their implementations, leading to the definition of products. While it appears to be a simple extension of the traceability relation between components and features, it involves several subtle issues that were overlooked in the existing literature. In this paper, we have introduced a precise and formal definition of implementability over a fairly expressive traceability relation. The consequent definition of products in the given SPL naturally entails a set of useful analysis problems that are either refinements of known problems or are completely novel. We also propose a new approach to solve these analysis problems by encoding them as Quantified Boolean Formulae (QBF) and solving them through Quantified Satisfiability (QSAT) solvers. QBF can represent more complex analysis operations, which cannot be represented by using propositional formulae. The methodology scales much better than the SAT-based solutions hinted in the literature and were demonstrated through a tool called SPLAnE (SPL Analysis Engine) on a large set of SPL models. Full article
(This article belongs to the Section Information Theory)
Figures

Open AccessArticle A Novel Image Encryption Scheme Using the Composite Discrete Chaotic System
Entropy 2016, 18(8), 276; doi:10.3390/e18080276
Received: 21 April 2016 / Revised: 17 July 2016 / Accepted: 21 July 2016 / Published: 1 August 2016
Cited by 2 | PDF Full-text (9806 KB) | HTML Full-text | XML Full-text
Abstract
The composite discrete chaotic system (CDCS) is a complex chaotic system that combines two or more discrete chaotic systems. This system holds the chaotic characteristics of different chaotic systems in a random way and has more complex chaotic behaviors. In this paper, we
[...] Read more.
The composite discrete chaotic system (CDCS) is a complex chaotic system that combines two or more discrete chaotic systems. This system holds the chaotic characteristics of different chaotic systems in a random way and has more complex chaotic behaviors. In this paper, we aim to provide a novel image encryption algorithm based on a new two-dimensional (2D) CDCS. The proposed scheme consists of two parts: firstly, we propose a new 2D CDCS and analysis the chaotic behaviors, then, we introduce the bit-level permutation and pixel-level diffusion encryption architecture with the new CDCS to form the full proposed algorithm. Random values and the total information of the plain image are added into the diffusion procedure to enhance the security of the proposed algorithm. Both the theoretical analysis and simulations confirm the security of the proposed algorithm. Full article
(This article belongs to the Section Information Theory)
Figures

Open AccessArticle A Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models
Entropy 2016, 18(8), 277; doi:10.3390/e18080277
Received: 11 June 2016 / Revised: 20 July 2016 / Accepted: 21 July 2016 / Published: 27 July 2016
PDF Full-text (347 KB) | HTML Full-text | XML Full-text
Abstract
Estimators derived from a divergence criterion such as φ-divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation
[...] Read more.
Estimators derived from a divergence criterion such as φ - divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation of φ–divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estimator. The algorithm contains by construction the well-known Expectation Maximization (EM) algorithm. Our work is based on the paper of Tseng on the likelihood function. We provide some convergence properties by adapting the ideas of Tseng. We improve Tseng’s results by relaxing the identifiability condition on the proximal term, a condition which is not verified for most mixture models and is hard to be verified for “non mixture” ones. Convergence of the EM algorithm in a two-component Gaussian mixture is discussed in the spirit of our approach. Several experimental results on mixture models are provided to confirm the validity of the approach. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Figures

Open AccessArticle Expected Logarithm of Central Quadratic Form and Its Use in KL-Divergence of Some Distributions
Entropy 2016, 18(8), 278; doi:10.3390/e18080278
Received: 10 May 2016 / Revised: 13 July 2016 / Accepted: 21 July 2016 / Published: 28 July 2016
PDF Full-text (477 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we develop three different methods for computing the expected logarithm of central quadratic forms: a series method, an integral method and a fast (but inexact) set of methods. The approach used for deriving the integral method is novel and can
[...] Read more.
In this paper, we develop three different methods for computing the expected logarithm of central quadratic forms: a series method, an integral method and a fast (but inexact) set of methods. The approach used for deriving the integral method is novel and can be used for computing the expected logarithm of other random variables. Furthermore, we derive expressions for the Kullback–Leibler (KL) divergence of elliptical gamma distributions and angular central Gaussian distributions, which turn out to be functions dependent on the expected logarithm of a central quadratic form. Through several experimental studies, we compare the performance of these methods. Full article
(This article belongs to the Section Information Theory)
Figures

Open AccessArticle Entropy Generation through Non-Equilibrium Ordered Structures in Corner Flows with Sidewall Mass Injection
Entropy 2016, 18(8), 279; doi:10.3390/e18080279
Received: 22 June 2016 / Revised: 24 July 2016 / Accepted: 25 July 2016 / Published: 28 July 2016
PDF Full-text (3798 KB) | HTML Full-text | XML Full-text
Abstract
Additional entropy generation rates through non-equilibrium ordered structures are predicted for corner flows with sidewall mass injection. Well-defined non-equilibrium ordered structures are predicted at a normalized vertical station of approximately eighteen percent of the boundary-layer thickness. These structures are in addition to the
[...] Read more.
Additional entropy generation rates through non-equilibrium ordered structures are predicted for corner flows with sidewall mass injection. Well-defined non-equilibrium ordered structures are predicted at a normalized vertical station of approximately eighteen percent of the boundary-layer thickness. These structures are in addition to the ordered structures previously reported at approximately thirty-eight percent of the boundary layer thickness. The computational procedure is used to determine the entropy generation rate for each spectral velocity component at each of several stream wise stations and for each of several injection velocity values. Application of the procedure to possible thermal system processes is discussed. These results indicate that cooling sidewall mass injection into a horizontal laminar boundary layer may actually increase the heat transfer to the horizontal surface. Full article
(This article belongs to the Special Issue Advances in Applied Thermodynamics II)
Figures

Open AccessArticle Acoustic Entropy of the Materials in the Course of Degradation
Entropy 2016, 18(8), 280; doi:10.3390/e18080280
Received: 1 July 2016 / Revised: 21 July 2016 / Accepted: 25 July 2016 / Published: 28 July 2016
Cited by 1 | PDF Full-text (3240 KB) | HTML Full-text | XML Full-text
Abstract
We report experimental observations on the evolution of acoustic entropy in the course of cyclic loading as degradation occurs due to fatigue. The measured entropy is a result of the materials’ microstructural changes that occur as degradation due to cyclic mechanical loading. Experimental
[...] Read more.
We report experimental observations on the evolution of acoustic entropy in the course of cyclic loading as degradation occurs due to fatigue. The measured entropy is a result of the materials’ microstructural changes that occur as degradation due to cyclic mechanical loading. Experimental results demonstrate that maximum acoustic entropy emanating from materials during the course of degradation remains similar. Experiments are shown for two different types of materials: Aluminum 6061 (a metallic alloy) and glass/epoxy (a composite laminate). The evolution of the acoustic entropy demonstrates a persistent trend over the course of degradation. Full article
Figures

Open AccessArticle Acoustic Detection of Coronary Occlusions before and after Stent Placement Using an Electronic Stethoscope
Entropy 2016, 18(8), 281; doi:10.3390/e18080281
Received: 27 April 2016 / Revised: 18 July 2016 / Accepted: 23 July 2016 / Published: 29 July 2016
PDF Full-text (1752 KB) | HTML Full-text | XML Full-text
Abstract
More than 370,000 Americans die every year from coronary artery disease (CAD). Early detection and treatment are crucial to reducing this number. Current diagnostic and disease-monitoring methods are invasive, costly, and time-consuming. Using an electronic stethoscope and spectral and nonlinear dynamics analysis of
[...] Read more.
More than 370,000 Americans die every year from coronary artery disease (CAD). Early detection and treatment are crucial to reducing this number. Current diagnostic and disease-monitoring methods are invasive, costly, and time-consuming. Using an electronic stethoscope and spectral and nonlinear dynamics analysis of the recorded heart sound, we investigated the acoustic signature of CAD in subjects with only a single coronary occlusion before and after stent placement, as well as subjects with clinically normal coronary arteries. The CAD signature was evaluated by estimating power ratios of the total power above 150 Hz over the total power below 150 Hz of the FFT of the acoustic signal. Additionally, approximate entropy values were estimated to assess the differences induced by the stent placement procedure to the acoustic signature of the signals in the time domain. The groups were identified with this method with 82% sensitivity and 64% specificity (using the power ratio method) and 82% sensitivity and 55% specificity (using the approximate entropy). Power ratios and approximate entropy values after stent placement are not statistically different from those estimated from subjects with no coronary occlusions. Our approach demonstrates that the effect of stent placement on coronary occlusions can be monitored using an electronic stethoscope. Full article
Figures

Open AccessArticle How Is a Data-Driven Approach Better than Random Choice in Label Space Division for Multi-Label Classification?
Entropy 2016, 18(8), 282; doi:10.3390/e18080282
Received: 1 February 2016 / Revised: 12 July 2016 / Accepted: 19 July 2016 / Published: 30 July 2016
PDF Full-text (2969 KB) | HTML Full-text | XML Full-text
Abstract
We propose using five data-driven community detection approaches from social networks to partition the label space in the task of multi-label classification as an alternative to random partitioning into equal subsets as performed by RAkELd. We evaluate modularity-maximizing using
[...] Read more.
We propose using five data-driven community detection approaches from social networks to partition the label space in the task of multi-label classification as an alternative to random partitioning into equal subsets as performed by RAkELd. We evaluate modularity-maximizing using fast greedy and leading eigenvector approximations, infomap, walktrap and label propagation algorithms. For this purpose, we propose to construct a label co-occurrence graph (both weighted and unweighted versions) based on training data and perform community detection to partition the label set. Then, each partition constitutes a label space for separate multi-label classification sub-problems. As a result, we obtain an ensemble of multi-label classifiers that jointly covers the whole label space. Based on the binary relevance and label powerset classification methods, we compare community detection methods to label space divisions against random baselines on 12 benchmark datasets over five evaluation measures. We discover that data-driven approaches are more efficient and more likely to outperform RAkELd than binary relevance or label powerset is, in every evaluated measure. For all measures, apart from Hamming loss, data-driven approaches are significantly better than RAkELd ( α = 0 . 05 ), and at least one data-driven approach is more likely to outperform RAkELd than a priori methods in the case of RAkELd’s best performance. This is the largest RAkELd evaluation published to date with 250 samplings per value for 10 values of RAkELd parameter k on 12 datasets published to date. Full article
Figures

Open AccessArticle A Critical Reassessment of the Hess–Murray Law
Entropy 2016, 18(8), 283; doi:10.3390/e18080283
Received: 10 March 2016 / Revised: 11 July 2016 / Accepted: 25 July 2016 / Published: 5 August 2016
PDF Full-text (4767 KB) | HTML Full-text | XML Full-text
Abstract
The Hess–Murray law is a correlation between the radii of successive branchings in bi/trifurcated vessels in biological tissues. First proposed by the Swiss physiologist and Nobel laureate Walter Rudolf Hess in his 1914 doctoral thesis and published in 1917, the law was “rediscovered”
[...] Read more.
The Hess–Murray law is a correlation between the radii of successive branchings in bi/trifurcated vessels in biological tissues. First proposed by the Swiss physiologist and Nobel laureate Walter Rudolf Hess in his 1914 doctoral thesis and published in 1917, the law was “rediscovered” by the American physiologist Cecil Dunmore Murray in 1926. The law is based on the assumption that blood or lymph circulation in living organisms is governed by a “work minimization” principle that—under a certain set of specified conditions—leads to an “optimal branching ratio” of r i + 1 r i = 1 2 3 = 0.7937 . This “cubic root of 2” correlation underwent extensive theoretical and experimental reassessment in the second half of the 20th century, and the results indicate that—under a well-defined series of conditions—the law is sufficiently accurate for the smallest vessels (r of the order of fractions of millimeter) but fails for the larger ones; moreover, it cannot be successfully extended to turbulent flows. Recent comparisons with numerical investigations of branched flows led to similar conclusions. More recently, the Hess–Murray law came back into the limelight when it was taken as a founding paradigm of the Constructal Law, a theory that employs physical intuition and mathematical reasoning to derive “optimal paths” for the transport of matter and energy between a source and a sink, regardless of the mode of transportation (continuous, like in convection and conduction, or discrete, like in the transportation of goods and people). This paper examines the foundation of the law and argues that both for natural flows and for engineering designs, a minimization of the irreversibility under physically sound boundary conditions leads to somewhat different results. It is also shown that, in the light of an exergy-based resource analysis, an amended version of the Hess–Murray law may still hold an important position in engineering and biological sciences. Full article
(This article belongs to the Special Issue Advances in Applied Thermodynamics II)
Figures

Open AccessArticle A Five Species Cyclically Dominant Evolutionary Game with Fixed Direction: A New Way to Produce Self-Organized Spatial Patterns
Entropy 2016, 18(8), 284; doi:10.3390/e18080284
Received: 21 April 2016 / Revised: 25 July 2016 / Accepted: 1 August 2016 / Published: 8 August 2016
PDF Full-text (2091 KB) | HTML Full-text | XML Full-text
Abstract
Cyclically dominant systems are hot issues in academia, and they play an important role in explaining biodiversity in Nature. In this paper, we construct a five-strategy cyclically dominant system. Each individual in our system changes its strategy along a fixed direction. The dominant
[...] Read more.
Cyclically dominant systems are hot issues in academia, and they play an important role in explaining biodiversity in Nature. In this paper, we construct a five-strategy cyclically dominant system. Each individual in our system changes its strategy along a fixed direction. The dominant strategy can promote a change in the dominated strategy, and the dominated strategy can block a change in the dominant strategy. We use mean-field theory and cellular automaton simulation to discuss the evolving characters of the system. In the cellular automaton simulation, we find the emergence of spiral waves on spatial patterns without a migration rate, which suggests a new way to produce self-organized spatial patterns. Full article
(This article belongs to the Section Statistical Mechanics)
Figures

Open AccessArticle ECG Classification Using Wavelet Packet Entropy and Random Forests
Entropy 2016, 18(8), 285; doi:10.3390/e18080285
Received: 20 June 2016 / Revised: 25 July 2016 / Accepted: 2 August 2016 / Published: 5 August 2016
PDF Full-text (800 KB) | HTML Full-text | XML Full-text
Abstract
The electrocardiogram (ECG) is one of the most important techniques for heart disease diagnosis. Many traditional methodologies of feature extraction and classification have been widely applied to ECG analysis. However, the effectiveness and efficiency of such methodologies remain to be improved, and much
[...] Read more.
The electrocardiogram (ECG) is one of the most important techniques for heart disease diagnosis. Many traditional methodologies of feature extraction and classification have been widely applied to ECG analysis. However, the effectiveness and efficiency of such methodologies remain to be improved, and much existing research did not consider the separation of training and testing samples from the same set of patients (so called inter-patient scheme). To cope with these issues, in this paper, we propose a method to classify ECG signals using wavelet packet entropy (WPE) and random forests (RF) following the Association for the Advancement of Medical Instrumentation (AAMI) recommendations and the inter-patient scheme. Specifically, we firstly decompose the ECG signals by wavelet packet decomposition (WPD), and then calculate entropy from the decomposed coefficients as representative features, and finally use RF to build an ECG classification model. To the best of our knowledge, it is the first time that WPE and RF are used to classify ECG following the AAMI recommendations and the inter-patient scheme. Extensive experiments are conducted on the publicly available MIT–BIH Arrhythmia database and influence of mother wavelets and level of decomposition for WPD, type of entropy and the number of base learners in RF on the performance are also discussed. The experimental results are superior to those by several state-of-the-art competing methods, showing that WPE and RF is promising for ECG classification. Full article
(This article belongs to the Special Issue Entropy on Biosignals and Intelligent Systems)
Figures

Open AccessArticle Parametric Analysis of the Exergoeconomic Operation Costs, Environmental and Human Toxicity Indexes of the MF501F3 Gas Turbine
Entropy 2016, 18(8), 286; doi:10.3390/e18080286
Received: 1 June 2016 / Revised: 25 July 2016 / Accepted: 2 August 2016 / Published: 6 August 2016
PDF Full-text (3430 KB) | HTML Full-text | XML Full-text
Abstract
This work presents an energetic, exergoeconomic, environmental, and toxicity analysis of the simple gas turbine M501F3 based on a parametric analysis of energetic (thermal efficiency, fuel and air flow rates, and specific work output), exergoeconomic (exergetic efficiency and exergoeconomic operation costs), environmental (global
[...] Read more.
This work presents an energetic, exergoeconomic, environmental, and toxicity analysis of the simple gas turbine M501F3 based on a parametric analysis of energetic (thermal efficiency, fuel and air flow rates, and specific work output), exergoeconomic (exergetic efficiency and exergoeconomic operation costs), environmental (global warming, smog formation, acid rain indexes), and human toxicity indexes, by taking the compressor pressure ratio and the turbine inlet temperature as the operating parameters. The aim of this paper is to provide an integral, systematic, and powerful diagnostic tool to establish possible operation and maintenance actions to improve the gas turbine’s exergoeconomic, environmental, and human toxicity indexes. Despite the continuous changes in the price of natural gas, the compressor, combustion chamber, and turbine always contribute 18.96%, 53.02%, and 28%, respectively, to the gas turbine’s exergoeconomic operation costs. The application of this methodology can be extended to other simple gas turbines using the pressure drops and isentropic efficiencies, among others, as the degradation parameters, as well as to other energetic systems, without loss of generality. Full article
(This article belongs to the Section Thermodynamics)
Figures

Open AccessArticle Hawking-Like Radiation from the Trapping Horizon of Both Homogeneous and Inhomogeneous Spherically Symmetric Spacetime Model of the Universe
Entropy 2016, 18(8), 287; doi:10.3390/e18080287
Received: 7 June 2016 / Revised: 21 July 2016 / Accepted: 28 July 2016 / Published: 8 August 2016
Cited by 2 | PDF Full-text (273 KB) | HTML Full-text | XML Full-text
Abstract
The present work deals with the semi-classical tunnelling approach and the Hamilton–Jacobi method to study Hawking radiation from the dynamical horizon of both the homogeneous Friedmann–Robertson–Walker (FRW) model and the inhomogeneous Lemaitre–Tolman–Bondi (LTB) model of the Universe. In the tunnelling prescription, radial null
[...] Read more.
The present work deals with the semi-classical tunnelling approach and the Hamilton–Jacobi method to study Hawking radiation from the dynamical horizon of both the homogeneous Friedmann–Robertson–Walker (FRW) model and the inhomogeneous Lemaitre–Tolman–Bondi (LTB) model of the Universe. In the tunnelling prescription, radial null geodesics are used to visualize particles from behind the trapping horizon and the Hawking-like temperature has been calculated. On the other hand, in the Hamilton–Jacobi formulation, quantum corrections have been incorporated by solving the Klein–Gordon wave equation. In both the approaches, the temperature agrees at the semiclassical level. Full article
(This article belongs to the Special Issue Entropy in Quantum Systems and Quantum Field Theory (QFT))
Open AccessArticle Microstructures of Al7.5Cr22.5Fe35Mn20Ni15 High-Entropy Alloy and Its Polarization Behaviors in Sulfuric Acid, Nitric Acid and Hydrochloric Acid Solutions
Entropy 2016, 18(8), 288; doi:10.3390/e18080288
Received: 15 June 2016 / Revised: 26 July 2016 / Accepted: 1 August 2016 / Published: 8 August 2016
Cited by 1 | PDF Full-text (17050 KB) | HTML Full-text | XML Full-text
Abstract
This paper investigates the microstructures and the polarization behaviors of Al7.5Cr22.5Fe35Mn20Ni15 high-entropy alloy in 1M (1 mol/L) deaerated sulfuric acid (H2SO4), nitric acid (HNO3), and hydrochloric acid (HCl)
[...] Read more.
This paper investigates the microstructures and the polarization behaviors of Al7.5Cr22.5Fe35Mn20Ni15 high-entropy alloy in 1M (1 mol/L) deaerated sulfuric acid (H2SO4), nitric acid (HNO3), and hydrochloric acid (HCl) solutions at temperatures of 30–60 °C. The three phases of the Al7.5Cr22.5Fe35Mn20Ni15 high-entropy alloy are body-centered cubic (BCC) dendrites, face-centered cubic (FCC) interdendrites, and ordered BCC precipitates uniformly dispersed in the BCC dendrites. The different phases were corroded in different acidic solutions. The passivation regions of the Al7.5Cr22.5Fe35Mn20Ni15 alloy are divided into three and two sub-regions in the solutions of H2SO4 and HNO3 at 30–60 °C, respectively. The passivation region of the Al7.5Cr22.5Fe35Mn20Ni15 alloy is also divided into two sub-domains in 1M deaerated HCl solution at 30 °C. The Al7.5Cr22.5Fe35Mn20Ni15 alloy has almost equal corrosion resistance in comparison with 304 stainless steel (304SS) in both the 1M H2SO4 and 1M HCl solutions. The polarization behaviors indicated that the Al7.5Cr22.5Fe35Mn20Ni15 alloy possessed much better corrosion resistance than 304SS in 1M HNO3 solution. However, in 1M NaCl solution, the corrosion resistance of the Al7.5Cr22.5Fe35Mn20Ni15 alloy was less than 304SS. Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)
Figures

Open AccessArticle Control of Self-Organized Criticality through Adaptive Behavior of Nano-Structured Thin Film Coatings
Entropy 2016, 18(8), 290; doi:10.3390/e18080290
Received: 21 June 2016 / Revised: 21 July 2016 / Accepted: 2 August 2016 / Published: 9 August 2016
Cited by 1 | PDF Full-text (4061 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we will develop a strategy for controlling the self-organized critical process using the example of extreme tribological conditions caused by intensive build-up edge (BUE) formation that take place during machining of hard-to-cut austentic superduplex stainless steel SDSS UNS32750. From a
[...] Read more.
In this paper, we will develop a strategy for controlling the self-organized critical process using the example of extreme tribological conditions caused by intensive build-up edge (BUE) formation that take place during machining of hard-to-cut austentic superduplex stainless steel SDSS UNS32750. From a tribological viewpoint, machining of this material involves intensive seizure and build-up edge formation at the tool/chip interface, which can result in catastrophic tool failure. Built-up edge is considered to be a very damaging process in the system. The periodical breakage of the build-ups may eventually result in tool tip breakage and, thereby, lead to a catastrophe (complete loss of workability) in the system. The dynamic process of build-up edge formation is similar to an avalanche. It is governed by stick-slip phenomenon during friction and associated with the self-organized critical process. Investigation of wear patterns on the frictional surfaces of cutting tools using Scanning Electron Microscope (SEM), combined with chip undersurface characterization and frictional (cutting) force analyses, confirms this hypothesis. The control of self-organized criticality is accomplished through application of a nano-multilayer TiAl60CrSiYN/TiAlCrN thin film Physical Vapor Deposition (PVD) coating containing elevated aluminum content on a cemented carbide tool. The suggested coating enhanced the formation of protective nano-scale tribo-films on the friction surface under operation. Moreover, machining process optimization contributed to further enhancement of this beneficial process, as evidenced by X-ray Photoelectron Spectroscopy (XPS) studies of tribo-films. This resulted in a reduction of the scale of the build ups leading to overall wear performance improvement. A new thermodynamic analysis is proposed concerning entropy production during friction in machining with buildup edge formation. This model is able to predict various phenomena and shows a good agreement with experimental results. In the presented research we demonstrated a novel experimental approach for controlling self-organized criticality using an example of the machining with buildup edge formation, which is similar to avalanches. This was done through enhanced adaptive performance of the surface engineered tribo-system, in the aim of reducing the scale and frequency of the avalanches. Full article
(This article belongs to the Special Issue Entropy Application in Tribology)
Figures

Open AccessArticle Indicators of Evidence for Bioequivalence
Entropy 2016, 18(8), 291; doi:10.3390/e18080291
Received: 29 May 2016 / Revised: 22 July 2016 / Accepted: 2 August 2016 / Published: 9 August 2016
PDF Full-text (806 KB) | HTML Full-text | XML Full-text
Abstract
Some equivalence tests are based on two one-sided tests, where in many applications the test statistics are approximately normal. We define and find evidence for equivalence in Z-tests and then one- and two-sample binomial tests as well as for t-tests. Multivariate
[...] Read more.
Some equivalence tests are based on two one-sided tests, where in many applications the test statistics are approximately normal. We define and find evidence for equivalence in Z-tests and then one- and two-sample binomial tests as well as for t-tests. Multivariate equivalence tests are typically based on statistics with non-central chi-squared or non-central F distributions in which the non-centrality parameter λ is a measure of heterogeneity of several groups. Classical tests of the null λ λ 0 versus the equivalence alternative λ < λ 0 are available, but simple formulae for power functions are not. In these tests, the equivalence limit λ 0 is typically chosen by context. We provide extensions of classical variance stabilizing transformations for the non-central chi-squared and F distributions that are easy to implement and which lead to indicators of evidence for equivalence. Approximate power functions are also obtained via simple expressions for the expected evidence in these equivalence tests. Full article
(This article belongs to the Special Issue Statistical Significance and the Logic of Hypothesis Testing)
Figures

Open AccessArticle On Multi-Scale Entropy Analysis of Order-Tracking Measurement for Bearing Fault Diagnosis under Variable Speed
Entropy 2016, 18(8), 292; doi:10.3390/e18080292
Received: 20 June 2016 / Revised: 2 August 2016 / Accepted: 8 August 2016 / Published: 10 August 2016
Cited by 1 | PDF Full-text (2970 KB) | HTML Full-text | XML Full-text
Abstract
The research objective in this paper is to investigate the feasibility and effectiveness of utilizing envelope extraction combining the multi-scale entropy (MSE) analysis for identifying different roller bearing faults. The features were extracted from the angle-domain vibration signals that were measured through the
[...] Read more.
The research objective in this paper is to investigate the feasibility and effectiveness of utilizing envelope extraction combining the multi-scale entropy (MSE) analysis for identifying different roller bearing faults. The features were extracted from the angle-domain vibration signals that were measured through the hardware-implemented order-tracking technique, so that the characteristics of bearing defects are not affected by the rotating speed. The envelope analysis was employed to the vibration measurements as well as the selected intrinsic mode function (IMF) that was separated by the empirical mode decomposition (EMD) method. By using the coarse-grain process, the entropy of the envelope signals in the different scales was calculated to form the MSE distributions that represent the complexity of the signals. The decision tree was used to distinguish the entropy-related features which reveal the different classes of bearing faults. Full article
(This article belongs to the Section Complexity)
Figures

Open AccessArticle Characterization of Seepage Velocity beneath a Complex Rock Mass Dam Based on Entropy Theory
Entropy 2016, 18(8), 293; doi:10.3390/e18080293
Received: 24 May 2016 / Revised: 29 July 2016 / Accepted: 8 August 2016 / Published: 11 August 2016
PDF Full-text (1589 KB) | HTML Full-text | XML Full-text
Abstract
Owing to the randomness in the fracture flow system, the seepage system beneath a complex rock mass dam is inherently complex and highly uncertain, an investigation of the dam leakage by estimating the spatial distribution of the seepage field by conventional methods is
[...] Read more.
Owing to the randomness in the fracture flow system, the seepage system beneath a complex rock mass dam is inherently complex and highly uncertain, an investigation of the dam leakage by estimating the spatial distribution of the seepage field by conventional methods is quite difficult. In this paper, the entropy theory, as a relation between the definiteness and probability, is used to probabilistically analyze the characteristics of the seepage system in a complex rock mass dam. Based on the principle of maximum entropy, an equation for the vertical distribution of the seepage velocity in a dam borehole is derived. The achieved distribution is tested and compared with actual field data, and the results show good agreement. According to the entropy of flow velocity in boreholes, the rupture degree of a dam bedrock has been successfully estimated. Moreover, a new sampling scheme is presented. The sampling frequency has a negative correlation with the distance to the site of the minimum velocity, which is preferable to the traditional one. This paper demonstrates the significant advantage of applying the entropy theory for seepage velocity analysis in a complex rock mass dam. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Figures

Open AccessArticle Understanding Gating Operations in Recurrent Neural Networks through Opinion Expression Extraction
Entropy 2016, 18(8), 294; doi:10.3390/e18080294
Received: 23 March 2016 / Revised: 18 July 2016 / Accepted: 8 August 2016 / Published: 11 August 2016
PDF Full-text (756 KB) | HTML Full-text | XML Full-text
Abstract
Extracting opinion expressions from text is an essential task of sentiment analysis, which is usually treated as one of the word-level sequence labeling problems. In such problems, compositional models with multiplicative gating operations provide efficient ways to encode the contexts, as well as
[...] Read more.
Extracting opinion expressions from text is an essential task of sentiment analysis, which is usually treated as one of the word-level sequence labeling problems. In such problems, compositional models with multiplicative gating operations provide efficient ways to encode the contexts, as well as to choose critical information. Thus, in this paper, we adopt Long Short-Term Memory (LSTM) recurrent neural networks to address the task of opinion expression extraction and explore the internal mechanisms of the model. The proposed approach is evaluated on the Multi-Perspective Question Answering (MPQA) opinion corpus. The experimental results demonstrate improvement over previous approaches, including the state-of-the-art method based on simple recurrent neural networks. We also provide a novel micro perspective to analyze the run-time processes and gain new insights into the advantages of LSTM selecting the source of information with its flexible connections and multiplicative gating operations. Full article
Figures

Open AccessArticle Information Theoretical Measures for Achieving Robust Learning Machines
Entropy 2016, 18(8), 295; doi:10.3390/e18080295
Received: 13 May 2016 / Revised: 6 August 2016 / Accepted: 8 August 2016 / Published: 12 August 2016
PDF Full-text (520 KB) | HTML Full-text | XML Full-text
Abstract
Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed
[...] Read more.
Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine. Full article
(This article belongs to the Special Issue Information Theoretic Learning)
Figures

Open AccessArticle Temporal Predictability of Online Behavior in Foursquare
Entropy 2016, 18(8), 296; doi:10.3390/e18080296
Received: 16 June 2016 / Revised: 2 August 2016 / Accepted: 8 August 2016 / Published: 12 August 2016
PDF Full-text (1336 KB) | HTML Full-text | XML Full-text
Abstract
With the widespread use of Internet technologies, online behaviors play a more and more important role in humans’ daily lives. Knowing the times when humans perform their next online activities can be quite valuable for developing better online services, which prompts us to
[...] Read more.
With the widespread use of Internet technologies, online behaviors play a more and more important role in humans’ daily lives. Knowing the times when humans perform their next online activities can be quite valuable for developing better online services, which prompts us to wonder whether the times of users’ next online activities are predictable. In this paper, we investigate the temporal predictability in human online activities through exploiting the dataset from the social network Foursquare. Through discretizing the inter-event times of users’ Foursquare activities into symbols, we map each user’s inter-event time sequence to a sequence of inter-event time symbols. By applying the information-theoretic method to the sequences of inter-event time symbols, we show that for a user’s Foursquare activities, knowing the time interval between the current activity and the previous activity decreases the entropy of the time interval between the next activity and current activity, i.e., the time of the user’s next Foursquare activity is predictable. Much of the predictability is explained by the equal-interval repeat; that is, users perform consecutive Foursquare activities with approximately equal time intervals. On the other hand, the unequal-interval preference, i.e., the preference of performing Foursquare activities with a fixed time interval after another given time interval, is also an origin for predictability. Furthermore, our results reveal that the Foursquare activities on weekdays have a higher temporal predictability than those on weekends and that users’ Foursquare activity is more temporally predictable if his/her previous activity is performed in a location that he/she visits more frequently. Full article
(This article belongs to the Section Information Theory)
Figures

Open AccessArticle Voice Activity Detection Using Fuzzy Entropy and Support Vector Machine
Entropy 2016, 18(8), 298; doi:10.3390/e18080298
Received: 27 April 2016 / Revised: 27 July 2016 / Accepted: 8 August 2016 / Published: 12 August 2016
Cited by 1 | PDF Full-text (4576 KB) | HTML Full-text | XML Full-text
Abstract
This paper proposes support vector machine (SVM) based voice activity detection using FuzzyEn to improve detection performance under noisy conditions. The proposed voice activity detection (VAD) uses fuzzy entropy (FuzzyEn) as a feature extracted from noise-reduced speech signals to train an SVM model
[...] Read more.
This paper proposes support vector machine (SVM) based voice activity detection using FuzzyEn to improve detection performance under noisy conditions. The proposed voice activity detection (VAD) uses fuzzy entropy (FuzzyEn) as a feature extracted from noise-reduced speech signals to train an SVM model for speech/non-speech classification. The proposed VAD method was tested by conducting various experiments by adding real background noises of different signal-to-noise ratios (SNR) ranging from −10 dB to 10 dB to actual speech signals collected from the TIMIT database. The analysis proves that FuzzyEn feature shows better results in discriminating noise and corrupted noisy speech. The efficacy of the SVM classifier was validated using 10-fold cross validation. Furthermore, the results obtained by the proposed method was compared with those of previous standardized VAD algorithms as well as recently developed methods. Performance comparison suggests that the proposed method is proven to be more efficient in detecting speech under various noisy environments with an accuracy of 93.29%, and the FuzzyEn feature detects speech efficiently even at low SNR levels. Full article
Figures

Open AccessArticle Determining the Entropic Index q of Tsallis Entropy in Images through Redundancy
Entropy 2016, 18(8), 299; doi:10.3390/e18080299
Received: 7 June 2016 / Revised: 3 August 2016 / Accepted: 8 August 2016 / Published: 15 August 2016
PDF Full-text (2251 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
The Boltzmann–Gibbs and Tsallis entropies are essential concepts in statistical physics, which have found multiple applications in many engineering and science areas. In particular, we focus our interest on their applications to image processing through information theory. We present in this article a
[...] Read more.
The Boltzmann–Gibbs and Tsallis entropies are essential concepts in statistical physics, which have found multiple applications in many engineering and science areas. In particular, we focus our interest on their applications to image processing through information theory. We present in this article a novel numeric method to calculate the Tsallis entropic index q characteristic to a given image, considering the image as a non-extensive system. The entropic index q is calculated through q-redundancy maximization, which is a methodology that comes from information theory. We find better results in the image processing in the grayscale by using the Tsallis entropy and thresholding q instead of the Shannon entropy. Full article
Figures

Review

Jump to: Research, Other

Open AccessReview Entropy as a Metric Generator of Dissipation in Complete Metriplectic Systems
Entropy 2016, 18(8), 304; doi:10.3390/e18080304
Received: 9 June 2016 / Revised: 31 July 2016 / Accepted: 11 August 2016 / Published: 16 August 2016
Cited by 1 | PDF Full-text (937 KB) | HTML Full-text | XML Full-text
Abstract
This lecture is a short review on the role entropy plays in those classical dissipative systems whose equations of motion may be expressed via a Leibniz Bracket Algebra (LBA). This means that the time derivative of any physical observable f of the system
[...] Read more.
This lecture is a short review on the role entropy plays in those classical dissipative systems whose equations of motion may be expressed via a Leibniz Bracket Algebra (LBA). This means that the time derivative of any physical observable f of the system is calculated by putting this f in a “bracket” together with a “special observable” F, referred to as a Leibniz generator of the dynamics. While conservative dynamics is given an LBA formulation in the Hamiltonian framework, so that F is the Hamiltonian H of the system that generates the motion via classical Poisson brackets or quantum commutation brackets, an LBA formulation can be given to classical dissipative dynamics through the Metriplectic Bracket Algebra (MBA): the conservative component of the dynamics is still generated via Poisson algebra by the total energy H, while S, the entropy of the degrees of freedom statistically encoded in friction, generates dissipation via a metric bracket. The motivation of expressing through a bracket algebra and a motion-generating function F is to endow the theory of the system at hand with all the powerful machinery of Hamiltonian systems in terms of symmetries that become evident and readable. Here a (necessarily partial) overview of the types of systems subject to MBA formulation is presented, and the physical meaning of the quantity S involved in each is discussed. Here the aim is to review the different MBAs for isolated systems in a synoptic way. At the end of this collection of examples, the fact that dissipative dynamics may be constructed also in the absence of friction with microscopic degrees of freedom is stressed. This reasoning is a hint to introduce dissipation at a more fundamental level. Full article
Figures

Other

Jump to: Research, Review

Open AccessCorrection Correction to Yao, H.; Qiao, J.-W.; Gao, M.C.; Hawk, J.A.; Ma, S.-G.; Zhou, H. MoNbTaV Medium-Entropy Alloy. Entropy 2016, 18, 189
Entropy 2016, 18(8), 289; doi:10.3390/e18080289
Received: 4 August 2016 / Accepted: 5 August 2016 / Published: 9 August 2016
PDF Full-text (153 KB) | HTML Full-text | XML Full-text
Abstract The authors wish to make the following correction to their paper [1].[...] Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
E-Mail: 
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy Edit a special issue Review for Entropy
loading...
Back to Top