entropy-logo

Journal Browser

Journal Browser

Maximum Entropy

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 September 2009) | Viewed by 123331

Special Issue Editor


E-Mail Website
Guest Editor
Copenhagen Business College, Rønne Alle 1, st., 2860 Søborg, Denmark
Interests: cause and effect; entropy; exponential families; graphical models; information divergence; minimum description length; quantum information; statistical mechanics
Special Issues, Collections and Topics in MDPI journals

Related Special Issue

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

332 KiB  
Article
Estimation of Seismic Wavelets Based on the Multivariate Scale Mixture of Gaussians Model
by Jing-Huai Gao and Bing Zhang
Entropy 2010, 12(1), 14-33; https://doi.org/10.3390/e12010014 - 28 Dec 2009
Cited by 15 | Viewed by 8351
Abstract
This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The phase [...] Read more.
This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The phase of the wavelet is estimated by constant-phase rotation to the seismic signal, while the other two parameters are obtained by the Higher-order Statistics (HOS) (fourth-order cumulant) matching method. In order to derive the estimator of the Higher-order Statistics (HOS), the multivariate scale mixture of Gaussians (MSMG) model is applied to formulating the multivariate joint probability density function (PDF) of the seismic signal. By this way, we can represent HOS as a polynomial function of second-order statistics to improve the anti-noise performance and accuracy. In addition, the proposed method can work well for short time series. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

657 KiB  
Article
Entropy-Based Wavelet De-noising Method for Time Series Analysis
by Yan-Fang Sang, Dong Wang, Ji-Chun Wu, Qing-Ping Zhu and Ling Wang
Entropy 2009, 11(4), 1123-1147; https://doi.org/10.3390/e11041123 - 22 Dec 2009
Cited by 59 | Viewed by 12294
Abstract
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the [...] Read more.
The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the practical needs due to their inherent shortcomings. In the present paper, first a set of key but difficult wavelet de-noising problems are discussed, and then by applying information entropy theories to the wavelet de-noising process, i.e., using the principle of maximum entropy (POME) to describe the random character of the noise and using wavelet energy entropy to describe the degrees of complexity of the main series in original series data, a new entropy-based wavelet de-noising method is proposed. Analysis results of both several different synthetic series and typical observed time series data have verified the performance of the new method. A comprehensive discussion of the results indicates that compared with traditional wavelet de-noising methods, the new proposed method is more effective and universal. Furthermore, because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum. Besides, the analysis process of the new proposed method is simple and is easy to implement, so it would be more applicable and useful in applied sciences and practical engineering works. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

1356 KiB  
Article
Best Probability Density Function for Random Sampled Data
by Donald J. Jacobs
Entropy 2009, 11(4), 1001-1024; https://doi.org/10.3390/e11041001 - 04 Dec 2009
Cited by 7 | Viewed by 10792
Abstract
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments [...] Read more.
The maximum entropy method is a theoretically sound approach to construct an analytical form for the probability density function (pdf) given a sample of random events. In practice, numerical methods employed to determine the appropriate Lagrange multipliers associated with a set of moments are generally unstable in the presence of noise due to limited sampling. A robust method is presented that always returns the best pdf, where tradeoff in smoothing a highly varying function due to noise can be controlled. An unconventional adaptive simulated annealing technique, called funnel diffusion, determines expansion coefficients for Chebyshev polynomials in the exponential function. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

189 KiB  
Article
A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight
by Ximing Wu
Entropy 2009, 11(4), 917-930; https://doi.org/10.3390/e11040917 - 26 Nov 2009
Cited by 26 | Viewed by 8179
Abstract
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This [...] Read more.
The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This method can be generalized to the weighted GME (W-GME), where different weights are assigned to the two entropies in the objective function. We propose a data-driven method to select the weights in the entropy objective function. We use the least squares cross validation to derive the optimal weights. MonteCarlo simulations demonstrate that the proposedW-GME estimator is comparable to and often outperforms the conventional GME estimator, which places equal weights on the entropies of coefficient and disturbance distributions. Full article
(This article belongs to the Special Issue Maximum Entropy)
210 KiB  
Article
Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains
by Erik Van der Straeten
Entropy 2009, 11(4), 867-887; https://doi.org/10.3390/e11040867 - 17 Nov 2009
Cited by 9 | Viewed by 11056
Abstract
In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to [...] Read more.
In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

222 KiB  
Article
What is Fair Pay for Executives? An Information Theoretic Analysis of Wage Distributions
by Venkat Venkatasubramanian
Entropy 2009, 11(4), 766-781; https://doi.org/10.3390/e11040766 - 03 Nov 2009
Cited by 11 | Viewed by 20629
Abstract
The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We [...] Read more.
The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We use the principle of maximum entropy to show that the maximally fair pay distribution is lognormal under ideal conditions. This prediction is in agreement with observed data for the bottom 90%–95% of the working population. The theory estimates that the top 35 U.S. CEOs were overpaid by about 129 times their ideal salaries in 2008. We also provide an insight of entropy as a measure of fairness, which is maximized at equilibrium, in an economic system. Full article
(This article belongs to the Special Issue Maximum Entropy)
164 KiB  
Article
The Maximum Entropy Rate Description of a Thermodynamic System in a Stationary Non-Equilibrium State
by Marco Favretti
Entropy 2009, 11(4), 675-687; https://doi.org/10.3390/e11040675 - 29 Oct 2009
Cited by 6 | Viewed by 7578
Abstract
In this paper we present a simple model to describe a rather general system in a stationary non-equilibrium state, which is an open system traversed by a stationary flux. The probabilistic description is provided by a non-homogeneous Markov chain, which is not assumed [...] Read more.
In this paper we present a simple model to describe a rather general system in a stationary non-equilibrium state, which is an open system traversed by a stationary flux. The probabilistic description is provided by a non-homogeneous Markov chain, which is not assumed on the basis of a model of the microscopic interactions but rather derived from the knowledge of the macroscopic fluxes traversing the system through a maximum entropy rate principle. Full article
(This article belongs to the Special Issue Maximum Entropy)

Review

Jump to: Research

266 KiB  
Review
Fisher Information and Semiclassical Treatments
by Flavia Pennini, Gustavo Ferri and Angelo Plastino
Entropy 2009, 11(4), 972-992; https://doi.org/10.3390/e11040972 - 03 Dec 2009
Cited by 13 | Viewed by 8493
Abstract
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal [...] Read more.
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well. Full article
(This article belongs to the Special Issue Maximum Entropy)
Show Figures

Figure 1

149 KiB  
Review
Use of Maximum Entropy Modeling in Wildlife Research
by Roger A. Baldwin
Entropy 2009, 11(4), 854-866; https://doi.org/10.3390/e11040854 - 16 Nov 2009
Cited by 543 | Viewed by 25896
Abstract
Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, [...] Read more.
Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, and performs better than other presence-only modeling approaches. Further advances are needed to better define model thresholds, to test model significance, and to address model selection. Additionally, development of modeling approaches is needed when using repeated sampling of known individuals to assess habitat selection. These advancements would strengthen the utility of Maxent for wildlife research and management. Full article
(This article belongs to the Special Issue Maximum Entropy)
367 KiB  
Review
The Maximum Entropy Formalism and the Prediction of Liquid Spray Drop-Size Distribution
by Christophe Dumouchel
Entropy 2009, 11(4), 713-747; https://doi.org/10.3390/e11040713 - 02 Nov 2009
Cited by 34 | Viewed by 9316
Abstract
The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are [...] Read more.
The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are partially known and so far a universal model is not available. For almost thirty years, models based on the Maximum Entropy Formalism have been proposed to fulfill this task. This paper presents a review of these models emphasizing their similarities and differences, and discusses expectations of the use of this formalism to model spray drop-size distribution Full article
(This article belongs to the Special Issue Maximum Entropy)
Back to TopTop