Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (7)

Search Parameters:
Keywords = Cressie-Read divergence

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 1153 KB  
Article
Correspondence Analysis for Assessing Departures from Perfect Symmetry Using the Cressie–Read Family of Divergence Statistics
by Eric J. Beh and Rosaria Lombardo
Symmetry 2024, 16(7), 830; https://doi.org/10.3390/sym16070830 - 2 Jul 2024
Viewed by 1375
Abstract
Recently, Beh and Lombardo (2022, Symmetry, 14, 1103) showed how to perform a correspondence analysis on a two-way contingency table where Bowker’s statistic lies at the numerical heart of this analysis. Thus, we showed how this statistic could be used [...] Read more.
Recently, Beh and Lombardo (2022, Symmetry, 14, 1103) showed how to perform a correspondence analysis on a two-way contingency table where Bowker’s statistic lies at the numerical heart of this analysis. Thus, we showed how this statistic could be used to visually identify departures from perfect symmetry. Interestingly, Bowker’s statistic is a special case of the symmetry version of the Cressie–Read family of divergence statistics. Therefore, this paper presents a new framework for visually assessing departures from perfect symmetry using a second-order Taylor series approximation of the Cressie–Read family of divergence statistics. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

14 pages, 690 KB  
Article
On Representations of Divergence Measures and Related Quantities in Exponential Families
by Stefan Bedbur and Udo Kamps
Entropy 2021, 23(6), 726; https://doi.org/10.3390/e23060726 - 8 Jun 2021
Cited by 1 | Viewed by 2996
Abstract
Within exponential families, which may consist of multi-parameter and multivariate distributions, a variety of divergence measures, such as the Kullback–Leibler divergence, the Cressie–Read divergence, the Rényi divergence, and the Hellinger metric, can be explicitly expressed in terms of the respective cumulant function and [...] Read more.
Within exponential families, which may consist of multi-parameter and multivariate distributions, a variety of divergence measures, such as the Kullback–Leibler divergence, the Cressie–Read divergence, the Rényi divergence, and the Hellinger metric, can be explicitly expressed in terms of the respective cumulant function and mean value function. Moreover, the same applies to related entropy and affinity measures. We compile representations scattered in the literature and present a unified approach to the derivation in exponential families. As a statistical application, we highlight their use in the construction of confidence regions in a multi-sample setup. Full article
(This article belongs to the Special Issue Measures of Information)
Show Figures

Figure 1

16 pages, 1467 KB  
Concept Paper
Permutation Entropy and Information Recovery in Nonlinear Dynamic Economic Time Series
by Miguel Henry and George Judge
Econometrics 2019, 7(1), 10; https://doi.org/10.3390/econometrics7010010 - 12 Mar 2019
Cited by 56 | Viewed by 11202
Abstract
The focus of this paper is an information theoretic-symbolic logic approach to extract information from complex economic systems and unlock its dynamic content. Permutation Entropy (PE) is used to capture the permutation patterns-ordinal relations among the individual values of a given time series; [...] Read more.
The focus of this paper is an information theoretic-symbolic logic approach to extract information from complex economic systems and unlock its dynamic content. Permutation Entropy (PE) is used to capture the permutation patterns-ordinal relations among the individual values of a given time series; to obtain a probability distribution of the accessible patterns; and to quantify the degree of complexity of an economic behavior system. Ordinal patterns are used to describe the intrinsic patterns, which are hidden in the dynamics of the economic system. Empirical applications involving the Dow Jones Industrial Average are presented to indicate the information recovery value and the applicability of the PE method. The results demonstrate the ability of the PE method to detect the extent of complexity (irregularity) and to discriminate and classify admissible and forbidden states. Full article
Show Figures

Figure 1

11 pages, 779 KB  
Article
Econometric Information Recovery in Behavioral Networks
by George Judge
Econometrics 2016, 4(3), 38; https://doi.org/10.3390/econometrics4030038 - 14 Sep 2016
Cited by 3 | Viewed by 6286
Abstract
In this paper, we suggest an approach to recovering behavior-related, preference-choice network information from observational data. We model the process as a self-organized behavior based random exponential network-graph system. To address the unknown nature of the sampling model in recovering behavior related network [...] Read more.
In this paper, we suggest an approach to recovering behavior-related, preference-choice network information from observational data. We model the process as a self-organized behavior based random exponential network-graph system. To address the unknown nature of the sampling model in recovering behavior related network information, we use the Cressie-Read (CR) family of divergence measures and the corresponding information theoretic entropy basis, for estimation, inference, model evaluation, and prediction. Examples are included to clarify how entropy based information theoretic methods are directly applicable to recovering the behavioral network probabilities in this fundamentally underdetermined ill posed inverse recovery problem. Full article
Show Figures

Figure 1

12 pages, 272 KB  
Article
Information Recovery in a Dynamic Statistical Markov Model
by Douglas J. Miller and George Judge
Econometrics 2015, 3(2), 187-198; https://doi.org/10.3390/econometrics3020187 - 25 Mar 2015
Cited by 12 | Viewed by 4818
Abstract
Although economic processes and systems are in general simple in nature, the underlying dynamics are complicated and seldom understood. Recognizing this, in this paper we use a nonstationary-conditional Markov process model of observed aggregate data to learn about and recover causal influence information [...] Read more.
Although economic processes and systems are in general simple in nature, the underlying dynamics are complicated and seldom understood. Recognizing this, in this paper we use a nonstationary-conditional Markov process model of observed aggregate data to learn about and recover causal influence information associated with the underlying dynamic micro-behavior. Estimating equations are used as a link to the data and to model the dynamic conditional Markov process. To recover the unknown transition probabilities, we use an information theoretic approach to model the data and derive a new class of conditional Markov models. A quadratic loss function is used as a basis for selecting the optimal member from the family of possible likelihood-entropy functional(s). The asymptotic properties of the resulting estimators are demonstrated, and a range of potential applications is discussed. Full article
12 pages, 232 KB  
Article
Implications of the Cressie-Read Family of Additive Divergences for Information Recovery
by George G. Judge and Ron C. Mittelhammer
Entropy 2012, 14(12), 2427-2438; https://doi.org/10.3390/e14122427 - 3 Dec 2012
Cited by 20 | Viewed by 6830
Abstract
To address the unknown nature of probability-sampling models, in this paper we use information theoretic concepts and the Cressie-Read (CR) family of information divergence measures to produce a flexible family of probability distributions, likelihood functions, estimators, and inference procedures. The usual case in [...] Read more.
To address the unknown nature of probability-sampling models, in this paper we use information theoretic concepts and the Cressie-Read (CR) family of information divergence measures to produce a flexible family of probability distributions, likelihood functions, estimators, and inference procedures. The usual case in statistical modeling is that the noisy indirect data are observed and known and the sampling model-error distribution-probability space, consistent with the data, is unknown. To address the unknown sampling process underlying the data, we consider a convex combination of two or more estimators derived from members of the flexible CR family of divergence measures and optimize that combination to select an estimator that minimizes expected quadratic loss. Sampling experiments are used to illustrate the finite sample properties of the resulting estimator and the nature of the recovered sampling distribution. Full article
10 pages, 181 KB  
Article
Generalized Measure of Departure from No Three-Factor Interaction Model for 2 x 2 x K Contingency Tables
by Kouji Yamamoto, Yohei Ban and Sadao Tomizawa
Entropy 2008, 10(4), 776-785; https://doi.org/10.3390/e10040776 - 22 Dec 2008
Cited by 1 | Viewed by 8837
Abstract
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x [...] Read more.
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x 2 x K tables. The measure proposed is expressed by using Patil-Taillie diversity index or Cressie-Read power-divergence. A special case of the proposed measure includes Tomizawa's measure. The proposed measure would be useful for comparing the degrees of departure from the NOTFI model in several tables. Full article
(This article belongs to the Special Issue Information and Entropy)
Back to TopTop