Skip to Content

2,657 Results Found

  • Article
  • Open Access
2 Citations
5,491 Views
12 Pages

Mutual Information between Order Book Layers

  • Daniel Libman,
  • Gil Ariel,
  • Mary Schaps and
  • Simi Haber

27 February 2022

The order book is a list of all current buy or sell orders for a given financial security. The rise of electronic stock exchanges introduced a debate about the relevance of the information it encapsulates of the activity of traders. Here, we approach...

  • Proceeding Paper
  • Open Access
13 Citations
5,135 Views
9 Pages

On the Estimation of Mutual Information

  • Nicholas Carrara and
  • Jesse Ernst

In this paper we focus on the estimation of mutual information from finite samples ( X × Y ) . The main concern with estimations of mutual information (MI) is their robustness under the class of transformations for which it remains invar...

  • Article
  • Open Access
19 Citations
8,028 Views
16 Pages

1 April 2017

We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussi...

  • Article
  • Open Access
4 Citations
5,440 Views
13 Pages

29 July 2021

Task-nuisance decomposition describes why the information bottleneck loss I(z;x)βI(z;y) is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z. When n is a nuisance independent from y,...

  • Article
  • Open Access
9 Citations
5,892 Views
13 Pages

3 March 2017

The main aim of this contribution is to define the notions of Kullback-Leibler divergence and conditional mutual information in fuzzy probability spaces and to derive the basic properties of the suggested measures. In particular, chain rules for mutu...

  • Article
  • Open Access
129 Citations
16,298 Views
14 Pages

22 November 2017

Starting with a new formulation for the mutual information (MI) between a pair of events, this paper derives alternative upper bounds and extends those to the case of two discrete random variables. Normalized mutual information (NMI) measures are the...

  • Article
  • Open Access
30 Citations
11,053 Views
14 Pages

Tsallis Mutual Information for Document Classification

  • Màrius Vila,
  • Anton Bardera,
  • Miquel Feixas and
  • Mateu Sbert

14 September 2011

Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned docu...

  • Article
  • Open Access
17 Citations
4,608 Views
25 Pages

4 October 2019

Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative e...

  • Article
  • Open Access
2 Citations
2,994 Views
18 Pages

A Measurement Model of Mutual Influence for Information Dissemination

  • Liang Zhang,
  • Yong Quan,
  • Bin Zhou,
  • Yan Jia and
  • Liqun Gao

30 June 2020

The recent development of the mobile Internet and the rise of social media have significantly enriched the way people access information. Accurate modeling of the probability of information propagation between users is essential for studying informat...

  • Article
  • Open Access
13 Citations
8,399 Views
24 Pages

Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage

  • Maria Teresa Giraudo,
  • Laura Sacerdote and
  • Roberta Sirovich

26 November 2013

A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed com...

  • Article
  • Open Access
9 Citations
8,554 Views
16 Pages

6 September 2013

The correlation distance quantifies the statistical independence of two classical or quantum systems, via the distance from their joint state to the product of the marginal states. Tight lower bounds are given for the mutual information between pairs...

  • Article
  • Open Access
28 Citations
10,197 Views
11 Pages

Exact Test of Independence Using Mutual Information

  • Shawn D. Pethel and
  • Daniel W. Hahs

23 May 2014

Using a recently discovered method for producing random symbol sequences with prescribed transition counts, we present an exact null hypothesis significance test (NHST) for mutual information between two random variables, the null hypothesis being th...

  • Article
  • Open Access
1 Citations
2,041 Views
8 Pages

21 July 2023

We study the time evolution of mutual information between mass distributions in spatially separated but casually connected regions in an expanding universe. The evolution of mutual information is primarily determined by the configuration entropy rate...

  • Article
  • Open Access
4 Citations
6,056 Views
15 Pages

8 May 2013

We introduce a new estimate of mutual information between a dataset and a target variable that can be maximised analytically and has broad applicability in the field of machine learning and statistical pattern recognition. This estimate has previousl...

  • Article
  • Open Access
1 Citations
2,616 Views
13 Pages

Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis

  • Elif Tuna,
  • Atıf Evren,
  • Erhan Ustaoğlu,
  • Büşra Şahin and
  • Zehra Zeynep Şahinbaşoğlu

31 December 2022

The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mu...

  • Article
  • Open Access
12 Citations
5,089 Views
16 Pages

11 December 2017

We explored the dynamics of two interacting information systems. We show that for the Markovian marginal systems, the driving force for information dynamics is determined by both the information landscape and information flux. While the information l...

  • Article
  • Open Access
4 Citations
4,283 Views
23 Pages

On the α-q-Mutual Information and the α-q-Capacities

  • Velimir M. Ilić and
  • Ivan B. Djordjević

1 June 2021

The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis...

  • Article
  • Open Access
9 Citations
5,080 Views
18 Pages

10 June 2017

The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties o...

  • Article
  • Open Access
11 Citations
4,029 Views
15 Pages

Weighted Mutual Information for Aggregated Kernel Clustering

  • Nezamoddin N. Kachouie and
  • Meshal Shutaywi

18 March 2020

Background: A common task in machine learning is clustering data into different groups based on similarities. Clustering methods can be divided in two groups: linear and nonlinear. A commonly used linear clustering method is K-means. Its extension, k...

  • Article
  • Open Access
15 Citations
6,461 Views
21 Pages

30 September 2018

Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, an...

  • Article
  • Open Access
10 Citations
7,242 Views
18 Pages

29 March 2016

This paper proposes a novel estimator of mutual information for discrete and continuous variables. The main feature of this estimator is that it is zero for a large sample size n if and only if the two variables are independent. The estimator can be...

  • Article
  • Open Access
9 Citations
4,049 Views
19 Pages

Mutual Information Boosted Precipitation Nowcasting from Radar Images

  • Yuan Cao,
  • Danchen Zhang,
  • Xin Zheng,
  • Hongming Shan and
  • Junping Zhang

17 March 2023

Precipitation nowcasting has long been a challenging problem in meteorology. While recent studies have introduced deep neural networks into this area and achieved promising results, these models still struggle with the rapid evolution of rainfall and...

  • Article
  • Open Access
30 Citations
10,366 Views
29 Pages

16 February 2017

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of information gain lattices, which allows separating the information that a set of variables contains about another variable into components,...

  • Article
  • Open Access
34 Citations
11,605 Views
42 Pages

19 April 2011

Feature selection is an important step in building accurate classifiers and provides better understanding of the data sets. In this paper, we propose a feature subset selection method based on high-dimensional mutual information. We also propose to u...

  • Article
  • Open Access
2,229 Views
8 Pages

3 October 2023

Zebra finches are a model animal used in the study of audition. They are adept at recognizing zebra finch songs, and the neural pathway involved in song recognition is well studied. Here, this example is used to illustrate the estimation of mutual in...

  • Article
  • Open Access
15 Citations
4,580 Views
15 Pages

In recent times, there has been a swift advancement in the field of cryptocurrency. The advent of cryptocurrency has provided us with convenience and prosperity, but has also given rise to certain illicit and unlawful activities. Unlike classical cur...

  • Article
  • Open Access
14 Citations
3,729 Views
14 Pages

28 October 2018

Information is often described as a reduction of uncertainty associated with a restriction of possible choices. Despite appearing in Hartley’s foundational work on information theory, there is a surprising lack of a formal treatment of this int...

  • Article
  • Open Access
22 Citations
6,164 Views
18 Pages

Measuring Independence between Statistical Randomness Tests by Mutual Information

  • Jorge Augusto Karell-Albo,
  • Carlos Miguel Legón-Pérez,
  • Evaristo José Madarro-Capó,
  • Omar Rojas and
  • Guillermo Sosa-Gómez

4 July 2020

The analysis of independence between statistical randomness tests has had great attention in the literature recently. Dependency detection between statistical randomness tests allows one to discriminate statistical randomness tests that measure simil...

  • Article
  • Open Access
58 Citations
8,935 Views
15 Pages

Sensitivity Analysis for Urban Drainage Modeling Using Mutual Information

  • Chuanqi Li,
  • Wei Wang,
  • Jianzhi Xiong and
  • Pengyu Chen

3 November 2014

The intention of this paper is to evaluate the sensitivity of the Storm Water Management Model (SWMM) output to its input parameters. A global parameter sensitivity analysis is conducted in order to determine which parameters mostly affect the model...

  • Article
  • Open Access
17 Citations
5,343 Views
14 Pages

We write the mutual information between an input speech utterance and its reconstruction by a code-excited linear prediction (CELP) codec in terms of the mutual information between the input speech and the contributions due to the short-term predicto...

  • Article
  • Open Access
3 Citations
2,612 Views
15 Pages

Mutual Information in Molecular and Macromolecular Systems

  • Antonio Tripodo,
  • Francesco Puosi,
  • Marco Malvaldi and
  • Dino Leporini

3 September 2021

The relaxation properties of viscous liquids close to their glass transition (GT) have been widely characterised by the statistical tool of time correlation functions. However, the strong influence of ubiquitous non-linearities calls for new, alterna...

  • Communication
  • Open Access
14 Citations
6,172 Views
24 Pages

7 August 2023

In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, severa...

  • Article
  • Open Access
3 Citations
2,595 Views
14 Pages

20 August 2024

Information theoretic quantities such as entropy, entropy rate, information gain, and relative entropy are often used to understand the performance of intelligent agents in learning applications. Mean squared error has not played a role in these anal...

  • Article
  • Open Access
494 Views
22 Pages

Uncovering Neural Learning Dynamics Through Latent Mutual Information

  • Arianna Issitt,
  • Alex Merino,
  • Lamine Deen,
  • Ryan T. White and
  • Mackenzie J. Meni

19 January 2026

We study how convolutional neural networks reorganize information during learning in natural image classification tasks by tracking mutual information (MI) between inputs, intermediate representations, and labels. Across VGG-16, ResNet-18, and ResNet...

  • Article
  • Open Access
17 Citations
9,826 Views
20 Pages

25 June 2019

Determining the strength of nonlinear, statistical dependencies between two variables is a crucial matter in many research fields. The established measure for quantifying such relations is the mutual information. However, estimating mutual informatio...

  • Article
  • Open Access
4 Citations
5,095 Views
21 Pages

4 March 2019

Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurat...

  • Article
  • Open Access
4 Citations
4,553 Views
20 Pages

Normalized-Mutual-Information-Based Mining Method for Cascading Patterns

  • Cunjin Xue,
  • Jingyi Liu,
  • Xiaohong Li and
  • Qing Dong

A cascading pattern is a sequential pattern characterized by an item following another item in order. Recent research has investigated a challenge of dealing with cascading patterns, namely, the exponential time dependence of database scanning with r...

  • Article
  • Open Access
6 Citations
9,595 Views
27 Pages

11 October 2010

Mutual information between a target variable and a feature subset is extensively used as a feature subset selection criterion. This work contributes to a more thorough understanding of the evolution of the mutual information as a function of the numb...

  • Article
  • Open Access
2 Citations
1,798 Views
17 Pages

3 August 2024

In rational decision-making processes, the information interaction among individual robots is a critical factor influencing system stability. We establish a game-theoretic model based on mutual information to address division of labor decision-making...

  • Article
  • Open Access
39 Citations
8,887 Views
17 Pages

2 July 2019

Feature interaction is a newly proposed feature relevance relationship, but the unintentional removal of interactive features can result in poor classification performance for this relationship. However, traditional feature selection algorithms mainl...

  • Article
  • Open Access
7 Citations
3,846 Views
16 Pages

17 June 2022

Belavkin–Staszewski relative entropy can naturally characterize the effects of the possible noncommutativity of quantum states. In this paper, two new conditional entropy terms and four new mutual information terms are first defined by replacin...

  • Article
  • Open Access
2,297 Views
18 Pages

30 December 2024

Deep neural networks, despite their remarkable success in computer vision tasks, often face deployment challenges due to high computational demands and memory usage. Addressing this, we introduce a probabilistic framework for automated model compress...

  • Article
  • Open Access
1 Citations
2,858 Views
20 Pages

Modeling Categorical Variables by Mutual Information Decomposition

  • Jiun-Wei Liou,
  • Michelle Liou and
  • Philip E. Cheng

4 May 2023

This paper proposed the use of mutual information (MI) decomposition as a novel approach to identifying indispensable variables and their interactions for contingency table analysis. The MI analysis identified subsets of associative variables based o...

  • Article
  • Open Access
21 Citations
12,159 Views
21 Pages

Classification Active Learning Based on Mutual Information

  • Jamshid Sourati,
  • Murat Akcakaya,
  • Jennifer G. Dy,
  • Todd K. Leen and
  • Deniz Erdogmus

5 February 2016

Selecting a subset of samples to label from a large pool of unlabeled data points, such that a sufficiently accurate classifier is obtained using a reasonably small training set is a challenging, yet critical problem. Challenging, since solving this...

  • Review
  • Open Access
39 Citations
12,693 Views
33 Pages

27 December 2012

Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was intr...

  • Article
  • Open Access
2 Citations
1,696 Views
18 Pages

21 April 2025

The choice of constellations largely affects the performance of both wireless and optical communications. To address increasing capacity requirements, constellation shaping, especially for high-order modulations, is imperative in high-speed coherent...

  • Article
  • Open Access
7 Citations
3,504 Views
25 Pages

The Role of Mutual Information Estimator Choice in Feature Selection: An Empirical Study on mRMR

  • Nikolaos Papaioannou,
  • Georgios Myllis,
  • Alkiviadis Tsimpiris and
  • Vasiliki Vrana

25 August 2025

Maximum Relevance Minimum Redundancy (mRMR) is a widely used feature selection method that is applied in a wide range of applications in various fields. mRMR adds to the optimal subset the features that have high relevance to the target variable whil...

of 54