Skip Content
You are currently on the new version of our website. Access the old version .

360 Results Found

  • Article
  • Open Access
50 Citations
17,033 Views
18 Pages

24 January 2017

We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on...

  • Comment
  • Open Access
12 Citations
4,211 Views
3 Pages

6 March 2019

The goal of this comment note is to express our considerations about the recent paper by A. Ben Naim (Entropy 2017, 19, 48). We strongly support the distinguishing between the Shannon measure of information and the thermodynamic entropy, suggested in...

  • Article
  • Open Access
95 Citations
17,595 Views
13 Pages

A Characterization of Entropy in Terms of Information Loss

  • John C. Baez,
  • Tobias Fritz and
  • Tom Leinster

24 November 2011

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a p...

  • Article
  • Open Access
779 Views
19 Pages

Information-Theoretic Analysis of Selected Water Force Fields: From Molecular Clusters to Bulk Properties

  • Rodolfo O. Esquivel,
  • Hazel Vázquez-Hernández and
  • Alexander Pérez de La Luz

15 October 2025

We present a comprehensive information-theoretic evaluation of three widely used rigid water models (TIP3P, SPC, and SPC/ε) through systematic analysis of water clusters ranging from single molecules to 11-molecule aggregates. Five fundamenta...

  • Article
  • Open Access
1 Citations
4,581 Views
19 Pages

31 March 2024

Xylella Fastidiosa has been recently detected for the first time in southern Italy, representing a very dangerous phytobacterium capable of inducing severe diseases in many plants. In particular, the disease induced in olive trees is called olive qui...

  • Review
  • Open Access
9 Citations
2,748 Views
31 Pages

14 May 2021

The spreading of the stationary states of the multidimensional single-particle systems with a central potential is quantified by means of Heisenberg-like measures (radial and logarithmic expectation values) and entropy-like quantities (Fisher, Shanno...

  • Article
  • Open Access
5 Citations
7,996 Views
18 Pages

In this study, we try to examine whether the forecast errors obtained by the ANN models affect the breakout of financial crises. Additionally, we try to investigate how much the asymmetric information and forecast errors are reflected on the output v...

  • Review
  • Open Access
15 Citations
11,690 Views
16 Pages

16 September 2022

Physical roots, exemplifications and consequences of periodic and aperiodic ordering (represented by Fibonacci series) in biological systems are discussed. The physical and biological roots and role of symmetry and asymmetry appearing in biological p...

  • Article
  • Open Access
30 Citations
4,499 Views
17 Pages

15 May 2020

We study flow of substance in a channel of network which consists of nodes of network and edges which connect these nodes and form ways for motion of substance. The channel can have arbitrary number of arms and each arm can contain arbitrary number o...

  • Feature Paper
  • Article
  • Open Access
6 Citations
2,927 Views
12 Pages

Shannon (Information) Measures of Symmetry for 1D and 2D Shapes and Patterns

  • Edward Bormashenko,
  • Irina Legchenkova,
  • Mark Frenkel,
  • Nir Shvalb and
  • Shraga Shoval

21 January 2022

In this paper, informational (Shannon) measures of symmetry are introduced and analyzed for patterns built of 1D and 2D shapes. The informational measure of symmetry Hsym(G) characterizes the averaged uncertainty in the presence of symmetry elements...

  • Feature Paper
  • Article
  • Open Access
1 Citations
1,553 Views
69 Pages

3 June 2025

Stationary quantum information sources emit sequences of correlated qudits—that is, structured quantum stochastic processes. If an observer performs identical measurements on a qudit sequence, the outcomes are a realization of a classical stoch...

  • Article
  • Open Access
12 Citations
9,110 Views
19 Pages

9 May 2014

The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial inst...

  • Article
  • Open Access
15 Citations
10,583 Views
18 Pages

Entropy and Time

  • Arieh Ben-Naim

10 April 2020

The idea that entropy is associated with the “arrow of time” has its roots in Clausius’s statement on the Second Law: “Entropy of the Universe always increases.” However, the explicit association of the entropy with time...

  • Proceeding Paper
  • Open Access
619 Views
11 Pages

The Value of Information in Economic Contexts

  • Stefan Behringer and
  • Roman V. Belavkin

This paper explores the application of the Value of Information, (VoI), based on the Claude Shannon/Ruslan Stratonovich framework within economic contexts. Unlike previous studies that examine circular settings and strategic interactions, we focus on...

  • Review
  • Open Access
4 Citations
3,459 Views
19 Pages

Measurements of Entropic Uncertainty Relations in Neutron Optics

  • Bülent Demirel,
  • Stephan Sponar and
  • Yuji Hasegawa

6 February 2020

The emergence of the uncertainty principle has celebrated its 90th anniversary recently. For this occasion, the latest experimental results of uncertainty relations quantified in terms of Shannon entropies are presented, concentrating only on outcome...

  • Article
  • Open Access
12 Citations
5,929 Views
8 Pages

Symmetry and Shannon Measure of Ordering: Paradoxes of Voronoi Tessellation

  • Edward Bormashenko,
  • Irina Legchenkova and
  • Mark Frenkel

30 April 2019

The Voronoi entropy for random patterns and patterns demonstrating various elements of symmetry was calculated. The symmetric patterns were characterized by the values of the Voronoi entropy being very close to those inherent to random ones. This con...

  • Article
  • Open Access
34 Citations
8,218 Views
30 Pages

Prediction of Multi-Target Networks of Neuroprotective Compounds with Entropy Indices and Synthesis, Assay, and Theoretical Study of New Asymmetric 1,2-Rasagiline Carbamates

  • Francisco J. Romero Durán,
  • Nerea Alonso,
  • Olga Caamaño,
  • Xerardo García-Mera,
  • Matilde Yañez,
  • Francisco J. Prado-Prado and
  • Humberto González-Díaz

24 September 2014

In a multi-target complex network, the links (Lij) represent the interactions between the drug (di) and the target (tj), characterized by different experimental measures (Ki, Km, IC50, etc.) obtained in pharmacological assays under diverse boundary c...

  • Article
  • Open Access
30 Citations
11,940 Views
28 Pages

14 April 2021

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius. The variational definition applies...

  • Article
  • Open Access
6 Citations
5,598 Views
59 Pages

1 March 2020

Fano’s inequality is one of the most elementary, ubiquitous, and important tools in information theory. Using majorization theory, Fano’s inequality is generalized to a broad class of information measures, which contains those of Shannon...

  • Article
  • Open Access
4 Citations
3,305 Views
10 Pages

18 March 2021

In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results. Next, we consider Fisher information and Bayes–Fisher information measures for mixing p...

  • Article
  • Open Access
6 Citations
2,752 Views
23 Pages

On Entropy of Some Fractal Structures

  • Haleemah Ghazwani,
  • Muhammad Faisal Nadeem,
  • Faiza Ishfaq and
  • Ali N. A. Koam

Shannon entropy, also known as information entropy or entropy, measures the uncertainty or randomness of probability distribution. Entropy is measured in bits, quantifying the average amount of information required to identify an event from the distr...

  • Article
  • Open Access
4 Citations
4,122 Views
19 Pages

6 November 2020

A definition of three-variable cumulative residual entropy is introduced, and then used to obtain expressions for higher order or triple-wise correlation measures, that are based on cumulative residual densities. These information measures are calcul...

  • Review
  • Open Access
4 Citations
3,362 Views
45 Pages

24 April 2025

Does semantic communication require a semantic information theory parallel to Shannon’s information theory, or can Shannon’s work be generalized for semantic communication? This paper advocates for the latter and introduces a semantic gen...

  • Review
  • Open Access
85 Citations
18,096 Views
34 Pages

15 July 2010

This article highlights advantages of entropy-based genetic diversity measures, at levels from gene expression to landscapes. Shannon’s entropy-based diversity is the standard for ecological communities. The exponentials of Shannon’s and the related...

  • Feature Paper
  • Article
  • Open Access
3 Citations
2,274 Views
16 Pages

Fisher and Shannon Functionals for Hyperbolic Diffusion

  • Manuel O. Cáceres,
  • Marco Nizama and
  • Flavia Pennini

6 December 2023

The complexity measure for the distribution in space-time of a finite-velocity diffusion process is calculated. Numerical results are presented for the calculation of Fisher’s information, Shannon’s entropy, and the Cramér–Ra...

  • Article
  • Open Access
49 Citations
6,969 Views
25 Pages

2 January 2018

The quality of an image affects its utility and image quality assessment has been a hot research topic for many years. One widely used measure for image quality assessment is Shannon entropy, which has a well-established information-theoretic basis....

  • Review
  • Open Access
52 Citations
9,347 Views
15 Pages

25 July 2018

Information-theoretic-based measures have been useful in quantifying network complexity. Here we briefly survey and contrast (algorithmic) information-theoretic methods which have been used to characterize graphs and networks. We illustrate the stren...

  • Article
  • Open Access
6 Citations
3,766 Views
27 Pages

26 April 2019

Different probabilities of events attract different attention in many scenarios such as anomaly detection and security systems. To characterize the events’ importance from a probabilistic perspective, the message importance measure (MIM) is pro...

  • Article
  • Open Access
12 Citations
10,955 Views
18 Pages

Measuring the Complexity of Continuous Distributions

  • Guillermo Santamaría-Bonfil,
  • Nelson Fernández and
  • Carlos Gershenson

26 February 2016

We extend previously proposed measures of complexity, emergence, and self-organization to continuous distributions using differential entropy. Given that the measures were based on Shannon’s information, the novel continuous complexity measures descr...

  • Review
  • Open Access
8 Citations
8,187 Views
11 Pages

Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations

  • Silvana Flego,
  • Felipe Olivares,
  • Angelo Plastino and
  • Montserrat Casas

14 January 2011

In employing MaxEnt, a crucial role is assigned to the reciprocity relations that relate the quantifier to be extremized (Shannon’s entropy S), the Lagrange multipliers that arise during the variational process, and the expectation values that consti...

  • Article
  • Open Access
37 Citations
2,673 Views
18 Pages

Parameterization of the Stochastic Model for Evaluating Variable Small Data in the Shannon Entropy Basis

  • Oleh Bisikalo,
  • Vyacheslav Kharchenko,
  • Viacheslav Kovtun,
  • Iurii Krak and
  • Sergii Pavlov

17 January 2023

The article analytically summarizes the idea of applying Shannon’s principle of entropy maximization to sets that represent the results of observations of the “input” and “output” entities of the stochastic model for eva...

  • Review
  • Open Access
32 Citations
8,570 Views
9 Pages

15 August 2019

In ecology and evolution, entropic methods are now used widely and increasingly frequently. Their use can be traced back to Ramon Margalef’s first attempt 70 years ago to use log-series to quantify ecological diversity, including searching for...

  • Article
  • Open Access
3 Citations
2,862 Views
21 Pages

12 February 2022

The Shannon entropy in an LS-coupled configuration space has been calculated through a transformation from that in a jj-coupled configuration space for a Ni-like isoelectronic sequence. The sudden change of Shannon entropy, information exchange, eige...

  • Article
  • Open Access
4 Citations
3,311 Views
20 Pages

A Multi-Feature Framework for Quantifying Information Content of Optical Remote Sensing Imagery

  • Luo Silong,
  • Zhou Xiaoguang,
  • Hou Dongyang,
  • Nawaz Ali,
  • Kang Qiankun and
  • Wang Sijia

20 August 2022

Quantifying the information content of remote sensing images is considered to be a fundamental task in quantitative remote sensing. Traditionally, the grayscale entropy designed by Shannon’s information theory cannot capture the spatial structu...

  • Article
  • Open Access
4 Citations
6,247 Views
15 Pages

19 April 2022

The informational energy of Onicescu is a positive quantity that measures the amount of uncertainty of a random variable. However, contrary to Shannon’s entropy, the informational energy is strictly convex and increases when randomness decrease...

  • Feature Paper
  • Article
  • Open Access
2 Citations
1,826 Views
20 Pages

29 June 2024

The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Al...

  • Article
  • Open Access
7 Citations
3,911 Views
32 Pages

16 January 2020

Information theory provides a mathematical foundation to measure uncertainty in belief. Belief is represented by a probability distribution that captures our understanding of an outcome’s plausibility. Information measures based on Shannon&rsqu...

  • Article
  • Open Access
40 Citations
13,922 Views
25 Pages

14 December 2004

Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outc...

  • Article
  • Open Access
1 Citations
2,537 Views
13 Pages

Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis

  • Elif Tuna,
  • Atıf Evren,
  • Erhan Ustaoğlu,
  • Büşra Şahin and
  • Zehra Zeynep Şahinbaşoğlu

31 December 2022

The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mu...

of 8