Skip Content
You are currently on the new version of our website. Access the old version .

553 Results Found

  • Review
  • Open Access
3 Citations
8,532 Views
32 Pages

8 July 2017

The present age, which can be called the Information Age, has a core technology constituted by bits transported by photons. Both concepts, bit and photon, originated in the past century: the concept of photon was introduced by Planck in 1900 when he...

  • Article
  • Open Access
50 Citations
17,033 Views
18 Pages

24 January 2017

We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on...

  • Article
  • Open Access
6 Citations
3,065 Views
7 Pages

11 August 2020

The influence of shielding on the Shannon information entropy for atomic states in strong coupled plasma is investigated using the perturbation method and the Ritz variational method. The analytic expressions for the Shannon information entropies of...

  • Comment
  • Open Access
12 Citations
4,211 Views
3 Pages

6 March 2019

The goal of this comment note is to express our considerations about the recent paper by A. Ben Naim (Entropy 2017, 19, 48). We strongly support the distinguishing between the Shannon measure of information and the thermodynamic entropy, suggested in...

  • Article
  • Open Access
49 Citations
6,969 Views
25 Pages

2 January 2018

The quality of an image affects its utility and image quality assessment has been a hot research topic for many years. One widely used measure for image quality assessment is Shannon entropy, which has a well-established information-theoretic basis....

  • Article
  • Open Access
29 Citations
4,747 Views
15 Pages

18 July 2019

Knowledge of the electronic structures of atomic and molecular systems deepens our understanding of the desired system. In particular, several information-theoretic quantities, such as Shannon entropy, have been applied to quantify the extent of elec...

  • Article
  • Open Access
2 Citations
692 Views
10 Pages

4 October 2025

The degree of informational redundancy is often examined in genetic studies but not yet detailed for taxa conceived as minimally monophyletic groups (microgenus). Evolutionary processes in microgenera were reviewed, detailing critical sets of traits,...

  • Article
  • Open Access
10 Citations
5,382 Views
11 Pages

21 September 2015

In the present work, we report an investigation on quantum entanglement in the doubly excited 2s2 1Se resonance state of the positronium negative ion by using highly correlated Hylleraas type wave functions, determined by calculation of the density o...

  • Article
  • Open Access
28 Citations
5,873 Views
15 Pages

24 March 2017

We provide benchmark values for Shannon information entropies in position space for the ground state and ls2s 1Se excited state of helium confined with finite confinement potentials by employing the highly correlated Hylleraas-type wave functions. Fo...

  • Article
  • Open Access
12 Citations
9,110 Views
19 Pages

9 May 2014

The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial inst...

  • Review
  • Open Access
4 Citations
3,411 Views
26 Pages

Information Theory Meets Quantum Chemistry: A Review and Perspective

  • Yilin Zhao,
  • Dongbo Zhao,
  • Chunying Rong,
  • Shubin Liu and
  • Paul W. Ayers

16 June 2025

In this survey, we begin with a concise introduction to information theory within Shannon’s framework, focusing on the key concept of Shannon entropy and its related quantities: relative entropy, joint entropy, conditional entropy, and mutual i...

  • Article
  • Open Access
4 Citations
6,247 Views
15 Pages

19 April 2022

The informational energy of Onicescu is a positive quantity that measures the amount of uncertainty of a random variable. However, contrary to Shannon’s entropy, the informational energy is strictly convex and increases when randomness decrease...

  • Article
  • Open Access
95 Citations
17,595 Views
13 Pages

A Characterization of Entropy in Terms of Information Loss

  • John C. Baez,
  • Tobias Fritz and
  • Tom Leinster

24 November 2011

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a p...

  • Article
  • Open Access
24 Citations
7,017 Views
14 Pages

31 March 2015

We study the correlation of the ground state of an N-particle Moshinsky model by computing the Shannon entropy in both position and momentum spaces. We have derived the Shannon entropy and mutual information with analytical forms of such an N-particl...

  • Article
  • Open Access
6 Citations
1,875 Views
19 Pages

26 September 2022

In this paper we recall, extend and compute some information measures for the concomitants of the generalized order statistics (GOS) from the Farlie–Gumbel–Morgenstern (FGM) family. We focus on two types of information measures: some rela...

  • Article
  • Open Access
4 Citations
3,311 Views
20 Pages

A Multi-Feature Framework for Quantifying Information Content of Optical Remote Sensing Imagery

  • Luo Silong,
  • Zhou Xiaoguang,
  • Hou Dongyang,
  • Nawaz Ali,
  • Kang Qiankun and
  • Wang Sijia

20 August 2022

Quantifying the information content of remote sensing images is considered to be a fundamental task in quantitative remote sensing. Traditionally, the grayscale entropy designed by Shannon’s information theory cannot capture the spatial structu...

  • Feature Paper
  • Article
  • Open Access
3 Citations
2,274 Views
16 Pages

Fisher and Shannon Functionals for Hyperbolic Diffusion

  • Manuel O. Cáceres,
  • Marco Nizama and
  • Flavia Pennini

6 December 2023

The complexity measure for the distribution in space-time of a finite-velocity diffusion process is calculated. Numerical results are presented for the calculation of Fisher’s information, Shannon’s entropy, and the Cramér–Ra...

  • Article
  • Open Access
12 Citations
8,289 Views
25 Pages

15 August 2019

In conventional textbook thermodynamics, entropy is a quantity that may be calculated by different methods, for example experimentally from heat capacities (following Clausius) or statistically from numbers of microscopic quantum states (following Bo...

  • Article
  • Open Access
2 Citations
2,205 Views
18 Pages

Entropy and Stability in Blockchain Consensus Dynamics

  • Aristidis G. Anagnostakis and
  • Euripidis Glavas

13 February 2025

Every Blockchain architecture relies upon two major pillars: (a) the hash-based, block-binding mechanism and (b) the consensus-achievement mechanism. While the entropic behavior of (a) has been extensively studied in literature over the past decades,...

  • Article
  • Open Access
4 Citations
3,305 Views
10 Pages

18 March 2021

In this work, we first consider the discrete version of Fisher information measure and then propose Jensen–Fisher information, to develop some associated results. Next, we consider Fisher information and Bayes–Fisher information measures for mixing p...

  • Article
  • Open Access
3,212 Views
11 Pages

13 May 2020

Novel measures of symbol dominance (dC1 and dC2), symbol diversity (DC1 = N (1 − dC1) and DC2 = N (1 − dC2)), and information entropy (HC1 = log2 DC1 and HC2 = log2 DC2) are derived from Lorenz-consistent statistics that I had previously...

  • Article
  • Open Access
7 Citations
3,069 Views
23 Pages

30 June 2023

Symbolic encoding of information is the foundation of Shannon’s mathematical theory of communication. The concept of the informational efficiency of capital markets is closely related to the issue of information processing by equity market part...

  • Article
  • Open Access
1 Citations
4,939 Views
20 Pages

Information Theoretical Measures for Achieving Robust Learning Machines

  • Pablo Zegers,
  • B. Roy Frieden,
  • Carlos Alarcón and
  • Alexis Fuentes

12 August 2016

Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested wi...

  • Article
  • Open Access
43 Citations
8,934 Views
14 Pages

Determining the Entropic Index q of Tsallis Entropy in Images through Redundancy

  • Abdiel Ramírez-Reyes,
  • Alejandro Raúl Hernández-Montoya,
  • Gerardo Herrera-Corral and
  • Ismael Domínguez-Jiménez

15 August 2016

The Boltzmann–Gibbs and Tsallis entropies are essential concepts in statistical physics, which have found multiple applications in many engineering and science areas. In particular, we focus our interest on their applications to image processing thro...

  • Article
  • Open Access
25 Citations
5,339 Views
12 Pages

On the Use of Entropy to Improve Model Selection Criteria

  • Andrea Murari,
  • Emmanuele Peluso,
  • Francesco Cianfrani,
  • Pasquale Gaudio and
  • Michele Lungaroni

12 April 2019

The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) and the Akaike Information Criterion (AIC), are expressed in terms of synthetic indicators of the residual distribution: the variance and the mean-square...

  • Article
  • Open Access
635 Views
27 Pages

Information-Theoretic Medical Image Encryption via LLE-Verified Chaotic Keystreams and DNA Diffusion

  • Ibrahim Al-dayel,
  • Muhammad Faisal Nadeem,
  • Yasir Bashir and
  • Ayesha Shabbir

12 November 2025

We propose an information-theoretic encryption scheme consisting of a four-dimensional chaotic map driver in combination with a prediction model using an LSTM neural net to generate a keystream, which was limited only after passing a test based on th...

  • Article
  • Open Access
2,455 Views
29 Pages

26 June 2025

Friston proposed the Minimum Free Energy Principle (FEP) based on the Variational Bayesian (VB) method. This principle emphasizes that the brain and behavior coordinate with the environment, promoting self-organization. However, it has a theoretical...

  • Article
  • Open Access
20 Citations
2,694 Views
11 Pages

Quantum Information of the Aharanov–Bohm Ring with Yukawa Interaction in the Presence of Disclination

  • Collins Okon Edet,
  • Francisco Cleiton E. Lima,
  • Carlos Alberto S. Almeida,
  • Norshamsuri Ali and
  • Muhammad Asjad

31 July 2022

We investigate quantum information by a theoretical measurement approach of an Aharanov–Bohm (AB) ring with Yukawa interaction in curved space with disclination. We obtained the so-called Shannon entropy through the eigenfunctions of the system...

  • Article
  • Open Access
15 Citations
9,448 Views
22 Pages

Coarse Graining Shannon and von Neumann Entropies

  • Ana Alonso-Serrano and
  • Matt Visser

3 May 2017

The nature of coarse graining is intuitively “obvious”, but it is rather difficult to find explicit and calculable models of the coarse graining process (and the resulting entropy flow) discussed in the literature. What we would like to have at hand...

  • Article
  • Open Access
4 Citations
2,348 Views
23 Pages

17 January 2023

In this paper, we study the connections between generalized mean operators and entropies, where the mean value operators are related to the strictly monotone logical operators of fuzzy theory. Here, we propose a new entropy measure based on the famil...

  • Proceeding Paper
  • Open Access
1 Citations
1,833 Views
10 Pages

20 November 2017

Information content of a polymeric macromolecule can be calculated in bits, by multiplying the number of building blocks that encompasses the entire length of the macromolecule with the Shannon’s entropy of each building block, which could be determi...

  • Article
  • Open Access
4 Citations
4,122 Views
19 Pages

6 November 2020

A definition of three-variable cumulative residual entropy is introduced, and then used to obtain expressions for higher order or triple-wise correlation measures, that are based on cumulative residual densities. These information measures are calcul...

  • Review
  • Open Access
28 Citations
14,088 Views
18 Pages

10 November 2009

In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equati...

  • Article
  • Open Access
778 Views
19 Pages

Information-Theoretic Analysis of Selected Water Force Fields: From Molecular Clusters to Bulk Properties

  • Rodolfo O. Esquivel,
  • Hazel Vázquez-Hernández and
  • Alexander Pérez de La Luz

15 October 2025

We present a comprehensive information-theoretic evaluation of three widely used rigid water models (TIP3P, SPC, and SPC/ε) through systematic analysis of water clusters ranging from single molecules to 11-molecule aggregates. Five fundamenta...

  • Article
  • Open Access
1 Citations
3,691 Views
18 Pages

13 September 2020

In this paper, we study the information lost when a real-valued statistic is used to reduce or summarize sample data from a discrete random variable with a one-dimensional parameter. We compare the probability that a random sample gives a particular...

  • Article
  • Open Access
21 Citations
10,975 Views
11 Pages

1 February 2001

Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of appr...

  • Review
  • Open Access
32 Citations
8,570 Views
9 Pages

15 August 2019

In ecology and evolution, entropic methods are now used widely and increasingly frequently. Their use can be traced back to Ramon Margalef’s first attempt 70 years ago to use log-series to quantify ecological diversity, including searching for...

  • Review
  • Open Access
9 Citations
2,748 Views
31 Pages

14 May 2021

The spreading of the stationary states of the multidimensional single-particle systems with a central potential is quantified by means of Heisenberg-like measures (radial and logarithmic expectation values) and entropy-like quantities (Fisher, Shanno...

  • Article
  • Open Access
19 Citations
2,528 Views
17 Pages

12 March 2021

In this paper, we study the concomitants of dual generalized order statistics (and consequently generalized order statistics) when the parameters γ1,,γn are assumed to be pairwise different from Huang–Kotz Farlie–Gumble–Morgenstern bivariate distrib...

  • Article
  • Open Access
1,556 Views
27 Pages

6 November 2025

This paper presents an integrated approach to social network analysis that combines graph theory, social network analysis (SNA), and Shannon’s information theory, applied to a real-world Twitter network built around the political hashtag Zandbe...

  • Article
  • Open Access
33 Citations
3,714 Views
25 Pages

2 March 2024

The main objective of this study is to assess the prediction and success rate based on bivariate frequency ratio (FR), weight of evidence (WoE), Shannon entropy (SE), and information value (IV) models for landslide susceptibility in the sedimentary t...

  • Article
  • Open Access
3 Citations
4,537 Views
14 Pages

14 June 2019

We propose a framework to convert the protein intrinsic disorder content to structural entropy (H) using Shannon’s information theory (IT). The structural capacity (C), which is the sum of H and structural information (I), is equal to the amino acid...

  • Article
  • Open Access
35 Citations
7,935 Views
14 Pages

Quantitative EEG Markers of Entropy and Auto Mutual Information in Relation to MMSE Scores of Probable Alzheimer’s Disease Patients

  • Carmina Coronel,
  • Heinrich Garn,
  • Markus Waser,
  • Manfred Deistler,
  • Thomas Benke,
  • Peter Dal-Bianco,
  • Gerhard Ransmayr,
  • Stephan Seiler,
  • Dieter Grossegger and
  • Reinhold Schmidt

17 March 2017

Analysis of nonlinear quantitative EEG (qEEG) markers describing complexity of signal in relation to severity of Alzheimer’s disease (AD) was the focal point of this study. In this study, 79 patients diagnosed with probable AD were recruited from the...

  • Article
  • Open Access
1 Citations
1,196 Views
14 Pages

The Application of Spectral Entropy to P-Wave Detection in Continuous Seismogram Analysis

  • Alisher Skabylov,
  • Aldiyar Agishev,
  • Dauren Zhexebay,
  • Margulan Ibraimov,
  • Serik Khokhlov and
  • Alua Maksutova

7 August 2025

This work aims to develop approaches to processing and interpreting spectral entropy outcomes in the context of seismic data, as well as to establish a methodological foundation for subsequent integration into practical monitoring solutions. The obje...

  • Review
  • Open Access
40 Citations
6,830 Views
14 Pages

21 September 2018

This article discusses how entropy/information methods are well-suited to analyzing and forecasting the four processes of innovation, transmission, movement, and adaptation, which are the common basis to ecology and evolution. Macroecologists study a...

of 12