Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (48)

Search Parameters:
Keywords = generalised entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 474 KB  
Article
Wavelet Energy Entropy for Predictability and Cross-Market Similarity in Crude Oil Benchmarks
by Maria Carannante and Alessandro Mazzoccoli
Axioms 2026, 15(4), 253; https://doi.org/10.3390/axioms15040253 - 28 Mar 2026
Viewed by 353
Abstract
We study the predictability and cross-market structural similarity of Brent, WTI, and Dubai crude oil futures by means of a wavelet-based Sharma–Mittal energy entropy measure. The proposed framework combines multiresolution wavelet decomposition with a parametric generalised entropy, allowing the characterisation of informational complexity [...] Read more.
We study the predictability and cross-market structural similarity of Brent, WTI, and Dubai crude oil futures by means of a wavelet-based Sharma–Mittal energy entropy measure. The proposed framework combines multiresolution wavelet decomposition with a parametric generalised entropy, allowing the characterisation of informational complexity across scales and entropic parameters. We show that predictability is jointly scale- and parameter-dependent. Despite this dependence, the resulting wavelet entropy surfaces exhibit a high degree of geometric similarity across the three benchmarks. A discrepancy analysis further indicates that cross-market differences are localised in restricted regions of the parameter space, whereas intermediate scales are associated with maximal entropy values. Outside such regions, the entropy surfaces converge. Overall, the results provide evidence of a common multi-scale entropic structure underlying crude oil benchmarks, with regional effects affecting predictability without altering the global structural properties. These findings are consistent with the hypothesis of strong informational integration in global oil markets. Full article
(This article belongs to the Special Issue Advances in Financial Mathematics)
Show Figures

Figure 1

37 pages, 4917 KB  
Article
Transformer and Pre-Transformer Model-Based Sentiment Prediction with Various Embeddings: A Case Study on Amazon Reviews
by Ismail Duru and Ayşe Saliha Sunar
Entropy 2025, 27(12), 1202; https://doi.org/10.3390/e27121202 - 27 Nov 2025
Cited by 1 | Viewed by 1915
Abstract
Sentiment analysis is essential for understanding consumer opinions, yet selecting the optimal models and embedding methods remains challenging, especially when handling ambiguous expressions, slang, or mismatched sentiment–rating pairs. This study provides a comprehensive comparative evaluation of sentiment classification models across three paradigms: traditional [...] Read more.
Sentiment analysis is essential for understanding consumer opinions, yet selecting the optimal models and embedding methods remains challenging, especially when handling ambiguous expressions, slang, or mismatched sentiment–rating pairs. This study provides a comprehensive comparative evaluation of sentiment classification models across three paradigms: traditional machine learning, pre-transformer deep learning, and transformer-based models. Using the Amazon Magazine Subscriptions 2023 dataset, we evaluate a range of embedding techniques, including static embeddings (GloVe, FastText) and contextual transformer embeddings (BERT, DistilBERT, etc.). To capture predictive confidence and model uncertainty, we include categorical cross-entropy as a key evaluation metric alongside accuracy, precision, recall, and F1-score. In addition to detailed quantitative comparisons, we conduct a systematic qualitative analysis of misclassified samples to reveal model-specific patterns of uncertainty. Our findings show that FastText consistently outperforms GloVe in both traditional and LSTM-based models, particularly in recall, due to its subword-level semantic richness. Transformer-based models demonstrate superior contextual understanding and achieve the highest accuracy (92%) and lowest cross-entropy loss (0.25) with DistilBERT, indicating well-calibrated predictions. To validate the generalisability of our results, we replicated our experiments on the Amazon Gift Card Reviews dataset, where similar trends were observed. We also adopt a resource-aware approach by reducing the dataset size from 25 K to 20 K to reflect real-world hardware constraints. This study contributes to both sentiment analysis and sustainable AI by offering a scalable, entropy-aware evaluation framework that supports informed, context-sensitive model selection for practical applications. Full article
Show Figures

Figure 1

13 pages, 314 KB  
Article
Thermodynamic Hamiltonian and Entropy Production
by Umberto Lucia and Giulia Grisolia
Mathematics 2025, 13(19), 3214; https://doi.org/10.3390/math13193214 - 7 Oct 2025
Viewed by 1068
Abstract
The variational method holds considerable significance within mathematical and theoretical physics. Its importance stems from its capacity to characterise natural systems through physical quantities, irrespective of the chosen frame of reference. This characteristic makes it a powerful tool for understanding the behaviour of [...] Read more.
The variational method holds considerable significance within mathematical and theoretical physics. Its importance stems from its capacity to characterise natural systems through physical quantities, irrespective of the chosen frame of reference. This characteristic makes it a powerful tool for understanding the behaviour of diverse physical phenomena. A global and statistical approach originating from the principles of non-equilibrium thermodynamics has been developed. This approach culminates in the principle of maximum entropy generation, specifically tailored for open systems. The principle itself arises as a direct consequence of applying the Lagrangian approach to open systems. The work focuses on a generalised method for deriving the thermodynamic Hamiltonian. This Hamiltonian is essential to the dynamical analysis of open systems, allowing for a detailed examination of their time evolution. The analysis suggests that irreversibility appears to be a fundamental process related to the evolution of states within open systems. Full article
41 pages, 7199 KB  
Article
Entropy, Irreversibility, and Time-Series Deep Learning of Kinematic and Kinetic Data for Gait Classification in Children with Cerebral Palsy, Idiopathic Toe Walking, and Hereditary Spastic Paraplegia
by Alfonso de Gorostegui, Massimiliano Zanin, Juan-Andrés Martín-Gonzalo, Javier López-López, David Gómez-Andrés, Damien Kiernan and Estrella Rausell
Sensors 2025, 25(13), 4235; https://doi.org/10.3390/s25134235 - 7 Jul 2025
Cited by 2 | Viewed by 1402
Abstract
The use of gait analysis to differentiate among paediatric populations with neurological and developmental conditions such as idiopathic toe walking (ITW), cerebral palsy (CP), and hereditary spastic paraplegia (HSP) remains challenging due to the insufficient precision of current diagnostic approaches, leading in some [...] Read more.
The use of gait analysis to differentiate among paediatric populations with neurological and developmental conditions such as idiopathic toe walking (ITW), cerebral palsy (CP), and hereditary spastic paraplegia (HSP) remains challenging due to the insufficient precision of current diagnostic approaches, leading in some cases to misdiagnosis. Existing methods often isolate the analysis of gait variables, overlooking the whole complexity of biomechanical patterns and variations in motor control strategies. While previous studies have explored the use of statistical physics principles for the analysis of impaired gait patterns, gaps remain in integrating both kinematic and kinetic information or benchmarking these approaches against Deep Learning models. This study evaluates the robustness of statistical physics metrics in differentiating between normal and abnormal gait patterns and quantifies how the data source affects model performance. The analysis was conducted using gait data sets from two research institutions in Madrid and Dublin, with a total of 81 children with ITW, 300 with CP, 20 with HSP, and 127 typically developing children as controls. From each kinematic and kinetic time series, Shannon’s entropy, permutation entropy, weighted permutation entropy, and time irreversibility metrics were derived and used with Random Forest models. The classification accuracy of these features was compared to a ResNet Deep Learning model. Further analyses explored the effects of inter-laboratory comparisons and the spatiotemporal resolution of time series on classification performance and evaluated the impact of age and walking speed with linear mixed models. The results revealed that statistical physics metrics were able to differentiate among impaired gait patterns, achieving classification scores comparable to ResNet. The effects of walking speed and age on gait predictability and temporal organisation were observed as disease-specific patterns. However, performance differences across laboratories limit the generalisation of the trained models. These findings highlight the value of statistical physics metrics in the classification of children with different toe walking conditions and point towards the need of multimetric integration to improve diagnostic accuracy and gain a more comprehensive understanding of gait disorders. Full article
(This article belongs to the Special Issue Sensor Technologies for Gait Analysis: 2nd Edition)
Show Figures

Figure 1

25 pages, 2711 KB  
Article
Enhancing Multi-User Activity Recognition in an Indoor Environment with Augmented Wi-Fi Channel State Information and Transformer Architectures
by MD Irteeja Kobir, Pedro Machado, Ahmad Lotfi, Daniyal Haider and Isibor Kennedy Ihianle
Sensors 2025, 25(13), 3955; https://doi.org/10.3390/s25133955 - 25 Jun 2025
Cited by 3 | Viewed by 2487
Abstract
Human Activity Recognition (HAR) is crucial for understanding human behaviour through sensor data, with applications in healthcare, smart environments, and surveillance. While traditional HAR often relies on ambient sensors, wearable devices or vision-based systems, these approaches can face limitations in dynamic settings and [...] Read more.
Human Activity Recognition (HAR) is crucial for understanding human behaviour through sensor data, with applications in healthcare, smart environments, and surveillance. While traditional HAR often relies on ambient sensors, wearable devices or vision-based systems, these approaches can face limitations in dynamic settings and raise privacy concerns. Device-free HAR systems, utilising Wi-Fi Channel State Information (CSI) to human movements, have emerged as a promising privacy-preserving alternative for next-generation health activity monitoring and smart environments, particularly for multi-user scenarios. However, current research faces challenges such as the need for substantial annotated training data, class imbalance, and poor generalisability in complex, multi-user environments where labelled data is often scarce. This paper addresses these gaps by proposing a hybrid deep learning approach which integrates signal preprocessing, targeted data augmentation, and a customised integration of CNN and Transformer models, designed to address the challenges of multi-user recognition and data scarcity. A random transformation technique to augment real CSI data, followed by hybrid feature extraction involving statistical, spectral, and entropy-based measures to derive suitable representations from temporal sensory input, is employed. Experimental results show that the proposed model outperforms several baselines in single-user and multi-user contexts. Our findings demonstrate that combining real and augmented data significantly improves model generalisation in scenarios with limited labelled data. Full article
(This article belongs to the Special Issue Sensors and Data Analysis for Biomechanics and Physical Activity)
Show Figures

Figure 1

34 pages, 435 KB  
Review
Black Hole Thermodynamics and Generalised Non-Extensive Entropy
by Emilio Elizalde, Shin’ichi Nojiri and Sergei D. Odintsov
Universe 2025, 11(2), 60; https://doi.org/10.3390/universe11020060 - 11 Feb 2025
Cited by 20 | Viewed by 2506
Abstract
The first part of this work provides a review of recent research on generalised entropies and their origin, as well as its application to black hole thermodynamics. To start, it is shown that the Hawking temperature and the Bekenstein–Hawking entropy are, respectively, the [...] Read more.
The first part of this work provides a review of recent research on generalised entropies and their origin, as well as its application to black hole thermodynamics. To start, it is shown that the Hawking temperature and the Bekenstein–Hawking entropy are, respectively, the only possible thermodynamical temperature and entropy of the Schwarzschild black hole. Moreover, it is investigated if the other known generalised entropies, which include Rényi’s entropy, Tsallis entropy, and the four- and five-parameter generalised entropies, could correctly yield the Hawking temperature and the ADM mass. The possibility that generalised entropies could describe hairy black hole thermodynamics is also considered, both for the Reissner–Nordström black hole and for Einstein’s gravity coupled with two scalar fields. Two possibilities are investigated, namely, the case when the ADM mass does not yield the Bekenstein–Hawking entropy, and the case in which the effective mass expressing the energy inside the horizon does not yield the Hawking temperature. For the model with two scalar fields, the radii of the photon sphere and of the black hole shadow are calculated, which gives constraints on the BH parameters. These constraints are seen to be consistent, provided that the black hole is of the Schwarzschild type. Subsequently, the origin of the generalised entropies is investigated, by using their microscopic particle descriptions in the frameworks of a microcanonical ensemble and canonical ensemble, respectively. Finally, the McLaughlin expansion for the generalised entropies is used to derive, in each case, the microscopic interpretation of the generalised entropies, via the canonical and the grand canonical ensembles. Full article
(This article belongs to the Section Gravitation)
15 pages, 2734 KB  
Article
Engineering the Mechanics and Thermodynamics of Ti3AlC2, Hf3AlC2, Hf3GaC2, (ZrHf)3AlC2, and (ZrHf)4AlN3 MAX Phases via the Ab Initio Method
by Adel Bandar Alruqi
Crystals 2025, 15(1), 87; https://doi.org/10.3390/cryst15010087 - 17 Jan 2025
Cited by 3 | Viewed by 2543
Abstract
When combined with ceramics, ternary carbides, nitrides, and borides form a class of materials known as MAX phases. These materials exhibit a multilayer hexagonal structure and are very strong, damage tolerant, and thermally stable. Further, they have a low thermal expansion and exhibit [...] Read more.
When combined with ceramics, ternary carbides, nitrides, and borides form a class of materials known as MAX phases. These materials exhibit a multilayer hexagonal structure and are very strong, damage tolerant, and thermally stable. Further, they have a low thermal expansion and exhibit outstanding resistance to corrosion and oxidation. However, despite the numerous MAX phases that have been identified, the search for better MAX phases is ongoing, including the recently discovered Zr3InC2 and Hf3InC2. The properties of MAX phases are still being tailored in order to lower their ductility. This study investigated Ti3AlC2 alloyed with nitrogen, gallium, hafnium, and zirconium with the aim of achieving better mechanical and thermal performances. Density functional theory within Quantum Espresso module was used in the computations. The Perdew–Burke–Ernzerhof generalised gradient approximation functionals were utilised. (ZrHf)4AlN3 exhibited an enhanced bulk and Young’s moduli, entropy, specific heat, and melting temperature. The best thermal conductivity was observed in the case of (ZrHf)3AlC2. Further, Ti3AlC2 exhibited the highest shear modulus, Debye temperature, and electrical conductivity. These samples can thus form part of the group of MAX phases that are used in areas wherein the above properties are crucial. These include structural components in aerospace and automotive engineering applications, turbine blades, and heat exchanges. However, the samples need to be synthesised and their properties require verification. Full article
(This article belongs to the Special Issue Modern Technologies in the Manufacturing of Metal Matrix Composites)
Show Figures

Figure 1

7 pages, 220 KB  
Article
An Information-Theoretic Proof of a Hypercontractive Inequality
by Ehud Friedgut
Entropy 2024, 26(11), 966; https://doi.org/10.3390/e26110966 - 11 Nov 2024
Viewed by 1538
Abstract
The famous hypercontractive estimate discovered independently by Gross, Bonami and Beckner has had a great impact on combinatorics and theoretical computer science since it was first used in this setting in a seminal paper by Kahn, Kalai and Linial. The usual proofs of [...] Read more.
The famous hypercontractive estimate discovered independently by Gross, Bonami and Beckner has had a great impact on combinatorics and theoretical computer science since it was first used in this setting in a seminal paper by Kahn, Kalai and Linial. The usual proofs of this inequality begin with two-point space, where some elementary calculus is used and then generalised immediately by introducing another dimension using submultiplicativity (Minkowski’s integral inequality). In this paper, we prove this inequality using information theory. We compare the entropy of a pair of correlated vectors in {0,1}n to their separate entropies, analysing them bit by bit (not as a figure of speech, but as the bits are revealed) using the chain rule of entropy. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
14 pages, 314 KB  
Entry
Count Random Variables
by Sandra Mendonça, António Alberto Oliveira, Dinis Pestana and Maria Luísa Rocha
Encyclopedia 2024, 4(3), 1367-1380; https://doi.org/10.3390/encyclopedia4030089 - 23 Sep 2024
Viewed by 1758
Definition
The observation of randomness patterns serves as guidance for the craft of probabilistic modelling. The most used count models—Binomial, Poisson, Negative Binomial—are the discrete Morris’ natural exponential families whose variance is at most quadratic on the mean, and the solutions of Katz–Panjer recurrence [...] Read more.
The observation of randomness patterns serves as guidance for the craft of probabilistic modelling. The most used count models—Binomial, Poisson, Negative Binomial—are the discrete Morris’ natural exponential families whose variance is at most quadratic on the mean, and the solutions of Katz–Panjer recurrence relation, aside from being members of the generalised power series and hypergeometric distribution families, and this accounts for their many advantageous characteristics. Some other basic count models are also described, as well as models with less obvious but useful randomness patterns in connection with maximum entropy characterisations, such as Zipf and Good models. Simple tools, such as truncation, thinning, or parameter randomisation, are straightforward ways of constructing other count models. Full article
(This article belongs to the Section Mathematics & Computer Science)
19 pages, 436 KB  
Review
Different Aspects of Entropic Cosmology
by Shin’ichi Nojiri, Sergei D. Odintsov and Tanmoy Paul
Universe 2024, 10(9), 352; https://doi.org/10.3390/universe10090352 - 3 Sep 2024
Cited by 26 | Viewed by 2703
Abstract
We provide a short review of the recent developments in entropic cosmology based on two thermodynamic laws of the apparent horizon, namely the first and the second laws of thermodynamics. The first law essentially provides the change in entropy of the apparent horizon [...] Read more.
We provide a short review of the recent developments in entropic cosmology based on two thermodynamic laws of the apparent horizon, namely the first and the second laws of thermodynamics. The first law essentially provides the change in entropy of the apparent horizon during the cosmic evolution of the universe; in particular, it is expressed by TdS=d(ρV)+WdV (where W is the work density and other quantities have their usual meanings). In this way, the first law actually links various theories of gravity with the entropy of the apparent horizon. This leads to a natural question—“What is the form of the horizon entropy corresponding to a general modified theory of gravity?”. The second law of horizon thermodynamics states that the change in total entropy (the sum of horizon entropy + matter fields’ entropy) with respect to cosmic time must be positive, where the matter fields behave like an open system characterised by a non-zero chemical potential. The second law of horizon thermodynamics importantly provides model-independent constraints on entropic parameters. Finally, we discuss the standpoint of entropic cosmology on inflation (or bounce), reheating and primordial gravitational waves from the perspective of a generalised entropy function. Full article
(This article belongs to the Special Issue Universe: Feature Papers 2024—'Cosmology')
Show Figures

Figure 1

21 pages, 14010 KB  
Article
A Time-Series Feature-Extraction Methodology Based on Multiscale Overlapping Windows, Adaptive KDE, and Continuous Entropic and Information Functionals
by Antonio Squicciarini, Elio Valero Toranzo and Alejandro Zarzo
Mathematics 2024, 12(15), 2396; https://doi.org/10.3390/math12152396 - 31 Jul 2024
Viewed by 2902
Abstract
We propose a new methodology to transform a time series into an ordered sequence of any entropic and information functionals, providing a novel tool for data analysis. To achieve this, a new algorithm has been designed to optimize the Probability Density Function (PDF) [...] Read more.
We propose a new methodology to transform a time series into an ordered sequence of any entropic and information functionals, providing a novel tool for data analysis. To achieve this, a new algorithm has been designed to optimize the Probability Density Function (PDF) associated with a time signal in the context of non-parametric Kernel Density Estimation (KDE). We illustrate the applicability of this method for anomaly detection in time signals. Specifically, our approach combines a non-parametric kernel density estimator with overlapping windows of various scales. Regarding the parameters involved in the KDE, it is well-known that bandwidth tuning is crucial for the kernel density estimator. To optimize it for time-series data, we introduce an adaptive solution based on Jensen–Shannon divergence, which adjusts the bandwidth for each window length to balance overfitting and underfitting. This solution selects unique bandwidth parameters for each window scale. Furthermore, it is implemented offline, eliminating the need for online optimization for each time-series window. To validate our methodology, we designed a synthetic experiment using a non-stationary signal generated by the composition of two stationary signals and a modulation function that controls the transitions between a normal and an abnormal state, allowing for the arbitrary design of various anomaly transitions. Additionally, we tested the methodology on real scalp-EEG data to detect epileptic crises. The results show our approach effectively detects and characterizes anomaly transitions. The use of overlapping windows at various scales significantly enhances detection ability, allowing for the simultaneous analysis of phenomena at different scales. Full article
(This article belongs to the Special Issue Advances in Computational Mathematics and Applied Mathematics)
Show Figures

Figure 1

26 pages, 7408 KB  
Article
A Hybrid Model for Carbon Price Forecasting Based on Improved Feature Extraction and Non-Linear Integration
by Yingjie Zhu, Yongfa Chen, Qiuling Hua, Jie Wang, Yinghui Guo, Zhijuan Li, Jiageng Ma and Qi Wei
Mathematics 2024, 12(10), 1428; https://doi.org/10.3390/math12101428 - 7 May 2024
Cited by 16 | Viewed by 2987
Abstract
Accurately predicting the price of carbon is an effective way of ensuring the stability of the carbon trading market and reducing carbon emissions. Aiming at the non-smooth and non-linear characteristics of carbon price, this paper proposes a novel hybrid prediction model based on [...] Read more.
Accurately predicting the price of carbon is an effective way of ensuring the stability of the carbon trading market and reducing carbon emissions. Aiming at the non-smooth and non-linear characteristics of carbon price, this paper proposes a novel hybrid prediction model based on improved feature extraction and non-linear integration, which is built on complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN), fuzzy entropy (FuzzyEn), improved random forest using particle swarm optimisation (PSORF), extreme learning machine (ELM), long short-term memory (LSTM), non-linear integration based on multiple linear regression (MLR) and random forest (MLRRF), and error correction with the autoregressive integrated moving average model (ARIMA), named CEEMDAN-FuzzyEn-PSORF-ELM-LSTM-MLRRF-ARIMA. Firstly, CEEMDAN is combined with FuzzyEn in the feature selection process to improve extraction efficiency and reliability. Secondly, at the critical prediction stage, PSORF, ELM, and LSTM are selected to predict high, medium, and low complexity sequences, respectively. Thirdly, the reconstructed sequences are assembled by applying MLRRF, which can effectively improve the prediction accuracy and generalisation ability. Finally, error correction is conducted using ARIMA to obtain the final forecasting results, and the Diebold–Mariano test (DM test) is introduced for a comprehensive evaluation of the models. With respect to carbon prices in the pilot regions of Shenzhen and Hubei, the results indicate that the proposed model has higher prediction accuracy and robustness. The main contributions of this paper are the improved feature extraction and the innovative combination of multiple linear regression and random forests into a non-linear integrated framework for carbon price forecasting. However, further optimisation is still a work in progress. Full article
Show Figures

Figure 1

27 pages, 522 KB  
Article
Higher-Order Interactions and Their Duals Reveal Synergy and Logical Dependence beyond Shannon-Information
by Abel Jansma
Entropy 2023, 25(4), 648; https://doi.org/10.3390/e25040648 - 12 Apr 2023
Cited by 4 | Viewed by 4193
Abstract
Information-theoretic quantities reveal dependencies among variables in the structure of joint, marginal, and conditional entropies while leaving certain fundamentally different systems indistinguishable. Furthermore, there is no consensus on the correct higher-order generalisation of mutual information (MI). In this manuscript, we show that a [...] Read more.
Information-theoretic quantities reveal dependencies among variables in the structure of joint, marginal, and conditional entropies while leaving certain fundamentally different systems indistinguishable. Furthermore, there is no consensus on the correct higher-order generalisation of mutual information (MI). In this manuscript, we show that a recently proposed model-free definition of higher-order interactions among binary variables (MFIs), such as mutual information, is a Möbius inversion on a Boolean algebra, except of surprisal instead of entropy. This provides an information-theoretic interpretation to the MFIs, and by extension to Ising interactions. We study the objects dual to mutual information and the MFIs on the order-reversed lattices. We find that dual MI is related to the previously studied differential mutual information, while dual interactions are interactions with respect to a different background state. Unlike (dual) mutual information, interactions and their duals uniquely identify all six 2-input logic gates, the dy- and triadic distributions, and different causal dynamics that are identical in terms of their Shannon information content. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

16 pages, 2242 KB  
Review
The Scientific Contribution of the Kaniadakis Entropy to Nuclear Reactor Physics: A Brief Review
by Aquilino Senra Martinez and Willian Vieira de Abreu
Entropy 2023, 25(3), 478; https://doi.org/10.3390/e25030478 - 9 Mar 2023
Cited by 5 | Viewed by 2324
Abstract
In nuclear reactors, tracking the loss and production of neutrons is crucial for the safe operation of such devices. In this regard, the microscopic cross section with the Doppler broadening function is a way to represent the thermal agitation movement in a reactor [...] Read more.
In nuclear reactors, tracking the loss and production of neutrons is crucial for the safe operation of such devices. In this regard, the microscopic cross section with the Doppler broadening function is a way to represent the thermal agitation movement in a reactor core. This function usually considers the Maxwell–Boltzmann statistics for the velocity distribution. However, this distribution cannot be applied on every occasion, i.e., in conditions outside the thermal equilibrium. In order to overcome this potential limitation, Kaniadakis entropy has been used over the last seven years to generate generalised nuclear data. This short review article summarises what has been conducted so far and what has to be conducted yet. Full article
Show Figures

Figure 1

9 pages, 276 KB  
Article
Rényi Entropy in Statistical Mechanics
by Jesús Fuentes and Jorge Gonçalves
Entropy 2022, 24(8), 1080; https://doi.org/10.3390/e24081080 - 5 Aug 2022
Cited by 19 | Viewed by 6719
Abstract
Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy and instead tending [...] Read more.
Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy and instead tending to introduce it artificially. However, as we will show, modifications to the theory of statistical mechanics are needless to see how Rényi entropy automatically arises as the average rate of change of free energy over an ensemble at different temperatures. Moreover, this notion is extended by considering distributions for isospectral, non-isothermal processes, resulting in relative versions of free energy, in which the Kullback–Leibler divergence or the relative version of Rényi entropy appear within the structure of the corrections to free energy. These generalisations of free energy recover the ordinary thermodynamic potential whenever isothermal processes are considered. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
Back to TopTop