Previous Issue
Volume 26, May
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 26, Issue 6 (June 2024) – 93 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
20 pages, 1816 KiB  
Article
Design of a Robust Synchronization-Based Topology Observer for Complex Delayed Networks with Fixed and Adaptive Coupling Strength
by Yanqin Sun, Huaiyu Wu, Zhihuan Chen, Yang Chen and Xiujuan Zheng
Entropy 2024, 26(6), 525; https://doi.org/10.3390/e26060525 (registering DOI) - 18 Jun 2024
Abstract
Network topology plays a key role in determining the characteristics and dynamical behaviors of a network. But in practice, network topology is sometimes hidden or uncertain ahead of time because of network complexity. In this paper, a robust-synchronization-based topology observer (STO) is proposed [...] Read more.
Network topology plays a key role in determining the characteristics and dynamical behaviors of a network. But in practice, network topology is sometimes hidden or uncertain ahead of time because of network complexity. In this paper, a robust-synchronization-based topology observer (STO) is proposed and applied to solve the problem of identifying the topology of complex delayed networks (TICDNs). In comparison to the existing literature, the proposed STO does not require any prior knowledge about the range of topological parameters and does not have strict limits on topology type. Furthermore, the proposed STO is suitable not only for networks with fixed coupling strength, but also for networks with adaptive coupling strength. Finally, a few comparison examples for TICDNs are used to verify the feasibility and efficiency of the proposed STO, and the results show that the proposed STO outperforms the other methods. Full article
(This article belongs to the Section Complexity)
23 pages, 2186 KiB  
Article
Effect of Private Deliberation: Deception of Large Language Models in Game Play
by Kristijan Poje, Mario Brcic, Mihael Kovac and Marina Bagic Babac
Entropy 2024, 26(6), 524; https://doi.org/10.3390/e26060524 - 18 Jun 2024
Viewed by 152
Abstract
Integrating large language model (LLM) agents within game theory demonstrates their ability to replicate human-like behaviors through strategic decision making. In this paper, we introduce an augmented LLM agent, called the private agent, which engages in private deliberation and employs deception in repeated [...] Read more.
Integrating large language model (LLM) agents within game theory demonstrates their ability to replicate human-like behaviors through strategic decision making. In this paper, we introduce an augmented LLM agent, called the private agent, which engages in private deliberation and employs deception in repeated games. Utilizing the partially observable stochastic game (POSG) framework and incorporating in-context learning (ICL) and chain-of-thought (CoT) prompting, we investigated the private agent’s proficiency in both competitive and cooperative scenarios. Our empirical analysis demonstrated that the private agent consistently achieved higher long-term payoffs than its baseline counterpart and performed similarly or better in various game settings. However, we also found inherent deficiencies of LLMs in certain algorithmic capabilities crucial for high-quality decision making in games. These findings highlight the potential for enhancing LLM agents’ performance in multi-player games using information-theoretic approaches of deception and communication with complex environments. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

2 pages, 146 KiB  
Correction
Correction: Sandlersky et al. Multispectral Remote Sensing Data Application in Modelling Non-Extensive Tsallis Thermodynamics for Mountain Forests in Northern Mongolia. Entropy 2023, 25, 1653
by Robert Sandlersky, Nataliya Petrzhik, Tushigma Jargalsaikhan and Ivan Shironiya
Entropy 2024, 26(6), 523; https://doi.org/10.3390/e26060523 - 18 Jun 2024
Viewed by 59
Abstract
There were some errors in the original publication [...] Full article
(This article belongs to the Section Entropy and Biology)
34 pages, 10821 KiB  
Article
Unambiguous Models and Machine Learning Strategies for Anomalous Extreme Events in Turbulent Dynamical System
by Di Qi
Entropy 2024, 26(6), 522; https://doi.org/10.3390/e26060522 - 17 Jun 2024
Viewed by 125
Abstract
Data-driven modeling methods are studied for turbulent dynamical systems with extreme events under an unambiguous model framework. New neural network architectures are proposed to effectively learn the key dynamical mechanisms including the multiscale coupling and strong instability, and gain robust skill for long-time [...] Read more.
Data-driven modeling methods are studied for turbulent dynamical systems with extreme events under an unambiguous model framework. New neural network architectures are proposed to effectively learn the key dynamical mechanisms including the multiscale coupling and strong instability, and gain robust skill for long-time prediction resistive to the accumulated model errors from the data-driven approximation. The machine learning model overcomes the inherent limitations in traditional long short-time memory networks by exploiting a conditional Gaussian structure informed of the essential physical dynamics. The model performance is demonstrated under a prototype model from idealized geophysical flow and passive tracers, which exhibits analytical solutions with representative statistical features. Many attractive properties are found in the trained model in recovering the hidden dynamics using a limited dataset and sparse observation time, showing uniformly high skill with persistent numerical stability in predicting both the trajectory and statistical solutions among different statistical regimes away from the training regime. The model framework is promising to be applied to a wider class of turbulent systems with complex structures. Full article
(This article belongs to the Special Issue An Information-Theoretical Perspective on Complex Dynamical Systems)
15 pages, 3678 KiB  
Article
Tsallis Entropy-Based Complexity-IPE Casualty Plane: A Novel Method for Complex Time Series Analysis
by Zhe Chen, Changling Wu, Junyi Wang and Hongbing Qiu
Entropy 2024, 26(6), 521; https://doi.org/10.3390/e26060521 - 17 Jun 2024
Viewed by 162
Abstract
Due to its capacity to unveil the dynamic characteristics of time series data, entropy has attracted growing interest. However, traditional entropy feature extraction methods, such as permutation entropy, fall short in concurrently considering both the absolute amplitude information of signals and the temporal [...] Read more.
Due to its capacity to unveil the dynamic characteristics of time series data, entropy has attracted growing interest. However, traditional entropy feature extraction methods, such as permutation entropy, fall short in concurrently considering both the absolute amplitude information of signals and the temporal correlation between sample points. Consequently, this limitation leads to inadequate differentiation among different time series and susceptibility to noise interference. In order to augment the discriminative power and noise robustness of entropy features in time series analysis, this paper introduces a novel method called Tsallis entropy-based complexity-improved permutation entropy casualty plane (TC-IPE-CP). TC-IPE-CP adopts a novel symbolization approach that preserves both absolute amplitude information and inter-point correlations within sequences, thereby enhancing feature separability and noise resilience. Additionally, by incorporating Tsallis entropy and weighting the probability distribution with parameter q, it integrates with statistical complexity to establish a feature plane of complexity and entropy, further enriching signal features. Through the integration of multiscale algorithms, a multiscale Tsallis-improved permutation entropy algorithm is also developed. The simulation results indicate that TC-IPE-CP requires a small amount of data, exhibits strong noise resistance, and possesses high separability for signals. When applied to the analysis of heart rate signals, fault diagnosis, and underwater acoustic signal recognition, experimental findings demonstrate that TC-IPE-CP can accurately differentiate between electrocardiographic signals of elderly and young subjects, achieve precise bearing fault diagnosis, and identify four types of underwater targets. Particularly in underwater acoustic signal recognition experiments, TC-IPE-CP achieves a recognition rate of 96.67%, surpassing the well-known multi-scale dispersion entropy and multi-scale permutation entropy by 7.34% and 19.17%, respectively. This suggests that TC-IPE-CP is highly suitable for the analysis of complex time series. Full article
(This article belongs to the Special Issue Ordinal Pattern-Based Entropies: New Ideas and Challenges)
Show Figures

Figure 1

13 pages, 428 KiB  
Article
A Possibilistic Formulation of Autonomous Search for Targets
by Zhijin Chen, Branko Ristic and Du Yong Kim
Entropy 2024, 26(6), 520; https://doi.org/10.3390/e26060520 - 17 Jun 2024
Viewed by 153
Abstract
Autonomous search is an ongoing cycle of sensing, statistical estimation, and motion control with the objective to find and localise targets in a designated search area. Traditionally, the theoretical framework for autonomous search combines sequential Bayesian estimation with information theoretic motion control. This [...] Read more.
Autonomous search is an ongoing cycle of sensing, statistical estimation, and motion control with the objective to find and localise targets in a designated search area. Traditionally, the theoretical framework for autonomous search combines sequential Bayesian estimation with information theoretic motion control. This paper formulates autonomous search in the framework of possibility theory. Although the possibilistic formulation is slightly more involved than the traditional method, it provides a means for quantitative modelling and reasoning in the presence of epistemic uncertainty. This feature is demonstrated in the paper in the context of partially known probability of detection, expressed as an interval value. The paper presents an elegant Bayes-like solution to sequential estimation, with the reward function for motion control defined to take into account the epistemic uncertainty. The advantages of the proposed search algorithm are demonstrated by numerical simulations. Full article
(This article belongs to the Special Issue Advances in Uncertain Information Fusion)
Show Figures

Figure 1

21 pages, 561 KiB  
Article
Comparative Analysis of Deterministic and Nondeterministic Decision Trees for Decision Tables from Closed Classes
by Azimkhon Ostonov and Mikhail Moshkov
Entropy 2024, 26(6), 519; https://doi.org/10.3390/e26060519 - 17 Jun 2024
Viewed by 135
Abstract
In this paper, we consider classes of decision tables with many-valued decisions closed under operations of the removal of columns, the changing of decisions, the permutation of columns, and the duplication of columns. We study relationships among three parameters of these tables: the [...] Read more.
In this paper, we consider classes of decision tables with many-valued decisions closed under operations of the removal of columns, the changing of decisions, the permutation of columns, and the duplication of columns. We study relationships among three parameters of these tables: the complexity of a decision table (if we consider the depth of the decision trees, then the complexity of a decision table is the number of columns in it), the minimum complexity of a deterministic decision tree, and the minimum complexity of a nondeterministic decision tree. We consider the rough classification of functions characterizing relationships and enumerate all possible seven types of relationships. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

12 pages, 845 KiB  
Article
The Universal Optimism of the Self-Evidencing Mind
by Elizabeth L. Fisher and Jakob Hohwy
Entropy 2024, 26(6), 518; https://doi.org/10.3390/e26060518 - 17 Jun 2024
Viewed by 192
Abstract
Karl Friston’s free-energy principle casts agents as self-evidencing through active inference. This implies that decision-making, planning and information-seeking are, in a generic sense, ‘wishful’. We take an interdisciplinary perspective on this perplexing aspect of the free-energy principle and unpack the epistemological implications of [...] Read more.
Karl Friston’s free-energy principle casts agents as self-evidencing through active inference. This implies that decision-making, planning and information-seeking are, in a generic sense, ‘wishful’. We take an interdisciplinary perspective on this perplexing aspect of the free-energy principle and unpack the epistemological implications of wishful thinking under the free-energy principle. We use this epistemic framing to discuss the emergence of biases for self-evidencing agents. In particular, we argue that this elucidates an optimism bias as a foundational tenet of self-evidencing. We allude to a historical precursor to some of these themes, interestingly found in Machiavelli’s oeuvre, to contextualise the universal optimism of the free-energy principle. Full article
Show Figures

Figure 1

34 pages, 23635 KiB  
Article
FFT-Based Probability Density Imaging of Euler Solutions
by Shujin Cao, Peng Chen, Guangyin Lu, Zhiyuan Ma, Bo Yang and Xinyue Chen
Entropy 2024, 26(6), 517; https://doi.org/10.3390/e26060517 - 15 Jun 2024
Viewed by 318
Abstract
When using traditional Euler deconvolution optimization strategies, it is difficult to distinguish between anomalies and their corresponding Euler tails (those solutions are often distributed outside the anomaly source, forming “tail”-shaped spurious solutions, i.e., misplaced Euler solutions, which must be removed or marked) with [...] Read more.
When using traditional Euler deconvolution optimization strategies, it is difficult to distinguish between anomalies and their corresponding Euler tails (those solutions are often distributed outside the anomaly source, forming “tail”-shaped spurious solutions, i.e., misplaced Euler solutions, which must be removed or marked) with only the structural index. The nonparametric estimation method based on the normalized B-spline probability density (BSS) is used to separate the Euler solution clusters and mark different anomaly sources according to the similarity and density characteristics of the Euler solutions. For display purposes, the BSS needs to map the samples onto the estimation grid at the points where density will be estimated in order to obtain the probability density distribution. However, if the size of the samples or the estimation grid is too large, this process can lead to high levels of memory consumption and excessive computation times. To address this issue, a fast linear binning approximation algorithm is introduced in the BSS to speed up the computation process and save time. Subsequently, the sample data are quickly projected onto the estimation grid to facilitate the discrete convolution between the grid and the density function using a fast Fourier transform. A method involving multivariate B-spline probability density estimation based on the FFT (BSSFFT), in conjunction with fast linear binning appropriation, is proposed in this paper. The results of two random normal distributions show the correctness of the BSS and BSSFFT algorithms, which is verified via a comparison with the true probability density function (pdf) and Gaussian kernel smoothing estimation algorithms. Then, the Euler solutions of the two synthetic models are analyzed using the BSS and BSSFFT algorithms. The results are consistent with their theoretical values, which verify their correctness regarding Euler solutions. Finally, the BSSFFT is applied to Bishop 5X data, and the numerical results show that the comprehensive analysis of the 3D probability density distributions using the BSSFFT algorithm, derived from the Euler solution subset of x0,y0,z0, can effectively separate and locate adjacent anomaly sources, demonstrating strong adaptability. Full article
18 pages, 482 KiB  
Article
Dual-Tower Counterfactual Session-Aware Recommender System
by Wenzhuo Song and Xiaoyu Xing
Entropy 2024, 26(6), 516; https://doi.org/10.3390/e26060516 - 14 Jun 2024
Viewed by 171
Abstract
In the complex dynamics of modern information systems such as e-commerce and streaming services, managing uncertainty and leveraging information theory are crucial in enhancing session-aware recommender systems (SARSs). This paper presents an innovative approach to SARSs that combines static long-term and dynamic short-term [...] Read more.
In the complex dynamics of modern information systems such as e-commerce and streaming services, managing uncertainty and leveraging information theory are crucial in enhancing session-aware recommender systems (SARSs). This paper presents an innovative approach to SARSs that combines static long-term and dynamic short-term preferences within a counterfactual causal framework. Our method addresses the shortcomings of current prediction models that tend to capture spurious correlations, leading to biased recommendations. By incorporating a counterfactual viewpoint, we aim to elucidate the causal influences of static long-term preferences on next-item selections and enhance the overall robustness of predictive models. We introduce a dual-tower architecture with a novel data augmentation process and a self-supervised training strategy, tailored to tackle inherent biases and unreliable correlations. Extensive experiments demonstrate the effectiveness of our approach, outperforming existing benchmarks and paving the way for more accurate and reliable session-based recommendations. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

13 pages, 756 KiB  
Article
Underwater Wavelength Attack on Discrete Modulated Continuous-Variable Quantum Key Distribution
by Kangyi Feng, Yijun Wang, Yin Li, Yuang Wang, Zhiyue Zuo and Ying Guo
Entropy 2024, 26(6), 515; https://doi.org/10.3390/e26060515 - 14 Jun 2024
Viewed by 192
Abstract
The wavelength attack utilizes the dependence of beam splitters (BSs) on wavelength to cause legitimate users Alice and Bob to underestimate their excess noise so that Eve can steal more secret keys without being detected. Recently, the wavelength attack on Gaussian-modulated continuous-variable quantum [...] Read more.
The wavelength attack utilizes the dependence of beam splitters (BSs) on wavelength to cause legitimate users Alice and Bob to underestimate their excess noise so that Eve can steal more secret keys without being detected. Recently, the wavelength attack on Gaussian-modulated continuous-variable quantum key distribution (CV-QKD) has been researched in both fiber and atmospheric channels. However, the wavelength attack may also pose a threat to the case of ocean turbulent channels, which are vital for the secure communication of both ocean sensor networks and submarines. In this work, we propose two wavelength attack schemes on underwater discrete modulated (DM) CV-QKD protocol, which is effective for the case with and without local oscillator (LO) intensity monitor, respectively. In terms of the transmittance properties of the fused biconical taper (FBT) BS, two sets of wavelengths are determined for Eve’s pulse manipulation, which are all located in the so-called blue–green band. The derived successful criterion shows that both attack schemes can control the estimated excess noise of Alice and Bob close to zero by selecting the corresponding condition parameters based on channel transmittance. Additionally, our numerical analysis shows that Eve can steal more bits when the wavelength attack controls the value of the estimated excess noise closer to zero. Full article
(This article belongs to the Special Issue Quantum Communications Networks: Trends and Challenges)
Show Figures

Figure 1

11 pages, 1561 KiB  
Article
A Symmetric Form of the Clausius Statement of the Second Law of Thermodynamics
by Ti-Wei Xue, Tian Zhao and Zeng-Yuan Guo
Entropy 2024, 26(6), 514; https://doi.org/10.3390/e26060514 - 14 Jun 2024
Viewed by 183
Abstract
Bridgman once reflected on thermodynamics that the laws of thermodynamics were formulated in their present form by the great founders of thermodynamics, Kelvin and Clausius, before all the essential physical facts were in, and there has been no adequate reexamination of the fundamentals [...] Read more.
Bridgman once reflected on thermodynamics that the laws of thermodynamics were formulated in their present form by the great founders of thermodynamics, Kelvin and Clausius, before all the essential physical facts were in, and there has been no adequate reexamination of the fundamentals since. Thermodynamics still has unknown possibilities waiting to be explored. This paper begins with a brief review of Clausius’s work on the second law of thermodynamics and a reassessment of the content of Clausius’s statement. The review tells that what Clausius originally referred to as the second law of thermodynamics was, in fact, the theorem of equivalence of transformations (TET) in a reversible cycle. On this basis, a new symmetric form of Clausius’s TET is proposed. This theorem says that the two transformations, i.e., the transformation of heat to work and the transformation of work from high pressure to low pressure, should be equivalent in a reversible work-to-heat cycle. New thermodynamic cyclic laws are developed on the basis of the cycle with two work reservoirs (two pressures), which enriches the fundamental of the second law of thermodynamics. Full article
(This article belongs to the Special Issue Trends in the Second Law of Thermodynamics)
Show Figures

Figure 1

12 pages, 274 KiB  
Article
Building Test Batteries Based on Analyzing Random Number Generator Tests within the Framework of Algorithmic Information Theory
by Boris Ryabko
Entropy 2024, 26(6), 513; https://doi.org/10.3390/e26060513 - 14 Jun 2024
Viewed by 177
Abstract
The problem of testing random number generators is considered and a new method for comparing the power of different statistical tests is proposed. It is based on the definitions of random sequence developed in the framework of algorithmic information theory and allows comparing [...] Read more.
The problem of testing random number generators is considered and a new method for comparing the power of different statistical tests is proposed. It is based on the definitions of random sequence developed in the framework of algorithmic information theory and allows comparing the power of different tests in some cases when the available methods of mathematical statistics do not distinguish between tests. In particular, it is shown that tests based on data compression methods using dictionaries should be included in test batteries. Full article
(This article belongs to the Special Issue Complexity, Entropy and the Physics of Information II)
14 pages, 1090 KiB  
Article
New Quantum Private Comparison Using Four-Particle Cluster State
by Min Hou, Yue Wu and Shibin Zhang
Entropy 2024, 26(6), 512; https://doi.org/10.3390/e26060512 - 14 Jun 2024
Viewed by 150
Abstract
Quantum private comparison (QPC) enables two users to securely conduct private comparisons in a network characterized by mutual distrust while guaranteeing the confidentiality of their private inputs. Most previous QPC protocols were primarily used to determine the equality of private information between two [...] Read more.
Quantum private comparison (QPC) enables two users to securely conduct private comparisons in a network characterized by mutual distrust while guaranteeing the confidentiality of their private inputs. Most previous QPC protocols were primarily used to determine the equality of private information between two users, which constrained their scalability. In this paper, we propose a QPC protocol that leverages the entanglement correlation between particles in a four-particle cluster state. This protocol can compare the information of two groups of users within one protocol execution, with each group consisting of two users. A semi-honest third party (TP), who will not deviate from the protocol execution or conspire with any participant, is involved in assisting users to achieve private comparisons. Users encode their inputs into specific angles of rotational operations performed on the received quantum sequence, which is then sent back to TP. Security analysis shows that both external attacks and insider threats are ineffective at stealing private data. Finally, we compare our protocol with some previously proposed QPC protocols. Full article
(This article belongs to the Special Issue Entropy, Quantum Information and Entanglement)
Show Figures

Figure 1

29 pages, 1342 KiB  
Article
Exergoeconomic Analysis and Optimization of a Biomass Integrated Gasification Combined Cycle Based on Externally Fired Gas Turbine, Steam Rankine Cycle, Organic Rankine Cycle, and Absorption Refrigeration Cycle
by Jie Ren, Chen Xu, Zuoqin Qian, Weilong Huang and Baolin Wang
Entropy 2024, 26(6), 511; https://doi.org/10.3390/e26060511 - 12 Jun 2024
Viewed by 197
Abstract
Adopting biomass energy as an alternative to fossil fuels for electricity production presents a viable strategy to address the prevailing energy deficits and environmental concerns, although it faces challenges related to suboptimal energy efficiency levels. This study introduces a novel combined cooling and [...] Read more.
Adopting biomass energy as an alternative to fossil fuels for electricity production presents a viable strategy to address the prevailing energy deficits and environmental concerns, although it faces challenges related to suboptimal energy efficiency levels. This study introduces a novel combined cooling and power (CCP) system, incorporating an externally fired gas turbine (EFGT), steam Rankine cycle (SRC), absorption refrigeration cycle (ARC), and organic Rankine cycle (ORC), aimed at boosting the efficiency of biomass integrated gasification combined cycle systems. Through the development of mathematical models, this research evaluates the system’s performance from both thermodynamic and exergoeconomic perspectives. Results show that the system could achieve the thermal efficiency, exergy efficiency, and levelized cost of exergy (LCOE) of 70.67%, 39.13%, and 11.67 USD/GJ, respectively. The analysis identifies the combustion chamber of the EFGT as the component with the highest rate of exergy destruction. Further analysis on parameters indicates that improvements in thermodynamic performance are achievable with increased air compressor pressure ratio and gas turbine inlet temperature, or reduced pinch point temperature difference, while the LCOE can be minimized through adjustments in these parameters. Optimized operation conditions demonstrate a potential 5.7% reduction in LCOE at the expense of a 2.5% decrease in exergy efficiency when compared to the baseline scenario. Full article
(This article belongs to the Special Issue Thermodynamic Optimization of Industrial Energy Systems)
16 pages, 597 KiB  
Article
A Bayesian Measure of Model Accuracy
by Gabriel Hideki Vatanabe Brunello and Eduardo Yoshio Nakano
Entropy 2024, 26(6), 510; https://doi.org/10.3390/e26060510 - 12 Jun 2024
Viewed by 228
Abstract
Ensuring that the proposed probabilistic model accurately represents the problem is a critical step in statistical modeling, as choosing a poorly fitting model can have significant repercussions on the decision-making process. The primary objective of statistical modeling often revolves around predicting new observations, [...] Read more.
Ensuring that the proposed probabilistic model accurately represents the problem is a critical step in statistical modeling, as choosing a poorly fitting model can have significant repercussions on the decision-making process. The primary objective of statistical modeling often revolves around predicting new observations, highlighting the importance of assessing the model’s accuracy. However, current methods for evaluating predictive ability typically involve model comparison, which may not guarantee a good model selection. This work presents an accuracy measure designed for evaluating a model’s predictive capability. This measure, which is straightforward and easy to understand, includes a decision criterion for model rejection. The development of this proposal adopts a Bayesian perspective of inference, elucidating the underlying concepts and outlining the necessary procedures for application. To illustrate its utility, the proposed methodology was applied to real-world data, facilitating an assessment of its practicality in real-world scenarios. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

15 pages, 374 KiB  
Article
Violations of Hyperscaling in Finite-Size Scaling above the Upper Critical Dimension
by A. Peter Young
Entropy 2024, 26(6), 509; https://doi.org/10.3390/e26060509 - 12 Jun 2024
Viewed by 203
Abstract
We consider how finite-size scaling (FSS) is modified above the upper critical dimension, du=4, due to hyperscaling violations, which in turn arise from a dangerous irrelevant variable. In addition to the commonly studied case of periodic boundary conditions, we [...] Read more.
We consider how finite-size scaling (FSS) is modified above the upper critical dimension, du=4, due to hyperscaling violations, which in turn arise from a dangerous irrelevant variable. In addition to the commonly studied case of periodic boundary conditions, we also consider new effects that arise with free boundary conditions. Some numerical results are presented in addition to theoretical arguments. Full article
Show Figures

Figure 1

12 pages, 996 KiB  
Article
Characterizing Complex Spatiotemporal Patterns from Entropy Measures
by Luan Orion Barauna, Rubens Andreas Sautter, Reinaldo Roberto Rosa, Erico Luiz Rempel and Alejandro C. Frery
Entropy 2024, 26(6), 508; https://doi.org/10.3390/e26060508 - 12 Jun 2024
Viewed by 217
Abstract
In addition to their importance in statistical thermodynamics, probabilistic entropy measurements are crucial for understanding and analyzing complex systems, with diverse applications in time series and one-dimensional profiles. However, extending these methods to two- and three-dimensional data still requires further development. In this [...] Read more.
In addition to their importance in statistical thermodynamics, probabilistic entropy measurements are crucial for understanding and analyzing complex systems, with diverse applications in time series and one-dimensional profiles. However, extending these methods to two- and three-dimensional data still requires further development. In this study, we present a new method for classifying spatiotemporal processes based on entropy measurements. To test and validate the method, we selected five classes of similar processes related to the evolution of random patterns: (i) white noise; (ii) red noise; (iii) weak turbulence from reaction to diffusion; (iv) hydrodynamic fully developed turbulence; and (v) plasma turbulence from MHD. Considering seven possible ways to measure entropy from a matrix, we present the method as a parameter space composed of the two best separating measures of the five selected classes. The results highlight better combined performance of Shannon permutation entropy (SHp) and a new approach based on Tsallis Spectral Permutation Entropy (Sqs). Notably, our observations reveal the segregation of reaction terms in this SHp×Sqs space, a result that identifies specific sectors for each class of dynamic process, and it can be used to train machine learning models for the automatic classification of complex spatiotemporal patterns. Full article
Show Figures

Figure 1

24 pages, 3000 KiB  
Article
Fault Diagnosis of Wind Turbine Gearbox Based on Modified Hierarchical Fluctuation Dispersion Entropy of Tan-Sigmoid Mapping
by Xiang Wang and Yang Du
Entropy 2024, 26(6), 507; https://doi.org/10.3390/e26060507 - 11 Jun 2024
Viewed by 221
Abstract
Vibration monitoring and analysis are important methods in wind turbine gearbox fault diagnosis, and determining how to extract fault characteristics from the vibration signal is of primary importance. This paper presents a fault diagnosis approach based on modified hierarchical fluctuation dispersion entropy of [...] Read more.
Vibration monitoring and analysis are important methods in wind turbine gearbox fault diagnosis, and determining how to extract fault characteristics from the vibration signal is of primary importance. This paper presents a fault diagnosis approach based on modified hierarchical fluctuation dispersion entropy of tan-sigmoid mapping (MHFDE_TANSIG) and northern goshawk optimization–support vector machine (NGO–SVM) for wind turbine gearboxes. The tan-sigmoid (TANSIG) mapping function replaces the normal cumulative distribution function (NCDF) of the hierarchical fluctuation dispersion entropy (HFDE) method. Additionally, the hierarchical decomposition of the HFDE method is improved, resulting in the proposed MHFDE_TANSIG method. The vibration signals of wind turbine gearboxes are analyzed using the MHFDE_TANSIG method to extract fault features. The constructed fault feature set is used to intelligently recognize and classify the fault type of the gearboxes with the NGO–SVM classifier. The fault diagnosis methods based on MHFDE_TANSIG and NGO–SVM are applied to the experimental data analysis of gearboxes with different operating conditions. The results show that the fault diagnosis model proposed in this paper has the best performance with an average accuracy rate of 97.25%. Full article
(This article belongs to the Special Issue Entropy Applications in Condition Monitoring and Fault Diagnosis)
8 pages, 226 KiB  
Article
Multimodel Approaches Are Not the Best Way to Understand Multifactorial Systems
by Benjamin M. Bolker
Entropy 2024, 26(6), 506; https://doi.org/10.3390/e26060506 - 11 Jun 2024
Viewed by 221
Abstract
Information-theoretic (IT) and multi-model averaging (MMA) statistical approaches are widely used but suboptimal tools for pursuing a multifactorial approach (also known as the method of multiple working hypotheses) in ecology. (1) Conceptually, IT encourages ecologists to perform tests on sets of artificially simplified [...] Read more.
Information-theoretic (IT) and multi-model averaging (MMA) statistical approaches are widely used but suboptimal tools for pursuing a multifactorial approach (also known as the method of multiple working hypotheses) in ecology. (1) Conceptually, IT encourages ecologists to perform tests on sets of artificially simplified models. (2) MMA improves on IT model selection by implementing a simple form of shrinkage estimation (a way to make accurate predictions from a model with many parameters relative to the amount of data, by “shrinking” parameter estimates toward zero). However, other shrinkage estimators such as penalized regression or Bayesian hierarchical models with regularizing priors are more computationally efficient and better supported theoretically. (3) In general, the procedures for extracting confidence intervals from MMA are overconfident, providing overly narrow intervals. If researchers want to use limited data sets to accurately estimate the strength of multiple competing ecological processes along with reliable confidence intervals, the current best approach is to use full (maximal) statistical models (possibly with Bayesian priors) after making principled, a priori decisions about model complexity. Full article
23 pages, 3411 KiB  
Article
Code Similarity Prediction Model for Industrial Management Features Based on Graph Neural Networks
by Zhenhao Li, Hang Lei, Zhichao Ma and Fengyun Zhang
Entropy 2024, 26(6), 505; https://doi.org/10.3390/e26060505 - 9 Jun 2024
Viewed by 364
Abstract
The code of industrial management software typically features few system API calls and a high number of customized variables and structures. This makes the similarity of such codes difficult to compute using text features or traditional neural network methods. In this paper, we [...] Read more.
The code of industrial management software typically features few system API calls and a high number of customized variables and structures. This makes the similarity of such codes difficult to compute using text features or traditional neural network methods. In this paper, we propose an FSPS-GNN model, which is based on graph neural networks (GNNs), to address this problem. The model categorizes code features into two types, outer graph and inner graph, and conducts training and prediction with four stages—feature embedding, feature enhancement, feature fusion, and similarity prediction. Moreover, differently structured GNNs were used in the embedding and enhancement stages, respectively, to increase the interaction of code features. Experiments with code from three open-source projects demonstrate that the model achieves an average precision of 87.57% and an F0.5 Score of 89.12%. Compared to existing similarity-computation models based on GNNs, this model exhibits a Mean Squared Error (MSE) that is approximately 0.0041 to 0.0266 lower and an F0.5 Score that is 3.3259% to 6.4392% higher. It broadens the application scope of GNNs and offers additional insights for the study of code-similarity issues. Full article
11 pages, 282 KiB  
Article
Derivation of Bose’s Entropy Spectral Density from the Multiplicity of Energy Eigenvalues
by Arnaldo Spalvieri
Entropy 2024, 26(6), 504; https://doi.org/10.3390/e26060504 - 9 Jun 2024
Viewed by 277
Abstract
The modern textbook analysis of the thermal state of photons inside a three-dimensional reflective cavity is based on the three quantum numbers that characterize photon’s energy eigenvalues coming out when the boundary conditions are imposed. The crucial passage from the quantum numbers to [...] Read more.
The modern textbook analysis of the thermal state of photons inside a three-dimensional reflective cavity is based on the three quantum numbers that characterize photon’s energy eigenvalues coming out when the boundary conditions are imposed. The crucial passage from the quantum numbers to the continuous frequency is operated by introducing a three-dimensional continuous version of the three discrete quantum numbers, which leads to the energy spectral density and to the entropy spectral density. This standard analysis obscures the role of the multiplicity of energy eigenvalues associated to the same eigenfrequency. In this paper we review the past derivations of Bose’s entropy spectral density and present a new analysis of energy spectral density and entropy spectral density based on the multiplicity of energy eigenvalues. Our analysis explicitly defines the eigenfrequency distribution of energy and entropy and uses it as a starting point for the passage from the discrete eigenfrequencies to the continuous frequency. Full article
(This article belongs to the Section Thermodynamics)
15 pages, 283 KiB  
Article
Refinements and Extensions of Ziv’s Model of Perfect Secrecy for Individual Sequences
by Neri Merhav
Entropy 2024, 26(6), 503; https://doi.org/10.3390/e26060503 - 9 Jun 2024
Viewed by 238
Abstract
We refine and extend Ziv’s model and results regarding perfectly secure encryption of individual sequences. According to this model, the encrypter and the legitimate decrypter share a common secret key that is not shared with the unauthorized eavesdropper. The eavesdropper is aware of [...] Read more.
We refine and extend Ziv’s model and results regarding perfectly secure encryption of individual sequences. According to this model, the encrypter and the legitimate decrypter share a common secret key that is not shared with the unauthorized eavesdropper. The eavesdropper is aware of the encryption scheme and has some prior knowledge concerning the individual plaintext source sequence. This prior knowledge, combined with the cryptogram, is harnessed by the eavesdropper, who implements a finite-state machine as a mechanism for accepting or rejecting attempted guesses of the plaintext source. The encryption is considered perfectly secure if the cryptogram does not provide any new information to the eavesdropper that may enhance their knowledge concerning the plaintext beyond their prior knowledge. Ziv has shown that the key rate needed for perfect secrecy is essentially lower bounded by the finite-state compressibility of the plaintext sequence, a bound that is clearly asymptotically attained through Lempel–Ziv compression followed by one-time pad encryption. In this work, we consider some more general classes of finite-state eavesdroppers and derive the respective lower bounds on the key rates needed for perfect secrecy. These bounds are tighter and more refined than Ziv’s bound, and they are attained using encryption schemes that are based on different universal lossless compression schemes. We also extend our findings to the case where side information is available to the eavesdropper and the legitimate decrypter but may or may not be available to the encrypter. Full article
(This article belongs to the Collection Feature Papers in Information Theory)
9 pages, 599 KiB  
Article
Modelling Heterogeneous Anomalous Dynamics of Radiation-Induced Double-Strand Breaks in DNA during Non-Homologous End-Joining Pathway
by Nickolay Korabel, John W. Warmenhoven, Nicholas T. Henthorn, Samuel Ingram, Sergei Fedotov, Charlotte J. Heaven, Karen J. Kirkby, Michael J. Taylor and Michael J. Merchant
Entropy 2024, 26(6), 502; https://doi.org/10.3390/e26060502 - 8 Jun 2024
Viewed by 300
Abstract
The process of end-joining during nonhomologous repair of DNA double-strand breaks (DSBs) after radiation damage is considered. Experimental evidence has revealed that the dynamics of DSB ends exhibit subdiffusive motion rather than simple diffusion with rare directional movement. Traditional models often overlook the [...] Read more.
The process of end-joining during nonhomologous repair of DNA double-strand breaks (DSBs) after radiation damage is considered. Experimental evidence has revealed that the dynamics of DSB ends exhibit subdiffusive motion rather than simple diffusion with rare directional movement. Traditional models often overlook the rare long-range directed motion. To address this limitation, we present a heterogeneous anomalous diffusion model consisting of subdiffusive fractional Brownian motion interchanged with short periods of long-range movement. Our model sheds light on the underlying mechanisms of heterogeneous diffusion in DSB repair and could be used to quantify the DSB dynamics on a time scale inaccessible to single particle tracking analysis. The model predicts that the long-range movement of DSB ends is responsible for the misrepair of DSBs in the form of dicentric chromosome lesions. Full article
27 pages, 3937 KiB  
Article
Simultaneous Optimization and Integration of Multiple Process Heat Cascade and Site Utility Selection for the Design of a New Generation of Sugarcane Biorefinery
by Victor Fernandes Garcia and Adriano Viana Ensinas
Entropy 2024, 26(6), 501; https://doi.org/10.3390/e26060501 - 8 Jun 2024
Viewed by 341
Abstract
Biorefinery plays a crucial role in the decarbonization of the current economic model, but its high investments and costs make its products less competitive. Identifying the best technological route to maximize operational synergies is crucial for its viability. This study presents a new [...] Read more.
Biorefinery plays a crucial role in the decarbonization of the current economic model, but its high investments and costs make its products less competitive. Identifying the best technological route to maximize operational synergies is crucial for its viability. This study presents a new superstructure model based on mixed integer linear programming to identify an ideal biorefinery configuration. The proposed formulation considers the selection and process scale adjustment, utility selection, and heat integration by heat cascade integration from different processes. The formulation is tested by a study where the impact of new technologies on energy efficiency and the total annualized cost of a sugarcane biorefinery is evaluated. As a result, the energy efficiency of biorefinery increased from 50.25% to 74.5% with methanol production through bagasse gasification, mainly due to its high heat availability that can be transferred to the distillery, which made it possible to shift the bagasse flow from the cogeneration to gasification process. Additionally, the production of DME yields outcomes comparable to methanol production. However, CO2 hydrogenation negatively impacts profitability and energy efficiency due to the significant consumption and electricity cost. Nonetheless, it is advantageous for surface power density as it increases biofuel production without expanding the biomass area. Full article
(This article belongs to the Special Issue Thermodynamic Optimization of Industrial Energy Systems)
28 pages, 4312 KiB  
Article
Intermediate Judgments and Trust in Artificial Intelligence-Supported Decision-Making
by Scott Humr and Mustafa Canan
Entropy 2024, 26(6), 500; https://doi.org/10.3390/e26060500 - 8 Jun 2024
Viewed by 469
Abstract
Human decision-making is increasingly supported by artificial intelligence (AI) systems. From medical imaging analysis to self-driving vehicles, AI systems are becoming organically embedded in a host of different technologies. However, incorporating such advice into decision-making entails a human rationalization of AI outputs for [...] Read more.
Human decision-making is increasingly supported by artificial intelligence (AI) systems. From medical imaging analysis to self-driving vehicles, AI systems are becoming organically embedded in a host of different technologies. However, incorporating such advice into decision-making entails a human rationalization of AI outputs for supporting beneficial outcomes. Recent research suggests intermediate judgments in the first stage of a decision process can interfere with decisions in subsequent stages. For this reason, we extend this research to AI-supported decision-making to investigate how intermediate judgments on AI-provided advice may influence subsequent decisions. In an online experiment (N = 192), we found a consistent bolstering effect in trust for those who made intermediate judgments and over those who did not. Furthermore, violations of total probability were observed at all timing intervals throughout the study. We further analyzed the results by demonstrating how quantum probability theory can model these types of behaviors in human–AI decision-making and ameliorate the understanding of the interaction dynamics at the confluence of human factors and information features. Full article
(This article belongs to the Section Quantum Information)
Show Figures

Figure 1

28 pages, 3502 KiB  
Review
On Casimir and Helmholtz Fluctuation-Induced Forces in Micro- and Nano-Systems: Survey of Some Basic Results
by Daniel Dantchev
Entropy 2024, 26(6), 499; https://doi.org/10.3390/e26060499 - 7 Jun 2024
Viewed by 437
Abstract
Fluctuations are omnipresent; they exist in any matter, due either to its quantum nature or to its nonzero temperature. In the current review, we briefly cover the quantum electrodynamic Casimir (QED) force as well as the critical Casimir (CC) and Helmholtz (HF) forces. [...] Read more.
Fluctuations are omnipresent; they exist in any matter, due either to its quantum nature or to its nonzero temperature. In the current review, we briefly cover the quantum electrodynamic Casimir (QED) force as well as the critical Casimir (CC) and Helmholtz (HF) forces. In the QED case, the medium is usually a vacuum and the massless excitations are photons, while in the CC and HF cases the medium is usually a critical or correlated fluid and the fluctuations of the order parameter are the cause of the force between the macroscopic or mesoscopic bodies immersed in it. We discuss the importance of the presented results for nanotechnology, especially for devising and assembling micro- or nano-scale systems. Several important problems for nanotechnology following from the currently available experimental findings are spelled out, and possible strategies for overcoming them are sketched. Regarding the example of HF, we explicitly demonstrate that when a given integral quantity characterizing the fluid is conserved, it has an essential influence on the behavior of the corresponding fluctuation-induced force. Full article
(This article belongs to the Collection Foundations of Statistical Mechanics)
17 pages, 463 KiB  
Article
A Semiparametric Bayesian Approach to Heterogeneous Spatial Autoregressive Models
by Ting Liu, Dengke Xu and Shiqi Ke
Entropy 2024, 26(6), 498; https://doi.org/10.3390/e26060498 - 7 Jun 2024
Viewed by 258
Abstract
Many semiparametric spatial autoregressive (SSAR) models have been used to analyze spatial data in a variety of applications; however, it is a common phenomenon that heteroscedasticity often occurs in spatial data analysis. Therefore, when considering SSAR models in this paper, it is allowed [...] Read more.
Many semiparametric spatial autoregressive (SSAR) models have been used to analyze spatial data in a variety of applications; however, it is a common phenomenon that heteroscedasticity often occurs in spatial data analysis. Therefore, when considering SSAR models in this paper, it is allowed that the variance parameters of the models can depend on the explanatory variable, and these are called heterogeneous semiparametric spatial autoregressive models. In order to estimate the model parameters, a Bayesian estimation method is proposed for heterogeneous SSAR models based on B-spline approximations of the nonparametric function. Then, we develop an efficient Markov chain Monte Carlo sampling algorithm on the basis of the Gibbs sampler and Metropolis–Hastings algorithm that can be used to generate posterior samples from posterior distributions and perform posterior inference. Finally, some simulation studies and real data analysis of Boston housing data have demonstrated the excellent performance of the proposed Bayesian method. Full article
(This article belongs to the Special Issue Markov Chain Monte Carlo for Bayesian Inference)
16 pages, 2640 KiB  
Article
Finite-Time Dynamics of an Entanglement Engine: Current, Fluctuations and Kinetic Uncertainty Relations
by Jeanne Bourgeois, Gianmichele Blasi, Shishir Khandelwal and Géraldine Haack
Entropy 2024, 26(6), 497; https://doi.org/10.3390/e26060497 - 7 Jun 2024
Viewed by 203
Abstract
Entanglement engines are autonomous quantum thermal machines designed to generate entanglement from the presence of a particle current flowing through the device. In this work, we investigate the functioning of a two-qubit entanglement engine beyond the steady-state regime. Within a master equation approach, [...] Read more.
Entanglement engines are autonomous quantum thermal machines designed to generate entanglement from the presence of a particle current flowing through the device. In this work, we investigate the functioning of a two-qubit entanglement engine beyond the steady-state regime. Within a master equation approach, we derive the time-dependent state, the particle current, as well as the associated current correlation functions. Our findings establish a direct connection between coherence and internal current, elucidating the existence of a critical current that serves as an indicator for entanglement in the steady state. We then apply our results to investigate kinetic uncertainty relations (KURs) at finite times. We demonstrate that there is more than one possible definition for KURs at finite times. Although the two definitions agree in the steady-state regime, they lead to different parameter ranges for violating KUR at finite times. Full article
(This article belongs to the Special Issue Advances in Quantum Thermodynamics)
12 pages, 588 KiB  
Article
Purported Self-Organized Criticality of the Cardiovascular Function: Methodological Considerations for Zipf’s Law Analysis
by Jacques-Olivier Fortrat
Entropy 2024, 26(6), 496; https://doi.org/10.3390/e26060496 - 7 Jun 2024
Viewed by 356
Abstract
Self-organized criticality is a universal theory for dynamical systems that has recently been applied to the cardiovascular system. Precise methodological approaches are essential for understanding the dynamics of cardiovascular self-organized criticality. This study examines how the duration and quality of data recording affect [...] Read more.
Self-organized criticality is a universal theory for dynamical systems that has recently been applied to the cardiovascular system. Precise methodological approaches are essential for understanding the dynamics of cardiovascular self-organized criticality. This study examines how the duration and quality of data recording affect the analysis of cardiovascular self-organized criticality, with a focus on the beat-by-beat heart rate variability time series obtained from seven healthy subjects in a standing position. Drawing a Zipf diagram, we evaluated the distribution of cardiovascular events of bradycardia and tachycardia. We identified tipping points for the distribution of both bradycardia and tachycardia events. By varying the recording durations (1, 2, 5, 10, 20, 30, and 40 min) and sampling frequencies (500, 250, and 100 Hz), we investigated their influence on the observed distributions. While shorter recordings can effectively capture cardiovascular events, they may underestimate the variables describing their distribution. Additionally, the tipping point of the Zipf distribution differs between bradycardia and tachycardia events. Comparisons of the distribution of bradycardia and tachycardia events should be conducted using long data recordings. Utilizing devices with lower sampling frequencies may compromise data fidelity. These insights contribute to refining experimental protocols and advancing our understanding of the complex dynamics underlying cardiovascular regulation. Full article
Show Figures

Figure 1

Previous Issue
Back to TopTop