Next Issue
Volume 26, March
Previous Issue
Volume 26, January
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 26, Issue 2 (February 2024) – 79 articles

Cover Story (view full-size image): A fundamental feature underpinning phenomenological thermodynamics is that we accept limited knowledge and lack of control over all degrees of freedom of a system. However, modern quantum thermodynamics, which grows from information theory, requires the opposite. To bridge the two, we propose an intermediate route inspired by gauge theories known from fundamental physics, treating certain information content as the gauge. Consequently, we arrive at different notions of work and heat for quantum systems, consistent with the classical intuition. In particular, we exhibit heat present in closed quantum systems which we associate with energy that cannot be used to perform work due to insufficient control over the system. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
17 pages, 3959 KiB  
Article
A Eulerian Numerical Model to Predict the Enhancement Effect of the Gravity-Driven Motion Melting Process for Latent Thermal Energy Storage
by Shen Tian, Bolun Tan, Yuchen Lin, Tieying Wang and Kaiyong Hu
Entropy 2024, 26(2), 175; https://doi.org/10.3390/e26020175 - 19 Feb 2024
Viewed by 686
Abstract
Latent thermal energy storage (LTES) devices can efficiently store renewable energy in thermal form and guarantee a stable-temperature thermal energy supply. The gravity-driven motion melting (GDMM) process improves the overall melting rate for packaged phase-change material (PCM) by constructing an enhanced flow field [...] Read more.
Latent thermal energy storage (LTES) devices can efficiently store renewable energy in thermal form and guarantee a stable-temperature thermal energy supply. The gravity-driven motion melting (GDMM) process improves the overall melting rate for packaged phase-change material (PCM) by constructing an enhanced flow field in the liquid phase. However, due to the complex mechanisms involved in fluid–solid coupling and liquid–solid phase transition, numerical simulation studies that demonstrate physical details are necessary. In this study, a simplified numerical model based on the Eulerian method is proposed. We aimed to introduce a fluid deformation yield stress equation to the “solid phase” based on the Bingham fluid assumption. As a result, fluid–solid coupling and liquid–solid phase transition processes become continuously solvable. The proposed model is validated by the referenced experimental measurements. The enhanced performance of liquid-phase convection and the macroscopic settling of the “solid phase” are numerically analyzed. The results indicate that the enhanced liquid-phase fluidity allows for a stronger heat transfer process than natural convection for the pure liquid phase. The gravity-driven pressure difference is directly proportional to the vertical melting rate, which indicates the feasibility of controlling the pressure difference to improve the melting rate. Full article
Show Figures

Figure 1

13 pages, 1214 KiB  
Review
Solvation Thermodynamics and Its Applications
by Arieh Ben-Naim
Entropy 2024, 26(2), 174; https://doi.org/10.3390/e26020174 - 18 Feb 2024
Viewed by 897
Abstract
In this article, we start by describing a few “definitions” of the solvation processes, which were used in the literature until about 1980. Then, we choose one of these definitions and show that it has a simple molecular interpretation. This fact led to [...] Read more.
In this article, we start by describing a few “definitions” of the solvation processes, which were used in the literature until about 1980. Then, we choose one of these definitions and show that it has a simple molecular interpretation. This fact led to a new definition of the solvation process and the corresponding thermodynamic quantities. The new measure of the solvation Gibbs energy has a simple interpretation. In addition, the thermodynamic quantities associated with the new solvation process have several other advantages over the older measures. These will be discussed briefly in the third section. In the fourth section, we discuss a few applications of the new solvation process. Full article
(This article belongs to the Special Issue Solvation Thermodynamics and Its Applications)
Show Figures

Figure 1

28 pages, 6126 KiB  
Article
Gas Kinetic Scheme Coupled with High-Speed Modifications for Hypersonic Transition Flow Simulations
by Chengrui Li, Wenwen Zhao, Hualin Liu, Youtao Xue, Yuxin Yang and Weifang Chen
Entropy 2024, 26(2), 173; https://doi.org/10.3390/e26020173 - 18 Feb 2024
Viewed by 704
Abstract
The issue of hypersonic boundary layer transition prediction is a critical aerodynamic concern that must be addressed during the aerodynamic design process of high-speed vehicles. In this context, we propose an advanced mesoscopic method that couples the gas kinetic scheme (GKS) with the [...] Read more.
The issue of hypersonic boundary layer transition prediction is a critical aerodynamic concern that must be addressed during the aerodynamic design process of high-speed vehicles. In this context, we propose an advanced mesoscopic method that couples the gas kinetic scheme (GKS) with the Langtry–Menter transition model, including its three high-speed modification methods, tailored for accurate predictions of high-speed transition flows. The new method incorporates the turbulent kinetic energy term into the Maxwellian velocity distribution function, and it couples the effects of high-speed modifications on turbulent kinetic energy within the computational framework of the GKS solver. This integration elevates both the transition model and its high-speed enhancements to the mesoscopic level, enhancing the method’s predictive capability. The GKS-coupled mesoscopic method is validated through a series of test cases, including supersonic flat plate simulation, multiple hypersonic cone cases, the Hypersonic International Flight Research Experimentation (HIFiRE)-1 flight test, and the HIFiRE-5 case. The computational results obtained from these cases exhibit favorable agreement with experimental data. In comparison with the conventional Godunov method, the new approach encompasses a broader range of physical mechanisms, yielding computational results that closely align with the true physical phenomena and marking a notable elevation in computational fidelity and accuracy. This innovative method potentially satisfies the compelling demand for developing a precise and rapid method for predicting hypersonic boundary layer transition, which can be readily used in engineering applications. Full article
(This article belongs to the Special Issue Kinetic Theory-based Methods in Fluid Dynamics II)
Show Figures

Figure 1

29 pages, 7999 KiB  
Article
Wetting and Spreading Behavior of Axisymmetric Compound Droplets on Curved Solid Walls Using Conservative Phase Field Lattice Boltzmann Method
by Yue Wang and Jun-Jie Huang
Entropy 2024, 26(2), 172; https://doi.org/10.3390/e26020172 - 17 Feb 2024
Viewed by 651
Abstract
Compound droplets have received increasing attention due to their applications in many several areas, including medicine and materials. Previous works mostly focused on compound droplets on planar surfaces and, as such, the effects of curved walls have not been studied thoroughly. In this [...] Read more.
Compound droplets have received increasing attention due to their applications in many several areas, including medicine and materials. Previous works mostly focused on compound droplets on planar surfaces and, as such, the effects of curved walls have not been studied thoroughly. In this paper, the influence of the properties of curved solid wall (including the shape, curvature, and contact angle) on the wetting behavior of compound droplets is explored. The axisymmetric lattice Boltzmann method, based on the conservative phase field formulation for ternary fluids, was used to numerically study the wetting and spreading of a compound droplet of the Janus type on various curved solid walls at large density ratios, focusing on whether the separation of compound droplets occurs. Several types of wall geometries were considered, including a planar wall, a concave wall with constant curvature, and a convex wall with fixed or variable curvature (specifically, a prolate or oblate spheroid). The effects of surface wettability, interfacial angles, and the density ratio (of droplet to ambient fluid) on the wetting process were also explored. In general, it was found that, under otherwise identical conditions, droplet separation tends to happen more likely on more hydrophilic walls, under larger interfacial angles (measured inside the droplet), and at larger density ratios. On convex walls, a larger radius of curvature of the surface near the droplet was found to be helpful to split the Janus droplet. On concave walls, as the radius of curvature increases from a small value, the possibility to observe droplet separation first increases and then decreases. Several phase diagrams on whether droplet separation occurs during the spreading process were produced for different kinds of walls to illustrate the influences of various factors. Full article
(This article belongs to the Special Issue Kinetic Theory-based Methods in Fluid Dynamics II)
Show Figures

Figure 1

19 pages, 318 KiB  
Article
Probability Turns Material: The Boltzmann Equation
by Lamberto Rondoni and Vincenzo Di Florio
Entropy 2024, 26(2), 171; https://doi.org/10.3390/e26020171 - 17 Feb 2024
Viewed by 863
Abstract
We review, under a modern light, the conditions that render the Boltzmann equation applicable. These are conditions that permit probability to behave like mass, thereby possessing clear and concrete content, whereas generally, this is not the case. Because science and technology are increasingly [...] Read more.
We review, under a modern light, the conditions that render the Boltzmann equation applicable. These are conditions that permit probability to behave like mass, thereby possessing clear and concrete content, whereas generally, this is not the case. Because science and technology are increasingly interested in small systems that violate the conditions of the Boltzmann equation, probability appears to be the only mathematical tool suitable for treating them. Therefore, Boltzmann’s teachings remain relevant, and the present analysis provides a critical perspective useful for accurately interpreting the results of current applications of statistical mechanics. Full article
(This article belongs to the Special Issue 180th Anniversary of Ludwig Boltzmann)
23 pages, 331 KiB  
Article
Memory Systems, the Epistemic Arrow of Time, and the Second Law
by David H. Wolpert and Jens Kipper
Entropy 2024, 26(2), 170; https://doi.org/10.3390/e26020170 - 16 Feb 2024
Viewed by 1076
Abstract
The epistemic arrow of time is the fact that our knowledge of the past seems to be both of a different kind and more detailed than our knowledge of the future. Just like with the other arrows of time, it has often been [...] Read more.
The epistemic arrow of time is the fact that our knowledge of the past seems to be both of a different kind and more detailed than our knowledge of the future. Just like with the other arrows of time, it has often been speculated that the epistemic arrow arises due to the second law of thermodynamics. In this paper, we investigate the epistemic arrow of time using a fully formal framework. We begin by defining a memory system as any physical system whose present state can provide information about the state of the external world at some time other than the present. We then identify two types of memory systems in our universe, along with an important special case of the first type, which we distinguish as a third type of memory system. We show that two of these types of memory systems are time-symmetric, able to provide knowledge about both the past and the future. However, the third type of memory systems exploits the second law of thermodynamics, at least in all of its instances in our universe that we are aware of. The result is that in our universe, this type of memory system only ever provides information about the past. We also argue that human memory is of this third type, completing the argument. We end by scrutinizing the basis of the second law itself. This uncovers a previously unappreciated formal problem for common arguments that try to derive the second law from the “Past Hypothesis”, i.e., from the claim that the very early universe was in a state of extremely low entropy. Our analysis is indebted to prior work by one of us but expands and improves upon this work in several respects. Full article
(This article belongs to the Special Issue Time and Temporal Asymmetries)
29 pages, 971 KiB  
Article
A New Logic, a New Information Measure, and a New Information-Based Approach to Interpreting Quantum Mechanics
by David Ellerman
Entropy 2024, 26(2), 169; https://doi.org/10.3390/e26020169 - 15 Feb 2024
Viewed by 1214
Abstract
The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy [...] Read more.
The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy is the normalized quantitative version of partitions. The new approach to interpreting quantum mechanics (QM) is showing that the mathematics (not the physics) of QM is the linearized Hilbert space version of the mathematics of partitions. Or, putting it the other way around, the math of partitions is a skeletal version of the math of QM. The key concepts throughout this progression from logic, to logical information, to quantum theory are distinctions versus indistinctions, definiteness versus indefiniteness, or distinguishability versus indistinguishability. The distinctions of a partition are the ordered pairs of elements from the underlying set that are in different blocks of the partition and logical entropy is defined (initially) as the normalized number of distinctions. The cognate notions of definiteness and distinguishability run throughout the math of QM, e.g., in the key non-classical notion of superposition (=ontic indefiniteness) and in the Feynman rules for adding amplitudes (indistinguishable alternatives) versus adding probabilities (distinguishable alternatives). Full article
(This article belongs to the Special Issue Information-Theoretic Concepts in Physics)
Show Figures

Figure 1

27 pages, 395 KiB  
Article
The Circumstance-Driven Bivariate Integer-Valued Autoregressive Model
by Huiqiao Wang and Christian H. Weiß
Entropy 2024, 26(2), 168; https://doi.org/10.3390/e26020168 - 15 Feb 2024
Viewed by 639
Abstract
The novel circumstance-driven bivariate integer-valued autoregressive (CuBINAR) model for non-stationary count time series is proposed. The non-stationarity of the bivariate count process is defined by a joint categorical sequence, which expresses the current state of the process. Additional cross-dependence can be generated via [...] Read more.
The novel circumstance-driven bivariate integer-valued autoregressive (CuBINAR) model for non-stationary count time series is proposed. The non-stationarity of the bivariate count process is defined by a joint categorical sequence, which expresses the current state of the process. Additional cross-dependence can be generated via cross-dependent innovations. The model can also be equipped with a marginal bivariate Poisson distribution to make it suitable for low-count time series. Important stochastic properties of the new model are derived. The Yule–Walker and conditional maximum likelihood method are adopted to estimate the unknown parameters. The consistency of these estimators is established, and their finite-sample performance is investigated by a simulation study. The scope and application of the model are illustrated by a real-world data example on sales counts, where a soap product in different stores with a common circumstance factor is investigated. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

17 pages, 874 KiB  
Article
Effect of Longitudinal Fluctuations of 3D Weizsäcker–Williams Field on Pressure Isotropization of Glasma
by Hidefumi Matsuda and Xu-Guang Huang
Entropy 2024, 26(2), 167; https://doi.org/10.3390/e26020167 - 15 Feb 2024
Cited by 1 | Viewed by 648
Abstract
We investigate the effects of boost invariance breaking on the isotropization of pressure in the glasma, using a 3+1D glasma simulation. The breaking is attributed to spatial fluctuations in the classical color charge density along the collision axis. We present numerical results for [...] Read more.
We investigate the effects of boost invariance breaking on the isotropization of pressure in the glasma, using a 3+1D glasma simulation. The breaking is attributed to spatial fluctuations in the classical color charge density along the collision axis. We present numerical results for pressure and energy density at mid-rapidity and across a wider rapidity region. It is found that, despite varying longitudinal correlation lengths, the behaviors of the pressure isotropizations are qualitatively similar. The numerical results suggest that, in the initial stage, longitudinal color electromagnetic fields develop, similar to those in the boost invariant glasma. Subsequently, these fields evolve into a dilute glasma, expanding longitudinally in a manner akin to a dilute gas. We also show that the energy density at mid-rapidity exhibits a 1/τ decay in the dilute glasma stage. Full article
(This article belongs to the Special Issue Nonequilibrium Quantum Field Processes and Phenomena)
Show Figures

Figure 1

22 pages, 4753 KiB  
Article
An N-Shaped Lightweight Network with a Feature Pyramid and Hybrid Attention for Brain Tumor Segmentation
by Mengxian Chi, Hong An, Xu Jin and Zhenguo Nie
Entropy 2024, 26(2), 166; https://doi.org/10.3390/e26020166 - 15 Feb 2024
Viewed by 869
Abstract
Brain tumor segmentation using neural networks presents challenges in accurately capturing diverse tumor shapes and sizes while maintaining real-time performance. Additionally, addressing class imbalance is crucial for achieving accurate clinical results. To tackle these issues, this study proposes a novel N-shaped lightweight network [...] Read more.
Brain tumor segmentation using neural networks presents challenges in accurately capturing diverse tumor shapes and sizes while maintaining real-time performance. Additionally, addressing class imbalance is crucial for achieving accurate clinical results. To tackle these issues, this study proposes a novel N-shaped lightweight network that combines multiple feature pyramid paths and U-Net architectures. Furthermore, we ingeniously integrate hybrid attention mechanisms into various locations of depth-wise separable convolution module to improve efficiency, with channel attention found to be the most effective for skip connections in the proposed network. Moreover, we introduce a combination loss function that incorporates a newly designed weighted cross-entropy loss and dice loss to effectively tackle the issue of class imbalance. Extensive experiments are conducted on four publicly available datasets, i.e., UCSF-PDGM, BraTS 2021, BraTS 2019, and MSD Task 01 to evaluate the performance of different methods. The results demonstrate that the proposed network achieves superior segmentation accuracy compared to state-of-the-art methods. The proposed network not only improves the overall segmentation performance but also provides a favorable computational efficiency, making it a promising approach for clinical applications. Full article
(This article belongs to the Special Issue Methods in Artificial Intelligence and Information Processing II)
Show Figures

Figure 1

16 pages, 20800 KiB  
Article
A Robust Method for the Unsupervised Scoring of Immunohistochemical Staining
by Iván Durán-Díaz, Auxiliadora Sarmiento, Irene Fondón, Clément Bodineau, Mercedes Tomé and Raúl V. Durán
Entropy 2024, 26(2), 165; https://doi.org/10.3390/e26020165 - 15 Feb 2024
Viewed by 741
Abstract
Immunohistochemistry is a powerful technique that is widely used in biomedical research and clinics; it allows one to determine the expression levels of some proteins of interest in tissue samples using color intensity due to the expression of biomarkers with specific antibodies. As [...] Read more.
Immunohistochemistry is a powerful technique that is widely used in biomedical research and clinics; it allows one to determine the expression levels of some proteins of interest in tissue samples using color intensity due to the expression of biomarkers with specific antibodies. As such, immunohistochemical images are complex and their features are difficult to quantify. Recently, we proposed a novel method, including a first separation stage based on non-negative matrix factorization (NMF), that achieved good results. However, this method was highly dependent on the parameters that control sparseness and non-negativity, as well as on algorithm initialization. Furthermore, the previously proposed method required a reference image as a starting point for the NMF algorithm. In the present work, we propose a new, simpler and more robust method for the automated, unsupervised scoring of immunohistochemical images based on bright field. Our work is focused on images from tumor tissues marked with blue (nuclei) and brown (protein of interest) stains. The new proposed method represents a simpler approach that, on the one hand, avoids the use of NMF in the separation stage and, on the other hand, circumvents the need for a control image. This new approach determines the subspace spanned by the two colors of interest using principal component analysis (PCA) with dimension reduction. This subspace is a two-dimensional space, allowing for color vector determination by considering the point density peaks. A new scoring stage is also developed in our method that, again, avoids reference images, making the procedure more robust and less dependent on parameters. Semi-quantitative image scoring experiments using five categories exhibit promising and consistent results when compared to manual scoring carried out by experts. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

18 pages, 4042 KiB  
Article
Federated Learning Backdoor Attack Based on Frequency Domain Injection
by Jiawang Liu, Changgen Peng, Weijie Tan and Chenghui Shi
Entropy 2024, 26(2), 164; https://doi.org/10.3390/e26020164 - 14 Feb 2024
Viewed by 821
Abstract
Federated learning (FL) is a distributed machine learning framework that enables scattered participants to collaboratively train machine learning models without revealing information to other participants. Due to its distributed nature, FL is susceptible to being manipulated by malicious clients. These malicious clients can [...] Read more.
Federated learning (FL) is a distributed machine learning framework that enables scattered participants to collaboratively train machine learning models without revealing information to other participants. Due to its distributed nature, FL is susceptible to being manipulated by malicious clients. These malicious clients can launch backdoor attacks by contaminating local data or tampering with local model gradients, thereby damaging the global model. However, existing backdoor attacks in distributed scenarios have several vulnerabilities. For example, (1) the triggers in distributed backdoor attacks are mostly visible and easily perceivable by humans; (2) these triggers are mostly applied in the spatial domain, inevitably corrupting the semantic information of the contaminated pixels. To address these issues, this paper introduces a frequency-domain injection-based backdoor attack in FL. Specifically, by performing a Fourier transform, the trigger and the clean image are linearly mixed in the frequency domain, injecting the low-frequency information of the trigger into the clean image while preserving its semantic information. Experiments on multiple image classification datasets demonstrate that the attack method proposed in this paper is stealthier and more effective in FL scenarios compared to existing attack methods. Full article
Show Figures

Figure 1

11 pages, 2914 KiB  
Article
Entropic Dynamics of Mutations in SARS-CoV-2 Genomic Sequences
by Marco Favretti
Entropy 2024, 26(2), 163; https://doi.org/10.3390/e26020163 - 14 Feb 2024
Viewed by 747
Abstract
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every [...] Read more.
In this paper, we investigate a certain class of mutations in genomic sequences by studying the evolution of the entropy and relative entropy associated with the base frequencies of a given genomic sequence. Even if the method is, in principle, applicable to every sequence which varies randomly, the case of SARS-CoV-2 RNA genome is particularly interesting to analyze, due to the richness of the available sequence database containing more than a million sequences. Our model is able to track known features of the mutation dynamics like the Cytosine–Thymine bias, but also to reveal new features of the virus mutation dynamics. We show that these new findings can be studied using an approach that combines the mean field approximation of a Markov dynamics within a stochastic thermodynamics framework. Full article
(This article belongs to the Special Issue Entropy and Information in Biological Systems)
Show Figures

Figure 1

16 pages, 365 KiB  
Article
FSN: Joint Entity and Relation Extraction Based on Filter Separator Network
by Qicai Dai, Wenzhong Yang, Fuyuan Wei, Liang He and Yuanyuan Liao
Entropy 2024, 26(2), 162; https://doi.org/10.3390/e26020162 - 12 Feb 2024
Viewed by 772
Abstract
Joint entity and relation extraction methods have attracted an increasing amount of attention recently due to their capacity to extract relational triples from intricate texts. However, most of the existing methods ignore the association and difference between the Named Entity Recognition (NER) subtask [...] Read more.
Joint entity and relation extraction methods have attracted an increasing amount of attention recently due to their capacity to extract relational triples from intricate texts. However, most of the existing methods ignore the association and difference between the Named Entity Recognition (NER) subtask features and the Relation Extraction (RE) subtask features, which leads to an imbalance in the interaction between these two subtasks. To solve the above problems, we propose a new joint entity and relation extraction method, FSN. It contains a Filter Separator Network (FSN) module that employs a two-direction LSTM to filter and separate the information contained in a sentence and merges similar features through a splicing operation, thus solving the problem of the interaction imbalance between subtasks. In order to better extract the local feature information for each subtask, we designed a Named Entity Recognition Generation (NERG) module and a Relation Extraction Generation (REG) module by adopting the design idea of the decoder in Transformer and average pooling operations to better capture the entity boundary information in the sentence and the entity pair boundary information for each relation in the relational triple, respectively. Additionally, we propose a dynamic loss function that dynamically adjusts the learning weights of each subtask in each epoch according to the proportionality between each subtask, thus narrowing down the difference between the ideal and realistic results. We thoroughly evaluated our model on the SciERC dataset and the ACE2005 dataset. The experimental results demonstrate that our model achieves satisfactory results compared to the baseline model. Full article
Show Figures

Figure 1

23 pages, 3863 KiB  
Article
Weighted Signed Networks Reveal Interactions between US Foreign Exchange Rates
by Leixin Yang, Haiying Wang, Changgui Gu and Huijie Yang
Entropy 2024, 26(2), 161; https://doi.org/10.3390/e26020161 - 12 Feb 2024
Viewed by 749
Abstract
Correlations between exchange rates are valuable for illuminating the dynamics of international trade and the financial dynamics of countries. This paper explores the changing interactions of the US foreign exchange market based on detrended cross-correlation analysis. First, we propose an objective way to [...] Read more.
Correlations between exchange rates are valuable for illuminating the dynamics of international trade and the financial dynamics of countries. This paper explores the changing interactions of the US foreign exchange market based on detrended cross-correlation analysis. First, we propose an objective way to choose a time scale parameter appropriate for comparing different samples by maximizing the summed magnitude of all DCCA coefficients. We then build weighted signed networks under this optimized time scale, which can clearly display the complex relationships between different exchange rates. Our study shows negative cross-correlations have become pyramidally rare in the past three decades. Both the number and strength of positive cross-correlations have grown, paralleling the increase in global interconnectivity. The balanced strong triads are identified subsequently after the network centrality analysis. Generally, while the strong development links revealed by foreign exchange have begun to spread to Asia since 2010, Europe is still the center of world finance, with the euro and Danish krone consistently maintaining the closest balanced development relationship. Finally, we propose a fluctuation propagation algorithm to investigate the propagation pattern of fluctuations in the inferred exchange rate networks. The results show that, over time, fluctuation propagation patterns have become simpler and more predictable. Full article
(This article belongs to the Special Issue New Challenges in Contemporary Statistical Physics)
Show Figures

Figure 1

13 pages, 327 KiB  
Article
Estimation of Weighted Extropy with Focus on Its Use in Reliability Modeling
by Muhammed Rasheed Irshad, Krishnakumar Archana, Radhakumari Maya and Maria Longobardi
Entropy 2024, 26(2), 160; https://doi.org/10.3390/e26020160 - 11 Feb 2024
Viewed by 749
Abstract
In the literature, estimation of weighted extropy is infrequently considered. In this paper, some non-parametric estimators of weighted extropy are given. The validation and comparison of the estimators are implemented with the help of simulation study and data illustration. The usefulness of the [...] Read more.
In the literature, estimation of weighted extropy is infrequently considered. In this paper, some non-parametric estimators of weighted extropy are given. The validation and comparison of the estimators are implemented with the help of simulation study and data illustration. The usefulness of the estimators is demonstrated using real data sets. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

25 pages, 7879 KiB  
Article
Entropy-Based Node Importance Identification Method for Public Transportation Infrastructure Coupled Networks: A Case Study of Chengdu
by Ziqiang Zeng, Yupeng Sun and Xinru Zhang
Entropy 2024, 26(2), 159; https://doi.org/10.3390/e26020159 - 11 Feb 2024
Viewed by 770
Abstract
Public transportation infrastructure is a typical, complex, coupled network that is usually composed of connected bus lines and subway networks. This study proposes an entropy-based node importance identification method for this type of coupled network that is helpful for the integrated planning of [...] Read more.
Public transportation infrastructure is a typical, complex, coupled network that is usually composed of connected bus lines and subway networks. This study proposes an entropy-based node importance identification method for this type of coupled network that is helpful for the integrated planning of urban public transport and traffic flows, as well as enhancing network information dissemination and maintaining network resilience. The proposed method develops a systematic entropy-based metric based on five centrality metrics, namely the degree centrality (DC), betweenness centrality (BC), closeness centrality (CC), eigenvector centrality (EC), and clustering coefficient (CCO). It then identifies the most important nodes in the coupled networks by considering the information entropy of the nodes and their neighboring ones. To evaluate the performance of the proposed method, a bus–subway coupled network in Chengdu, containing 10,652 nodes and 15,476 edges, is employed as a case study. Four network resilience assessment metrics, namely the maximum connectivity coefficient (MCC), network efficiency (NE), susceptibility (S), and natural connectivity (NC), were used to conduct group experiments. The experimental results demonstrate the following: (1) the multi-functional fitting analysis improves the analytical accuracy by 30% as compared to fitting with power law functions only; (2) for both CC and CCO, the improved metric’s performance in important node identification is greatly improved, and it demonstrates good network resilience. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

22 pages, 16652 KiB  
Review
Reminiscences of Half a Century of Life in the World of Theoretical Physics
by Constantino Tsallis
Entropy 2024, 26(2), 158; https://doi.org/10.3390/e26020158 - 11 Feb 2024
Viewed by 833
Abstract
Selma Lagerlöf said that culture is what remains when one has forgotten everything we had learned. Without any warranty, through ongoing research tasks, that I will ever attain this high level of wisdom, I simply share here reminiscences that have played, during my [...] Read more.
Selma Lagerlöf said that culture is what remains when one has forgotten everything we had learned. Without any warranty, through ongoing research tasks, that I will ever attain this high level of wisdom, I simply share here reminiscences that have played, during my life, an important role in my incursions in science, mainly in theoretical physics. I end by presenting some perspectives for future developments. Full article
Show Figures

Figure 1

14 pages, 307 KiB  
Article
Fried-Yennie Gauge in Pseudo-QED
by Ana Mizher, Alfredo Raya and Khépani Raya
Entropy 2024, 26(2), 157; https://doi.org/10.3390/e26020157 - 11 Feb 2024
Viewed by 684
Abstract
The Fried-Yennie gauge is a covariant gauge for which the mass-shell renormalization procedure can be performed without introducing spurious infrared divergences to the theory. It is usually applied in calculations in regular Quantum Electrodynamics (QED), but it is particularly interesting when employed in [...] Read more.
The Fried-Yennie gauge is a covariant gauge for which the mass-shell renormalization procedure can be performed without introducing spurious infrared divergences to the theory. It is usually applied in calculations in regular Quantum Electrodynamics (QED), but it is particularly interesting when employed in the framework of pseudo-QED (PQED), where fermions are constrained to 2 + 1 dimensions while the dynamical fields interacting with these fermions live in the bulk of a 3 + 1 space. In this context, the gauge parameter can be adjusted to match the power of the external momentum in the denominator of the photon propagator, simplifying the infrared region without the need for a photon mass. In this work, we apply this machinery, for the first time, to PQED, generalizing the procedure to calculate the self energy in arbitrary dimensions, allowing, of course, for different dimensionalities of fermions and gauge fields. Full article
(This article belongs to the Special Issue PQED: 30 Years of Reduced Quantum Electrodynamics)
16 pages, 734 KiB  
Article
Fluctuation Relation for the Dissipative Flux: The Role of Dynamics, Correlations and Heat Baths
by Xubin Lin, Lamberto Rondoni and Hong Zhao
Entropy 2024, 26(2), 156; https://doi.org/10.3390/e26020156 - 11 Feb 2024
Viewed by 788
Abstract
The fluctuation relation stands as a fundamental result in nonequilibrium statistical physics. Its derivation, particularly in the stationary state, places stringent conditions on the physical systems of interest. On the other hand, numerical analyses usually do not directly reveal any specific connection with [...] Read more.
The fluctuation relation stands as a fundamental result in nonequilibrium statistical physics. Its derivation, particularly in the stationary state, places stringent conditions on the physical systems of interest. On the other hand, numerical analyses usually do not directly reveal any specific connection with such physical properties. This study proposes an investigation of such a connection with the fundamental ingredients of the derivation of the fluctuation relation for the dissipation, which includes the decay of correlations, in the case of heat transport in one-dimensional systems. The role of the heat baths in connection with the system’s inherent properties is then highlighted. A crucial discovery of our research is that different lattice models obeying the steady-state fluctuation relation may do so through fundamentally different mechanisms, characterizing their intrinsic nature. Systems with normal heat conduction, such as the lattice ϕ4 model, comply with the theorem after surpassing a certain observational time window, irrespective of lattice size. In contrast, systems characterized by anomalous heat conduction, such as Fermi–Pasta–Ulam–Tsingou-β and harmonic oscillator chains, require extended observation periods for theoretical alignment, particularly as the lattice size increases. In these systems, the heat bath’s fluctuations significantly influence the entire lattice, linking the system’s fluctuations with those of the bath. Here, the current autocorrelation function allows us to discern the varying conditions under which different systems satisfy with the fluctuation relation. Our findings significantly expand the understanding of the stationary fluctuation relation and its broader implications in the field of nonequilibrium phenomena. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

15 pages, 5469 KiB  
Article
A Fast Algorithm for Estimating Two-Dimensional Sample Entropy Based on an Upper Confidence Bound and Monte Carlo Sampling
by Zeheng Zhou, Ying Jiang, Weifeng Liu, Ruifan Wu, Zerong Li and Wenchao Guan
Entropy 2024, 26(2), 155; https://doi.org/10.3390/e26020155 - 10 Feb 2024
Viewed by 697
Abstract
The two-dimensional sample entropy marks a significant advance in evaluating the regularity and predictability of images in the information domain. Unlike the direct computation of sample entropy, which incurs a time complexity of O(N2) for the series with N [...] Read more.
The two-dimensional sample entropy marks a significant advance in evaluating the regularity and predictability of images in the information domain. Unlike the direct computation of sample entropy, which incurs a time complexity of O(N2) for the series with N length, the Monte Carlo-based algorithm for computing one-dimensional sample entropy (MCSampEn) markedly reduces computational costs by minimizing the dependence on N. This paper extends MCSampEn to two dimensions, referred to as MCSampEn2D. This new approach substantially accelerates the estimation of two-dimensional sample entropy, outperforming the direct method by more than a thousand fold. Despite these advancements, MCSampEn2D encounters challenges with significant errors and slow convergence rates. To counter these issues, we have incorporated an upper confidence bound (UCB) strategy in MCSampEn2D. This strategy involves assigning varied upper confidence bounds in each Monte Carlo experiment iteration to enhance the algorithm’s speed and accuracy. Our evaluation of this enhanced approach, dubbed UCBMCSampEn2D, involved the use of medical and natural image data sets. The experiments demonstrate that UCBMCSampEn2D achieves a 40% reduction in computational time compared to MCSampEn2D. Furthermore, the errors with UCBMCSampEn2D are only 30% of those observed in MCSampEn2D, highlighting its improved accuracy and efficiency. Full article
(This article belongs to the Special Issue Entropy in Biomedical Engineering II)
Show Figures

Figure 1

24 pages, 7314 KiB  
Article
A Hybrid Cryptosystem Incorporating a New Algorithm for Improved Entropy
by Víctor Manuel Silva-García, Rolando Flores-Carapia and Manuel Alejandro Cardona-López
Entropy 2024, 26(2), 154; https://doi.org/10.3390/e26020154 - 10 Feb 2024
Viewed by 857
Abstract
Today, safeguarding sensitive content through encryption is crucial. This work presents a hybrid cryptosystem for images that employs both asymmetric and symmetric encryption. The asymmetric component involves applying the Diffie–Hellman protocol and the ElGamal cryptosystem to securely transmit two constants. These constants are [...] Read more.
Today, safeguarding sensitive content through encryption is crucial. This work presents a hybrid cryptosystem for images that employs both asymmetric and symmetric encryption. The asymmetric component involves applying the Diffie–Hellman protocol and the ElGamal cryptosystem to securely transmit two constants. These constants are necessary for the symmetrical aspect to generate dynamic permutations, substitution boxes, and round keys. Following an encryption process with fourteen rounds, the encrypted images are processed by an algorithm proposed to enhance entropy, a critical metric for assessing encryption quality. It increases the frequencies of the basic colors to achieve a histogram closely resembling a uniform distribution, but it increases the image size by approximately 8%. This improves the entropy values achieved by the hybrid cryptosystem, bringing them remarkably close to the ideal value of 8.0. In specific instances, the entropy values were elevated from 7.99926 to 8.0. The proposed method exhibits resilience against various attacks, including differential, linear, brute force, and algebraic attacks, as evaluated through the entropy, correlation, goodness of fit, Discrete Fourier Transform (DFT), Number of Pixels Change Rate (NPCR), Unified Average Changing Intensity (UACI), Avalanche Criteria (AC), contrast, energy, and homogeneity. Further, encrypted images are subjected to noise attacks ranging from 20% to 50% noise, including additive, multiplicative, occlusion noise, as well as the newly introduced χ2 noise. The noise damage is quantified using the proposed Similarity Parameter (SP), and a 3 × 3 median filter is employed to enhance the visual quality. Full article
Show Figures

Figure 1

0 pages, 1818 KiB  
Article
Quantum Switch as a Thermodynamic Resource in the Context of Passive States
by Otavio A. D. Molitor and Łukasz Rudnicki
Entropy 2024, 26(2), 153; https://doi.org/10.3390/e26020153 - 10 Feb 2024
Viewed by 686
Abstract
In recent years, many works have explored possible advantages of indefinite causal order, with the main focus on its controlled implementation known as quantum switch. In this paper, we tackle advantages in quantum thermodynamics, studying whether quantum switch is capable of activating [...] Read more.
In recent years, many works have explored possible advantages of indefinite causal order, with the main focus on its controlled implementation known as quantum switch. In this paper, we tackle advantages in quantum thermodynamics, studying whether quantum switch is capable of activating a passive state, either alone or with extra resources (active control state) and/or operations (measurement of the control system). By disproving the first possibility and confirming the second one, we show that quantum switch is not a thermodynamic resource in the discussed context, though it can facilitate work extraction given external resources. We discuss our findings by considering specific examples: a qubit system subject to rotations around the x and y axes in the Bloch sphere, as well as general unitaries from the U(2) group; and the system as a quantum harmonic oscillator with displacement operators, as well as with a combination of displacement and squeeze operators. Full article
(This article belongs to the Special Issue Advances in Quantum Thermodynamics)
Show Figures

Figure 1

14 pages, 394 KiB  
Article
A Multi-Information Spreading Model for One-Time Retweet Information in Complex Networks
by Kaidi Zhao, Dingding Han, Yihong Bao, Jianghai Qian and Ruiqi Yang
Entropy 2024, 26(2), 152; https://doi.org/10.3390/e26020152 - 09 Feb 2024
Viewed by 684
Abstract
In the realm of online social networks, the spreading of information is influenced by a complex interplay of factors. To explore the dynamics of one-time retweet information spreading, we propose a Susceptible–Infected–Completed (SIC) multi-information spreading model. This model captures how multiple pieces of [...] Read more.
In the realm of online social networks, the spreading of information is influenced by a complex interplay of factors. To explore the dynamics of one-time retweet information spreading, we propose a Susceptible–Infected–Completed (SIC) multi-information spreading model. This model captures how multiple pieces of information interact in online social networks by introducing inhibiting and enhancement factors. The SIC model considers the completed state, where nodes cease to spread a particular piece of information after transmitting it. It also takes into account the impact of past and present information received from neighboring nodes, dynamically calculating the probability of nodes spreading each piece of information at any given moment. To analyze the dynamics of multiple information pieces in various scenarios, such as mutual enhancement, partial competition, complete competition, and coexistence of competition and enhancement, we conduct experiments on BA scale-free networks and the Twitter network. Our findings reveal that competing information decreases the likelihood of its spread while cooperating information amplifies the spreading of mutually beneficial content. Furthermore, the strength of the enhancement factor between different information pieces determines their spread when competition and cooperation coexist. These insights offer a fresh perspective for understanding the patterns of information propagation in multiple contexts. Full article
Show Figures

Figure 1

53 pages, 9198 KiB  
Article
Theoretical Improvements in Enzyme Efficiency Associated with Noisy Rate Constants and Increased Dissipation
by Davor Juretić and Željana Bonačić Lošić
Entropy 2024, 26(2), 151; https://doi.org/10.3390/e26020151 - 09 Feb 2024
Viewed by 800
Abstract
Previous studies have revealed the extraordinarily large catalytic efficiency of some enzymes. High catalytic proficiency is an essential accomplishment of biological evolution. Natural selection led to the increased turnover number, kcat, and enzyme efficiency, kcat/KM, of uni–uni [...] Read more.
Previous studies have revealed the extraordinarily large catalytic efficiency of some enzymes. High catalytic proficiency is an essential accomplishment of biological evolution. Natural selection led to the increased turnover number, kcat, and enzyme efficiency, kcat/KM, of uni–uni enzymes, which convert a single substrate into a single product. We added or multiplied random noise with chosen rate constants to explore the correlation between dissipation and catalytic efficiency for ten enzymes: beta-galactosidase, glucose isomerase, β-lactamases from three bacterial strains, ketosteroid isomerase, triosephosphate isomerase, and carbonic anhydrase I, II, and T200H. Our results highlight the role of biological evolution in accelerating thermodynamic evolution. The catalytic performance of these enzymes is proportional to overall entropy production—the main parameter from irreversible thermodynamics. That parameter is also proportional to the evolutionary distance of β-lactamases PC1, RTEM, and Lac-1 when natural or artificial evolution produces the optimal or maximal possible catalytic efficiency. De novo enzyme design and attempts to speed up the rate-limiting catalytic steps may profit from the described connection between kinetics and thermodynamics. Full article
(This article belongs to the Special Issue Entropy, Time and Evolution II)
Show Figures

Figure 1

17 pages, 3100 KiB  
Article
Fast Model Selection and Hyperparameter Tuning for Generative Models
by Luming Chen and Sujit K. Ghosh
Entropy 2024, 26(2), 150; https://doi.org/10.3390/e26020150 - 09 Feb 2024
Viewed by 740
Abstract
Generative models have gained significant attention in recent years. They are increasingly used to estimate the underlying structure of high-dimensional data and artificially generate various kinds of data similar to those from the real world. The performance of generative models depends critically on [...] Read more.
Generative models have gained significant attention in recent years. They are increasingly used to estimate the underlying structure of high-dimensional data and artificially generate various kinds of data similar to those from the real world. The performance of generative models depends critically on a good set of hyperparameters. Yet, finding the right hyperparameter configuration can be an extremely time-consuming task. In this paper, we focus on speeding up the hyperparameter search through adaptive resource allocation, early stopping underperforming candidates quickly and allocating more computational resources to promising ones by comparing their intermediate performance. The hyperparameter search is formulated as a non-stochastic best-arm identification problem where resources like iterations or training time constrained by some predetermined budget are allocated to different hyperparameter configurations. A procedure which uses hypothesis testing coupled with Successive Halving is proposed to make the resource allocation and early stopping decisions and compares the intermediate performance of generative models by their exponentially weighted Maximum Means Discrepancy (MMD). The experimental results show that the proposed method selects hyperparameter configurations that lead to a significant improvement in the model performance compared to Successive Halving for a wide range of budgets across several real-world applications. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

33 pages, 30151 KiB  
Article
Comparison of Graph Distance Measures for Movie Similarity Using a Multilayer Network Model
by Majda Lafhel, Hocine Cherifi, Benjamin Renoust and Mohammed El Hassouni
Entropy 2024, 26(2), 149; https://doi.org/10.3390/e26020149 - 08 Feb 2024
Viewed by 880
Abstract
Graph distance measures have emerged as an effective tool for evaluating the similarity or dissimilarity between graphs. Recently, there has been a growing trend in the application of movie networks to analyze and understand movie stories. Previous studies focused on computing the distance [...] Read more.
Graph distance measures have emerged as an effective tool for evaluating the similarity or dissimilarity between graphs. Recently, there has been a growing trend in the application of movie networks to analyze and understand movie stories. Previous studies focused on computing the distance between individual characters in narratives and identifying the most important ones. Unlike previous techniques, which often relied on representing movie stories through single-layer networks based on characters or keywords, a new multilayer network model was developed to allow a more comprehensive representation of movie stories, including character, keyword, and location aspects. To assess the similarities among movie stories, we propose a methodology that utilizes a multilayer network model and layer-to-layer distance measures. We aim to quantify the similarity between movie networks by verifying two aspects: (i) regarding many components of the movie story and (ii) quantifying the distance between their corresponding movie networks. We tend to explore how five graph distance measures reveal the similarity between movie stories in two aspects: (i) finding the order of similarity among movies within the same genre, and (ii) classifying movie stories based on genre. We select movies from various genres: sci-fi, horror, romance, and comedy. We extract movie stories from movie scripts regarding character, keyword, and location entities to perform this. Then, we compute the distance between movie networks using different methods, such as the network portrait divergence, the network Laplacian spectra descriptor (NetLSD), the network embedding as matrix factorization (NetMF), the Laplacian spectra, and D-measure. The study shows the effectiveness of different methods for identifying similarities among various genres and classifying movies across different genres. The results suggest that the efficiency of an approach on a specific network type depends on its capacity to capture the inherent network structure of that type. We propose incorporating the approach into movie recommendation systems. Full article
Show Figures

Figure 1

14 pages, 5658 KiB  
Article
Multifractal Multiscale Analysis of Human Movements during Cognitive Tasks
by Andrea Faini, Laurent M. Arsac, Veronique Deschodt-Arsac and Paolo Castiglioni
Entropy 2024, 26(2), 148; https://doi.org/10.3390/e26020148 - 08 Feb 2024
Viewed by 1089
Abstract
Continuous adaptations of the movement system to changing environments or task demands rely on superposed fractal processes exhibiting power laws, that is, multifractality. The estimators of the multifractal spectrum potentially reflect the adaptive use of perception, cognition, and action. To observe time-specific behavior [...] Read more.
Continuous adaptations of the movement system to changing environments or task demands rely on superposed fractal processes exhibiting power laws, that is, multifractality. The estimators of the multifractal spectrum potentially reflect the adaptive use of perception, cognition, and action. To observe time-specific behavior in multifractal dynamics, a multiscale multifractal analysis based on DFA (MFMS-DFA) has been recently proposed and applied to cardiovascular dynamics. Here we aimed at evaluating whether MFMS-DFA allows identifying multiscale structures in the dynamics of human movements. Thirty-six (12 females) participants pedaled freely, after a metronomic initiation of the cadence at 60 rpm, against a light workload for 10 min: in reference to cycling (C), cycling while playing “Tetris” on a computer, alone (CT) or collaboratively (CTC) with another pedaling participant. Pedal revolution periods (PRP) series were examined with MFMS-DFA and compared to linearized surrogates, which attested to a presence of multifractality at almost all scales. A marked alteration in multifractality when playing Tetris was evidenced at two scales, τ ≈ 16 and τ ≈ 64 s, yet less marked at τ ≈ 16 s when playing collaboratively. Playing Tetris in collaboration attenuated these alterations, especially in the best Tetris players. This observation suggests the high sensitivity to cognitive demand of MFMS-DFA estimators, extending to the assessment of skill/demand interplay from individual behavior. So, by identifying scale-dependent multifractal structures in movement dynamics, MFMS-DFA has obvious potential for examining brain-movement coordinative structures, likely with sufficient sensitivity to find echo in diagnosing disorders and monitoring the progress of diseases that affect cognition and movement control. Full article
Show Figures

Figure 1

12 pages, 363 KiB  
Article
Relativistic Heat Conduction in the Large-Flux Regime
by Lorenzo Gavassino
Entropy 2024, 26(2), 147; https://doi.org/10.3390/e26020147 - 08 Feb 2024
Viewed by 651
Abstract
We propose a general procedure for evaluating, directly from microphysics, the constitutive relations of heat-conducting fluids in regimes of large fluxes of heat. Our choice of hydrodynamic formalism is Carter’s two-fluid theory, which happens to coincide with Öttinger’s GENERIC theory for relativistic heat [...] Read more.
We propose a general procedure for evaluating, directly from microphysics, the constitutive relations of heat-conducting fluids in regimes of large fluxes of heat. Our choice of hydrodynamic formalism is Carter’s two-fluid theory, which happens to coincide with Öttinger’s GENERIC theory for relativistic heat conduction. This is a natural framework, as it should correctly describe the relativistic “inertia of heat” as well as the subtle interplay between reversible and irreversible couplings. We provide two concrete applications of our procedure, where the constitutive relations are evaluated, respectively, from maximum entropy hydrodynamics and Chapman–Enskog theory. Full article
(This article belongs to the Special Issue Causal Relativistic Hydrodynamics for Viscous Fluids)
Show Figures

Figure 1

22 pages, 347 KiB  
Article
Bilocal Field Theory for Composite Scalar Bosons
by Christopher T. Hill
Entropy 2024, 26(2), 146; https://doi.org/10.3390/e26020146 - 08 Feb 2024
Cited by 1 | Viewed by 635
Abstract
We give a bilocal field theory description of a composite scalar with an extended binding potential that reduces to the Nambu–Jona-Lasinio (NJL) model in the pointlike limit. This provides a description of the internal dynamics of the bound state and features a static [...] Read more.
We give a bilocal field theory description of a composite scalar with an extended binding potential that reduces to the Nambu–Jona-Lasinio (NJL) model in the pointlike limit. This provides a description of the internal dynamics of the bound state and features a static internal wave function, ϕ(r), in the center-of-mass frame that satisfies a Schrödinger–Klein–Gordon equation with eigenvalues m2. We analyze the “coloron” model (single perturbative massive gluon exchange) which yields a UV completion of the NJL model. This has a BCS-like enhancement of its interaction, Nc the number of colors, and is classically critical with gcritical remarkably close to the NJL quantum critical coupling. Negative eigenvalues for m2 lead to spontaneous symmetry breaking, and the Yukawa coupling of the bound state to constituent fermions is emergent. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop