Previous Issue
Volume 27, June
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 27, Issue 7 (July 2025) – 93 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
14 pages, 10668 KiB  
Article
Effect of Grain Size Distribution on Frictional Wear and Corrosion Properties of (FeCoNi)86Al7Ti7 High-Entropy Alloys
by Qinhu Sun, Pan Ma, Hong Yang, Kaiqiang Xie, Shiguang Wan, Chunqi Sheng, Zhibo Chen, Hongji Yang, Yandong Jia and Konda Gokuldoss Prashanth
Entropy 2025, 27(7), 747; https://doi.org/10.3390/e27070747 (registering DOI) - 12 Jul 2025
Abstract
Optimization of grain size distribution in high-entropy alloys (HEAs) is a promising design strategy to overcome wear and corrosion resistance. In this study, a (FeCoNi)86Al7Ti7 high-entropy alloy with customized isometric and heterogeneous structure, as well as fine-crystal isometric [...] Read more.
Optimization of grain size distribution in high-entropy alloys (HEAs) is a promising design strategy to overcome wear and corrosion resistance. In this study, a (FeCoNi)86Al7Ti7 high-entropy alloy with customized isometric and heterogeneous structure, as well as fine-crystal isometric design by SPS, is investigated for microstructure, surface morphology, hardness, frictional wear, and corrosion resistance. The effects of the SPS process on the microstructure and mechanical behavior are elucidated, and the frictional wear and corrosion resistance of the alloys are improved with heterogeneous structural fine-grain strengthening and uniform fine-grain strengthening. The wear mechanisms and corrosion behavior mechanisms of (FeCoNi)86Al7Ti7 HEAs with different phase structure designs are elaborated. This work highlights the potential of using powder metallurgy to efficiently and precisely control and optimize the multi-scale microstructure of high-entropy alloys, thereby improving their frictional wear and corrosion properties in demanding applications. Full article
(This article belongs to the Special Issue Recent Advances in High Entropy Alloys)
23 pages, 3614 KiB  
Article
A Multimodal Semantic-Enhanced Attention Network for Fake News Detection
by Weijie Chen, Yuzhuo Dang and Xin Zhang
Entropy 2025, 27(7), 746; https://doi.org/10.3390/e27070746 (registering DOI) - 12 Jul 2025
Abstract
The proliferation of social media platforms has triggered an unprecedented increase in multimodal fake news, creating pressing challenges for content authenticity verification. Current fake news detection systems predominantly rely on isolated unimodal analysis (text or image), failing to exploit critical cross-modal correlations or [...] Read more.
The proliferation of social media platforms has triggered an unprecedented increase in multimodal fake news, creating pressing challenges for content authenticity verification. Current fake news detection systems predominantly rely on isolated unimodal analysis (text or image), failing to exploit critical cross-modal correlations or leverage latent social context cues. To bridge this gap, we introduce the SCCN (Semantic-enhanced Cross-modal Co-attention Network), a novel framework that synergistically combines multimodal features with refined social graph signals. Our approach innovatively combines text, image, and social relation features through a hierarchical fusion framework. First, we extract modality-specific features and enhance semantics by identifying entities in both text and visual data. Second, an improved co-attention mechanism selectively integrates social relations while removing irrelevant connections to reduce noise and exploring latent informative links. Finally, the model is optimized via cross-entropy loss with entropy minimization. Experimental results for benchmark datasets (PHEME and Weibo) show that SCCN consistently outperforms existing approaches, achieving relative accuracy enhancements of 1.7% and 1.6% over the best-performing baseline methods in each dataset. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

32 pages, 735 KiB  
Article
Dynamic Balance: A Thermodynamic Principle for the Emergence of the Golden Ratio in Open Non-Equilibrium Steady States
by Alejandro Ruiz
Entropy 2025, 27(7), 745; https://doi.org/10.3390/e27070745 - 11 Jul 2025
Abstract
We develop a symmetry-based variational theory that shows the coarse-grained balance of work inflow to heat outflow in a driven, dissipative system relaxed to the golden ratio. Two order-2 Möbius transformations—a self-dual flip and a self-similar shift—generate a discrete non-abelian subgroup of [...] Read more.
We develop a symmetry-based variational theory that shows the coarse-grained balance of work inflow to heat outflow in a driven, dissipative system relaxed to the golden ratio. Two order-2 Möbius transformations—a self-dual flip and a self-similar shift—generate a discrete non-abelian subgroup of PGL(2,Q(5)). Requiring any smooth, strictly convex Lyapunov functional to be invariant under both maps enforces a single non-equilibrium fixed point: the golden mean. We confirm this result by (i) a gradient-flow partial-differential equation, (ii) a birth–death Markov chain whose continuum limit is Fokker–Planck, (iii) a Martin–Siggia–Rose field theory, and (iv) exact Ward identities that protect the fixed point against noise. Microscopic kinetics merely set the approach rate; three parameter-free invariants emerge: a 62%:38% split between entropy production and useful power, an RG-invariant diffusion coefficient linking relaxation time and correlation length Dα=ξz/τ, and a ϑ=45 eigen-angle that maps to the golden logarithmic spiral. The same dual symmetry underlies scaling laws in rotating turbulence, plant phyllotaxis, cortical avalanches, quantum critical metals, and even de-Sitter cosmology, providing a falsifiable, unifying principle for pattern formation far from equilibrium. Full article
(This article belongs to the Section Entropy and Biology)
Show Figures

Figure 1

37 pages, 2539 KiB  
Article
Enhancing s-CO2 Brayton Power Cycle Efficiency in Cold Ambient Conditions Through Working Fluid Blends
by Paul Tafur-Escanta, Luis Coco-Enríquez, Robert Valencia-Chapi and Javier Muñoz-Antón
Entropy 2025, 27(7), 744; https://doi.org/10.3390/e27070744 - 11 Jul 2025
Abstract
Supercritical carbon dioxide (s-CO2) Brayton cycles have emerged as a promising technology for high-efficiency power generation, owing to their compact architecture and favorable thermophysical properties. However, their performance degrades significantly under cold-climate conditions—such as those encountered in Greenland, Russia, Canada, Scandinavia, [...] Read more.
Supercritical carbon dioxide (s-CO2) Brayton cycles have emerged as a promising technology for high-efficiency power generation, owing to their compact architecture and favorable thermophysical properties. However, their performance degrades significantly under cold-climate conditions—such as those encountered in Greenland, Russia, Canada, Scandinavia, and Alaska—due to the proximity to the fluid’s critical point. This study investigates the behavior of the recompression Brayton cycle (RBC) under subzero ambient temperatures through the incorporation of low-critical-temperature additives to create CO2-based binary mixtures. The working fluids examined include methane (CH4), tetrafluoromethane (CF4), nitrogen trifluoride (NF3), and krypton (Kr). Simulation results show that CH4- and CF4-rich mixtures can achieve thermal efficiency improvements of up to 10 percentage points over pure CO2. NF3-containing blends yield solid performance in moderately cold environments, while Kr-based mixtures provide modest but consistent efficiency gains. At low compressor inlet temperatures, the high-temperature recuperator (HTR) becomes the dominant performance-limiting component. Optimal distribution of recuperator conductance (UA) favors increased HTR sizing when mixtures are employed, ensuring effective heat recovery across larger temperature differentials. The study concludes with a comparative exergy analysis between pure CO2 and mixture-based cycles in RBC architecture. The findings highlight the potential of custom-tailored working fluids to enhance thermodynamic performance and operational stability of s-CO2 power systems under cold-climate conditions. Full article
(This article belongs to the Section Thermodynamics)
21 pages, 1362 KiB  
Article
Decentralized Consensus Protocols on SO(4)N and TSO(4)N with Reshaping
by Eric A. Butcher and Vianella Spaeth
Entropy 2025, 27(7), 743; https://doi.org/10.3390/e27070743 - 11 Jul 2025
Abstract
Consensus protocols for a multi-agent networked system consist of strategies that align the states of all agents that share information according to a given network topology, despite challenges such as communication limitations, time-varying networks, and communication delays. The special orthogonal group [...] Read more.
Consensus protocols for a multi-agent networked system consist of strategies that align the states of all agents that share information according to a given network topology, despite challenges such as communication limitations, time-varying networks, and communication delays. The special orthogonal group SO(n) plays a key role in applications from rigid body attitude synchronization to machine learning on Lie groups, particularly in fields like physics-informed learning and geometric deep learning. In this paper, N-agent consensus protocols are proposed on the Lie group SO(4) and the corresponding tangent bundle TSO(4), in which the state spaces are SO(4)N and TSO(4)N, respectively. In particular, when using communication topologies such as a ring graph for which the local stability of non-consensus equilibria is retained in the closed loop, a consensus protocol that leverages a reshaping strategy is proposed to destabilize non-consensus equilibria and produce consensus with almost global stability on SO(4)N or TSO(4)N. Lyapunov-based stability guarantees are obtained, and simulations are conducted to illustrate the advantages of these proposed consensus protocols. Full article
(This article belongs to the Special Issue Lie Group Machine Learning)
Show Figures

Figure 1

26 pages, 4823 KiB  
Article
Robust Fractional Low Order Adaptive Linear Chirplet Transform and Its Application to Fault Analysis
by Junbo Long, Changshou Deng, Haibin Wang and Youxue Zhou
Entropy 2025, 27(7), 742; https://doi.org/10.3390/e27070742 - 11 Jul 2025
Abstract
Time-frequency analysis (TFA) technology is an important tool for analyzing non-Gaussian mechanical fault vibration signals. In the complex background of infinite variance process noise and Gaussian colored noise, it is difficult for traditional methods to obtain the highly concentrated time-frequency representation (TFR) of [...] Read more.
Time-frequency analysis (TFA) technology is an important tool for analyzing non-Gaussian mechanical fault vibration signals. In the complex background of infinite variance process noise and Gaussian colored noise, it is difficult for traditional methods to obtain the highly concentrated time-frequency representation (TFR) of fault vibration signals. Based on the insensitive property of fractional low-order statistics for infinite variance and Gaussian processes, robust fractional lower order adaptive linear chirplet transform (FLOACT) and fractional lower order adaptive scaling chirplet transform (FLOASCT) methods are proposed to suppress the mixed complex noise in this paper. The calculation steps and processes of the algorithms are summarized and deduced in detail. The experimental simulation results show that the improved FLOACT and FLOASCT methods have good effects on multi-component signals with short frequency intervals in the time-frequency domain and even cross-frequency trajectories in the strong impulse background noise environment. Finally, the proposed methods are applied to the feature analysis and extraction of the mechanical outer race fault vibration signals in complex background environments, and the results show that they have good estimation accuracy and effectiveness in lower MSNR, which indicate their robustness and adaptability. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

12 pages, 843 KiB  
Article
Thermalization in Asymmetric Harmonic Chains
by Weicheng Fu, Sihan Feng, Yong Zhang and Hong Zhao
Entropy 2025, 27(7), 741; https://doi.org/10.3390/e27070741 - 11 Jul 2025
Abstract
The symmetry of the interparticle interaction potential (IIP) plays a critical role in determining the thermodynamic and transport properties of solids. This study investigates the isolated effect of IIP asymmetry on thermalization. Asymmetry and nonlinearity are typically intertwined. To isolate the effect of [...] Read more.
The symmetry of the interparticle interaction potential (IIP) plays a critical role in determining the thermodynamic and transport properties of solids. This study investigates the isolated effect of IIP asymmetry on thermalization. Asymmetry and nonlinearity are typically intertwined. To isolate the effect of asymmetry, we introduce a one-dimensional asymmetric harmonic (AH) model whose IIP possesses asymmetry but no nonlinearity, evidenced by energy-independent vibrational frequencies. Extensive numerical simulations confirm a power-law relationship between thermalization time (Teq) and perturbation strength for the AH chain, revealing an exponent larger than the previously observed inverse-square law in the thermodynamic limit. Upon adding symmetric quartic nonlinearity into the AH model, we systematically study thermalization under combined asymmetry and nonlinearity. Matthiessen’s rule provides a good estimate of Teq in this case. Our results demonstrate that asymmetry plays a distinct role in enhancing higher-order effects and governing relaxation dynamics. Full article
Show Figures

Figure 1

14 pages, 2812 KiB  
Perspective
The Generation of Wind Velocity via Scale Invariant Gibbs Free Energy: Turbulence Drives the General Circulation
by Adrian F. Tuck
Entropy 2025, 27(7), 740; https://doi.org/10.3390/e27070740 - 10 Jul 2025
Abstract
The mechanism for the upscale deposition of energy into the atmosphere from molecules and photons up to organized wind systems is examined. This analysis rests on the statistical multifractal analysis of airborne observations. The results show that the persistence of molecular velocity after [...] Read more.
The mechanism for the upscale deposition of energy into the atmosphere from molecules and photons up to organized wind systems is examined. This analysis rests on the statistical multifractal analysis of airborne observations. The results show that the persistence of molecular velocity after collision in breaking the continuous translational symmetry of an equilibrated gas is causative. The symmetry breaking may be caused by excited photofragments with the associated persistence of molecular velocity after collision, interaction with condensed phase surfaces (solid or liquid), or, in a scaling environment, an adjacent scale having a different velocity and temperature. The relationship of these factors for the solution to the Navier–Stokes equation in an atmospheric context is considered. The scale invariant version of Gibbs free energy, carried by the most energetic molecules, enables the acceleration of organized flow (winds) from the smallest planetary scales by virtue of the nonlinearity of the mechanism, subject to dissipation by the more numerous average molecules maintaining an operational temperature via infrared radiation to the cold sink of space. The fastest moving molecules also affect the transfer of infrared radiation because their higher kinetic energy and the associated more-energetic collisions contribute more to the far wings of the spectral lines, where the collisional displacement from the central energy level gap is greatest and the lines are less self-absorbed. The relationship of events at these scales to macroscopic variables such as the thermal wind equation and its components will be considered in the Discussion section. An attempt is made to synthesize the mechanisms by which winds are generated and sustained, on all scales, by appealing to published works since 2003. This synthesis produces a view of the general circulation that includes thermodynamics and the defining role of turbulence in driving it. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

20 pages, 1765 KiB  
Article
Can Informativity Effects Be Predictability Effects in Disguise?
by Vsevolod Kapatsinski
Entropy 2025, 27(7), 739; https://doi.org/10.3390/e27070739 - 10 Jul 2025
Abstract
Recent work in corpus linguistics has observed that informativity predicts articulatory reduction of a linguistic unit above and beyond the unit’s predictability in the local context, i.e., the unit’s probability given the current context. Informativity of a unit is the inverse of average [...] Read more.
Recent work in corpus linguistics has observed that informativity predicts articulatory reduction of a linguistic unit above and beyond the unit’s predictability in the local context, i.e., the unit’s probability given the current context. Informativity of a unit is the inverse of average (log-scaled) predictability and corresponds to its information content. Research in the field has interpreted effects of informativity as speakers being sensitive to the information content of a unit in deciding how much effort to put into pronouncing it or as accumulation of memories of pronunciation details in long-term memory representations. However, average predictability can improve the estimate of local predictability of a unit above and beyond the observed predictability in that context, especially when that context is rare. Therefore, informativity can contribute to explaining variance in a dependent variable like reduction above and beyond local predictability simply because informativity improves the (inherently noisy) estimate of local predictability. This paper shows how to estimate the proportion of an observed informativity effect that is likely to be artifactual, due entirely to informativity improving the estimates of predictability, via simulation. The proposed simulation approach can be used to investigate whether an effect of informativity is likely to be real, under the assumption that corpus probabilities are an unbiased estimate of probabilities driving reduction behavior, and how much of it is likely to be due to noise in predictability estimates, in any real dataset. Full article
(This article belongs to the Special Issue Complexity Characteristics of Natural Language)
Show Figures

Figure 1

19 pages, 24556 KiB  
Article
Harmonic Aggregation Entropy: A Highly Discriminative Harmonic Feature Estimator for Time Series
by Ye Wang, Zhentao Yu, Cheng Chi, Bozhong Lei, Jianxin Pei and Dan Wang
Entropy 2025, 27(7), 738; https://doi.org/10.3390/e27070738 - 10 Jul 2025
Abstract
Harmonics are a common phenomenon widely present in power systems. The presence of harmonics not only increases the energy consumption of equipment but also poses hidden risks to the safety and stealth performance of large ships. Thus, there is an urgent need for [...] Read more.
Harmonics are a common phenomenon widely present in power systems. The presence of harmonics not only increases the energy consumption of equipment but also poses hidden risks to the safety and stealth performance of large ships. Thus, there is an urgent need for a detection method for the harmonic characteristics of time series. We propose a novel harmonic feature estimation method, termed Harmonic Aggregation Entropy (HaAgEn), which effectively discriminates against background noise. The method is based on bispectrum analysis; utilizing the distribution characteristics of harmonic signals in the bispectrum matrix, a new Diagonal Bi-directional Integral Bispectrum (DBIB) method is employed to effectively extract harmonic features within the bispectrum matrix. This approach addresses the issues associated with traditional time–frequency analysis methods, such as the large computational burden and lack of specificity in feature extraction. The integration results, Ix and Iy, of DBIB on different frequency axes are calculated using cross-entropy to derive HaAgEn. It is verified that HaAgEn is significantly more sensitive to harmonic components in the signal compared to other types of entropy, thereby better addressing harmonic detection issues and reducing feature redundancy. The detection accuracy of harmonic components in the shaft-rate electromagnetic field signal, as evidenced by sea trial data, reaches 96.8%, which is significantly higher than that of other detection methods. This provides a novel technical approach for addressing the issue of harmonic detection in industrial applications. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

14 pages, 1999 KiB  
Article
Asymmetric Protocols for Mode Pairing Quantum Key Distribution with Finite-Key Analysis
by Zhenhua Li, Tianqi Dou, Yuheng Xie, Weiwen Kong, Yang Liu, Haiqiang Ma and Jianjun Tang
Entropy 2025, 27(7), 737; https://doi.org/10.3390/e27070737 - 9 Jul 2025
Viewed by 63
Abstract
The mode pairing quantum key distribution (MP-QKD) protocol has attracted considerable attention for its capability to ensure high secure key rates over long distances without requiring global phase locking. However, ensuring symmetric channels for the MP-QKD protocol is challenging in practical quantum communication [...] Read more.
The mode pairing quantum key distribution (MP-QKD) protocol has attracted considerable attention for its capability to ensure high secure key rates over long distances without requiring global phase locking. However, ensuring symmetric channels for the MP-QKD protocol is challenging in practical quantum communication networks. Previous studies on the asymmetric MP-QKD protocol have relied on ideal decoy state assumptions and infinite-key analysis, which are unattainable for real-world deployment. In this paper, we conduct a security analysis of the asymmetric MP-QKD protocol with the finite-key analysis, where we discard the previously impractical assumptions made in the decoy state method. Combined with statistical fluctuation analysis, we globally optimized the 10 independent parameters in the asymmetric MP-QKD protocol by employing our modified particle swarm optimization. Through further analysis, the simulation results demonstrate that our work achieves improved secure key rates and transmission distances compared to the strategy with additional attenuation. We further investigate the relationship between the intensities and probabilities of signal, decoy, and vacuum states with transmission distance, facilitating their more efficient deployment in future quantum networks. Full article
(This article belongs to the Section Quantum Information)
Show Figures

Figure 1

26 pages, 3087 KiB  
Article
Pre-Warning for the Remaining Time to Alarm Based on Variation Rates and Mixture Entropies
by Zijiang Yang, Jiandong Wang, Honghai Li and Song Gao
Entropy 2025, 27(7), 736; https://doi.org/10.3390/e27070736 - 9 Jul 2025
Viewed by 97
Abstract
Alarm systems play crucial roles in industrial process safety. To support tackling the accident that is about to occur after an alarm, a pre-warning method is proposed for a special class of industrial process variables to alert operators about the remaining time to [...] Read more.
Alarm systems play crucial roles in industrial process safety. To support tackling the accident that is about to occur after an alarm, a pre-warning method is proposed for a special class of industrial process variables to alert operators about the remaining time to alarm. The main idea of the proposed method is to estimate the remaining time to alarm based on variation rates and mixture entropies of qualitative trends in univariate variables. If the remaining time to alarm is no longer than the pre-warning threshold and its mixture entropy is small enough then a warning is generated to alert the operators. One challenge for the proposed method is how to determine an optimal pre-warning threshold by considering the uncertainties induced by the sample distribution of the remaining time to alarm, subject to the constraint of the required false warning rate. This challenge is addressed by utilizing Bayesian estimation theory to estimate the confidence intervals for all candidates of the pre-warning threshold, and the optimal one is selected as the one whose upper bound of the confidence interval is nearest to the required false warning rate. Another challenge is how to measure the possibility of the current trend segment increasing to the alarm threshold, and this challenge is overcome by adopting the mixture entropy as a possibility measurement. Numerical and industrial examples illustrate the effectiveness of the proposed method and the advantages of the proposed method over the existing methods. Full article
(This article belongs to the Special Issue Failure Diagnosis of Complex Systems)
Show Figures

Figure 1

18 pages, 70320 KiB  
Article
RIS-UNet: A Multi-Level Hierarchical Framework for Liver Tumor Segmentation in CT Images
by Yuchai Wan, Lili Zhang and Murong Wang
Entropy 2025, 27(7), 735; https://doi.org/10.3390/e27070735 - 9 Jul 2025
Viewed by 126
Abstract
The deep learning-based analysis of liver CT images is expected to provide assistance for clinicians in the diagnostic decision-making process. However, the accuracy of existing methods still falls short of clinical requirements and needs to be further improved. Therefore, in this work, we [...] Read more.
The deep learning-based analysis of liver CT images is expected to provide assistance for clinicians in the diagnostic decision-making process. However, the accuracy of existing methods still falls short of clinical requirements and needs to be further improved. Therefore, in this work, we propose a novel multi-level hierarchical framework for liver tumor segmentation. In the first level, we integrate inter-slice spatial information by a 2.5D network to resolve the accuracy–efficiency trade-off inherent in conventional 2D/3D segmentation strategies for liver tumor segmentation. Then, the second level extracts the inner-slice global and local features for enhancing feature representation. We propose the Res-Inception-SE Block, which combines residual connections, multi-scale Inception modules, and squeeze-excitation attention to capture comprehensive global and local features. Furthermore, we design a hybrid loss function combining Binary Cross Entropy (BCE) and Dice loss to solve the category imbalance problem and accelerate convergence. Extensive experiments on the LiTS17 dataset demonstrate the effectiveness of our method on accuracy, efficiency, and visual results for liver tumor segmentation. Full article
(This article belongs to the Special Issue Cutting-Edge AI in Computational Bioinformatics)
Show Figures

Figure 1

17 pages, 7786 KiB  
Article
Video Coding Based on Ladder Subband Recovery and ResGroup Module
by Libo Wei, Aolin Zhang, Lei Liu, Jun Wang and Shuai Wang
Entropy 2025, 27(7), 734; https://doi.org/10.3390/e27070734 - 8 Jul 2025
Viewed by 162
Abstract
With the rapid development of video encoding technology in the field of computer vision, the demand for tasks such as video frame reconstruction, denoising, and super-resolution has been continuously increasing. However, traditional video encoding methods typically focus on extracting spatial or temporal domain [...] Read more.
With the rapid development of video encoding technology in the field of computer vision, the demand for tasks such as video frame reconstruction, denoising, and super-resolution has been continuously increasing. However, traditional video encoding methods typically focus on extracting spatial or temporal domain information, often facing challenges of insufficient accuracy and information loss when reconstructing high-frequency details, edges, and textures of images. To address this issue, this paper proposes an innovative LadderConv framework, which combines discrete wavelet transform (DWT) with spatial and channel attention mechanisms. By progressively recovering wavelet subbands, it effectively enhances the video frame encoding quality. Specifically, the LadderConv framework adopts a stepwise recovery approach for wavelet subbands, first processing high-frequency detail subbands with relatively less information, then enhancing the interaction between these subbands, and ultimately synthesizing a high-quality reconstructed image through inverse wavelet transform. Moreover, the framework introduces spatial and channel attention mechanisms, which further strengthen the focus on key regions and channel features, leading to notable improvements in detail restoration and image reconstruction accuracy. To optimize the performance of the LadderConv framework, particularly in detail recovery and high-frequency information extraction tasks, this paper designs an innovative ResGroup module. By using multi-layer convolution operations along with feature map compression and recovery, the ResGroup module enhances the network’s expressive capability and effectively reduces computational complexity. The ResGroup module captures multi-level features from low level to high level and retains rich feature information through residual connections, thus improving the overall reconstruction performance of the model. In experiments, the combination of the LadderConv framework and the ResGroup module demonstrates superior performance in video frame reconstruction tasks, particularly in recovering high-frequency information, image clarity, and detail representation. Full article
(This article belongs to the Special Issue Rethinking Representation Learning in the Age of Large Models)
Show Figures

Figure 1

27 pages, 958 KiB  
Article
AQEA-QAS: An Adaptive Quantum Evolutionary Algorithm for Quantum Architecture Search
by Yaochong Li, Jing Zhang, Rigui Zhou, Yi Qu and Ruiqing Xu
Entropy 2025, 27(7), 733; https://doi.org/10.3390/e27070733 - 8 Jul 2025
Viewed by 169
Abstract
Quantum neural networks (QNNs) represent an emerging technology that uses a quantum computer for neural network computations. The QNNs have demonstrated potential advantages over classical neural networks in certain tasks. As a core component of a QNN, the parameterized quantum circuit (PQC) plays [...] Read more.
Quantum neural networks (QNNs) represent an emerging technology that uses a quantum computer for neural network computations. The QNNs have demonstrated potential advantages over classical neural networks in certain tasks. As a core component of a QNN, the parameterized quantum circuit (PQC) plays a crucial role in determining the QNN’s overall performance. However, quantum circuit architectures designed manually based on experience or using specific hardware structures can suffer from inefficiency due to the introduction of redundant quantum gates, which amplifies the impact of noise on system performance. Recent studies have suggested that the advantages of quantum evolutionary algorithms (QEAs) in terms of precision and convergence speed can provide an effective solution to quantum circuit architecture-related problems. Currently, most QEAs adopt a fixed rotation mode in the evolution process, and a lack of an adaptive updating mode can cause the QEAs to fall into a local optimum and make it difficult for them to converge. To address these problems, this study proposes an adaptive quantum evolution algorithm (AQEA). First, an adaptive mechanism is introduced to the evolution process, and the strategy of combining two dynamic rotation angles is adopted. Second, to prevent the fluctuations of the population’s offspring, the elite retention of the parents is used to ensure the inheritance of good genes. Finally, when the population falls into a local optimum, a quantum catastrophe mechanism is employed to break the current population state. The experimental results show that compared with the QNN structure based on manual design and QEA search, the proposed AQEA can reduce the number of network parameters by up to 20% and increase the accuracy by 7.21%. Moreover, in noisy environments, the AQEA-optimized circuit outperforms traditional circuits in maintaining high fidelity, and its excellent noise resistance provides strong support for the reliability of quantum computing. Full article
(This article belongs to the Special Issue Quantum Information and Quantum Computation)
Show Figures

Figure 1

44 pages, 507 KiB  
Article
Compositional Causal Identification from Imperfect or Disturbing Observations
by Isaac Friend, Aleks Kissinger, Robert W. Spekkens and Elie Wolfe
Entropy 2025, 27(7), 732; https://doi.org/10.3390/e27070732 - 8 Jul 2025
Viewed by 148
Abstract
The usual inputs for a causal identification task are a graph representing qualitative causal hypotheses and a joint probability distribution for some of the causal model’s variables when they are observed rather than intervened on. Alternatively, the available probabilities sometimes come from a [...] Read more.
The usual inputs for a causal identification task are a graph representing qualitative causal hypotheses and a joint probability distribution for some of the causal model’s variables when they are observed rather than intervened on. Alternatively, the available probabilities sometimes come from a combination of passive observations and controlled experiments. It also makes sense, however, to consider causal identification with data collected via schemes more generic than (perfect) passive observation or perfect controlled experiments. For example, observation procedures may be noisy, may disturb the variables, or may yield only coarse-grained specification of the variables’ values. In this work, we investigate identification of causal quantities when the probabilities available for inference are the probabilities of outcomes of these more generic schemes. Using process theories (aka symmetric monoidal categories), we formulate graphical causal models as second-order processes that respond to such data collection instruments. We pose the causal identification problem relative to arbitrary sets of available instruments. Perfect passive observation instruments—those that produce the usual observational probabilities used in causal inference—satisfy an abstract process-theoretic property called marginal informational completeness. This property also holds for other (sets of) instruments. The main finding is that in the case of Markovian models, as long as the available instruments satisfy this property, the probabilities they produce suffice for identification of interventional quantities, just as those produced by perfect passive observations do. This finding sharpens the distinction between the Markovianity of a causal model and that of a probability distribution, suggesting a more extensive line of investigation of causal inference within a process-theoretic framework. Full article
(This article belongs to the Special Issue Causal Graphical Models and Their Applications)
24 pages, 1917 KiB  
Article
Empirical Evaluation of the Relative Range for Detecting Outliers
by Dania Dallah, Hana Sulieman, Ayman Al Zaatreh and Firuz Kamalov
Entropy 2025, 27(7), 731; https://doi.org/10.3390/e27070731 - 7 Jul 2025
Viewed by 141
Abstract
Outlier detection plays a key role in data analysis by improving data quality, uncovering data entry errors, and spotting unusual patterns, such as fraudulent activities. Choosing the right detection method is essential, as some approaches may be too complex or ineffective depending on [...] Read more.
Outlier detection plays a key role in data analysis by improving data quality, uncovering data entry errors, and spotting unusual patterns, such as fraudulent activities. Choosing the right detection method is essential, as some approaches may be too complex or ineffective depending on the data distribution. In this study, we explore a simple yet powerful approach using the range distribution to identify outliers in univariate data. We compare the effectiveness of two range statistics: we normalize the range by the standard deviation (σ) and the interquartile range (IQR) across different types of distributions, including normal, logistic, Laplace, and Weibull distributions, with varying sample sizes (n) and error rates (α). An evaluation of the range behavior across multiple distributions allows for the determination of threshold values for identifying potential outliers. Through extensive experimental work, the accuracy of both statistics in detecting outliers under various contamination strategies, sample sizes, and error rates (α=0.1,0.05,0.01) is investigated. The results demonstrate the flexibility of the proposed statistic, as it adapts well to different underlying distributions and maintains robust detection performance under a variety of conditions. Our findings underscore the value of an adaptive method for reliable anomaly detection in diverse data environments. Full article
(This article belongs to the Special Issue Information-Theoretic Methods in Data Analytics, 2nd Edition)
Show Figures

Figure 1

17 pages, 10694 KiB  
Article
Entropy-Inspired Aperture Optimization in Fourier Optics
by Marcos Miotti and Daniel Varela Magalhães
Entropy 2025, 27(7), 730; https://doi.org/10.3390/e27070730 - 7 Jul 2025
Viewed by 116
Abstract
The trade-off between resolution and contrast is a transcendental problem in optical imaging, spanning from artistic photography to technoscientific applications. To the latter, Fourier-optics-based filters, such as the 4f system, are well-known for their image-enhancement properties, removing high spatial frequencies from an [...] Read more.
The trade-off between resolution and contrast is a transcendental problem in optical imaging, spanning from artistic photography to technoscientific applications. To the latter, Fourier-optics-based filters, such as the 4f system, are well-known for their image-enhancement properties, removing high spatial frequencies from an optically Fourier-transformed light signal through simple aperture adjustment. Nonetheless, assessing the contrast–resolution balance in optical imaging remains a challenging task, often requiring complex mathematical treatment and controlled laboratory conditions to match theoretical predictions. With that in mind, we propose a simple yet robust analytical technique to determine the optimal aperture in a 4f imaging system for static and quasi-static objects. Our technique employs the mathematical formalism of the H-theorem, enabling us to directly access the information of an imaged object. By varying the aperture at the Fourier plane of the 4f system, we have empirically found an optimal aperture region where the imaging entropy is maximum, given that the object is fitted to the imaged area. At that region, the image is lit and well-resolved, and no further aperture decrease improves that, as information of the whole assembly (object plus imaging system) is maximum. With that analysis, we have also been able to investigate how the imperfections in an object affect the entropy during its imaging. Despite its simplicity, our technique is generally applicable and passable for automation, making it interesting for many imaging-based optical devices. Full article
(This article belongs to the Special Issue Insight into Entropy)
Show Figures

Figure 1

13 pages, 836 KiB  
Article
Numerical Generation of Trajectories Statistically Consistent with Stochastic Differential Equations
by Mykhaylo Evstigneev
Entropy 2025, 27(7), 729; https://doi.org/10.3390/e27070729 - 6 Jul 2025
Viewed by 125
Abstract
A weak second-order numerical method for generating trajectories based on stochastic differential equations (SDE) is developed. The proposed approach bypasses direct noise realization by updating the system’s state using independent Gaussian random variables so as to reproduce the first three cumulants of the [...] Read more.
A weak second-order numerical method for generating trajectories based on stochastic differential equations (SDE) is developed. The proposed approach bypasses direct noise realization by updating the system’s state using independent Gaussian random variables so as to reproduce the first three cumulants of the state variable at each time step to the second order in the time-step size. The update rule for the state variable is derived based on the system’s Fokker–Planck equation in an arbitrary number of dimensions. The high accuracy of the method as compared to the standard Milstein algorithm is demonstrated on the example of Büttiker’s ratchet. While the method is second-order accurate in the time step, it can be extended to systematically generate higher-order terms of the stochastic Taylor expansion approximating the solution of the SDE. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

18 pages, 292 KiB  
Article
Motion of Quantum Particles in Terms of Probabilities of Paths
by Emilio Santos
Entropy 2025, 27(7), 728; https://doi.org/10.3390/e27070728 - 6 Jul 2025
Viewed by 113
Abstract
The Feynman path integral formalism for non-relativistic quantum mechanics is revisited. A comparison is made with cases of light propagation (Huygens’ principle) and Brownian motion. The difficulties for a physical model applying Feynman’s formalism are pointed out. A reformulation is proposed, where the [...] Read more.
The Feynman path integral formalism for non-relativistic quantum mechanics is revisited. A comparison is made with cases of light propagation (Huygens’ principle) and Brownian motion. The difficulties for a physical model applying Feynman’s formalism are pointed out. A reformulation is proposed, where the transition probability of a particle from one space-time point to another one is the sum of probabilities of the possible paths. As an application, Born approximation for scattering is derived within the formalism, which suggests an interpretation involving the stochastic motion of a particle rather than the square of a wavelike amplitude. Full article
(This article belongs to the Special Issue Quantum Probability and Randomness V)
26 pages, 543 KiB  
Article
Bounds on the Excess Minimum Risk via Generalized Information Divergence Measures
by Ananya Omanwar, Fady Alajaji and Tamás Linder
Entropy 2025, 27(7), 727; https://doi.org/10.3390/e27070727 - 5 Jul 2025
Viewed by 139
Abstract
Given finite-dimensional random vectors Y, X, and Z that form a Markov chain in that order (YXZ), we derive the upper bounds on the excess minimum risk using generalized information divergence measures. Here, Y is [...] Read more.
Given finite-dimensional random vectors Y, X, and Z that form a Markov chain in that order (YXZ), we derive the upper bounds on the excess minimum risk using generalized information divergence measures. Here, Y is a target vector to be estimated from an observed feature vector X or its stochastically degraded version Z. The excess minimum risk is defined as the difference between the minimum expected loss in estimating Y from X and from Z. We present a family of bounds that generalize a prior bound based on mutual information, using the Rényi and α-Jensen–Shannon divergences, as well as Sibson’s mutual information. Our bounds are similar to recently developed bounds for the generalization error of learning algorithms. However, unlike these works, our bounds do not require the sub-Gaussian parameter to be constant, and therefore, apply to a broader class of joint distributions over Y, X, and Z. We also provide numerical examples under both constant and non-constant sub-Gaussianity assumptions, illustrating that our generalized divergence-based bounds can be tighter than the ones based on mutual information for certain regimes of the parameter α. Full article
(This article belongs to the Special Issue Information Theoretic Learning with Its Applications)
Show Figures

Figure 1

25 pages, 9127 KiB  
Article
Applicability and Design Considerations of Chaotic and Quantum Entropy Sources for Random Number Generation in IoT Devices
by Wieslaw Marszalek, Michał Melosik, Mariusz Naumowicz and Przemysław Głowacki
Entropy 2025, 27(7), 726; https://doi.org/10.3390/e27070726 - 4 Jul 2025
Viewed by 147
Abstract
This article presents a comparative analysis of two types of generators of random sequences: one based on a discrete chaotic system being the logistic map, and the other being a commercial quantum random number generator QUANTIS-USB-4M. The results of the conducted analysis serve [...] Read more.
This article presents a comparative analysis of two types of generators of random sequences: one based on a discrete chaotic system being the logistic map, and the other being a commercial quantum random number generator QUANTIS-USB-4M. The results of the conducted analysis serve as a guide for selecting the type of generator that is more suited for a specific IoT solution, depending on the functional profile of the target application and the amount of random data required in the cryptographic process. This article discusses both the theoretical foundations of chaotic phenomena underlying the pseudorandom number generator based on the logistic map, as well as the theoretical principles of photon detection used in the quantum random number generators. A hardware IP Core implementing the logistic map was developed, suitable for direct implementation either as a standalone ASIC using the SkyWater PDK process or on an FPGA. The generated bitstreams from the implemented IP Core were evaluated for randomness. The analysis of the entropy levels and evaluation of randomness for both the logistic map and the quantum random number generator were performed using the ent tool and NIST test suite. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

17 pages, 630 KiB  
Article
Mining Complex Ecological Patterns in Protected Areas: An FP-Growth Approach to Conservation Rule Discovery
by Ioan Daniel Hunyadi and Cristina Cismaș
Entropy 2025, 27(7), 725; https://doi.org/10.3390/e27070725 - 4 Jul 2025
Viewed by 119
Abstract
This study introduces a data-driven framework for enhancing the sustainable management of fish species in Romania’s Natura 2000 protected areas through ecosystem modeling and association rule mining (ARM). Drawing on seven years of ecological monitoring data for 13 fish species of ecological and [...] Read more.
This study introduces a data-driven framework for enhancing the sustainable management of fish species in Romania’s Natura 2000 protected areas through ecosystem modeling and association rule mining (ARM). Drawing on seven years of ecological monitoring data for 13 fish species of ecological and socio-economic importance, we apply the FP-Growth algorithm to extract high-confidence co-occurrence patterns among 19 codified conservation measures. By encoding expert habitat assessments into binary transactions, the analysis revealed 44 robust association rules, highlighting interdependent management actions that collectively improve species resilience and habitat conditions. These results provide actionable insights for integrated, evidence-based conservation planning. The approach demonstrates the interpretability, scalability, and practical relevance of ARM in biodiversity management, offering a replicable method for supporting adaptive ecological decision making across complex protected area networks. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

1 pages, 137 KiB  
Correction
Correction: Bianconi, G. The Quantum Relative Entropy of the Schwarzschild Black Hole and the Area Law. Entropy 2025, 27, 266
by Ginestra Bianconi
Entropy 2025, 27(7), 724; https://doi.org/10.3390/e27070724 - 4 Jul 2025
Viewed by 106
Abstract
The new version of [...] Full article
30 pages, 1423 KiB  
Article
Entropy-Based Correlation Analysis for Privacy Risk Assessment in IoT Identity Ecosystem
by Kai-Chih Chang and Suzanne Barber
Entropy 2025, 27(7), 723; https://doi.org/10.3390/e27070723 - 3 Jul 2025
Viewed by 136
Abstract
As the Internet of Things (IoT) expands, robust tools for assessing privacy risk are increasingly critical. This research introduces a quantitative framework for evaluating IoT privacy risks, centered on two algorithmically derived scores: the Personalized Privacy Assistant (PPA) score and the PrivacyCheck score, [...] Read more.
As the Internet of Things (IoT) expands, robust tools for assessing privacy risk are increasingly critical. This research introduces a quantitative framework for evaluating IoT privacy risks, centered on two algorithmically derived scores: the Personalized Privacy Assistant (PPA) score and the PrivacyCheck score, both developed by the Center for Identity at The University of Texas. We analyze the correlation between these scores across multiple types of sensitive data—including email, social security numbers, and location—to understand their effectiveness in detecting privacy vulnerabilities. Our approach leverages Bayesian networks with cycle decomposition to capture complex dependencies among risk factors and applies entropy-based metrics to quantify informational uncertainty in privacy assessments. Experimental results highlight the strengths and limitations of each tool and demonstrate the value of combining data-driven risk scoring, information-theoretic analysis, and network modeling for privacy evaluation in IoT environments. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

23 pages, 7163 KiB  
Article
Entropy-Regularized Attention for Explainable Histological Classification with Convolutional and Hybrid Models
by Pedro L. Miguel, Leandro A. Neves, Alessandra Lumini, Giuliano C. Medalha, Guilherme F. Roberto, Guilherme B. Rozendo, Adriano M. Cansian, Thaína A. A. Tosta and Marcelo Z. do Nascimento
Entropy 2025, 27(7), 722; https://doi.org/10.3390/e27070722 - 3 Jul 2025
Viewed by 208
Abstract
Deep learning models such as convolutional neural networks (CNNs) and vision transformers (ViTs) perform well in histological image classification, but often lack interpretability. We introduce a unified framework that adds an attention branch and CAM Fostering, an entropy-based regularizer, to improve Grad-CAM visualizations. [...] Read more.
Deep learning models such as convolutional neural networks (CNNs) and vision transformers (ViTs) perform well in histological image classification, but often lack interpretability. We introduce a unified framework that adds an attention branch and CAM Fostering, an entropy-based regularizer, to improve Grad-CAM visualizations. Six backbone architectures (ResNet-50, DenseNet-201, EfficientNet-b0, ResNeXt-50, ConvNeXt, CoatNet-small) were trained, with and without our modifications, on five H&E-stained datasets. We measured explanation quality using coherence, complexity, confidence drop, and their harmonic mean (ADCC). Our method increased the ADCC in five of the six backbones; ResNet-50 saw the largest gain (+15.65%), and CoatNet-small achieved the highest overall score (+2.69%), peaking at 77.90% on the non-Hodgkin lymphoma set. The classification accuracy remained stable or improved in four models. These results show that combining attention and entropy produces clearer, more informative heatmaps without degrading performance. Our contributions include a modular architecture for both convolutional and hybrid models and a comprehensive, quantitative explainability evaluation suite. Full article
Show Figures

Figure 1

12 pages, 1785 KiB  
Article
Fisher–Shannon Analysis of Sentinel 1 Time Series from 2015 to 2023: Revealing the Impact of Toumeyella Parvicornis Infection in a Pilot Site of Central Italy
by Luciano Telesca, Nicodemo Abate, Michele Lovallo and Rosa Lasaponara
Entropy 2025, 27(7), 721; https://doi.org/10.3390/e27070721 - 3 Jul 2025
Viewed by 211
Abstract
This study investigates the capability of Sentinel-1 (S1) SAR time series to identify vegetation sites affected by pest infestations. For this purpose, the statistical method of the Fisher–Shannon analysis was employed to discern infected from unifected forest trees. The analysis was performed on [...] Read more.
This study investigates the capability of Sentinel-1 (S1) SAR time series to identify vegetation sites affected by pest infestations. For this purpose, the statistical method of the Fisher–Shannon analysis was employed to discern infected from unifected forest trees. The analysis was performed on a case study (Castel Porziano) located in the urban and peri-urban areas of Rome (Italy), which have been significantly impacted by Toumeyella parvicornis (TP) in recent years. For comparison, the area of Follonica (Italy), which has not yet been affected by this insect, was also analyzed. Two polarizations (VV and VH) and two orbit types (Ascending and Descending) were analyzed. The results, supported by Receiver Operating Characteristic (ROC) analysis, demonstrated that VH polarization in the Descending orbit provided the best performance in identifying TP-infected sites. Full article
(This article belongs to the Section Entropy and Biology)
Show Figures

Figure 1

13 pages, 2884 KiB  
Article
Entropy-Based Human Activity Measure Using FMCW Radar
by Hak-Hoon Lee and Hyun-Chool Shin
Entropy 2025, 27(7), 720; https://doi.org/10.3390/e27070720 - 3 Jul 2025
Viewed by 183
Abstract
Existing activity measurement methods, such as gas analyzers, activity trackers, and camera-based systems, have limitations in accuracy, convenience, and privacy. To address these issues, this study proposes an improved activity estimation algorithm using a 60 GHz Frequency-Modulated Continuous-Wave (FMCW) radar. Unlike conventional methods [...] Read more.
Existing activity measurement methods, such as gas analyzers, activity trackers, and camera-based systems, have limitations in accuracy, convenience, and privacy. To address these issues, this study proposes an improved activity estimation algorithm using a 60 GHz Frequency-Modulated Continuous-Wave (FMCW) radar. Unlike conventional methods that rely solely on distance variations, the proposed method incorporates both distance and velocity information, enhancing measurement accuracy. The algorithm quantifies activity levels using Shannon entropy to reflect the spatial–temporal variation in range signatures. The proposed method was validated through experiments comparing estimated activity levels with motion sensor-based ground truth data. The results demonstrate that the proposed approach significantly improves accuracy, achieving a lower Root Mean Square Error (RMSE) and higher correlation with ground truth values than conventional methods. This study highlights the potential of FMCW radar for non-contact, unrestricted activity monitoring and suggests future research directions using multi-channel radar systems for enhanced motion analysis. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

82 pages, 1294 KiB  
Review
Mock Modularity at Work, or Black Holes in a Forest
by Sergei Alexandrov
Entropy 2025, 27(7), 719; https://doi.org/10.3390/e27070719 - 2 Jul 2025
Viewed by 114
Abstract
Mock modular forms, first invented by Ramanujan, provide a beautiful generalization of the usual modular forms. In recent years, it was found that they capture the generating functions of the number of microstates of BPS black holes appearing in compactifications of string theory [...] Read more.
Mock modular forms, first invented by Ramanujan, provide a beautiful generalization of the usual modular forms. In recent years, it was found that they capture the generating functions of the number of microstates of BPS black holes appearing in compactifications of string theory with 8 and 16 supercharges. This review describes these results and their applications, which range from the actual computation of these generating functions for both compact and non-compact compactification manifolds (encoding, respectively, Donaldson–Thomas and Vafa–Witten topological invariants) to the construction of new non-commutative structures on moduli spaces of Calabi–Yau threefolds. Full article
(This article belongs to the Section Astrophysics, Cosmology, and Black Holes)
22 pages, 323 KiB  
Article
Bridge, Reverse Bridge, and Their Control
by Andrea Baldassarri and Andrea Puglisi
Entropy 2025, 27(7), 718; https://doi.org/10.3390/e27070718 - 2 Jul 2025
Viewed by 139
Abstract
We investigate the bridge problem for stochastic processes, that is, we analyze the statistical properties of trajectories constrained to begin and terminate at a fixed position within a time interval τ. Our primary focus is the time-reversal symmetry of these trajectories: under [...] Read more.
We investigate the bridge problem for stochastic processes, that is, we analyze the statistical properties of trajectories constrained to begin and terminate at a fixed position within a time interval τ. Our primary focus is the time-reversal symmetry of these trajectories: under which conditions do the statistical properties remain invariant under the transformation tτt? To address this question, we compare the stochastic differential equation describing the bridge, derived equivalently via Doob’s transform or stochastic optimal control, with the corresponding equation for the time-reversed bridge. We aim to provide a concise overview of these well-established derivation techniques and subsequently obtain a local condition for the time-reversal asymmetry that is specifically valid for the bridge. We are specifically interested in cases in which detailed balance is not satisfied and aim to eventually quantify the bridge asymmetry and understand how to use it to derive useful information about the underlying out-of-equilibrium dynamics. To this end, we derived a necessary condition for time-reversal symmetry, expressed in terms of the current velocity of the original stochastic process and a quantity linked to detailed balance. As expected, this formulation demonstrates that the bridge is symmetric when detailed balance holds, a sufficient condition that was already known. However, it also suggests that a bridge can exhibit symmetry even when the underlying process violates detailed balance. While we did not identify a specific instance of complete symmetry under broken detailed balance, we present an example of partial symmetry. In this case, some, but not all, components of the bridge display time-reversal symmetry. This example is drawn from a minimal non-equilibrium model, namely Brownian Gyrators, that are linear stochastic processes. We examined non-equilibrium systems driven by a "mechanical” force, specifically those in which the linear drift cannot be expressed as the gradient of a potential. While Gaussian processes like Brownian Gyrators offer valuable insights, it is known that they can be overly simplistic, even in their time-reversal properties. Therefore, we transformed the model into polar coordinates, obtaining a non-Gaussian process representing the squared modulus of the original process. Despite this increased complexity and the violation of detailed balance in the full process, we demonstrate through exact calculations that the bridge of the squared modulus in the isotropic case, constrained to start and end at the origin, exhibits perfect time-reversal symmetry. Full article
(This article belongs to the Special Issue Control of Driven Stochastic Systems: From Shortcuts to Optimality)
Previous Issue
Back to TopTop