Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (188)

Search Parameters:
Keywords = non-Shannon entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 1444 KB  
Article
Longitudinal Analysis of Vulvovaginal Bacteriome Following Use of Water- and Silicone-Based Personal Lubricants: Stability, Spatial Specificity, and Clinical Implications
by Jose A. Freixas-Coutin, Jin Seo, Lingyao Su and Sarah Hood
Microorganisms 2026, 14(1), 82; https://doi.org/10.3390/microorganisms14010082 (registering DOI) - 30 Dec 2025
Abstract
The vulvovaginal microbiome is a complex and dynamic ecosystem of microorganisms. The potential effects of common personal lubricants on its balance, which have implications for reproductive health, are still unknown. This study longitudinally assessed the impact of two commercially available lubricants on the [...] Read more.
The vulvovaginal microbiome is a complex and dynamic ecosystem of microorganisms. The potential effects of common personal lubricants on its balance, which have implications for reproductive health, are still unknown. This study longitudinally assessed the impact of two commercially available lubricants on the composition and stability of the vaginal and vulvar bacteriome. Paired vaginal and vulvar swabs were collected at baseline and after repeated lubricant use, and the bacteriome was assessed using 16S rRNA gene amplicon sequencing. Alpha and beta diversity were assessed using Shannon entropy and Bray–Curtis dissimilarity, respectively. The results showed that the vaginal bacteriome was dominated by Lactobacillus and Firmicutes, while vulvar communities were more diverse and had higher abundances of Prevotella, Finegoldia, and Peptoniphilus. Both alpha and beta diversity measures indicated that the vaginal and vulvar bacteriome remained largely stable even after repeated lubricant use. Minor and non-significant changes in genus-level composition were observed, particularly in the vulvar samples. A moderate but significant correlation (Mantel r = 0.274, p = 0.001) was also observed between the vaginal and vulvar bacteriome. Overall, this study shows that short-term, repeated use of the water-based lubricant and the silicone-based lubricant tested in this study does not significantly disrupt the vaginal or vulvar bacteriome. Full article
(This article belongs to the Special Issue The Vaginal Microbiome in Health and Disease)
22 pages, 4884 KB  
Article
Integrating Microtopographic Engineering with Native Plant Functional Diversity to Support Restoration of Degraded Arid Ecosystems
by Yassine Fendane, Mohamed Djamel Miara, Hassan Boukcim, Sami D. Almalki, Shauna K. Rees, Abdalsamad Aldabaa, Ayman Abdulkareem and Ahmed H. Mohamed
Land 2025, 14(12), 2445; https://doi.org/10.3390/land14122445 - 18 Dec 2025
Viewed by 254
Abstract
Active restoration structures such as microtopographic water-harvesting designs are widely implemented in dryland ecosystems to improve soil moisture, reduce erosion, and promote vegetation recovery. We assessed the combined effects of planted species identity, planting diversity (mono-, bi- and multi-species mixtures), and micro-catchment (half-moon) [...] Read more.
Active restoration structures such as microtopographic water-harvesting designs are widely implemented in dryland ecosystems to improve soil moisture, reduce erosion, and promote vegetation recovery. We assessed the combined effects of planted species identity, planting diversity (mono-, bi- and multi-species mixtures), and micro-catchment (half-moon) structures on seedling performance and spontaneous natural regeneration in a hyper-arid restoration pilot site in Sharaan National Park, northwest Saudi Arabia. Thirteen native plant species, of which four—Ochradenus baccatus, Haloxylon persicum, Haloxylon salicornicum, and Acacia gerrardii—formed the dominant planted treatments, were established in 18 half-moons and monitored for survival, growth, and natural recruitment. Seedling survival after 20 months differed significantly among planting treatments, increasing from 58% in mono-plantings to 69% in bi-plantings and 82% in multi-plantings (binomial GLMM, p < 0.001), indicating a positive effect of planting diversity on establishment. Growth traits (height, collar diameter, and crown dimensions) were synthesized into an Overall Growth Index (OGI) and an entropy-weighted OGI (EW-OGI). Mixed-effects models revealed strong species effects on both indices (F12,369 ≈ 7.2, p < 0.001), with O. baccatus and H. persicum outperforming other taxa and cluster analysis separating “fast expanders”, “moderate growers”, and “decliners”. Trait-based modeling showed that lateral crown expansion was the main driver of overall performance, whereas stem thickening and fruit production contributed little. Between 2022 and 2024, half-moon soils exhibited reduced electrical conductivity and exchangeable Na, higher organic carbon, and doubled available P, consistent with emerging positive soil–plant feedbacks. Spontaneous recruits were dominated by perennials (≈67% of richness), with perennial dominance increasing from mono- to multi-plantings, although Shannon diversity differences among treatments were small and non-significant. The correlation between OGI and spontaneous richness was positive but weak (r = 0.29, p = 0.25), yet plots dominated by O. baccatus hosted nearly two additional spontaneous species relative to other plantings, highlighting its strong facilitative role. Overall, our results show that half-moon micro-catchments, especially when combined with functionally diverse native plantings, can simultaneously improve soil properties and promote biotic facilitation, fostering a transition from active intervention to passive, self-sustaining restoration in hyper-arid environments. Full article
Show Figures

Figure 1

20 pages, 6876 KB  
Article
Real-Time Inductance Estimation of Sensorless PMSM Drive System Using Wavelet Denoising and Least-Order Observer with Time-Delay Compensation
by Gwangmin Park and Junhyung Bae
Machines 2025, 13(12), 1102; https://doi.org/10.3390/machines13121102 - 28 Nov 2025
Viewed by 288
Abstract
In this paper, the inductance of a sensorless PMSM (Permanent Magnet Synchronous Motor) drive system equipped with a periodic load torque compensator based on a wavelet denoising and least-order observer with time-delay compensation is estimated in real-time. In a sensorless PMSM system with [...] Read more.
In this paper, the inductance of a sensorless PMSM (Permanent Magnet Synchronous Motor) drive system equipped with a periodic load torque compensator based on a wavelet denoising and least-order observer with time-delay compensation is estimated in real-time. In a sensorless PMSM system with constant load torque, the magnetically saturated inductance value remains constant. This constant inductance error causes minor performance degradation, such as a constant rotor position estimation error and non-optimal torque current, but it does not introduce a speed estimation error. Conversely, in a sensorless PMSM motor system subjected to periodic load torque, the magnetically saturated inductance error fluctuates periodically. This fluctuation leads to periodic variations in both the estimated position error and the speed error, ultimately degrading the load torque compensation performance. This paper applies the maximum energy-to-Shannon entropy criterion for the optimal selection of the mother wavelet in the wavelet transform to remove the motor signal noise and achieve more accurate inductance estimation. Additionally, the coherence and correlation theory is proposed to address the time delay in the least-order observer and improve the time delay. A self-saturation compensation method is also proposed to minimize periodic speed fluctuations and improve control accuracy through inductance parameter estimation. Finally, experiments were conducted on a sensorless PMSM drive system to verify the inductance estimation performance and validate the effectiveness of vibration reduction. Full article
(This article belongs to the Special Issue Advanced Sensorless Control of Electrical Machines)
Show Figures

Figure 1

33 pages, 2581 KB  
Article
Information-Theoretic ESG Index Direction Forecasting: A Complexity-Aware Framework
by Kadriye Nurdanay Öztürk and Öyküm Esra Yiğit
Entropy 2025, 27(11), 1164; https://doi.org/10.3390/e27111164 - 17 Nov 2025
Viewed by 1181
Abstract
Sustainable finance exhibits non-linear dynamics, regime shifts, and distributional drift that challenge conventional forecasting, particularly in volatile emerging markets. Conventional models, which often overlook this structural complexity, can struggle to produce stable or reliable probabilistic forecasts. To address this challenge, this study introduces [...] Read more.
Sustainable finance exhibits non-linear dynamics, regime shifts, and distributional drift that challenge conventional forecasting, particularly in volatile emerging markets. Conventional models, which often overlook this structural complexity, can struggle to produce stable or reliable probabilistic forecasts. To address this challenge, this study introduces a complexity-aware forecasting framework that operationalizes information-theoretic meta features, Shannon entropy (SE), permutation entropy (PE) and Kullback–Leibler (KL) divergence to make Environmental, Social, and Governance (ESG) index forecasting more stable, probabilistically accurate, and operationally reliable. Applied in an emerging-market setting using Türkiye’s ESG index as a natural stress test, the framework was benchmarked against a macro-technical baseline with a calibrated XGBoost classifier under a strictly chronological, leakage-controlled nested cross-validation protocol and evaluated on a strictly held-out test set. In development, the framework achieved statistically significant improvements in both stability and calibration, reducing fold-level dispersion (by 40.4–66.6%) across all metrics and enhancing probability-level alignment with Brier score reduced by 0.0140 and the ECE by 0.0287. Furthermore, a meta-analytic McNemar’s test confirmed a significant reduction in misclassifications across the development folds. On the strictly held-out test set, the framework’s superiority was confirmed by a statistically significant reduction in classification errors (exact McNemar p < 0.001), alongside strong gains in imbalance-robust metrics such as BAcc (0.618, +12.8%) and the MCC (0.288, +38.5%), achieving an F1-score of 0.719. Overall, the findings of the complexity-aware framework indicate that explicitly representing the market’s informational state and transitions yields more stable, well-calibrated, and operationally reliable forecasts in regime-shifting financial environments, supporting enhanced robustness and practical deployability. Full article
Show Figures

Figure 1

21 pages, 5261 KB  
Article
Real-Time Defect Identification in Automotive Brake Calipers Using PCA-Optimized Feature Extraction and Machine Learning
by Juwon Lee, Ukyong Woo, Myung-Hun Lee, Jin-Young Kim, Hajin Choi and Taekeun Oh
Sensors 2025, 25(21), 6753; https://doi.org/10.3390/s25216753 - 4 Nov 2025
Viewed by 679
Abstract
This study aims to develop a non-contact automated impact-acoustic measurement system (AIAMS) for real-time detection of manufacturing defects in automotive brake calipers, a key component of the Electric Parking Brake (EPB) system. Calipers hold brake pads in contact with discs, and defects caused [...] Read more.
This study aims to develop a non-contact automated impact-acoustic measurement system (AIAMS) for real-time detection of manufacturing defects in automotive brake calipers, a key component of the Electric Parking Brake (EPB) system. Calipers hold brake pads in contact with discs, and defects caused by repeated loads and friction can lead to reduced braking performance and abnormal vibration and noise. To address this issue, an automated impact hammer and a microphone-based measurement system were designed and implemented. Feature extraction was performed using Fast Fourier Transform (FFT) and Principal Component Analysis (PCA), followed by defect classification through machine learning algorithms including Support Vector Machine (SVM), k-Nearest Neighbor (KNN), and Decision Tree (DT). Experiments were conducted on five normal and six defective caliper specimens, each subjected to 200 repeated measurements, yielding a total of 2200 datasets. Twelve statistical and spectral features were extracted, and PCA revealed that Shannon Entropy (SE) was the most discriminative feature. Based on SE-centric feature combinations, the SVM, KNN, and DT models achieved classification accuracies of at least 99.2%/97.5%, 98.8%/98.0%, and 99.2%/96.5% for normal and defective specimens, respectively. Furthermore, GUI-based software (version 1.0.0) was implemented to enable real-time defect identification and visualization. Field tests also demonstrated an average defect classification accuracy of over 95%, demonstrating its applicability as a real-time quality control system. Full article
(This article belongs to the Special Issue Sensors for Fault Diagnosis of Electric Machines)
Show Figures

Figure 1

24 pages, 31209 KB  
Article
Characterisation of GPS Horizontal Positioning Errors and Dst Using Recurrence Plot Analysis in Sub-Equatorial Ionospheric Conditions
by Lucija Žužić, Luka Škrlj, Aleksandar Nešković and Renato Filjar
Urban Sci. 2025, 9(11), 451; https://doi.org/10.3390/urbansci9110451 - 31 Oct 2025
Viewed by 521
Abstract
The Global Navigation Satellite System (GNSS) positioning performance may be degraded due to the effects of various natural and adversarial causes, most notably those related to space weather, geomagnetic, and ionospheric conditions and disturbances. Here we present a contribution to understanding the nature [...] Read more.
The Global Navigation Satellite System (GNSS) positioning performance may be degraded due to the effects of various natural and adversarial causes, most notably those related to space weather, geomagnetic, and ionospheric conditions and disturbances. Here we present a contribution to understanding the nature of geomagnetic and ionospheric conditions in terms of the effects on the GPS positioning performance through the comparative time-series analysis of the long-term annual (Year 2014) non-linear properties of Disturbance storm-time (Dst) index, an indicator of geomagnetic conditions, and the single-frequency commercial-grade GPS horizontal positioning errors as derived from raw single-frequency commercial-grade GPS observations taken at the International GNSS Service (IGS) reference station at Darwin, Northern Territory (NT), Australia. The analysis reveals candidate non-linear property indicators for future assessments and modelling, as potential descriptors of the long-term non-linear association between geomagnetic/ionospheric disturbances and GNSS positioning performance degradation: recurrence rate (RR), total number of lines in the recurrent plot, Shannon entropy, and trapping time (TT). The inference presented may serve as a framework for introducing advanced GNSS PNT correction procedures to mitigate environmental ionospheric effects on GNSS positioning performance, thereby offering more resilient and robust PNT services for GNSS applications in urban mobility, systems, and services. Full article
(This article belongs to the Special Issue Human, Technologies, and Environment in Sustainable Cities)
Show Figures

Graphical abstract

19 pages, 321 KB  
Article
Entropy Production and Irreversibility in the Linearized Stochastic Amari Neural Model
by Dario Lucente, Giacomo Gradenigo and Luca Salasnich
Entropy 2025, 27(11), 1104; https://doi.org/10.3390/e27111104 - 25 Oct 2025
Viewed by 982
Abstract
One among the most intriguing results coming from the application of statistical mechanics to the study of the brain is the understanding that it, as a dynamical system, is inherently out of equilibrium. In the realm of non-equilibrium statistical mechanics and stochastic processes, [...] Read more.
One among the most intriguing results coming from the application of statistical mechanics to the study of the brain is the understanding that it, as a dynamical system, is inherently out of equilibrium. In the realm of non-equilibrium statistical mechanics and stochastic processes, the standard observable computed to determine whether a system is at equilibrium or not is the entropy produced along the dynamics. For this reason, we present here a detailed calculation of the entropy production in the Amari model, a coarse-grained model of the brain neural network, consisting of an integro-differential equation for the neural activity field, when stochasticity is added to the original dynamics. Since the way to add stochasticity is always to some extent arbitrary, particularly for coarse-grained models, there is no general prescription to do so. We precisely investigate the interplay between noise properties and the original model features, discussing in which cases the stationary state is in thermal equilibrium and which cases it is out of equilibrium, providing explicit and simple formulae. Following the derivation for the particular case considered, we also show how the entropy production rate is related to the variation in time of the Shannon entropy of the system. Full article
(This article belongs to the Section Non-equilibrium Phenomena)
14 pages, 843 KB  
Article
A Scalarized Entropy-Based Model for Portfolio Optimization: Balancing Return, Risk and Diversification
by Florentin Șerban and Silvia Dedu
Mathematics 2025, 13(20), 3311; https://doi.org/10.3390/math13203311 - 16 Oct 2025
Viewed by 904
Abstract
Portfolio optimization is a cornerstone of modern financial decision-making, traditionally based on the mean–variance model introduced by Markowitz. However, this framework relies on restrictive assumptions—such as normally distributed returns and symmetric risk preferences—that often fail in real-world markets, particularly in volatile and non-Gaussian [...] Read more.
Portfolio optimization is a cornerstone of modern financial decision-making, traditionally based on the mean–variance model introduced by Markowitz. However, this framework relies on restrictive assumptions—such as normally distributed returns and symmetric risk preferences—that often fail in real-world markets, particularly in volatile and non-Gaussian environments such as cryptocurrencies. To address these limitations, this paper proposes a novel multi-objective model that combines expected return maximization, mean absolute deviation (MAD) minimization, and entropy-based diversification into a unified optimization structure: the Mean–Deviation–Entropy (MDE) model. The MAD metric offers a robust alternative to variance by capturing the average magnitude of deviations from the mean without inflating extreme values, while entropy serves as an information-theoretic proxy for portfolio diversification and uncertainty. Three entropy formulations are considered—Shannon entropy, Tsallis entropy, and cumulative residual Sharma–Taneja–Mittal entropy (CR-STME)—to explore different notions of uncertainty and structural diversity. The MDE model is formulated as a tri-objective optimization problem and solved via scalarization techniques, enabling flexible trade-offs between return, deviation, and entropy. The framework is empirically tested on a cryptocurrency portfolio composed of Bitcoin (BTC), Ethereum (ETH), Solana (SOL), and Binance Coin (BNB), using daily data over a 12-month period. The empirical setting reflects a high-volatility, high-skewness regime, ideal for testing entropy-driven diversification. Comparative outcomes reveal that entropy-integrated models yield more robust weightings, particularly when tail risk and regime shifts are present. Comparative results against classical mean–variance and mean–MAD models indicate that the MDE model achieves improved diversification, enhanced allocation stability, and greater resilience to volatility clustering and tail risk. This study contributes to the literature on robust portfolio optimization by integrating entropy as a formal objective within a scalarized multi-criteria framework. The proposed approach offers promising applications in sustainable investing, algorithmic asset allocation, and decentralized finance, especially under high-uncertainty market conditions. Full article
(This article belongs to the Section E5: Financial Mathematics)
Show Figures

Figure 1

11 pages, 1274 KB  
Proceeding Paper
The Value of Information in Economic Contexts
by Stefan Behringer and Roman V. Belavkin
Phys. Sci. Forum 2025, 12(1), 6; https://doi.org/10.3390/psf2025012006 - 23 Sep 2025
Viewed by 492
Abstract
This paper explores the application of the Value of Information, (VoI), based on the Claude Shannon/Ruslan Stratonovich framework within economic contexts. Unlike previous studies that examine circular settings and strategic interactions, we focus on a non-strategic linear setting. We employ standard [...] Read more.
This paper explores the application of the Value of Information, (VoI), based on the Claude Shannon/Ruslan Stratonovich framework within economic contexts. Unlike previous studies that examine circular settings and strategic interactions, we focus on a non-strategic linear setting. We employ standard economically motivated utility functions, including linear, quadratic, constant absolute risk aversion (CARA), and constant relative risk aversion (CRRA), across various priors of the stochastic environment, and analyse the resulting specific VoI forms. The curvature of these VoI functions play a decisive role in determining whether acquiring additional costly information enhances the efficiency of the decision making process. We also outline potential implications for broader decision-making frameworks. Full article
Show Figures

Figure 1

24 pages, 3314 KB  
Article
Entropy as a Lens: Exploring Visual Behavior Patterns in Architects
by Renate Delucchi Danhier, Barbara Mertins, Holger Mertins and Gerold Schneider
J. Eye Mov. Res. 2025, 18(5), 43; https://doi.org/10.3390/jemr18050043 - 16 Sep 2025
Viewed by 982
Abstract
This study examines how architectural expertise shapes visual perception, extending the “Seeing for Speaking” hypothesis into a non-linguistic domain. Specifically, it investigates whether architectural training influences unconscious visual processing of architectural content. Using eye-tracking, 48 architects and 48 laypeople freely viewed 15 still [...] Read more.
This study examines how architectural expertise shapes visual perception, extending the “Seeing for Speaking” hypothesis into a non-linguistic domain. Specifically, it investigates whether architectural training influences unconscious visual processing of architectural content. Using eye-tracking, 48 architects and 48 laypeople freely viewed 15 still images of built, mixed, and natural environments. Visual behavior was analyzed using Shannon’s entropy scores based on dwell times within 16 × 16 grids during the first six seconds of viewing. Results revealed distinct visual attention patterns between groups. Architects showed lower entropy, indicating more focused and systematic gaze behavior, and their attention was consistently drawn to built structures. In contrast, laypeople exhibited more variable and less organized scanning patterns, with greater individual differences. Moreover, architects demonstrated higher intra-group similarity in their gaze behavior, suggesting a shared attentional schema shaped by professional training. These findings highlight that domain-specific expertise deeply influences perceptual processing, resulting in systematic and efficient attention allocation. Entropy-based metrics proved effective in capturing these differences, offering a robust tool for quantifying expert vs. non-expert visual strategies in architectural cognition. The visual patterns exhibited by architects are interpreted to reflect a “Grammar of Space”, i.e., a structured way of visually parsing spatial elements. Full article
Show Figures

Graphical abstract

28 pages, 2519 KB  
Article
On the Entropy-Based Localization of Inequality in Probability Distributions
by Rajeev Rajaram, Nathan Ritchey and Brian Castellani
Entropy 2025, 27(9), 945; https://doi.org/10.3390/e27090945 - 10 Sep 2025
Viewed by 641
Abstract
We present a novel method for localizing inequality within probability distributions by applying a recursive Hahn decomposition to the degree of uniformity—a measure derived from the exponential of Shannon entropy. This approach partitions the probability space into disjoint regions exhibiting progressively sharper deviations [...] Read more.
We present a novel method for localizing inequality within probability distributions by applying a recursive Hahn decomposition to the degree of uniformity—a measure derived from the exponential of Shannon entropy. This approach partitions the probability space into disjoint regions exhibiting progressively sharper deviations from uniformity, enabling structural insights into how and where inequality is concentrated. To demonstrate its broad applicability, we apply the method to both standard and contextualized systems: the discrete binomial and continuous exponential distributions serve as canonical cases, while two hypothetical examples illustrate domain-specific applications. In the first, we analyze localized risk concentrations in disease contraction data, revealing targeted zones of epidemiological disparity. In the second, we uncover stress localization in a non-uniformly loaded beam, demonstrating the method’s relevance to physical systems with spatial heterogeneity. This decomposition reveals aspects of structural disparity that are often obscured by scalar summaries. The resulting recursive tree offers a multi-scale representation of informational non-uniformity, capturing the emergence and localization of inequality across the distribution. The framework may have implications for understanding entropy localization, transitions in informational structure, and the dynamics of heterogeneous systems. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

13 pages, 3943 KB  
Proceeding Paper
Emergent Behavior and Computational Capabilities in Nonlinear Systems: Advancing Applications in Time Series Forecasting and Predictive Modeling
by Kárel García-Medina, Daniel Estevez-Moya, Ernesto Estevez-Rams and Reinhard B. Neder
Comput. Sci. Math. Forum 2025, 11(1), 17; https://doi.org/10.3390/cmsf2025011017 - 11 Aug 2025
Viewed by 582
Abstract
Natural dynamical systems can often display various long-term behaviours, ranging from entirely predictable decaying states to unpredictable, chaotic regimes or, more interestingly, highly correlated and intricate states featuring emergent phenomena. That, of course, imposes a level of generality on the models we use [...] Read more.
Natural dynamical systems can often display various long-term behaviours, ranging from entirely predictable decaying states to unpredictable, chaotic regimes or, more interestingly, highly correlated and intricate states featuring emergent phenomena. That, of course, imposes a level of generality on the models we use to study them. Among those models, coupled oscillators and cellular automata (CA) present a unique opportunity to advance the understanding of complex temporal behaviours because of their conceptual simplicity but very rich dynamics. In this contribution, we review the work completed by our research team over the last few years in the development and application of an alternative information-based characterization scheme to study the emergent behaviour and information handling of nonlinear systems, specifically Adler-type oscillators under different types of coupling: local phase-dependent (LAP) coupling and Kuramoto-like local (LAK) coupling. We thoroughly studied the long-term dynamics of these systems, identifying several distinct dynamic regimes, ranging from periodic to chaotic and complex. The systems were analysed qualitatively and quantitatively, drawing on entropic measures and information theory. Measures such as entropy density (Shannon entropy rate), effective complexity measure, and Lempel–Ziv complexity/information distance were employed. Our analysis revealed similar patterns and behaviours between these systems and CA, which are computationally capable systems, for some specific rules and regimes. These findings further reinforce the argument around computational capabilities in dynamical systems, as understood by information transmission, storage, and generation measures. Furthermore, the edge of chaos hypothesis (EOC) was verified in coupled oscillators systems for specific regions of parameter space, where a sudden increase in effective complexity measure was observed, indicating enhanced information processing capabilities. Given the potential for exploiting this non-anthropocentric computational power, we propose this alternative information-based characterization scheme as a general framework to identify a dynamical system’s proximity to computationally enhanced states. Furthermore, this study advances the understanding of emergent behaviour in nonlinear systems. It explores the potential for leveraging the features of dynamical systems operating at the edge of chaos by coupling them with computationally capable settings within machine learning frameworks, specifically by using them as reservoirs in Echo State Networks (ESNs) for time series forecasting and predictive modeling. This approach aims to enhance the predictive capacity, particularly that of chaotic systems, by utilising EOC systems’ complex, sensitive dynamics as the ESN reservoir. Full article
(This article belongs to the Proceedings of The 11th International Conference on Time Series and Forecasting)
Show Figures

Figure 1

27 pages, 5776 KB  
Review
From “Information” to Configuration and Meaning: In Living Systems, the Structure Is the Function
by Paolo Renati and Pierre Madl
Int. J. Mol. Sci. 2025, 26(15), 7319; https://doi.org/10.3390/ijms26157319 - 29 Jul 2025
Viewed by 1880
Abstract
In this position paper, we argue that the conventional understanding of ‘information’ (as generally conceived in science, in a digital fashion) is overly simplistic and not consistently applicable to living systems, which are open systems that cannot be reduced to any kind of [...] Read more.
In this position paper, we argue that the conventional understanding of ‘information’ (as generally conceived in science, in a digital fashion) is overly simplistic and not consistently applicable to living systems, which are open systems that cannot be reduced to any kind of ‘portion’ (building block) ascribed to the category of quantity. Instead, it is a matter of relationships and qualities in an indivisible analogical (and ontological) relationship between any presumed ‘software’ and ‘hardware’ (information/matter, psyche/soma). Furthermore, in biological systems, contrary to Shannon’s definition, which is well-suited to telecommunications and informatics, any kind of ‘information’ is the opposite of internal entropy, as it depends directly on order: it is associated with distinction and differentiation, rather than flattening and homogenisation. Moreover, the high degree of structural compartmentalisation of living matter prevents its energetics from being thermodynamically described by using a macroscopic, bulk state function. This requires the Second Principle of Thermodynamics to be redefined in order to make it applicable to living systems. For these reasons, any static, bit-related concept of ‘information’ is inadequate, as it fails to consider the system’s evolution, it being, in essence, the organized coupling to its own environment. From the perspective of quantum field theory (QFT), where many vacuum levels, symmetry breaking, dissipation, coherence and phase transitions can be described, a consistent picture emerges that portrays any living system as a relational process that exists as a flux of context-dependent meanings. This epistemological shift is also associated with a transition away from the ‘particle view’ (first quantisation) characteristic of quantum mechanics (QM) towards the ‘field view’ possible only in QFT (second quantisation). This crucial transition must take place in life sciences, particularly regarding the methodological approaches. Foremost because biological systems cannot be conceived as ‘objects’, but rather as non-confinable processes and relationships. Full article
Show Figures

Figure 1

25 pages, 654 KB  
Article
Entropy-Regularized Federated Optimization for Non-IID Data
by Koffka Khan
Algorithms 2025, 18(8), 455; https://doi.org/10.3390/a18080455 - 22 Jul 2025
Viewed by 1192
Abstract
Federated learning (FL) struggles under non-IID client data when local models drift toward conflicting optima, impairing global convergence and performance. We introduce entropy-regularized federated optimization (ERFO), a lightweight client-side modification that augments each local objective with a Shannon entropy penalty on the per-parameter [...] Read more.
Federated learning (FL) struggles under non-IID client data when local models drift toward conflicting optima, impairing global convergence and performance. We introduce entropy-regularized federated optimization (ERFO), a lightweight client-side modification that augments each local objective with a Shannon entropy penalty on the per-parameter update distribution. ERFO requires no additional communication, adds a single-scalar hyperparameter λ, and integrates seamlessly into any FedAvg-style training loop. We derive a closed-form gradient for the entropy regularizer and provide convergence guarantees: under μ-strong convexity and L-smoothness, ERFO achieves the same O(1/T) (or linear) rates as FedAvg (with only O(λ) bias for fixed λ and exact convergence when λt0); in the non-convex case, we prove stationary-point convergence at O(1/T). Empirically, on five-client non-IID splits of the UNSW-NB15 intrusion-detection dataset, ERFO yields a +1.6 pp gain in accuracy and +0.008 in macro-F1 over FedAvg with markedly smoother dynamics. On a three-of-five split of PneumoniaMNIST, a fixed λ matches or exceeds FedAvg, FedProx, and SCAFFOLD—achieving 90.3% accuracy and 0.878 macro-F1—while preserving rapid, stable learning. ERFO’s gradient-only design is model-agnostic, making it broadly applicable across tasks. Full article
(This article belongs to the Special Issue Advances in Parallel and Distributed AI Computing)
Show Figures

Figure 1

Back to TopTop