Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (5,772)

Search Parameters:
Keywords = system entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
48 pages, 1830 KB  
Article
An Information–Theoretic Model of Abduction for Detecting Hallucinations in Explanations
by Boris Galitsky
Entropy 2026, 28(2), 173; https://doi.org/10.3390/e28020173 - 2 Feb 2026
Abstract
We present an Information–Theoretic Model of Abduction for Detecting Hallucinations in Generative Models, a neuro-symbolic framework that combines entropy-based inference with abductive reasoning to identify unsupported or contradictory content in large language model outputs. Our approach treats hallucination detection as a dual optimization [...] Read more.
We present an Information–Theoretic Model of Abduction for Detecting Hallucinations in Generative Models, a neuro-symbolic framework that combines entropy-based inference with abductive reasoning to identify unsupported or contradictory content in large language model outputs. Our approach treats hallucination detection as a dual optimization problem: minimizing the information gain between source-conditioned and response-conditioned belief distributions, while simultaneously selecting the minimal abductive hypothesis capable of explaining discourse-salient claims. By incorporating discourse structure through RST-derived EDU weighting, the model distinguishes legitimate abductive elaborations from claims that cannot be justified under any computationally plausible hypothesis. Experimental evaluation across medical, factual QA, and multi-hop reasoning datasets demonstrates that the proposed method outperforms state-of-the-art neural and symbolic baselines in both accuracy and interpretability. Qualitative analysis further shows that the framework successfully exposes plausible-sounding but abductively unsupported model errors, including real hallucinations generated by GPT-5.1. Together, these results indicate that integrating Information–Theoretic divergence and abductive explanation provides a principled and effective foundation for robust hallucination detection in generative systems. Full article
(This article belongs to the Special Issue Information Theory in Artificial Intelligence)
Show Figures

Figure 1

56 pages, 2923 KB  
Article
FileCipher: A Chaos-Enhanced CPRNG-Based Algorithm for Parallel File Encryption
by Yousef Sanjalawe, Ahmad Al-Daraiseh, Salam Al-E’mari and Sharif Naser Makhadmeh
Algorithms 2026, 19(2), 119; https://doi.org/10.3390/a19020119 - 2 Feb 2026
Abstract
The exponential growth of digital data and the escalating sophistication of cyber threats have intensified the demand for secure yet computationally efficient encryption methods. Conventional algorithms (e.g., AES-based schemes) are cryptographically strong and widely deployed; however, some implementations can face performance bottlenecks in [...] Read more.
The exponential growth of digital data and the escalating sophistication of cyber threats have intensified the demand for secure yet computationally efficient encryption methods. Conventional algorithms (e.g., AES-based schemes) are cryptographically strong and widely deployed; however, some implementations can face performance bottlenecks in large-scale or real-time workloads. While many modern systems seed from hardware entropy sources and employ standardized cryptographic PRNGs/DRBGs, security can still be degraded in practice by weak entropy initialization, misconfiguration, or the use of non-cryptographic deterministic generators in certain environments. To address these gaps, this study introduces FileCipher. This novel file-encryption framework integrates a chaos-enhanced Cryptographically Secure Pseudorandom Number Generator (CPRNG) based on the State-Based Tent Map (SBTM). The proposed design achieves a balanced trade-off between security and efficiency through dynamic key generation, adaptive block reshaping, and structured confusion–diffusion processes. The SBTM-driven CPRNG introduces adaptive seeding and multi-key feedback, ensuring high entropy and sensitivity to initial conditions. A multi-threaded Java implementation demonstrates approximately 60% reduction in encryption time compared with AES-CBC, validating FileCipher’s scalability in parallel execution environments. Statistical evaluations using NIST SP 800-22, SP 800-90B, Dieharder, and TestU01 confirm superior randomness with over 99% pass rates, while Avalanche Effect analysis indicates bit-change ratios near 50%, proving strong diffusion characteristics. The results highlight FileCipher’s novelty in combining nonlinear chaotic dynamics with lightweight parallel architecture, offering a robust, platform-independent solution for secure data storage and transmission. Ultimately, this paper contributes a reproducible, entropy-stable, and high-performance cryptographic mechanism that redefines the efficiency–security balance in modern encryption systems. Full article
12 pages, 1058 KB  
Article
Inforpower: Quantifying the Informational Power of Probability Distributions
by Hening Huang
AppliedMath 2026, 6(2), 19; https://doi.org/10.3390/appliedmath6020019 - 2 Feb 2026
Abstract
In many scientific and engineering fields (e.g., measurement science), a probability density function often models a system comprising a signal embedded in noise. Conventional measures, such as the mean, variance, entropy, and informity, characterize signal strength and uncertainty (or noise level) separately. However, [...] Read more.
In many scientific and engineering fields (e.g., measurement science), a probability density function often models a system comprising a signal embedded in noise. Conventional measures, such as the mean, variance, entropy, and informity, characterize signal strength and uncertainty (or noise level) separately. However, the true performance of a system depends on the interaction between signal and noise. In this paper, we propose a novel measure, called “inforpower”, for quantifying the system’s informational power that explicitly captures the interaction between signal and noise. We also propose a new measure of central tendency, called “information-energy center”. Closed-form expressions for inforpower and information-energy center are provided for ten well known continuous distributions. Moreover, we propose a maximum inforpower criterion, which can complement the Akaike information criterion (AIC), the minimum entropy criterion, and the maximum informity criterion for selecting the best distribution from a set of candidate distributions. Two examples (synthetic Weibull distribution data and Tana River annual maximum streamflow) are presented to demonstrate the effectiveness of the proposed maximum inforpower criterion and compare it with existing goodness-of-fit criteria. Full article
(This article belongs to the Section Probabilistic & Statistical Mathematics)
Show Figures

Figure 1

22 pages, 1100 KB  
Article
Statistical Distribution and Entropy of Multi-Scale Returns: A Coarse-Grained Analysis and Evidence for a New Stylized Fact
by Alejandro Raúl Hernández-Montoya
Entropy 2026, 28(2), 172; https://doi.org/10.3390/e28020172 - 2 Feb 2026
Abstract
Financial time series often show periods during which market index values or asset prices increase or decrease monotonically. These events are known as price runs, uninterrupted trends, or simply runs. By identifying such runs in the daily DJIA and IPC indices from 2 [...] Read more.
Financial time series often show periods during which market index values or asset prices increase or decrease monotonically. These events are known as price runs, uninterrupted trends, or simply runs. By identifying such runs in the daily DJIA and IPC indices from 2 January 1990 to 17 October 2025, we construct their associated returns to obtain a non-arbitrary sample of multi-scale returns, which we call trend returns (TReturns). The timescale of each multi-scale return is determined by the exponentially distributed duration of its corresponding run. We empirically show that the distribution of these coarse-grained returns exhibits distinctive statistical properties: the central region displays an exponential decay, likely resulting from the exponential distribution of trend durations, while the tails follow a power-law decay. This combination of exponential central behavior and asymptotic power-law decay has also been observed in other complex systems, and our findings provide additional evidence of its natural emergence. We also explore the informational properties of multi-scale returns using three measures: Shannon entropy, permutation entropy, and compression-based complexity. We find that Shannon entropy increases with coarse-graining, indicating a wider range of values; permutation entropy drops sharply, revealing underlying temporal patterns; and compression ratios improve, reflecting suppressed randomness. Overall, these findings suggest that constructing TReturns filters out microscopic noise, reveals structured temporal patterns, and provides a complementary and clear view of market behavior. Full article
(This article belongs to the Special Issue Entropy, Econophysics, and Complexity)
Show Figures

Figure 1

18 pages, 1308 KB  
Article
A New Chaotic Interval-Based Multi-Objective Honey Badger Algorithm for Real-Time Fire Localization
by Khedija Arour, Hadhami Kaabi, Mohamed Ben Farah and Raouf Abozariba
Information 2026, 17(2), 144; https://doi.org/10.3390/info17020144 - 2 Feb 2026
Abstract
Real-time fire localization in urban environments remains a significant challenge due to sparse IoT sensor deployments, measurement uncertainties, and the computational uses of AI-based estimation techniques. To address these limitations, this paper proposes a Chaotic Interval-Based Multi-Objective Honey Badger Algorithm (CI-MOHBA) designed to [...] Read more.
Real-time fire localization in urban environments remains a significant challenge due to sparse IoT sensor deployments, measurement uncertainties, and the computational uses of AI-based estimation techniques. To address these limitations, this paper proposes a Chaotic Interval-Based Multi-Objective Honey Badger Algorithm (CI-MOHBA) designed to improve the accuracy and reliability of fire source localization under uncertain and limited sensor data. The approach formulates localization as a multi-objective optimization problem that simultaneously minimizes source estimation error, false alarm rates, and computation time. CI-MOHBA integrates a new chaotic map to improve global search capability and interval arithmetic to effectively manage sensor uncertainty within sparse measurement environments. Experimental evaluation of the proposed chaotic map, supported by entropy convergence analysis and Lyapunov exponent verification, demonstrates the stability and robustness of the proposed technique. Results indicate that CI-MOHBA achieves an average localization error of 0.73 m and a false alarm rate of 8.2%, while maintaining high computational efficiency. Results show that the proposed algorithm is well-suited for real-time fire localization in urban IoT-based monitoring systems. Full article
(This article belongs to the Special Issue AI and Data Analysis in Smart Cities)
Show Figures

Figure 1

17 pages, 2638 KB  
Article
Evaluation of Geotourism Potential Based on Spatial Pattern Analysis in Jiangxi Province, China
by Qiuxiang Cao, Haixia Deng, Lanshu Zheng, Qing Wang and Kai Xu
Sustainability 2026, 18(3), 1449; https://doi.org/10.3390/su18031449 - 1 Feb 2026
Abstract
To provide essential information on geoheritage and geotourism potential in Jiangxi Province—a key region for geoheritage distribution in China—this study summarizes and categorizes the types, grades, and distribution characteristics of geoheritage within local communities. The primary analytical methods included average nearest neighbour analysis, [...] Read more.
To provide essential information on geoheritage and geotourism potential in Jiangxi Province—a key region for geoheritage distribution in China—this study summarizes and categorizes the types, grades, and distribution characteristics of geoheritage within local communities. The primary analytical methods included average nearest neighbour analysis, kernel density estimation, and spatial autocorrelation to explore spatial distribution patterns. A total of 202 significant geoheritage sites were identified in Jiangxi Province. Furthermore, an evaluation index system was established using the entropy weight TOPSIS model to assess the geotourism potential of each city. The findings reveal the following: (1) Geoheritage sites in Jiangxi Province exhibit an overall aggregated spatial distribution, although clustering intensity varies among different geoheritage types and grades. (2) Considering both grade and category, the core distribution area of geoheritage is located in eastern Shangrao City, while global-level geoheritage sites are mainly concentrated in the Poyang Lake Plain. (3) Spatial autocorrelation analysis indicates that, except for global-level geoheritage sites, other geoheritage sites display significant spatial agglomeration with positive spatial correlation. Moreover, local-scale spatial association characteristics differ notably according to geoheritage type and grade. (4) The geotourism development potential across Jiangxi Province shows clear spatial differentiation, with higher potential concentrated in the eastern and southern regions. Full article
Show Figures

Figure 1

18 pages, 10981 KB  
Article
Ensemble Entropy with Adaptive Deep Fusion for Short-Term Power Load Forecasting
by Yiling Wang, Yan Niu, Xuejun Li, Xianglong Dai, Xiaopeng Wang, Yong Jiang, Chenghu He and Li Zhou
Entropy 2026, 28(2), 158; https://doi.org/10.3390/e28020158 - 31 Jan 2026
Viewed by 43
Abstract
Accurate power load forecasting is crucial for ensuring the safety and economic operation of power systems. However, the complex, non-stationary, and heterogeneous nature of power load data presents significant challenges for traditional prediction methods, particularly in capturing instantaneous dynamics and effectively fusing multi-feature [...] Read more.
Accurate power load forecasting is crucial for ensuring the safety and economic operation of power systems. However, the complex, non-stationary, and heterogeneous nature of power load data presents significant challenges for traditional prediction methods, particularly in capturing instantaneous dynamics and effectively fusing multi-feature information. This paper proposes a novel framework—Ensemble Entropy with Adaptive Deep Fusion (EEADF)—for short-term multi-feature power load forecasting. The framework introduces an ensemble instantaneous entropy extraction module to compute and fuse multiple entropy types (approximate, sample, and permutation entropies) in real-time within sliding windows, creating a sensitive representation of system states. A task-adaptive hierarchical fusion mechanism is employed to balance computational efficiency and model expressivity. For time-series forecasting tasks with relatively structured patterns, feature concatenation fusion is used that directly combines LSTM sequence features with multimodal entropy features. For complex multimodal understanding tasks requiring nuanced cross-modal interactions, multi-head self-attention fusion is implemented that dynamically weights feature importance based on contextual relevance. A dual-branch deep learning model is constructed that processes both raw sequences (via LSTM) and extracted entropy features (via MLP) in parallel. Extensive experiments on a carefully designed simulated multimodal dataset demonstrate the framework’s robustness in recognizing diverse dynamic patterns, achieving MSE of 0.0125, MAE of 0.0794, and R² of 0.9932. Validation on the real-world ETDataset for power load forecasting confirms that the proposed method significantly outperforms baseline models (LSTM, TCN, transformer, and informer) and traditional entropy methods across standard evaluation metrics (MSE, MAE, RMSE, MAPE, and R²). Ablation studies further verify the critical roles of both the entropy features and the fusion mechanism. Full article
(This article belongs to the Section Multidisciplinary Applications)
19 pages, 657 KB  
Article
Entropy-Based Patent Valuation: Decoding “Costly Signals” in the Food Industry via a Robust Entropy–TOPSIS Framework
by Xiaoman Li, Wei Liu, Xiaohe Liang and Ailian Zhou
Entropy 2026, 28(2), 159; https://doi.org/10.3390/e28020159 - 31 Jan 2026
Viewed by 39
Abstract
Accurate patent valuation remains a persistent challenge in intellectual property management, particularly in the food industry, where technological homogeneity and rapid innovation cycles introduce substantial noise into observable performance indicators. Traditional valuation approaches, whether based on subjective expert judgment or citation-based metrics, often [...] Read more.
Accurate patent valuation remains a persistent challenge in intellectual property management, particularly in the food industry, where technological homogeneity and rapid innovation cycles introduce substantial noise into observable performance indicators. Traditional valuation approaches, whether based on subjective expert judgment or citation-based metrics, often struggle to effectively reduce information uncertainty in this context. To address this limitation, this study proposes an objective, data-driven patent valuation framework grounded in information theory. We construct a multidimensional evaluation system comprising nine indicators across technological, legal, and economic dimensions and apply it to a large-scale dataset of 100,648 invention patents. To address the heavy-tailed nature of patent indicators without sacrificing the information contained in high-impact outliers, we introduce a square-root transformation strategy that stabilizes dispersion while preserving ordinal relationships. Indicator weights are determined objectively via Shannon entropy, capturing the relative scarcity and discriminatory information content of each signal, after which comprehensive value scores are derived using the TOPSIS method. Empirical results reveal that the entropy-based model assigns dominant weights to so-called “costly signals”, specifically PCT applications (29.53%) and patent transfers (24.36%). Statistical correlation analysis confirms that these selected indicators are significantly associated with patent value (p<0.001), while bootstrapping tests demonstrate the robustness of the resulting weight structure. The model’s validity is further evaluated using an external benchmark (“ground truth”) dataset comprising 55 patents recognized by the China Patent Award. The proposed framework demonstrates substantially stronger discriminatory capability than baseline methods, awarded patents achieve an average score 2.64 times higher than that of ordinary patents, and the enrichment factor for award-winning patents within the Top-100 ranking reaches 91.5. Additional robustness analyses, including benchmarking against the Weighted Sum Model (WSM), further confirm the methodological stability of the framework, with sensitivity analysis revealing an exceptional enrichment factor of 183.1 for the Top-50 patents. These findings confirm that the Entropy–TOPSIS framework functions as an effective information-filtering mechanism, amplifying high-value patent signals in noise-intensive environments. Consequently, the proposed model serves as a generalizable and theoretically grounded tool for objective patent valuation, with particular relevance to industries characterized by heavy-tailed data and high information uncertainty. Full article
(This article belongs to the Section Multidisciplinary Applications)
38 pages, 1612 KB  
Article
The Mechanism and Spatiotemporal Variations in Digital Economy in Enhancing Resilience of the Cotton Industry Chain
by Muhabaiti Pareti, Sixue Qin, Yang Su, Jiao Zhang and Jiangtao Zhang
Systems 2026, 14(2), 152; https://doi.org/10.3390/systems14020152 - 31 Jan 2026
Viewed by 47
Abstract
In the era of the digital economy, enhancing the resilience of industrial chains is a core task in building a modern industrial system. This paper views the cotton industrial chain as a system composed of multiple segments and entities, aiming to explore how [...] Read more.
In the era of the digital economy, enhancing the resilience of industrial chains is a core task in building a modern industrial system. This paper views the cotton industrial chain as a system composed of multiple segments and entities, aiming to explore how the digital economy drives the collaborative evolution of the chain’s constituent elements, organizational structure, and overall functions, ultimately enhancing its resilience to respond to shocks and adapt to changes. The study focuses on the cotton industrial chain, systematically analyzing the mechanisms and spatiotemporal characteristics of the digital economy’s impact on its resilience, aiming to provide theoretical support and practical pathways for constructing a secure, efficient, and sustainable cotton industrial chain. Based on panel data from nine provinces in China’s three major cotton-producing regions from 2013 to 2022, the study uses the entropy method to measure the technological innovation vitality and the resilience of the cotton industrial chain, employing a semi-parametric panel model to empirically test the systemic association between them, and utilizing a mediation effect model to identify the roles of market information utilization and the scale of planting in this relationship. The findings indicate the following: (1) The development of the digital economy significantly enhances the resilience of the cotton industrial chain and exhibits an inverted U-shaped nonlinear relationship. (2) The digital economy enhances the overall resilience and synergy of the cotton industrial chain through two key pathways: improving the technological innovation vitality and increasing the level of planting scale. (3) The influence of the digital economy on the resilience of the cotton industrial chain shows geographical heterogeneity, with the order being “Yangtze River Basin cotton areas > Northwest Inland cotton areas > Yellow River Basin cotton areas.” The impact of the digital economy on the resilience of the cotton industrial chain also exhibits temporal heterogeneity, with “2013–2017 > 2018–2022.” From the perspective of system optimization, future efforts should focus on constructing regionally differentiated collaborative mechanisms, improving the integrated platform for market information services, strengthening incentives for large-scale planting policies, enhancing the digital literacy of practitioners, and conducting skills training, in order to strengthen the overall resilience and sustainable evolution of China’s cotton industrial chain. Full article
(This article belongs to the Section Supply Chain Management)
27 pages, 2073 KB  
Article
SparseMambaNet: A Novel Architecture Integrating Bi-Mamba and a Mixture of Experts for Efficient EEG-Based Lie Detection
by Hanbeot Park, Yunjeong Cho and Hunhee Kim
Appl. Sci. 2026, 16(3), 1437; https://doi.org/10.3390/app16031437 - 30 Jan 2026
Viewed by 133
Abstract
Traditional lie detection technologies, such as the polygraph and event-related potential (ERP)-based approaches, often face limitations in real-world applicability due to their sensitivity to psychological states and the complex, nonlinear nature of electroencephalogram (EEG) signals. In this study, we propose SparseMambaNet, a novel [...] Read more.
Traditional lie detection technologies, such as the polygraph and event-related potential (ERP)-based approaches, often face limitations in real-world applicability due to their sensitivity to psychological states and the complex, nonlinear nature of electroencephalogram (EEG) signals. In this study, we propose SparseMambaNet, a novel neural architecture that integrates the recently developed Bi-Mamba model with a Sparsely Activated Mixture of Experts (MoE) structure to effectively model the intricate spatio-temporal dynamics of EEG data. By leveraging the near-linear computational complexity of Mamba and the bidirectional contextual modeling of Bi-Mamba, the proposed framework efficiently processes long EEG sequences while maximizing representational power through the selective activation of expert networks tailored to diverse input characteristics. Experiments were conducted with 46 healthy subjects using a simulated criminal scenario based on the Comparison Question Technique (CQT) with monetary incentives to induce realistic psychological tension. We extracted nine statistical and neural complexity features, including Hjorth parameters, Sample Entropy, and Spectral Entropy. The results demonstrated that Sample entropy and Hjorth parameters achieved exceptional classification performance, recording F1 scores of 0.9963 and 0.9935, respectively. Statistical analyses further revealed that the post-response “answer” interval provided significantly higher discriminative power compared to the “question” interval. Furthermore, channel-level analysis identified core neural loci for deception in the frontal and fronto-central regions, specifically at channels E54 and E63. These findings suggest that SparseMambaNet offers a highly efficient and precise solution for EEG-based lie detection, providing a robust foundation for the development of personalized brain–computer interface (BCI) systems in forensic and clinical settings. Full article
(This article belongs to the Special Issue Brain-Computer Interfaces: Development, Applications, and Challenges)
Show Figures

Figure 1

28 pages, 854 KB  
Article
Study on Coupling Coordination Between Ecotourism and Economic Development in Hainan Free Trade Port
by Gang Liu, Jingyao Chen and Shaohui Wang
Sustainability 2026, 18(3), 1403; https://doi.org/10.3390/su18031403 - 30 Jan 2026
Viewed by 96
Abstract
Coordinating ecotourism development with economic growth is central to achieving sustainability in regions where natural assets are both a comparative advantage and a binding constraint. This study assesses the ecotourism economy coupling coordination in Hainan Free Trade Port (China) during 2017–2023. Building on [...] Read more.
Coordinating ecotourism development with economic growth is central to achieving sustainability in regions where natural assets are both a comparative advantage and a binding constraint. This study assesses the ecotourism economy coupling coordination in Hainan Free Trade Port (China) during 2017–2023. Building on sustainable development theory, systems theory, and the tourism-led growth hypothesis, we conceptualize three coordination pathways, industrial structure upgrading, clustering effects, and urban–rural linkages, and operationalize them through an 18-indicator evaluation system covering ecotourism and economic subsystems. Indicator weights are determined using the entropy weight method, and the coupling coordination degree model is applied to quantify the interaction intensity and coordination level. Gray Relational Analysis is further used as a robustness-oriented complement to identify the factors most associated with coordination changes. Results show that both subsystems improved overall with noticeable fluctuations: the ecotourism index rose from 0.239 to 0.719, while the economic development index increased from 0.370 to 0.610. The coupling coordination degree advanced from moderate dysregulation (0.230 in 2017) to near quality coordination (0.995 in 2023), while shock-sensitive years highlight the vulnerability of tourism-related performance. The findings suggest that improving industrial structure and strengthening tourism-related productive capacity and external connectivity are key levers for sustaining coordination without compromising ecological efficiency. Full article
24 pages, 6704 KB  
Article
Exploratory Assessment of Short-Term Antecedent Modeled Flow Memory in Shaping Macroinvertebrate Diversity: Integrating Satellite-Derived Precipitation and Rainfall-Runoff Modeling in a Remote Andean Micro-Catchment
by Gonzalo Sotomayor, Raúl F. Vázquez, Marie Anne Eurie Forio, Henrietta Hampel, Bolívar Erazo and Peter L. M. Goethals
Biology 2026, 15(3), 257; https://doi.org/10.3390/biology15030257 - 30 Jan 2026
Viewed by 323
Abstract
Estimating runoff in ungauged catchments remains a major challenge in hydrology, particularly in remote Andean headwaters where limited accessibility and budgetary constraints hinder the long-term operation of monitoring networks. This study integrates satellite-derived rainfall data, hydrological modeling, and benthic macroinvertebrate diversity analysis to [...] Read more.
Estimating runoff in ungauged catchments remains a major challenge in hydrology, particularly in remote Andean headwaters where limited accessibility and budgetary constraints hinder the long-term operation of monitoring networks. This study integrates satellite-derived rainfall data, hydrological modeling, and benthic macroinvertebrate diversity analysis to explore how short-term antecedent flow conditions relate to temporal variation in community structure. The research was conducted in a pristine 0.26 km2 micro-catchment of the upper Collay basin (southern Ecuador). Daily simulated discharge was used to compute antecedent flow descriptors representing short-term variability and cumulative changes in stream conditions, which were related to taxonomic (i.e., H = Shannon diversity, E = Pielou evenness, and D = Simpson dominance) and functional indices (i.e., Rao = Rao’s quadratic entropy, FAD1 = Functional Attribute Diversity, and wFDc = weighted functional dendrogram-based diversity) using Generalized Additive Models. Results showed progressively higher hydrology–biology associations with increasing antecedent flow integration length, suggesting that biological variability responds more strongly to cumulative than to instantaneous flow conditions. Among hydrological descriptors, the cumulative magnitude of negative flow changes was consistently associated with taxonomic diversity. H and E showed more coherent and robust patterns than functional metrics, indicating a faster response of community composition to short-term hydrological variability, whereas functional diversity integrates slower ecological processes. While based on modeled discharge under severe hydrometeorological data limitations, this study provides a practical ecohydrological starting point for identifying short-term hydrological memory signals potentially relevant to aquatic biodiversity in ungauged headwater systems. Full article
(This article belongs to the Section Marine and Freshwater Biology)
Show Figures

Graphical abstract

30 pages, 4008 KB  
Article
Path-Dependent Infrastructure Planning: A Network Science-Driven Decision Support System with Iterative TOPSIS
by Senbin Yu, Haichen Chen, Nina Xu, Xinxin Yu, Zeling Fang, Gehui Liu and Jun Yang
Symmetry 2026, 18(2), 258; https://doi.org/10.3390/sym18020258 - 30 Jan 2026
Viewed by 64
Abstract
Expressway networks represent evolving complex systems whose topological properties significantly impact regional development. This paper presents a decision support framework for addressing the expressway infrastructure sequencing problem using computational intelligence. We develop a novel framework that models expressways as L-space networks and evaluates [...] Read more.
Expressway networks represent evolving complex systems whose topological properties significantly impact regional development. This paper presents a decision support framework for addressing the expressway infrastructure sequencing problem using computational intelligence. We develop a novel framework that models expressways as L-space networks and evaluates how construction sequences create path-dependent evolutionary trajectories, introducing network science principles into infrastructure planning decisions. Our decision support framework quantifies project impacts on accessibility, connectivity, and reliability using nine topological metrics and a hybrid weighting mechanism that combines domain expertise with entropy-based uncertainty quantification. The system employs a hybrid TOPSIS algorithm that relies on geometric symmetry to simulate network evolution, capturing emergent properties in which each decision restructures possibilities for subsequent choices—a computational challenge that conventional planning approaches have not addressed. The system was validated with real-world Chongqing expressway planning data, demonstrating its ability to identify sequences that maximize synergistic network effects. Results reveal how topologically equivalent projects produce dramatically different system-wide outcomes depending on implementation order. Analysis shows that network science-informed sequencing substantially enhances system performance by exploiting structural synergies. This research advances decision support frameworks by bridging complex network theory with computational decision-making, creating a novel analytical tool that enables transportation authorities to implement evidence-based infrastructure sequencing strategies beyond the reach of conventional planning methods. Full article
(This article belongs to the Section Physics)
39 pages, 7869 KB  
Article
Research on an Ultra-Short-Term Wind Power Forecasting Model Based on Multi-Scale Decomposition and Fusion Framework
by Daixuan Zhou, Yan Jia, Guangchen Liu, Junlin Li, Kaile Xi, Zhichao Wang and Xu Wang
Symmetry 2026, 18(2), 253; https://doi.org/10.3390/sym18020253 - 30 Jan 2026
Viewed by 63
Abstract
Accurate wind power prediction is of great significance for the dispatch, security, and stable operation of energy systems. It helps enhance the symmetry and coordination between the highly stochastic and volatile nature of the power generation supply side and the stringent requirements for [...] Read more.
Accurate wind power prediction is of great significance for the dispatch, security, and stable operation of energy systems. It helps enhance the symmetry and coordination between the highly stochastic and volatile nature of the power generation supply side and the stringent requirements for stability and power quality on the grid demand side. To further enhance the accuracy of ultra-short-term wind power forecasting, this paper proposes a novel prediction framework based on multi-layer data decomposition, reconstruction, and a combined prediction model. A multi-stage decomposition and reconstruction technique is first employed to significantly reduce noise interference: the Sparrow Search Algorithm (SSA) is utilized to optimize the parameters for an initial Variational Mode Decomposition (VMD), followed by a secondary decomposition of the high-frequency components using Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN). The resulting components are then reconstructed based on Sample Entropy (SE), effectively improving the quality of the input data. Subsequently, a hybrid prediction model named IMGWO-BiTCN-BiGRU is constructed to extract spatiotemporal bidirectional features from the input sequences. Finally, simulation experiments are conducted using actual measurement data from the Sotavento wind farm in Spain. The results demonstrate that the proposed hybrid model outperforms benchmark models across all evaluation metrics, validating its effectiveness in improving forecasting accuracy and stability. Full article
23 pages, 7886 KB  
Article
Building Virtual Drainage Systems Based on Open Road Data and Assessing Urban Flooding Risks
by Haowen Li, Chuanjie Yan, Chun Zhou and Li Zhou
Water 2026, 18(3), 341; https://doi.org/10.3390/w18030341 - 29 Jan 2026
Viewed by 170
Abstract
With accelerating urbanisation, extreme rainfall events have become increasingly frequent, leading to rising urban flooding risks that threaten city operation and infrastructure safety. The rapid expansion of impervious surfaces reduces infiltration capacity and accelerates runoff responses, making cities more vulnerable to short-duration, high-intensity [...] Read more.
With accelerating urbanisation, extreme rainfall events have become increasingly frequent, leading to rising urban flooding risks that threaten city operation and infrastructure safety. The rapid expansion of impervious surfaces reduces infiltration capacity and accelerates runoff responses, making cities more vulnerable to short-duration, high-intensity storms. Although the SWMM is widely used for urban stormwater simulation, its application is often constrained by the lack of detailed drainage network data, such as pipe diameters, slopes, and node connectivity. To address this limitation, this study focuses on the main built-up area within the Second Ring Expressway of Chengdu, Sichuan Province, in southwestern China. As a regional core city, Chengdu frequently experiences intense short-duration rainfall during the rainy season, and the coexistence of rapid urbanisation with ageing drainage infrastructure further elevates flood risk. Accordingly, a technical framework of “open road data substitution–automated modelling–SWMM-based assessment” is proposed. Leveraging the spatial correspondence between road layouts and drainage pathways, open road data are used to construct a virtual drainage system. Combined with DEM and land-use data, Python-based automation enables sub-catchment delineation, parameter extraction, and network topology generation, achieving efficient large-scale modelling. Design storms of multiple return periods are generated based on Chengdu’s revised rainfall intensity formula, while socioeconomic indicators such as population density and infrastructure exposure are normalised and weighted using the entropy method to develop a comprehensive flood-risk assessment. Results indicate that the virtual drainage network effectively compensates for missing pipe data at the macro scale, and high-risk zones are mainly concentrated in densely populated and highly urbanised older districts. Overall, the proposed method successfully captures urban flood-risk patterns under data-scarce conditions and provides a practical approach for large-city flood-risk management. Full article
Show Figures

Figure 1

Back to TopTop