Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (269)

Search Parameters:
Keywords = heavy-tailed data

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
17 pages, 1132 KB  
Article
Multifractal Random Walk Model for Bursty Impulsive PLC Noise
by Steven O. Awino and Bakhe Nleya
Appl. Sci. 2026, 16(1), 49; https://doi.org/10.3390/app16010049 (registering DOI) - 20 Dec 2025
Abstract
The indoor low-voltage power line network is characterized by highly irregular interferences, where background noise coexists with bursty impulsive noise originating from household appliances and switching events. Traditional noise models, which are considered monofractal models, often fail to reproduce the clustering, intermittency, and [...] Read more.
The indoor low-voltage power line network is characterized by highly irregular interferences, where background noise coexists with bursty impulsive noise originating from household appliances and switching events. Traditional noise models, which are considered monofractal models, often fail to reproduce the clustering, intermittency, and long-range dependence seen in measurement data. In this paper, a Multifractal Random Walk (MRW) framework tailored for Power Line Communication (PLC) noise modelling is developed. MRW is a continuous time limit process based on discrete-time random walks with stochastic log-normal variance. As such, the formulated MRW framework introduces a stochastic volatility component that modulates Gaussian increments, thus generating heavy-tailed statistics and multifractal scaling laws which are consistent with the measured PLC noise data. Empirical validation is carried out through structure function analysis and covariance of log-amplitudes, both of which reveal scaling characteristics that align well with MRW-simulated predictions. This proposed model captures both the bursty nature and correlation structure of impulsive PLC noise more effectively as compared to the conventional monofractal approaches, thereby providing a mathematically grounded framework for accurate noise generation and the robust system-level performance evaluation of PLC networks. Full article
Show Figures

Figure 1

14 pages, 977 KB  
Article
Maximizing Portfolio Diversification via Weighted Shannon Entropy: Application to the Cryptocurrency Market
by Florentin Șerban and Silvia Dedu
Risks 2025, 13(12), 253; https://doi.org/10.3390/risks13120253 - 18 Dec 2025
Abstract
This paper develops a robust portfolio optimization framework that integrates Weighted Shannon Entropy (WSE) into the classical mean–variance paradigm, offering a distribution-free approach to diversification suited for volatile and heavy-tailed markets. While traditional variance-based models are highly sensitive to estimation errors and instability [...] Read more.
This paper develops a robust portfolio optimization framework that integrates Weighted Shannon Entropy (WSE) into the classical mean–variance paradigm, offering a distribution-free approach to diversification suited for volatile and heavy-tailed markets. While traditional variance-based models are highly sensitive to estimation errors and instability in covariance structures—issues that are particularly acute in cryptocurrency markets—entropy provides a structural mechanism for mitigating concentration risk and enhancing resilience under uncertainty. By incorporating informational weights that reflect asset-specific characteristics such as volatility, market capitalization, and liquidity, the WSE model generalizes classical Shannon entropy and allows for more realistic, data-driven diversification profiles. Analytical solutions derived from the maximum entropy principle and Lagrange multipliers yield exponential-form portfolio weights that balance expected return, variance, and diversification. The empirical analysis examines two case studies: a four-asset cryptocurrency portfolio (BTC, ETH, SOL, and BNB) over January–March 2025, and an extended twelve-asset portfolio over April 2024–March 2025 with rolling rebalancing and proportional transaction costs. The results show that WSE portfolios achieve systematically higher entropy scores, more balanced allocations, and improved downside protection relative to both equal-weight and classical mean–variance portfolios. Risk-adjusted metrics confirm these improvements: WSE delivers higher Sharpe ratios and less negative Conditional Value-at-Risk (CVaR), together with reduced overexposure to highly volatile assets. Overall, the findings demonstrate that Weighted Shannon Entropy offers a transparent, flexible, and robust framework for portfolio construction in environments characterized by nonlinear dependencies, structural breaks, and parameter uncertainty. Beyond its empirical performance, the WSE model provides a theoretically grounded bridge between information theory and risk management, with strong potential for applications in algorithmic allocation, index construction, and regulatory settings where diversification and stability are essential. Moreover, the integration of informational weighting schemes highlights the capacity of WSE to incorporate both statistical properties and market microstructure signals, thereby enhancing its practical relevance for real-world investment decision-making. Full article
Show Figures

Figure 1

22 pages, 2261 KB  
Article
Statistical and Multivariate Analysis of the IoT-23 Dataset: A Comprehensive Approach to Network Traffic Pattern Discovery
by Humera Ghani, Shahram Salekzamankhani and Bal Virdee
J. Cybersecur. Priv. 2025, 5(4), 112; https://doi.org/10.3390/jcp5040112 - 16 Dec 2025
Viewed by 134
Abstract
The rapid expansion of Internet of Things (IoT) technologies has introduced significant challenges in understanding the complexity and structure of network traffic data, which is essential for developing effective cybersecurity solutions. This research presents a comprehensive statistical and multivariate analysis of the IoT-23 [...] Read more.
The rapid expansion of Internet of Things (IoT) technologies has introduced significant challenges in understanding the complexity and structure of network traffic data, which is essential for developing effective cybersecurity solutions. This research presents a comprehensive statistical and multivariate analysis of the IoT-23 dataset to identify meaningful network traffic patterns and assess the effectiveness of various analytical methods for IoT security research. The study applies descriptive statistics, inferential analysis, and multivariate techniques, including Principal Component Analysis (PCA), DBSCAN clustering, and factor analysis (FA), to the publicly available IoT-23 dataset. Descriptive analysis reveals clear evidence of non-normal distributions: for example, the features src_bytes, dst_bytes, and src_pkts have skewness values of −4.21, −3.87, and −2.98, and kurtosis values of 38.45, 29.67, and 18.23, respectively. These values indicate highly skewed, heavy-tailed distributions with frequent outliers. Correlation analysis revealed a strong positive correlation (0.97) between orig_bytes and resp_bytes, and a strong negative correlation (−0.76) between duration and resp_bytes, while inferential statistics indicate that linear regression provides optimal modeling of data relationships. Key findings show that PCA is highly effective, capturing 99% of the dataset’s variance and enabling significant dimensionality reduction. DBSCAN clustering identifies six distinct clusters, highlighting diverse network traffic behaviors within IoT environments. In contrast, FA explains only 11.63% of the variance, indicating limited suitability for this dataset. These results establish important benchmarks for future IoT cybersecurity research and demonstrate the superior effectiveness of PCA and DBSCAN for analyzing complex IoT network traffic data. The findings offer practical guidance for researchers in selecting appropriate statistical methods for IoT dataset analysis, ultimately supporting the development of more robust cybersecurity solutions. Full article
(This article belongs to the Special Issue Intrusion/Malware Detection and Prevention in Networks—2nd Edition)
Show Figures

Figure 1

55 pages, 28888 KB  
Article
MECOA: A Multi-Strategy Enhanced Coati Optimization Algorithm for Global Optimization and Photovoltaic Models Parameter Estimation
by Hang Chen and Maomao Luo
Biomimetics 2025, 10(12), 839; https://doi.org/10.3390/biomimetics10120839 - 15 Dec 2025
Viewed by 176
Abstract
To address the limitations of the traditional Coati Optimization Algorithm (COA), such as insufficient global exploration, poor population cooperation, and low convergence efficiency in global optimization and photovoltaic (PV) model parameter identification, this paper proposes a Multi-strategy Enhanced Coati Optimization Algorithm (MECOA). MECOA [...] Read more.
To address the limitations of the traditional Coati Optimization Algorithm (COA), such as insufficient global exploration, poor population cooperation, and low convergence efficiency in global optimization and photovoltaic (PV) model parameter identification, this paper proposes a Multi-strategy Enhanced Coati Optimization Algorithm (MECOA). MECOA improves performance through three core strategies: (1) Elite-guided search, which replaces the single global best solution with an elite pool of three top individuals and incorporates the heavy-tailed property of Lévy flights to balance large-step exploration and small-step exploitation; (2) Horizontal crossover, which simulates biological gene recombination to promote information sharing among individuals and enhance cooperative search efficiency; and (3) Precise elimination, which discards 20% of low-fitness individuals in each generation and generates new individuals around the best solution to improve population quality. Experiments on the CEC2017 (30/50/100-dimensional) and CEC2022 (20-dimensional) benchmark suites demonstrate that MECOA achieves superior performance. On CEC2017, MECOA ranks first with an average rank of 1.87, 2.07, 1.83, outperforming the second-best LSHADE (2.03, 2.43 and 2.63) and the original COA (9.93, 9.93 and 9.96). On CEC2022, MECOA also maintains the leading position with an average rank of 1.58, far surpassing COA (8.92). Statistical analysis using the Wilcoxon rank-sum test (significance level 0.05) confirms the superiority of MECOA. Furthermore, MECOA is applied to parameter identification of single-diode (SDM) and double-diode (DDM) PV models. Experiments based on real measurement data show that the SDM model achieves an RMSE of 9.8610 × 10−4, which is only 1/20 of that of COA. For the DDM model, the fitted curves almost perfectly overlap with the experimental data, with a total integrated absolute error (IAE) of only 0.021555 A. These results fully validate the effectiveness and reliability of MECOA in solving complex engineering optimization problems, providing a robust and efficient solution for accurate modeling and optimization of PV systems. Full article
Show Figures

Figure 1

35 pages, 4673 KB  
Article
Advances in Discrete Lifetime Modeling: A Novel Discrete Weibull Mixture Distribution with Applications to Medical and Reliability Studies
by Doha R. Salem, Mai A. Hegazy, Hebatalla H. Mohammad, Zakiah I. Kalantan, Gannat R. AL-Dayian, Abeer A. EL-Helbawy and Mervat K. Abd Elaal
Symmetry 2025, 17(12), 2140; https://doi.org/10.3390/sym17122140 - 12 Dec 2025
Viewed by 139
Abstract
In recent years, there has been growing interest in discrete probability distributions due to their ability to model the complex behavior of real-world count data. In this paper, a new discrete mixture distribution based on two Weibull components is introduced, constructed using the [...] Read more.
In recent years, there has been growing interest in discrete probability distributions due to their ability to model the complex behavior of real-world count data. In this paper, a new discrete mixture distribution based on two Weibull components is introduced, constructed using the general discretization approach. Several important statistical properties of the proposed distribution, including the survival function, hazard rate function, alternative hazard rate function, moments, quantile function, and order statistics are derived. It was concluded from the descriptive measures that the discrete mixture of two Weibull distributions transitions from being positively skewed with heavy tails to a more symmetric and light-tailed form. This demonstrates the high flexibility of the discrete mixture of two Weibull distributions in capturing a wide range of shapes as its parameter values vary. Estimation of the parameters is performed via maximum likelihood under Type II censoring scheme. A simulation study assesses the performance of the maximum likelihood estimators. Furthermore, the applicability of the proposed distribution is demonstrated using two real-life datasets. In summary, this paper constructs the discrete mixture of two Weibull distributions, investigates its statistical characteristics, and estimates its parameters, demonstrating its flexibility and practical applicability. These results highlight its potential as a powerful tool for modeling complex discrete data. Full article
Show Figures

Figure 1

22 pages, 10850 KB  
Article
Characterization and Quantification of Methane Emission Plumes and Super-Emitter Detection Across North-Central Brazil Using Hyperspectral Satellite Data
by Gabriel I. Cotlier, Vitor F. V. V. de Miranda and Juan Carlos Jimenez
Remote Sens. 2025, 17(24), 3973; https://doi.org/10.3390/rs17243973 - 9 Dec 2025
Viewed by 292
Abstract
Methane (CH4) is a potent greenhouse gas and a key target for near-term climate mitigation, yet major uncertainties remain in quantifying emissions from landfills, particularly in rapidly urbanizing regions of the Global South. Here, we present a systematic satellite-based assessment of [...] Read more.
Methane (CH4) is a potent greenhouse gas and a key target for near-term climate mitigation, yet major uncertainties remain in quantifying emissions from landfills, particularly in rapidly urbanizing regions of the Global South. Here, we present a systematic satellite-based assessment of CH4 emissions from landfills and related sites across northern and central Brazil, based on plume detections from the Carbon Mapper public data portal. Using imaging spectroscopy data from the Earth Surface Mineral Dust Source Investigation (EMIT) onboard the International Space Station and the dedicated Tanager-1 satellite, we analyzed 40 plume detections across 16 sites in nine Brazilian states spanning the Amazon forest biome and the Cerrado transition region. An adaptive thresholding algorithm was applied to each detection to quantify plume strength (ppm·m3), areal extent, and recurrence across multiple overpasses. Our results reveal a strongly heavy-tailed distribution of emissions, with most sites exhibiting modest plume strengths in the 106–107 ppm·m3 range, while a small number of facilities dominated the upper tail. Two detections at Brasília (2.22 × 108 and 2.14 × 108 ppm·m3) and one at Marituba (1.66 × 108 ppm·m3) were classified as super-emitters, exceeding all other sites by more than an order of magnitude. These facilities also demonstrated high persistence across overpasses, in contrast to smaller landfills such as Macapá and Boa Vista, where emissions were weaker (<107 ppm·m3) and episodic. Regional contrasts were also evident: sites in the Cerrado transition zone, (e.g., Brasília, Campo Grande) generally showed stronger and more frequent emissions than those in the Amazon basin. These findings underscore the disproportionate role of a few persistent super-emitters in shaping the regional CH4 budget. Targeted mitigation at these high-impact sites could yield rapid and cost-effective emission reductions, directly supporting Brazil’s commitments under the Paris Agreement and the Global CH4 Pledge. More broadly, this study demonstrates the power of high-resolution satellite imaging spectroscopy for identifying, monitoring, and prioritizing CH4 mitigation opportunities in the waste sector. Full article
Show Figures

Graphical abstract

28 pages, 683 KB  
Article
A New Topp–Leone Heavy-Tailed Odd Burr X-G Family of Distributions with Applications
by Fastel Chipepa, Bassant Elkalzah, Broderick Oluyede, Neo Dingalo and Abdurahman Aldukeel
Symmetry 2025, 17(12), 2093; https://doi.org/10.3390/sym17122093 - 5 Dec 2025
Viewed by 166
Abstract
This paper introduces the Topp–Leone Heavy-Tailed Odd Burr X-G (TL-HT-OBX-G) family of distributions (FOD), designed to model diverse data patterns. The new distribution is an infinite linear combination of the established exponentiated-G distributions. We used the established properties of the exponentiated-G distribution to [...] Read more.
This paper introduces the Topp–Leone Heavy-Tailed Odd Burr X-G (TL-HT-OBX-G) family of distributions (FOD), designed to model diverse data patterns. The new distribution is an infinite linear combination of the established exponentiated-G distributions. We used the established properties of the exponentiated-G distribution to infer the properties of the new FOD. The properties considered include the quantile function, moments and moment generating functions, probability-weighted moments, order statistics, stochastic orderings, and Rényi entropy. Parameter estimation is performed using multiple techniques, such as maximum likelihood, least squares, weighted least squares, Anderson–Darling, Cramér–von Mises, and Right-Tail Anderson–Darling. The maximum likelihood estimation method produced superior results in the Monte Carlo simulation studies. A special case of the developed model was applied to three real-world datasets. The model parameters were estimated using the maximum likelihood method. The selected special model was compared to other competing models, and goodness-of-fit was evaluated by the use of several goodness-of-fit statistics. The developed model fit the selected real-world datasets better than all the selected competing models. The new FOD provides a new framework for data modeling in health sciences and reliability datasets. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

22 pages, 1405 KB  
Article
Entropy-Based Evidence Functions for Testing Dilation Order via Cumulative Entropies
by Mashael A. Alshehri
Entropy 2025, 27(12), 1235; https://doi.org/10.3390/e27121235 - 5 Dec 2025
Viewed by 154
Abstract
This paper introduces novel non-parametric entropy-based evidence functions and associated test statistics for assessing the dilation order of probability distributions constructed from cumulative residual entropy and cumulative entropy. The proposed evidence functions are explicitly tuned to questions about distributional variability and stochastic ordering, [...] Read more.
This paper introduces novel non-parametric entropy-based evidence functions and associated test statistics for assessing the dilation order of probability distributions constructed from cumulative residual entropy and cumulative entropy. The proposed evidence functions are explicitly tuned to questions about distributional variability and stochastic ordering, rather than global model fit, and are developed within a rigorous evidential framework. Their asymptotic distributions are established, providing a solid foundation for large-sample inference. Beyond their theoretical appeal, these procedures act as effective entropy-driven tools for quantifying statistical evidence, offering a compelling non-parametric alternative to traditional approaches, such as Kullback–Leibler discrepancies. Comprehensive Monte Carlo simulations highlight their robustness and consistently high power across a wide range of distributional scenarios, including heavy-tailed models, where conventional methods often perform poorly. A real-data example further illustrates their practical utility, showing how cumulative entropies can provide sharper statistical evidence and clarify stochastic comparisons in applied settings. Altogether, these results advance the theoretical foundation of evidential statistics and open avenues for applying cumulative entropies to broader classes of stochastic inference problems. Full article
Show Figures

Figure 1

19 pages, 560 KB  
Article
Modeling PM2.5 Pollution Using a Truncated Positive Student’s-t Distribution: A Case Study in Chile
by Héctor J. Gómez, Karol I. Santoro, Diego I. Gallardo, Paola E. Leal and Tiago M. Magalhães
Mathematics 2025, 13(23), 3838; https://doi.org/10.3390/math13233838 - 30 Nov 2025
Viewed by 170
Abstract
This study revisits a recently proposed member of the truncated positive family of distributions, referred to as the positively truncated Student’s-t distribution. The distribution retains the structure of the classical Student’s-t distribution while explicitly incorporating a kurtosis parameter, yielding a flexible three-parameter formulation [...] Read more.
This study revisits a recently proposed member of the truncated positive family of distributions, referred to as the positively truncated Student’s-t distribution. The distribution retains the structure of the classical Student’s-t distribution while explicitly incorporating a kurtosis parameter, yielding a flexible three-parameter formulation that governs location, scale, and tail behavior. A closed-form quantile function is derived, allowing a novel reparameterization based on the pth quantile and thereby facilitating integration into quantile regression models. The analytical tractability of the quantile function also enables efficient random number generation via the inverse transform method, which supports a comprehensive simulation study demonstrating the strong performance of the proposed estimators, particularly for the degrees-of-freedom parameter. The entire methodology is implemented in the tpn package for the R software. Finally, two real-data applications involving PM2.5 measurements—one without covariates and another with covariates—highlight the model’s robustness and its ability to capture heavy-tailed behavior. Full article
(This article belongs to the Special Issue Mathematical Modelling and Applied Statistics)
Show Figures

Figure 1

18 pages, 684 KB  
Article
A New Goodness-of-Fit Test for Azzalini’s Skew-t Distribution Based on the Energy Distance Framework with Applications
by Joseph Njuki and Abeer M. Hasan
Mathematics 2025, 13(23), 3833; https://doi.org/10.3390/math13233833 - 29 Nov 2025
Viewed by 250
Abstract
In response to the growing need for flexible parametric models for skewed and heavy-tailed data, this paper introduces a novel goodness-of-fit test for the Skew-t distribution, a widely used flexible parametric probability distribution. Traditional methods often fail to capture the complex behavior [...] Read more.
In response to the growing need for flexible parametric models for skewed and heavy-tailed data, this paper introduces a novel goodness-of-fit test for the Skew-t distribution, a widely used flexible parametric probability distribution. Traditional methods often fail to capture the complex behavior of data in fields such as engineering, public health, and the social sciences. Our proposed test, based on energy statistics, provides practitioners with a robust and powerful tool for assessing the suitability of the Skew-t distribution for their data. We present a comprehensive methodological evaluation, including a comparative study that highlights the advantages of our approach over traditional tests. The results of our simulation studies demonstrate a significant improvement in power, leading to more reliable inference. To further showcase the practical utility of our method, we apply the proposed test to three real-world datasets, offering a valuable contribution to both the theoretical and applied aspects of statistical modeling for non-normal data. Full article
Show Figures

Figure 1

16 pages, 1639 KB  
Article
Event-Driven Average Estimation with Dispersion-Tolerant Poisson–Inverse Gaussian Approach
by Atef F. Hashem, Asmaa S. Al-Moisheer, Ahmet Bekir, Ishfaq Ahmad and Muhammad Raza
Mathematics 2025, 13(23), 3822; https://doi.org/10.3390/math13233822 - 28 Nov 2025
Viewed by 192
Abstract
Overdispersion is a major problem in the context of count data analysis, and the classical Poisson regression estimators are, in general, unreliable since they imply that the mean equals its variance. In this article, an event-driven class of average estimators, which is based [...] Read more.
Overdispersion is a major problem in the context of count data analysis, and the classical Poisson regression estimators are, in general, unreliable since they imply that the mean equals its variance. In this article, an event-driven class of average estimators, which is based on the Poisson–inverse Gaussian (P-IG) regression model, is formulated to overcome this shortcoming. P-IG regression is a mixture of Poisson and inverse Gaussian regression that is modeled to deal with the overdispersion that is often found in real data. It approximates such count data by a compound distribution with a heavy-tailed inverse Gaussian component. Suggested estimators are more effective in estimating the population means in situations of overdispersion using auxiliary data in the form of covariates. The design-based framework specifies the statistical properties of proposed estimators with respect to their bias and mean squared error (MSE). To confirm the effectiveness and the strength of the suggested methodology, a reasonable amount of simulations and real-data applications are carried out, contrasting it with customary Poisson-based estimators. The results indicate that the P-IG-based estimators are superior over their counterparts. The study provides a statistically valid and practically useful breakthrough in survey sampling and count data regression that can provide researchers and practitioners with a strong alternative to classical Poisson-regression-based mean estimator procedures. Full article
Show Figures

Figure 1

77 pages, 18938 KB  
Article
Rainfall Disaggregation in Data-Scarce Regions Using the Random Bartlett-Lewis Rectangular Pulse Model
by Sofia Skroufouta and Evangelos Baltas
Climate 2025, 13(12), 242; https://doi.org/10.3390/cli13120242 - 27 Nov 2025
Viewed by 334
Abstract
Rainfall disaggregation is a key challenge in hydrology, especially in regions with limited high-resolution records. This study applies the Random Bartlett–Lewis Rectangular Pulse Model to four regions of Hellas to generate hourly rainfall from daily totals. The work is novel in evaluating the [...] Read more.
Rainfall disaggregation is a key challenge in hydrology, especially in regions with limited high-resolution records. This study applies the Random Bartlett–Lewis Rectangular Pulse Model to four regions of Hellas to generate hourly rainfall from daily totals. The work is novel in evaluating the model under data-scarce Mediterranean conditions, incorporating a two-tiered uncertainty analysis, testing alternative pulse intensity distributions (Gamma and Exponential), and comparing its performance with a deterministic machine learning (ML) approach. Results show that the RBLRPM reproduces essential rainfall properties such as variance, autocorrelation, skewness, and dry spell probabilities, even when calibrated with as little as three years of data. The ML approach ensures perfect conservation of daily totals and computational efficiency, but it smooths temporal variability and underestimates extremes. By contrast, the stochastic RBLRPM captures clustering, intermittency, and heavy tails more realistically, which is crucial for hydrological design and flood risk analysis. The Gamma distribution consistently outperforms the Exponential form, though both remain applicable. Overall, the Gamma-based RBLRPM offers a robust and transferable method for rainfall disaggregation in data-limited contexts, highlighting the importance of stochastic approaches for water resource management, infrastructure resilience, and climate adaptation. Full article
(This article belongs to the Special Issue Advances of Flood Risk Assessment and Management)
Show Figures

Figure 1

17 pages, 3983 KB  
Article
Reference Static Pressure Effect on Fluctuating Wind Pressure on Roofs of Low-Rise Buildings in Open-Circuit Wind Tunnels
by Mengchang Yang, Enguang Wang and Il-Seung Yang
Buildings 2025, 15(23), 4208; https://doi.org/10.3390/buildings15234208 - 21 Nov 2025
Viewed by 242
Abstract
The structural characteristics of open-circuit wind tunnels result in internal static pressure instability, which can affect the accuracy of wind pressure coefficient measurements on rigid models of low-rise buildings. To address this issue, a Pitot tube and an ESP electronic pressure scanning system [...] Read more.
The structural characteristics of open-circuit wind tunnels result in internal static pressure instability, which can affect the accuracy of wind pressure coefficient measurements on rigid models of low-rise buildings. To address this issue, a Pitot tube and an ESP electronic pressure scanning system were used to collect data on reference static pressure variation and wind pressure on the roof of a low-rise building under Class B wind terrain and different temperature conditions. The results indicate that the reference static pressure decreases with increasing temperature and is significantly influenced by external airflow disturbances at the beginning of the experiment, and it tends to stabilize approximately 5 min after the wind tunnel is activated. The probability density of reference static pressure under different conditions mostly follows a Gaussian distribution, although a few samples exhibit heavy-tailed or skewed fluctuations. The sliding standard deviation and coefficient of variation of the reference static pressure are both relatively small, but occasional samples show extreme fluctuations. It is recommended to apply filtering techniques or repeated measurements to reduce experimental errors. At a wind direction angle of 0°, the fluctuating wind pressure coefficients on the roof calculated using reference static pressures from different samples exhibit good consistency. The average mean relative error of the fluctuating wind pressure coefficients across roof zones I–VIII was 3.71%, which is within an acceptable range. The research findings provide a useful reference for reducing result errors in wind pressure tests conducted in open-circuit wind tunnels. Full article
(This article belongs to the Section Building Structures)
Show Figures

Figure 1

25 pages, 515 KB  
Article
Prioritizing Longitudinal Gene–Environment Interactions Using an FDR-Assisted Robust Bayesian Linear Mixed Model
by Xiaoxi Li, Kun Fan and Cen Wu
Algorithms 2025, 18(11), 728; https://doi.org/10.3390/a18110728 - 19 Nov 2025
Viewed by 403
Abstract
Analysis of longitudinal data in high-dimensional gene–environment interaction studies have been extensively conducted using variable selection methods. Despite their success, these studies have been consistently challenged by the lack of uncertainty quantification procedures to identify main and interaction effects under longitudinal phenotypes that [...] Read more.
Analysis of longitudinal data in high-dimensional gene–environment interaction studies have been extensively conducted using variable selection methods. Despite their success, these studies have been consistently challenged by the lack of uncertainty quantification procedures to identify main and interaction effects under longitudinal phenotypes that follow heavy-tailed distributions due to disease heterogeneity. In this article, to improve statistical rigor of variable selection-based G × E analysis, we propose to apply the robust Bayesian linear mixed-effect model with a false discovery rate (FDR) control procedure to tackle these challenges. The Bayesian mixed model adopts a robust likelihood function to account for skewness in longitudinal phenotypic measurements, and it imposes spike-and-slab priors to detect important main and interaction effects. Leveraging the parallelism between spike-and-slab priors and the Bayesian approach to hypothesis testing, we perform variable selection and uncertainty quantification through a Bayesian false discovery rate (FDR)-assisted procedure. Numerical analyses have demonstrated the advantage of our proposal over alternative approaches. A case study of a longitudinal cancer prevention study with high-dimensional lipid measures yields main and interaction effects with important biological implications. Full article
Show Figures

Figure 1

20 pages, 873 KB  
Article
Multi-Sensor Recursive EM Algorithm for Robust Identification of ARX Models
by Xin Chen and Jiale Li
Sensors 2025, 25(22), 7060; https://doi.org/10.3390/s25227060 - 19 Nov 2025
Viewed by 403
Abstract
A robust multi-sensor recursive Expectation-Maximization (RMSREM) algorithm is proposed in this paper for autoregressive eXogenous (ARX) models, addressing the challenges of heavy-tailed noise, as well as the difficulty in simultaneously processing multi-sensor information. First, for the potential outliers in industrial processes, the Student’s [...] Read more.
A robust multi-sensor recursive Expectation-Maximization (RMSREM) algorithm is proposed in this paper for autoregressive eXogenous (ARX) models, addressing the challenges of heavy-tailed noise, as well as the difficulty in simultaneously processing multi-sensor information. First, for the potential outliers in industrial processes, the Student’s t-distribution is introduced to model the statistical characteristics of measurement noise, whose heavy-tailed property enhances the algorithm’s robustness. Second, a recursive framework is integrated into the Expectation-Maximization (EM) algorithm to satisfy the real-time requirement of dynamic system identification. Through a recursive scheme of the Q-function and sufficient statistics, model parameters are updated in real-time, allowing them to adapt to time-varying system characteristics. Finally, by exploiting the redundancy and complementarity of multi-sensor data, a multi-sensor information fusion mechanism is designed that adaptively calculates the weight of each sensor based on the noise variances. This mechanism effectively fuses multi-source observation information and mitigates the impact of single-sensor failure or inaccuracy on identification performance. Numerical examples and simulations of the continuous stirred-tank reactor (CSTR) demonstrate the validity of the proposed RMSREM algorithm. Full article
(This article belongs to the Section Industrial Sensors)
Show Figures

Figure 1

Back to TopTop