Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (497)

Search Parameters:
Keywords = empirical likelihood

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 3746 KiB  
Article
Empirical Modelling of Ice-Jam Flood Hazards Along the Mackenzie River in a Changing Climate
by Karl-Erich Lindenschmidt, Sergio Gomez, Jad Saade, Brian Perry and Apurba Das
Water 2025, 17(15), 2288; https://doi.org/10.3390/w17152288 - 1 Aug 2025
Abstract
This study introduces a novel methodology for assessing ice-jam flood hazards along river channels. It employs empirical equations that relate non-dimensional ice-jam stage to discharge, enabling the generation of an ensemble of longitudinal profiles of ice-jam backwater levels through Monte-Carlo simulations. These simulations [...] Read more.
This study introduces a novel methodology for assessing ice-jam flood hazards along river channels. It employs empirical equations that relate non-dimensional ice-jam stage to discharge, enabling the generation of an ensemble of longitudinal profiles of ice-jam backwater levels through Monte-Carlo simulations. These simulations produce non-exceedance probability profiles, which indicate the likelihood of various flood levels occurring due to ice jams. The flood levels associated with specific return periods were validated using historical gauge records. The empirical equations require input parameters such as channel width, slope, and thalweg elevation, which were obtained from bathymetric surveys. This approach is applied to assess ice-jam flood hazards by extrapolating data from a gauged reach at Fort Simpson to an ungauged reach at Jean Marie River along the Mackenzie River in Canada’s Northwest Territories. The analysis further suggests that climate change is likely to increase the severity of ice-jam flood hazards in both reaches by the end of the century. This methodology is applicable to other cold-region rivers in Canada and northern Europe, provided similar fluvial geomorphological and hydro-meteorological data are available, making it a valuable tool for ice-jam flood risk assessment in other ungauged areas. Full article
Show Figures

Figure 1

22 pages, 3440 KiB  
Article
Probabilistic Damage Modeling and Thermal Shock Risk Assessment of UHTCMC Thruster Under Transient Green Propulsion Operation
by Prakhar Jindal, Tamim Doozandeh and Jyoti Botchu
Materials 2025, 18(15), 3600; https://doi.org/10.3390/ma18153600 (registering DOI) - 31 Jul 2025
Abstract
This study presents a simulation-based damage modeling and fatigue risk assessment of a reusable ceramic matrix composite thruster designed for short-duration, green bipropellant propulsion systems. The thruster is constructed from a fiber-reinforced ultra-high temperature ceramic matrix composite composed of zirconium diboride, silicon carbide, [...] Read more.
This study presents a simulation-based damage modeling and fatigue risk assessment of a reusable ceramic matrix composite thruster designed for short-duration, green bipropellant propulsion systems. The thruster is constructed from a fiber-reinforced ultra-high temperature ceramic matrix composite composed of zirconium diboride, silicon carbide, and carbon fibers. Time-resolved thermal and structural simulations are conducted on a validated thruster geometry to characterize the severity of early-stage thermal shock, stress buildup, and potential degradation pathways. Unlike traditional fatigue studies that rely on empirical fatigue constants or Paris-law-based crack-growth models, this work introduces a simulation-derived stress-margin envelope methodology that incorporates ±20% variability in temperature-dependent material strength, offering a physically grounded yet conservative risk estimate. From this, a normalized risk index is derived to evaluate the likelihood of damage initiation in critical regions over the 0–10 s firing window. The results indicate that the convergent throat region experiences a peak thermal gradient rate of approximately 380 K/s, with the normalized thermal shock index exceeding 43. Stress margins in this region collapse by 2.3 s, while margin loss in the flange curvature appears near 8 s. These findings are mapped into green, yellow, and red risk bands to classify operational safety zones. All the results assume no active cooling, representing conservative operating limits. If regenerative or ablative cooling is implemented, these margins would improve significantly. The framework established here enables a transparent, reproducible methodology for evaluating lifetime safety in ceramic propulsion nozzles and serves as a foundational tool for fatigue-resilient component design in green space engines. Full article
Show Figures

Figure 1

31 pages, 19389 KiB  
Article
Enhancing Prediction by Incorporating Entropy Loss in Volatility Forecasting
by Renaldas Urniezius, Rytis Petrauskas, Vygandas Vaitkus, Javid Karimov, Kestutis Brazauskas, Jolanta Repsyte, Egle Kacerauskiene, Torsten Harms, Jovita Dargiene and Darius Ezerskis
Entropy 2025, 27(8), 806; https://doi.org/10.3390/e27080806 - 28 Jul 2025
Viewed by 249
Abstract
In this paper, we propose examining Heterogeneous Autoregressive (HAR) models using five different estimation techniques and four different estimation horizons to decide which performs better in terms of forecasting accuracy. Several different estimators are used to determine the coefficients of three selected HAR-type [...] Read more.
In this paper, we propose examining Heterogeneous Autoregressive (HAR) models using five different estimation techniques and four different estimation horizons to decide which performs better in terms of forecasting accuracy. Several different estimators are used to determine the coefficients of three selected HAR-type models. Furthermore, model lags, calculated using 5 min intraday data from the Standard & Poor’s 500 (SPX) index and the Chicago Board Options Exchange Volatility (VIX) index as the sole exogenous variable, enrich the models. For comparison and evaluation of the experimental results, we use three metrics: Quasi-Likelihood (QLIKE), Mean Absolute Error (MAE), and Mean Squared Error (MSE). An empirical study reveals that the Entropy Loss Function consistently achieves the best QLIKE results in all the horizons, especially in the weekly horizon. On the other hand, the performance of the Robust Linear Model implies that it can provide an alternative to the Entropy Loss Function when considering the results of the MAE and MSE metrics. Moreover, research shows that adding more informative lags, such as Realized Quarticity for the Heterogeneous Autoregressive model yielding the Realized Quarticity (HARQ) model, and incorporating the VIX index further improve the general results of the models. The results of the proposed Entropy Loss Function and Robust Linear Model suggest that they successfully achieve significant forecasting accuracy for HAR models across multiple forecasting horizons. Full article
Show Figures

Figure 1

14 pages, 281 KiB  
Article
Optimising Regimen of Co-Amoxiclav (ORCA)—The Safety and Efficacy of Intravenous Co-Amoxiclav at Higher Dosing Frequency in Patients with Diabetic Foot Infection
by Jun Jie Tan, Peijun Yvonne Zhou, Jia Le Lim, Fang Liu and Lay Hoon Andrea Kwa
Antibiotics 2025, 14(8), 758; https://doi.org/10.3390/antibiotics14080758 - 28 Jul 2025
Viewed by 206
Abstract
Background: With increasing pharmacokinetic evidence suggesting the inadequacy of conventional dose intravenous co-amoxiclav (IVCA) 1.2 g Q8H in targeting Enterobacterales, our institution antibiotic guidelines optimised dosing recommendations for diabetic foot infection (DFI) management to 1.2 g Q6H in August 2023. In [...] Read more.
Background: With increasing pharmacokinetic evidence suggesting the inadequacy of conventional dose intravenous co-amoxiclav (IVCA) 1.2 g Q8H in targeting Enterobacterales, our institution antibiotic guidelines optimised dosing recommendations for diabetic foot infection (DFI) management to 1.2 g Q6H in August 2023. In this study, we aim to evaluate the efficacy and safety of the optimised dose IVCA in DFI treatment. Methods: In this single-centre cohort study, patients ≥ 21 years with DFI, creatinine clearance ≥ 50 mL/min, and weight > 50 kg, who were prescribed IVCA 1.2 g Q8H (standard group (SG)), were compared with those prescribed IVCA 1.2 g Q6H (optimised group (OG)). Patients who were pregnant, immunocompromised, had nosocomial exposure in last 3 months, or received < 72 h of IVCA were excluded. The primary efficacy outcome was clinical deterioration at end of IVCA monotherapy. The secondary efficacy outcomes include 30-day readmission and mortality, empiric escalation of antibiotics, lower limb amputation, and length of hospitalisation. The safety outcomes include hepatotoxicity, renal toxicity, and diarrhoea. Results: There were 189 patients (94 in SG; 95 in OG) included. Patients in SG (31.9%) were twice as likely to experience clinical deterioration compared to OG (16.8%) (odds ratio: 2.31, 95% confidence interval: 1.16–4.62, p < 0.05). There were statistically more patients who had 30-day all-cause mortality in SG (5.3%) compared to OG (0%) (p < 0.05). Furthermore, 30-day readmission due to DFI in SG (26.6%) was higher compared to OG (11.6%) (p < 0.05). Empiric escalation of IV antibiotics was required for 14.9% patients in SG and 6.3% patients in OG (p = 0.06). There was no statistical difference for lower limb amputation (p = 0.72), length of hospitalisation (p = 0.13), and the occurrence of safety outcomes in both groups. Conclusions: This study suggests IVCA 1.2 g Q6H is associated with the decreased likelihood of clinical deterioration and is likely as safe as IVCA 1.2 g Q8H. The optimised dose of IVCA may help reduce the use of broad-spectrum antibiotics due to clinical deterioration. Full article
(This article belongs to the Special Issue Antimicrobial Stewardship—from Projects to Standard of Care)
Show Figures

Graphical abstract

27 pages, 856 KiB  
Article
Equivalence Test and Sample Size Determination Based on Odds Ratio in an AB/BA Crossover Study with Binary Outcomes
by Shi-Fang Qiu, Xue-Qin Yu and Wai-Yin Poon
Axioms 2025, 14(8), 582; https://doi.org/10.3390/axioms14080582 - 27 Jul 2025
Viewed by 213
Abstract
Crossover trials are specifically designed to evaluate treatment effects within individual participants through within-subject comparisons. In a standard AB/BA crossover trial, participants are randomly allocated to one of two treatment sequences: either the AB sequence (where patients receive treatment A first and then [...] Read more.
Crossover trials are specifically designed to evaluate treatment effects within individual participants through within-subject comparisons. In a standard AB/BA crossover trial, participants are randomly allocated to one of two treatment sequences: either the AB sequence (where patients receive treatment A first and then cross over to treatment B after a washout period) or the BA sequence (where patients receive B first and then cross over to A after a washout period). Asymptotic and approximate unconditional test procedures, based on two Wald-type statistics, the likelihood ratio statistic, and the score test statistic for the odds ratio (OR), are developed to evaluate the equality of treatment effects in this trial design. Additionally, confidence intervals for OR are constructed, accompanied by an approximate sample size calculation methodology to control the interval width at a pre-specified precision. Empirical analyses demonstrate that asymptotic test procedures exhibit robust performance in moderate to large sample sizes, though they occasionally yield unsatisfactory type I error rates when the sample size is small. In such cases, approximate unconditional test procedures emerge as a rigorous alternative. All proposed confidence intervals achieve satisfactory coverage probabilities, and the approximate sample size estimation method demonstrates high accuracy, as evidenced by empirical coverage probabilities aligning closely with pre-specified confidence levels under estimated sample sizes. To validate practical utility, two real examples are used to illustrate the proposed methodologies. Full article
(This article belongs to the Special Issue Recent Developments in Statistical Research)
Show Figures

Figure 1

20 pages, 437 KiB  
Article
A Copula-Driven CNN-LSTM Framework for Estimating Heterogeneous Treatment Effects in Multivariate Outcomes
by Jong-Min Kim
Mathematics 2025, 13(15), 2384; https://doi.org/10.3390/math13152384 - 24 Jul 2025
Viewed by 391
Abstract
Estimating heterogeneous treatment effects (HTEs) across multiple correlated outcomes poses significant challenges due to complex dependency structures and diverse data types. In this study, we propose a novel deep learning framework integrating empirical copula transformations with a CNN-LSTM (Convolutional Neural Networks and Long [...] Read more.
Estimating heterogeneous treatment effects (HTEs) across multiple correlated outcomes poses significant challenges due to complex dependency structures and diverse data types. In this study, we propose a novel deep learning framework integrating empirical copula transformations with a CNN-LSTM (Convolutional Neural Networks and Long Short-Term Memory networks) architecture to capture nonlinear dependencies and temporal dynamics in multivariate treatment effect estimation. The empirical copula transformation, a rank-based nonparametric approach, preprocesses input covariates to better represent the underlying joint distributions before modeling. We compare this method with a baseline CNN-LSTM model lacking copula preprocessing and a nonparametric tree-based approach, the Causal Forest, grounded in generalized random forests for HTE estimation. Our framework accommodates continuous, count, and censored survival outcomes simultaneously through a multitask learning setup with customized loss functions, including Cox partial likelihood for survival data. We evaluate model performance under varying treatment perturbation rates via extensive simulation studies, demonstrating that the Empirical Copula CNN-LSTM achieves superior accuracy and robustness in average treatment effect (ATE) and conditional average treatment effect (CATE) estimation. These results highlight the potential of copula-based deep learning models for causal inference in complex multivariate settings, offering valuable insights for personalized treatment strategies. Full article
(This article belongs to the Special Issue Current Developments in Theoretical and Applied Statistics)
Show Figures

Figure 1

18 pages, 1029 KiB  
Article
Processing Fruits and Vegetables as a Way to Prevent Their Waste
by Ksenia Juszczak-Szelągowska, Iwona Kowalczuk, Dawid Olewnicki, Małgorzata Kosicka-Gębska and Dagmara Stangierska-Mazurkiewicz
Sustainability 2025, 17(14), 6610; https://doi.org/10.3390/su17146610 - 19 Jul 2025
Viewed by 360
Abstract
The aim of the current study was to determine the scale and underlying causes for the waste of raw and processed fruits and vegetables in Polish households. A survey was conducted on a representative sample of 1100 respondents. The collected empirical data were [...] Read more.
The aim of the current study was to determine the scale and underlying causes for the waste of raw and processed fruits and vegetables in Polish households. A survey was conducted on a representative sample of 1100 respondents. The collected empirical data were analyzed using statistical tools such as non-parametric tests, multiple regression methods, and logistic regression. This study assessed the level and determinants of waste of raw and processed fruits and vegetables, identified the reasons for this waste and their impact on its extent, and analyzed the effect of waste prevention methods (including processing) on the scale of product losses. This study showed that the scale of waste of processed fruits and vegetables in Polish consumer households is significantly lower than that of raw products. The level of waste for both raw and processed products vary depending on place of residence, education, income, household size, and, in the case of processed fruits and vegetables, also the age of respondents. The main reason for fruit and vegetable losses in households is missing the product’s expiration date. Logistic regression analysis showed that the most effective strategies for reducing the waste of raw fruits and vegetables include purchasing the right quantities and freezing them. In contrast, practices such as donating food to others or composting were linked to a statistically significant decrease in the likelihood of reducing waste. Full article
(This article belongs to the Special Issue Future Trends in Food Processing and Food Preservation Techniques)
Show Figures

Figure 1

32 pages, 907 KiB  
Article
A New Exponentiated Power Distribution for Modeling Censored Data with Applications to Clinical and Reliability Studies
by Kenechukwu F. Aforka, H. E. Semary, Sidney I. Onyeagu, Harrison O. Etaga, Okechukwu J. Obulezi and A. S. Al-Moisheer
Symmetry 2025, 17(7), 1153; https://doi.org/10.3390/sym17071153 - 18 Jul 2025
Viewed by 826
Abstract
This paper presents the exponentiated power shanker (EPS) distribution, a fresh three-parameter extension of the standard Shanker distribution with the ability to extend a wider class of data behaviors, from right-skewed and heavy-tailed phenomena. The structural properties of the distribution, namely complete and [...] Read more.
This paper presents the exponentiated power shanker (EPS) distribution, a fresh three-parameter extension of the standard Shanker distribution with the ability to extend a wider class of data behaviors, from right-skewed and heavy-tailed phenomena. The structural properties of the distribution, namely complete and incomplete moments, entropy, and the moment generating function, are derived and examined in a formal manner. Maximum likelihood estimation (MLE) techniques are used for estimation of parameters, as well as a Monte Carlo simulation study to account for estimator performance across varying sample sizes and parameter values. The EPS model is also generalized to a regression paradigm to include covariate data, whose estimation is also conducted via MLE. Practical utility and flexibility of the EPS distribution are demonstrated through two real examples: one for the duration of repairs and another for HIV/AIDS mortality in Germany. Comparisons with some of the existing distributions, i.e., power Zeghdoudi, power Ishita, power Prakaamy, and logistic-Weibull, are made through some of the goodness-of-fit statistics such as log-likelihood, AIC, BIC, and the Kolmogorov–Smirnov statistic. Graphical plots, including PP plots, QQ plots, TTT plots, and empirical CDFs, further confirm the high modeling capacity of the EPS distribution. Results confirm the high goodness-of-fit and flexibility of the EPS model, making it a very good tool for reliability and biomedical modeling. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

25 pages, 434 KiB  
Article
The Impact of Digitalization on Carbon Emission Efficiency: An Intrinsic Gaussian Process Regression Approach
by Yongtong Hu, Jiaqi Xu and Tao Liu
Sustainability 2025, 17(14), 6551; https://doi.org/10.3390/su17146551 - 17 Jul 2025
Viewed by 285
Abstract
This study introduces an intrinsic Gaussian Process Regression (iGPR) model for the first time, which incorporates non-Euclidean spatial covariates via a Gaussian process prior to analyzing the relationship between digitalization and carbon emission efficiency. The iGPR model’s hierarchical design embeds a Gaussian process [...] Read more.
This study introduces an intrinsic Gaussian Process Regression (iGPR) model for the first time, which incorporates non-Euclidean spatial covariates via a Gaussian process prior to analyzing the relationship between digitalization and carbon emission efficiency. The iGPR model’s hierarchical design embeds a Gaussian process as a flexible spatial random effect with a heat-kernel-based covariance function to capture the manifold geometry of spatial features. To enable tractable inference, we employ a penalized maximum-likelihood estimation (PMLE) approach to jointly estimate regression coefficients and covariance hyperparameters. Using a panel dataset linking a national digitalization (modernization) index to carbon emission efficiency, the empirical analysis demonstrates that digitalization has a significantly positive impact on carbon emission efficiency while accounting for spatial heterogeneity. The iGPR model also exhibits superior predictive accuracy compared to state-of-the-art machine learning methods (including XGBoost, random forest, support vector regression, ElasticNet, and a standard Gaussian process regression), achieving the lowest mean squared error (MSE = 0.0047) and an average prediction error near zero. Robustness checks include instrumental-variable GMM estimation to address potential endogeneity across the efficiency distribution and confirm the stability of the estimated positive effect of digitalization. Full article
Show Figures

Figure 1

20 pages, 774 KiB  
Article
Robust Variable Selection via Bayesian LASSO-Composite Quantile Regression with Empirical Likelihood: A Hybrid Sampling Approach
by Ruisi Nan, Jingwei Wang, Hanfang Li and Youxi Luo
Mathematics 2025, 13(14), 2287; https://doi.org/10.3390/math13142287 - 16 Jul 2025
Viewed by 302
Abstract
Since the advent of composite quantile regression (CQR), its inherent robustness has established it as a pivotal methodology for high-dimensional data analysis. High-dimensional outlier contamination refers to data scenarios where the number of observed dimensions (p) is much greater than the [...] Read more.
Since the advent of composite quantile regression (CQR), its inherent robustness has established it as a pivotal methodology for high-dimensional data analysis. High-dimensional outlier contamination refers to data scenarios where the number of observed dimensions (p) is much greater than the sample size (n) and there are extreme outliers in the response variables or covariates (e.g., p/n > 0.1). Traditional penalized regression techniques, however, exhibit notable vulnerability to data outliers during high-dimensional variable selection, often leading to biased parameter estimates and compromised resilience. To address this critical limitation, we propose a novel empirical likelihood (EL)-based variable selection framework that integrates a Bayesian LASSO penalty within the composite quantile regression framework. By constructing a hybrid sampling mechanism that incorporates the Expectation–Maximization (EM) algorithm and Metropolis–Hastings (M-H) algorithm within the Gibbs sampling scheme, this approach effectively tackles variable selection in high-dimensional settings with outlier contamination. This innovative design enables simultaneous optimization of regression coefficients and penalty parameters, circumventing the need for ad hoc selection of optimal penalty parameters—a long-standing challenge in conventional LASSO estimation. Moreover, the proposed method imposes no restrictive assumptions on the distribution of random errors in the model. Through Monte Carlo simulations under outlier interference and empirical analysis of two U.S. house price datasets, we demonstrate that the new approach significantly enhances variable selection accuracy, reduces estimation bias for key regression coefficients, and exhibits robust resistance to data outlier contamination. Full article
Show Figures

Figure 1

31 pages, 807 KiB  
Article
A Three-Parameter Record-Based Transmuted Rayleigh Distribution (Order 3): Theory and Real-Data Applications
by Faton Merovci
Symmetry 2025, 17(7), 1034; https://doi.org/10.3390/sym17071034 - 1 Jul 2025
Viewed by 260
Abstract
This paper introduces the record-based transmuted Rayleigh distribution of order 3 (rbt-R), a three-parameter extension of the classical Rayleigh model designed to address data characterized by high skewness and heavy tails. While traditional generalizations of the Rayleigh distribution enhance model flexibility, they often [...] Read more.
This paper introduces the record-based transmuted Rayleigh distribution of order 3 (rbt-R), a three-parameter extension of the classical Rayleigh model designed to address data characterized by high skewness and heavy tails. While traditional generalizations of the Rayleigh distribution enhance model flexibility, they often lack sufficient adaptability to capture the complexity of empirical distributions encountered in applied statistics. The rbt-R model incorporates two additional shape parameters, a and b, enabling it to represent a wider range of distributional shapes. Parameter estimation for the rbt-R model is performed using the maximum likelihood method. Simulation studies are conducted to evaluate the asymptotic properties of the estimators, including bias and mean squared error. The performance of the rbt-R model is assessed through empirical applications to four datasets: nicotine yields and carbon monoxide emissions from cigarette data, as well as breaking stress measurements from carbon-fiber materials. Model fit is evaluated using standard goodness-of-fit criteria, including AIC, AICc, BIC, and the Kolmogorov–Smirnov statistic. In all cases, the rbt-R model demonstrates a superior fit compared to existing Rayleigh-based models, indicating its effectiveness in modeling highly skewed and heavy-tailed data. Full article
(This article belongs to the Special Issue Symmetric or Asymmetric Distributions and Its Applications)
Show Figures

Figure 1

25 pages, 378 KiB  
Article
Markov Observation Models and Deepfakes
by Michael A. Kouritzin
Mathematics 2025, 13(13), 2128; https://doi.org/10.3390/math13132128 - 29 Jun 2025
Viewed by 206
Abstract
Herein, expanded Hidden Markov Models (HMMs) are considered as potential deepfake generation and detection tools. The most specific model is the HMM, while the most general is the pairwise Markov chain (PMC). In between, the Markov observation model (MOM) is proposed, where the [...] Read more.
Herein, expanded Hidden Markov Models (HMMs) are considered as potential deepfake generation and detection tools. The most specific model is the HMM, while the most general is the pairwise Markov chain (PMC). In between, the Markov observation model (MOM) is proposed, where the observations form a Markov chain conditionally on the hidden state. An expectation-maximization (EM) analog to the Baum–Welch algorithm is developed to estimate the transition probabilities as well as the initial hidden-state-observation joint distribution for all the models considered. This new EM algorithm also includes a recursive log-likelihood equation so that model selection can be performed (after parameter convergence). Once models have been learnt through the EM algorithm, deepfakes are generated through simulation, while they are detected using the log-likelihood. Our three models were compared empirically in terms of their generative and detective ability. PMC and MOM consistently produced the best deepfake generator and detector, respectively. Full article
Show Figures

Figure 1

24 pages, 3774 KiB  
Article
A Novel Stochastic SVIR Model Capturing Transmission Variability Through Mean-Reverting Processes and Stationary Reproduction Thresholds
by Yassine Sabbar and Saud Fahad Aldosary
Mathematics 2025, 13(13), 2097; https://doi.org/10.3390/math13132097 - 26 Jun 2025
Viewed by 356
Abstract
This study presents a stochastic SVIR epidemic model in which disease transmission rates fluctuate randomly over time, driven by independent, mean-reverting processes with multiplicative noise. These dynamics capture environmental variability and behavioral changes affecting disease spread. We derive analytical expressions for the conditional [...] Read more.
This study presents a stochastic SVIR epidemic model in which disease transmission rates fluctuate randomly over time, driven by independent, mean-reverting processes with multiplicative noise. These dynamics capture environmental variability and behavioral changes affecting disease spread. We derive analytical expressions for the conditional moments of the transmission rates and establish the existence of their stationary distributions under broad conditions. By averaging over these distributions, we define a stationary effective reproduction number that enables a probabilistic classification of outbreak scenarios. Specifically, we estimate the likelihood of disease persistence or extinction based on transmission uncertainty. Sensitivity analyses reveal that the shape and intensity of transmission variability play a decisive role in epidemic outcomes. Monte Carlo simulations validate our theoretical findings, showing strong agreement between empirical distributions and theoretical predictions. Our results underscore how randomness in disease transmission can fundamentally alter epidemic trajectories, offering a robust mathematical framework for risk assessment under uncertainty. Full article
Show Figures

Figure 1

15 pages, 234 KiB  
Article
Cultural Dimensions of Trade Fairs: A Longitudinal Analysis of Urban Development and Destination Loyalty in Thessaloniki
by Dimitris Kourkouridis and Asimenia Salepaki
Urban Sci. 2025, 9(7), 237; https://doi.org/10.3390/urbansci9070237 - 24 Jun 2025
Viewed by 310
Abstract
Trade fairs are not only commercial platforms but also catalysts for urban development, city branding, and international engagement. This longitudinal study analyzes data from trade fair exhibitors from China, the United Arab Emirates (UAE), and Germany to examine how cultural differences influence their [...] Read more.
Trade fairs are not only commercial platforms but also catalysts for urban development, city branding, and international engagement. This longitudinal study analyzes data from trade fair exhibitors from China, the United Arab Emirates (UAE), and Germany to examine how cultural differences influence their experiences, satisfaction, and destination loyalty within the urban landscape of Thessaloniki. By adopting Social Exchange Theory (S.E.T.) as a framework, this research applies a mixed-methods approach, combining surveys and in-depth interviews conducted over multiple years (2017–2024) at the 82nd, 86th, and 88th Thessaloniki International Fair (T.I.F.). The empirical material consists of 226 survey responses (116 from China, 44 from the UAE, and 84 from Germany) and 52 semi-structured interviews, analyzed using descriptive and non-parametric statistics, alongside thematic interpretation of qualitative data. Findings reveal distinct exhibitor expectations. These cultural distinctions shape their perceptions of Thessaloniki’s infrastructure, services, and overall urban experience, influencing their likelihood to revisit or recommend the city. This study underscores the long-term role of trade fairs in shaping urban economies and offers insights into how cities can leverage international exhibitions for sustainable urban growth. Policy recommendations highlight the need for tailored infrastructural improvements, strategic city branding initiatives, and cultural adaptations to enhance exhibitor engagement and maximize the economic impact of global events. Full article
17 pages, 3073 KiB  
Article
Forecast of Aging of PEMFCs Based on CEEMD-VMD and Triple Echo State Network
by Jie Sun, Shiyuan Pan, Qi Yang, Yiming Wang, Lei Qin, Wang Han, Ruixiang Wang, Lei Gong, Dongdong Zhao and Zhiguang Hua
Sensors 2025, 25(13), 3868; https://doi.org/10.3390/s25133868 - 21 Jun 2025
Viewed by 639
Abstract
Accurately forecasting the degradation trajectory of proton exchange membrane fuel cells (PEMFCs) across a spectrum of operational scenarios is indispensable for effective maintenance scheduling and robust health surveillance. However, this task is highly intricate due to the fluctuating nature of dynamic operating conditions [...] Read more.
Accurately forecasting the degradation trajectory of proton exchange membrane fuel cells (PEMFCs) across a spectrum of operational scenarios is indispensable for effective maintenance scheduling and robust health surveillance. However, this task is highly intricate due to the fluctuating nature of dynamic operating conditions and the limitations inherent in short-term forecasting techniques, which collectively pose significant challenges to achieving reliable predictions. To enhance the accuracy of PEMFC degradation forecasting, this research proposes an integrated approach that combines the complete ensemble empirical mode decomposition with the variational mode decomposition (CEEMD-VMD) and triple echo state network (TriESN) to predict the deterioration process precisely. Decomposition can filter out high-frequency noise and retain low-frequency degradation information effectively. Among data-driven methods, the echo state network (ESN) is capable of estimating the degradation performance of PEMFCs. To tackle the problem of low prediction accuracy, this study proposes a novel TriESN that builds upon the classical ESN. The proposed enhancement method seeks to refine the ESN architecture by reducing the impact of surrounding neurons and sub-reservoirs on active neurons, thus realizing partial decoupling of the ESN. On this basis of decoupling, the method takes into account the multi-timescale aging characteristics of PEMFCs to achieve precise prediction of remaining useful life. Overall, combining CEEMD-VMD with the TriESN strengthens feature depiction, fosters sparsity, diminishes the likelihood of overfitting, and augments the network’s capacity for generalization. It has been shown that the TriESN markedly improved the accuracy of long-term PEMFC degradation predictions in three different dynamic contexts. Full article
(This article belongs to the Section Fault Diagnosis & Sensors)
Show Figures

Figure 1

Back to TopTop