Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (486)

Search Parameters:
Keywords = tail probabilities

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 1511 KB  
Article
Estimator Statistics from Simulation-Free Dirichlet Block-Bootstrap Resampling
by Tillmann Rosenow
Stats 2026, 9(2), 32; https://doi.org/10.3390/stats9020032 (registering DOI) - 20 Mar 2026
Abstract
Since the initiation of two variants of the bootstrap method by Efron and Rubin in the late 1970s, a variety of advancements has emerged in the literature. The subsampling of blocks enabled the estimation of the actual variance of the sample mean. The [...] Read more.
Since the initiation of two variants of the bootstrap method by Efron and Rubin in the late 1970s, a variety of advancements has emerged in the literature. The subsampling of blocks enabled the estimation of the actual variance of the sample mean. The equivalence of the data-level and the estimator-level resampling is easily established for the sample mean and estimators alike. For Rubin’s variant of the bootstrap we apply an algorithm by Diniz et al. which allows for the numerically stable computation of the sample-based cumulative distribution function of the estimator under investigation. No actual Monte-Carlo resampling is necessary in this setting and we demonstrate how we get access to the very small probabilities of the tails and moreover to confidence intervals. We do this at the example of a well-known test model that exhibits geometrically decaying spatial correlations. The analysis naturally applies to temporally correlated systems or to the correlations occurring in Markov chains, as well. Full article
(This article belongs to the Section Time Series Analysis)
Show Figures

Figure 1

24 pages, 611 KB  
Article
Discrete Asymmetric Double Lindley Distribution on Z: Theory, Likelihood Inference, and Applications
by Hugo S. Salinas, Hassan S. Bakouch, Sudeep R. Bapat, Amira F. Daghestani and Anhar S. Aloufi
Symmetry 2026, 18(3), 533; https://doi.org/10.3390/sym18030533 - 20 Mar 2026
Abstract
We introduce the discrete asymmetric double Lindley distribution, a new two-parameter family on the integer line designed to model signed counts and net changes with flexible asymmetric tail behavior. This statistical model is obtained by merging two Lindley-type linear-geometric kernels on the negative [...] Read more.
We introduce the discrete asymmetric double Lindley distribution, a new two-parameter family on the integer line designed to model signed counts and net changes with flexible asymmetric tail behavior. This statistical model is obtained by merging two Lindley-type linear-geometric kernels on the negative and non-negative half-lines, with tail decay rates that are coupled through a simple two-parameter mechanism. This construction yields an analytically tractable probability mass function with an explicit normalizing constant, as well as closed-form expressions for the cumulative distribution function and one-sided tail probabilities. We further provide a transparent stochastic representation based solely on Bernoulli and geometric random variables, leading to an exact and efficient simulation algorithm that is convenient for Monte Carlo studies and validating numerical likelihood routines. Graphical illustrations highlight the role of the asymmetry parameter in controlling the imbalance between the two tails and the resulting skewness on Z. The proposed family offers a practical and interpretable alternative to existing integer-line models for asymmetric discrete data, with direct applicability to likelihood-based inference and real-world datasets. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

12 pages, 368 KB  
Article
On the Integro-Differential Equation Arising in the Ruin Problem for Non-Life Insurance Models with Investment
by Viktor Antipov and Yuri Kabanov
Mathematics 2026, 14(6), 1035; https://doi.org/10.3390/math14061035 - 19 Mar 2026
Abstract
In the classical non-life insurance models, the capital reserve of an insurance company increases at a constant rate and decreases by downward jumps. We consider a generalization of this model by supposing that a fixed portion of the capital reserve is continuously invested [...] Read more.
In the classical non-life insurance models, the capital reserve of an insurance company increases at a constant rate and decreases by downward jumps. We consider a generalization of this model by supposing that a fixed portion of the capital reserve is continuously invested in a risky asset whose price follows a geometric Brownian motion, while the complementary part is placed in a bank account with a constant rate of return. The quantity of interest is the ruin probability on the infinite time horizon as a function of the initial capital. In the present note, we assume only the continuity of the distribution of claims together with a standard moment restriction called “light tails.” Our main contribution is that we reveal, under such “minimalistic” hypotheses, that the ruin probability is smooth and satisfies a second-order integro-differential equation in the classical sense. We obtain the exact asymptotics for large values of the initial capital with “computable” constants and present results of numerical experiments. In contrast with other methods used in the theory, we rely upon only standard mathematics, allowing implementation in lecture courses for master’s students. Full article
(This article belongs to the Section E5: Financial Mathematics)
Show Figures

Figure 1

28 pages, 2545 KB  
Article
Modeling Rank Distribution and the Relative Importance Factor Index in Discrete Power-Law Models: Application to Social Resilience Using the Scopus Database
by Brian Llinas, Jose Padilla, Humberto Llinas, Erika Frydenlund and Katherine Palacio
Mathematics 2026, 14(6), 966; https://doi.org/10.3390/math14060966 - 12 Mar 2026
Viewed by 202
Abstract
Prior research on power-law distributions has primarily focused on modeling frequency patterns, with less attention given to rank distributions and how ranked positions reflect relative importance among elements. In discrete power-law distributions, frequency-based metrics often provide limited discrimination in the tail, where elements [...] Read more.
Prior research on power-law distributions has primarily focused on modeling frequency patterns, with less attention given to rank distributions and how ranked positions reflect relative importance among elements. In discrete power-law distributions, frequency-based metrics often provide limited discrimination in the tail, where elements may exhibit similar counts but differ in relative dominance. These patterns are especially evident, for instance, in academic publishing, where keywords, affiliations, and citations commonly exhibit power-law behavior. To address this limitation, we introduce the Relative Importance Factor (RIF) Index, a statistical measure derived from the estimated discrete power-law rank distribution rather than an additional independent parameter. The RIF Index compares the probability of an element at a given rank with its probabilities at lower ranks, enabling explicit pairwise statistical comparison, particularly within the tail. We formalize the mathematical framework for discrete rank modeling and apply RIF to synthetic data and a Scopus dataset on social resilience. Our results show that RIF clarifies dominance relationships among ranked elements, providing stronger discrimination in the tail than frequency-based measures alone. We further introduce the RIF matrix and RIF network to represent these pairwise relationships structurally, supporting interpretation of prominence patterns. Although demonstrated in academic publishing, the method generalizes to domains where categorical variables follow discrete power-law behavior under appropriate model-fit validation. Full article
Show Figures

Figure 1

13 pages, 3283 KB  
Article
Comprehensive Comparison of Front- and Back-Illuminated Single-Photon Avalanche Diodes in 110 nm Standard CMOS Image Sensor Technology
by Doyoon Eom, Won-Yong Ha, Eunsung Park, Jung-Hoon Chun, Jaehyuk Choi, Woo-Young Choi and Myung-Jae Lee
Sensors 2026, 26(5), 1664; https://doi.org/10.3390/s26051664 - 6 Mar 2026
Viewed by 403
Abstract
This paper presents a process-controlled study of illumination engineering in single-photon avalanche diodes (SPADs) fabricated in a 110 nm standard CMOS image sensor (CIS) technology. Front-illuminated (FI) and back-illuminated (BI) SPADs were implemented with identical front-end-of-line (FEOL) structures, including the junction and guard-ring [...] Read more.
This paper presents a process-controlled study of illumination engineering in single-photon avalanche diodes (SPADs) fabricated in a 110 nm standard CMOS image sensor (CIS) technology. Front-illuminated (FI) and back-illuminated (BI) SPADs were implemented with identical front-end-of-line (FEOL) structures, including the junction and guard-ring configurations, enabling the isolation of the effects of illumination direction and back-end-of-line (BEOL) configuration without modifying the junction structure. Through TCAD simulations and comprehensive experimental characterizations, including current–voltage, light-emission, dark count rate (DCR), photon detection probability (PDP), and timing-jitter measurements, we systematically analyze the performance trade-offs introduced by the BI configuration. The BI SPAD exhibits enhanced near-infrared PDP and a broader spectral response due to its deeper absorption region and the incorporation of a metal reflector, while maintaining identical avalanche characteristics, as evidenced by an unchanged 72 ps full-width-at-half-maximum (FWHM) timing jitter. However, the backside illumination increases the diffusion tail, indicating a trade-off between near-infrared sensitivity and diffusion-related timing performance. These results provide design guidelines for optimizing SPAD performance through illumination-direction and BEOL engineering while preserving the FEOL design and demonstrate a useful approach for SPAD integration in standard CMOS technology. Full article
(This article belongs to the Special Issue Advances in Single Photon Detectors)
Show Figures

Figure 1

30 pages, 2528 KB  
Article
A Two-Dimensional Cloud Model for Early Warning of Tailings Dam Failure Risk Considering Probability and Consequence Coupling
by Zhengjun Ji, Guocai Yan, Yaoyao Meng, Menglong Wu and Lizhen Zhao
Appl. Sci. 2026, 16(5), 2324; https://doi.org/10.3390/app16052324 - 27 Feb 2026
Viewed by 200
Abstract
The accurate assessment of tailings dam operational status and timely risk warnings are critical for ensuring their safe operation. To address the limitations of existing models in managing complex environments and multidimensional risk factors, this study proposes an early warning model for tailings [...] Read more.
The accurate assessment of tailings dam operational status and timely risk warnings are critical for ensuring their safe operation. To address the limitations of existing models in managing complex environments and multidimensional risk factors, this study proposes an early warning model for tailings dam operational status based on a two-dimensional cloud model. First, a comprehensive early warning system is developed to assess the probability and consequences of dam failure, using risk probability and consequences as two-dimensional coordinates, incorporating the randomness and fuzziness of uncertainty described by cloud theory, and transforming qualitative data into quantitative conclusions. Next, a genetic algorithm optimizes the projection pursuit model to determine weights, and weighted numerical features are utilized to enhance the classification of early warning levels. Furthermore, the two-dimensional cloud model is enhanced by introducing a proximity coefficient to replace the membership function, with the resulting cloud map visualized using a forward cloud generator. Finally, the early warning level of the tailings dam’s operational status is determined based on the clustering of cloud droplets and the proximity coefficient. Empirical application to five tailings dams in Hubei Province confirms the model’s effectiveness and practicality. The results demonstrate that the model effectively addresses the complexity and uncertainty of tailings dam operational status, delivers accurate warnings, and provides robust decision support for emergency response. Full article
(This article belongs to the Section Energy Science and Technology)
Show Figures

Figure 1

20 pages, 545 KB  
Article
Environmental Risks of Talc Mining
by Henrieta Pavolová, Mária Kaňuchová, Tomáš Bakalár, Ľubica Kozáková and Edyta Nartowska
Appl. Sci. 2026, 16(5), 2317; https://doi.org/10.3390/app16052317 - 27 Feb 2026
Viewed by 217
Abstract
This study examines the environmental risks associated with talc mining in Slovakia, focusing on various aspects. It applies a structured risk assessment methodology to evaluate the probability and severity of environmental impacts stemming from talc extraction, flotation, and tailings pond operations. Key stressors [...] Read more.
This study examines the environmental risks associated with talc mining in Slovakia, focusing on various aspects. It applies a structured risk assessment methodology to evaluate the probability and severity of environmental impacts stemming from talc extraction, flotation, and tailings pond operations. Key stressors include chemical pollutants such as oils, diesel, and flotation reagents, as well as physical disruptions like georelief alteration and vegetation loss. The findings highlight high environmental risks from technical infrastructure leaks and tailings pond operations, particularly regarding groundwater contamination and landscape modification. Moderate risks were identified in diesel and oil substance leakage, while flotation processes posed minimal risk. The research underscores the need for improved risk mitigation strategies, such as enhanced monitoring and containment systems, to protect local ecosystems and water resources. The study contributes to a better understanding of the long-term environmental impacts of mineral resource exploitation and provides a foundation for more sustainable mining practices. Full article
(This article belongs to the Special Issue Environmental Pollution and Wastewater Treatment Strategies)
Show Figures

Figure 1

31 pages, 1361 KB  
Article
Risk Modeling and Robust Resource Allocation in Complex Aviation Networks: A Wasserstein Distributionally Robust Optimization Approach
by Jingxiao Wen, Yiming Chen, Wenbing Chang, Jiankai Wang and Shenghan Zhou
Appl. Sci. 2026, 16(4), 1959; https://doi.org/10.3390/app16041959 - 16 Feb 2026
Viewed by 200
Abstract
Aircraft routing networks are complex systems vulnerable to cascading delays triggered by weather disruptions and airspace constraints. This paper proposes a Distributionally Robust Aircraft Routing (DRAR) model for systemic risk assessment. Conventional robust or stochastic optimization methods often rely on specific assumptions about [...] Read more.
Aircraft routing networks are complex systems vulnerable to cascading delays triggered by weather disruptions and airspace constraints. This paper proposes a Distributionally Robust Aircraft Routing (DRAR) model for systemic risk assessment. Conventional robust or stochastic optimization methods often rely on specific assumptions about delay distributions (e.g., fixed probability distributions or scenario sets). However, due to the suddenness and multi-source nature of flight delays, their true distribution is difficult to accurately characterize, limiting the effectiveness of these methods in real-world uncertain conditions. By constructing a Wasserstein-metric ambiguity set, the proposed model captures distributional uncertainty without assuming fixed probabilities, thereby handling delay risks more robustly. The study incorporated chance constraints to bound extreme delay probabilities and reformulated the model as a tractable mixed-integer program. Experiments on real airline data demonstrate that DRAR outperforms traditional benchmarks, reducing propagation delays by 4–6%, volatility by 7–9%, and extreme delay risks by up to 15.7%. Thus, the model provides a practical tool for aviation decision-makers: airlines can leverage it to optimize aircraft scheduling and routing, systematically mitigate delay propagation risk, control the probability of extreme delays, and consequently reduce indirect operational costs arising from crew overtime and airport scheduling conflicts, thereby enhancing overall resource efficiency and operational resilience. These results validate DRAR as an effective tool for controlling tail risks and ensuring sustainable operations in uncertain aviation environments. Full article
(This article belongs to the Special Issue Risk Models, Analysis, and Assessment of Complex Systems)
Show Figures

Figure 1

28 pages, 2384 KB  
Article
Bayesian Estimation of Spatial Lagged Panel Quantile Regression Model
by Man Zhao, Rushan Huang, Hanfang Li, Youxi Luo and Qiming Liu
Appl. Sci. 2026, 16(4), 1927; https://doi.org/10.3390/app16041927 - 14 Feb 2026
Viewed by 196
Abstract
This paper proposes a Bayesian estimation method for spatial lagged panel quantile models. The proposed model simultaneously considers spatial lag effects of the dependent variable and the quantile regression framework, enabling effective capture of spatial dependence and conditional distribution heterogeneity. The research constructs [...] Read more.
This paper proposes a Bayesian estimation method for spatial lagged panel quantile models. The proposed model simultaneously considers spatial lag effects of the dependent variable and the quantile regression framework, enabling effective capture of spatial dependence and conditional distribution heterogeneity. The research constructs a Bayesian estimation framework based on the asymmetric Laplace distribution by decomposing the random disturbance term into a combination of normal and exponential distributions, successfully developing a probabilistic model with both thick tail robustness and computational efficiency. On this basis, the study derives the full conditional posterior probability distributions of model parameters and designs a hybrid Markov Chain Monte Carlo (MCMC) sampling algorithm integrating Gibbs sampling and Metropolis–Hastings algorithm for parameter estimation. Numerical simulation experiments demonstrate that, compared with traditional estimation methods, the proposed Bayesian estimation approach exhibits superior estimation accuracy and robustness across different quantiles, with particularly pronounced advantages in small sample and heavy-tailed distribution scenarios. This methodology provides a more reliable theoretical tool for analyzing panel data with spatial dependencies. This method can not only accurately quantify the spatial spillover effect, but also identify the different effects of the same influencing factor at different emission levels, which provides a strong methodological support for formulating differentiated and precise emission reduction policies. Full article
Show Figures

Figure 1

42 pages, 10041 KB  
Article
Probabilistic Prediction of Concrete Compressive Strength Using Copula Functions: A Novel Framework for Uncertainty Quantification
by Cheng Zhang, Senhao Cheng, Shanshan Tao, Shuai Du and Zhengjun Wang
Buildings 2026, 16(4), 754; https://doi.org/10.3390/buildings16040754 - 12 Feb 2026
Viewed by 260
Abstract
Traditional machine learning models for concrete compressive strength prediction provide only single-value estimates without quantifying the probability of meeting design requirements, leaving engineers unable to make risk-informed decisions. This study addresses this critical limitation by developing a novel probabilistic prediction framework that integrates [...] Read more.
Traditional machine learning models for concrete compressive strength prediction provide only single-value estimates without quantifying the probability of meeting design requirements, leaving engineers unable to make risk-informed decisions. This study addresses this critical limitation by developing a novel probabilistic prediction framework that integrates explainable machine learning with Copula-based joint distribution modeling. Using a dataset of 1030 concrete samples with curing ages ranging from 1 to 365 days, we first established an XGBoost 2.1.4 prediction model achieving R2 = 0.9211 (RMSE = 4.51 MPa) on the test set. SHAP 0.49.1 (SHapley Additive exPlanations) analysis identified curing age (33.3%) and water–cement ratio (28.8%) as the dominant features, together accounting for 62.1% of predictive importance. These two controllable engineering parameters were then selected as core variables for probabilistic modeling. The key innovation lies in integrating Copula-based dependence modeling with explainable machine learning (XGBoost–SHAP) to quantify the compliance probability of concrete strength under specific mix designs and curing conditions, thereby supporting risk-informed quality control decisions. Through systematic comparison of five Copula families (Gaussian, Student t, Clayton, Gumbel, and Frank), we identified optimal dependence structures: Gaussian Copula (ρ = −0.54) for the water–cement ratio–strength relationship and Clayton Copula for the age–strength relationship, revealing asymmetric tail dependence patterns invisible to conventional correlation analysis. The three-dimensional Copula model enables engineers to estimate compliance probability—the likelihood of concrete achieving target strength under specific mix designs and curing conditions. We propose an illustrative three-tier decision rule for construction quality management based on the compliance probability P: P ≥ 0.95 (high-confidence approval), 0.80 ≤ P < 0.95 (warning zone requiring enhanced monitoring), and P < 0.80 (high risk suggesting corrective actions such as mix adjustment or extended curing), noting that these thresholds can be recalibrated to project-specific risk tolerance and local specifications. This framework supports a paradigm shift from reactive “mix-then-test” quality control to proactive “predict-then-decide” construction management, providing quantitative risk assessment tools previously unavailable in deterministic prediction approaches. Full article
Show Figures

Figure 1

11 pages, 3464 KB  
Article
Pre-Programming Thermal Sensors Improves Detection During Drone-Based Nocturnal Wildlife Surveys in Warm Weather
by Lori Massey, Aaron M. Foley, Jeremy Baumgardt, Randy W. DeYoung and Humberto L. Perotto-Baldivieso
Drones 2026, 10(2), 127; https://doi.org/10.3390/drones10020127 - 11 Feb 2026
Viewed by 365
Abstract
Improvements in thermal infrared imaging provide new opportunities for drone-based wildlife surveys. The use of thermal sensors can be limited by ambient temperatures and vegetation cover, which can limit opportunities to survey during optimal biological seasons. Pre-programming isotherm settings in thermal cameras has [...] Read more.
Improvements in thermal infrared imaging provide new opportunities for drone-based wildlife surveys. The use of thermal sensors can be limited by ambient temperatures and vegetation cover, which can limit opportunities to survey during optimal biological seasons. Pre-programming isotherm settings in thermal cameras has the potential to allow surveys during warmer environmental conditions. We evaluated night-time surveys of white-tailed deer (Odocoileus virginianus) using isotherm settings in a 102 ha enclosed property in South Texas during February (winter) and July (summer) 2022. Detection probabilities were 0.84 and 0.65 during winter and summer, respectively. Percent woody cover was 48.1% and 60.7% during these seasons, respectively. The seasonal pattern in detection probabilities met expectations in terms of visibility bias caused by canopy cover. Despite different detection probabilities among seasons, population estimates were similar because distance sampling accounted for visibility bias. The use of isotherm settings allowed us to survey during temperatures previously thought to be too warm for ideal contrast (~21 °C vs. 30 °C), which provides more opportunities to survey during biologically important seasons typically associated with warm temperatures (i.e., fawning and antlerogenesis). We recommend the use of distance sampling methods to evaluate and correct for visibility bias during thermal-based drone surveys because detections of focal species may vary with vegetation. Full article
Show Figures

Figure 1

20 pages, 3878 KB  
Article
Emergency Medical Logistics of Helicopter Air Ambulance Response-Time Reliability: A Monte Carlo Simulation
by James Cline and Dothang Truong
Logistics 2026, 10(2), 44; https://doi.org/10.3390/logistics10020044 - 11 Feb 2026
Viewed by 493
Abstract
Background: Rapid helicopter air ambulance (HAA) response is a cornerstone of emergency medical logistics, yet the “time-to-care” metric remains highly sensitive to uncertainties in base posture, readiness, and operational disruptions. This study evaluates how these factors jointly influence response-time reliability and identifies [...] Read more.
Background: Rapid helicopter air ambulance (HAA) response is a cornerstone of emergency medical logistics, yet the “time-to-care” metric remains highly sensitive to uncertainties in base posture, readiness, and operational disruptions. This study evaluates how these factors jointly influence response-time reliability and identifies strategies for improving service performance. Methods: A Monte Carlo simulation was developed to model the end-to-end HAA mission chain, including dispatch, wheels-up delay, en-route flight, and patient handoff, while accounting for uncertainty from weather, airspace congestion, and flight dynamics. Scenario experiments incorporated training improvements and alternative response protocols (Ground vs. Airborne Standby). Results: Simulation results indicate that operational factors reduced mean and tail response times, with Airborne Standby reducing the probability of exceeding a 45 min threshold by over 90% in urban night scenarios. Performance gains were most prominent in rural service areas and night operations, where disruption risks were highest. Conclusions: The findings offer evidence-based guidance for EMS logistics planners by clarifying how standby policies and readiness enhancements mitigate logistical risks. Full article
Show Figures

Figure 1

44 pages, 940 KB  
Article
A Two-Level Relative-Entropy Theory for Isotropic Turbulence Spectra: Fokker–Planck Semigroup Irreversibility and WKB Selection of Dissipation Tails
by Shin-ichi Inage
Mathematics 2026, 14(4), 620; https://doi.org/10.3390/math14040620 - 10 Feb 2026
Viewed by 355
Abstract
We propose a two-level theory that connects Lin-equation-based dynamical coarse-graining of the turbulence cascade with an information-theoretic selection principle in logarithmic wavenumber space. This framework places the dissipation-range spectral shape on a verifiable logical basis rather than on ad hoc fitting. At the [...] Read more.
We propose a two-level theory that connects Lin-equation-based dynamical coarse-graining of the turbulence cascade with an information-theoretic selection principle in logarithmic wavenumber space. This framework places the dissipation-range spectral shape on a verifiable logical basis rather than on ad hoc fitting. At the first (dynamical) level, we formulate an autonomous conservative Fokker–Planck equation for the normalized density and probability current. Under sufficient boundary decay and a strictly positive effective diffusion, the sign-reversed Kullback–Leibler divergence is shown to be a Lyapunov functional, yielding a rigorous H-theorem and fixing the arrow of time in scale space. At the second (selection) level, the dissipation range is treated as a stationary boundary-value problem for an open system by introducing a killing term for an unnormalized scale density. A WKB (Liouville–Green) analysis restricts the admissible tail to a stretched-exponential form and links the tail exponent to the high-wavenumber scaling of the effective diffusion. The exponential prefactor is fixed by dissipation-rate consistency, and the remaining degree of freedom is determined by one-dimensional Kullback–Leibler minimization (Hyper-MaxEnt) against a globally constructed reference distribution. The resulting exponent range is validated against the high-resolution DNS spectra reported in the literature. Full article
(This article belongs to the Special Issue Mathematical Fluid Dynamics: Theory, Analysis and Emerging Trends)
Show Figures

Figure 1

27 pages, 6812 KB  
Article
Probability Distribution and Extreme Characteristics of Tree Wind-Induced Responses Under Various Approaching Flow Turbulences
by Yanfeng Hao, Bin Huang, Xijie Liu, Zichun Zhou and Yueyue Pan
Forests 2026, 17(2), 217; https://doi.org/10.3390/f17020217 - 5 Feb 2026
Viewed by 225
Abstract
Trees play a critical role in urban ecological protection and wind disaster mitigation, yet conventional Gaussian-based wind engineering models often underestimate extreme tree motions under turbulent flows. This study aims to clarify the statistical characteristics of tree wind-induced responses and develop a quantitative [...] Read more.
Trees play a critical role in urban ecological protection and wind disaster mitigation, yet conventional Gaussian-based wind engineering models often underestimate extreme tree motions under turbulent flows. This study aims to clarify the statistical characteristics of tree wind-induced responses and develop a quantitative framework to distinguish Gaussian and non-Gaussian behaviors. Scaled aeroelastic tree models were tested in a boundary-layer wind tunnel under controlled turbulence intensity (0.05–0.19), mean wind speeds of 3.9–9.3 m/s, and leaf area index (LAI) of 0–2.46. Acceleration and displacement time histories of branches, crown center, and trunk were recorded. A Gaussian discrimination criterion was established using cumulative probability thresholds of skewness and kurtosis, supplemented by time-history and probability density verification. Results reveal that branch accelerations exhibit strong non-Gaussianity with heavy-tailed and asymmetric distributions, crown displacements show moderate non-Gaussianity, while trunk responses remain near-Gaussian due to higher stiffness. Under weak turbulence, Gamma and Lognormal distributions fit best; under strong turbulence, the Generalized Extreme Value (GEV) distribution prevails. A high-quantile GEV-based framework markedly reduces extreme response prediction bias compared with Gaussian assumptions. These findings provide a probabilistic basis for more accurate assessment of tree wind stability and the design of wind-resistant urban vegetation and shelterbelts. Full article
(This article belongs to the Section Natural Hazards and Risk Management)
Show Figures

Figure 1

22 pages, 2055 KB  
Article
Time-Dependent Route Optimization for Multimodal Hazardous Materials Transport Using Conditional Value-at-Risk Under Uncertainty
by Song Liu, Jingjing Li, Yazhi Lin, Dennis Z. Yu, Yong Peng, Yi Liu and Xianting Ma
Symmetry 2026, 18(2), 292; https://doi.org/10.3390/sym18020292 - 5 Feb 2026
Viewed by 286
Abstract
Transporting hazardous materials has low accident probabilities but potentially catastrophic consequences, making effective risk management essential in uncertain conditions such as population distribution, weather, traffic, and multimodal scheduling constraints. This study develops a Conditional Value-at-Risk (CVaR)-based optimization model for multimodal hazardous materials transportation [...] Read more.
Transporting hazardous materials has low accident probabilities but potentially catastrophic consequences, making effective risk management essential in uncertain conditions such as population distribution, weather, traffic, and multimodal scheduling constraints. This study develops a Conditional Value-at-Risk (CVaR)-based optimization model for multimodal hazardous materials transportation that incorporates transportation and transshipment risks, population exposure uncertainty, fixed departure schedules for rail and waterway transport, dual time-window constraints, and limits on the number of transshipments. The model also reflects the decision-maker’s risk aversion and time-varying travel times. To solve this NP-hard problem, an improved chaotic simulated annealing-ant colony optimization (CSAACO) algorithm is proposed. Numerical experiments show that CSAACO outperforms the standard ACO in terms of solution quality and stability. The results demonstrate that the model effectively captures tail risk in dynamic environments and that both the risk aversion coefficient μ and departure time significantly influence route selection. The proposed approach provides an efficient and practical decision-support tool for hazardous materials multimodal transportation planning under uncertainty. Full article
(This article belongs to the Special Issue The Fusion of Fuzzy Sets and Optimization Using Symmetry)
Show Figures

Figure 1

Back to TopTop