Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (940)

Search Parameters:
Keywords = Monte Carlo randomization

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 10818 KB  
Article
Public Health Safety Governance and System Resilience in Petrochemical Plants Based on STAMP/STPA and Complex Networks: A Case Study from China
by Zhiqian Hu, Jie Hou, Yunsheng Su, Yuqing Wang, Wei Dai and Jie Yang
Sustainability 2026, 18(8), 3754; https://doi.org/10.3390/su18083754 - 10 Apr 2026
Abstract
As a highly integrated and increasingly complex high-risk process industry, the petrochemical sector plays a critical role in industrial continuity and social stability, yet faces significant governance adaptability challenges under normalized public health emergencies. Taking a Chinese petrochemical enterprise as a case study, [...] Read more.
As a highly integrated and increasingly complex high-risk process industry, the petrochemical sector plays a critical role in industrial continuity and social stability, yet faces significant governance adaptability challenges under normalized public health emergencies. Taking a Chinese petrochemical enterprise as a case study, this paper develops an integrated framework combining STAMP/STPA, complex network analysis, and robustness analysis. Based on a reconstructed four-level hierarchical control and feedback structure, STPA was applied to identify 20 unsafe control actions (UCAs). These UCAs and their precursor factors were further abstracted into a relational network of control deficiencies for topological analysis and Monte Carlo-based robustness testing under random failure and targeted attack. The results show pronounced small-world and core–periphery structural characteristics, with vulnerability concentrated in a limited number of high-centrality source and hub nodes. Systemic resilience constraints mainly arise from governmental deficiencies in response experience and training, enterprise-level amplification at hub nodes, and pressure accumulation at frontline execution nodes. Accordingly, three resilience protocols are proposed: distributed authorization for source nodes; digitized dual-channel feedback for hub nodes; and minimum operational redundancy with cross-replacement for terminal nodes. This study provides theoretical basis and strategies for high-risk industrial systems to enhance resilience and sustainable development in uncertain environments. Full article
Show Figures

Figure 1

27 pages, 4581 KB  
Article
Assessing Climate Efficiency with Random Forest, DEA, and SHAP in the Eastern Black Sea Region, Türkiye
by Mehmet Ali Çelik, Yakup Kızılelma, Melahat Batu Ağırkaya, İsmet Güney, Dündar Dagli and Volkan Duran
Atmosphere 2026, 17(4), 381; https://doi.org/10.3390/atmos17040381 - 9 Apr 2026
Abstract
The study is based on Land Surface Temperature (LST) and Air Temperature data and Nonparametric Data Envelopment Analysis (DEA) technique to evaluate heat efficiency and detect anomalies in the thermal regime in the Eastern Black Sea Region, particularly in Hopa and Artvin, during [...] Read more.
The study is based on Land Surface Temperature (LST) and Air Temperature data and Nonparametric Data Envelopment Analysis (DEA) technique to evaluate heat efficiency and detect anomalies in the thermal regime in the Eastern Black Sea Region, particularly in Hopa and Artvin, during the period 2000–2024. The regulating role of the Black Sea has resulted in Hopa having the warmest and most stable temperature patterns, with daytime temperatures 1.8 to 3.7 °C higher than Artvin. Previous DEA analysis of daytime temperatures has shown that the 2018–2020 period had the highest daily temperatures, while the 2001–2010 decade was characterized by the highest nighttime temperatures. A future heat map based on Monte Carlo simulation using six climate change scenarios indicates that in the most optimistic case, assuming a temperature increase of +0.8 °C, efficiency scores could increase as high as 0.995. On the other hand, if global warming leads to a sudden temperature increase above +7.2 °C, there is a 21.7% climate efficiency loss. Sensitivity analysis showed that technological innovation and good governance are the main positive factors affecting climate efficiency. Random Forest (RF) and SHapley Additive Explanations (SHAP) analyses were applied to determine the impact of climate factors on DEA scores and also indicated areas requiring risk assessment. The findings highlight the importance of considering location-specific climate adaptation strategies. Based on the observed thermal contrasts between coastal and inland environments, potential adaptation considerations may include urban heat management and agricultural water stress in coastal areas such as Hopa, and cold-climate resilience and energy-efficient infrastructure in inland locations such as Artvin. Full article
(This article belongs to the Special Issue Machine Learning for Hydrological Prediction and Water Management)
Show Figures

Figure 1

26 pages, 2531 KB  
Article
Underwater Acoustic Source DOA Estimation for Non-Uniform Circular Arrays Based on EMD and PWLS Correction
by Chuang Han, Boyuan Zheng and Tao Shen
Symmetry 2026, 18(4), 627; https://doi.org/10.3390/sym18040627 - 9 Apr 2026
Abstract
Uniform circular arrays (UCAs) are widely used in underwater source localization due to their omnidirectional coverage. However, random sensor position errors caused by installation inaccuracies and environmental disturbances convert UCAs into non-uniform circular arrays (NCAs), severely degrading the performance of high-resolution direction of [...] Read more.
Uniform circular arrays (UCAs) are widely used in underwater source localization due to their omnidirectional coverage. However, random sensor position errors caused by installation inaccuracies and environmental disturbances convert UCAs into non-uniform circular arrays (NCAs), severely degrading the performance of high-resolution direction of arrival (DOA) estimation algorithms. To address this issue, this paper proposes a robust DOA estimation method that integrates empirical mode decomposition (EMD) denoising with prior-weighted iterative least squares (PWLS) correction. The method first applies EMD to adaptively denoise received signals by selecting intrinsic mode functions based on a combined energy-correlation criterion. An initial DOA estimate is then obtained using the MUSIC algorithm. Finally, a PWLS correction algorithm leverages prior knowledge of deviated sensors to iteratively fit the circle center and gradually pull sensor positions toward the ideal circumference, using a differentiated relaxation mechanism to suppress outliers while preserving geometric features. Systematic Monte Carlo simulations compare five correction algorithms under multi-frequency and wideband signals. The results show that both multi-frequency and wideband signals reduce estimation errors to below 0.1°, with the proposed PWLS achieving the best accuracy under multi-frequency signals, while all algorithms approach zero error under wideband signals. The PWLS algorithm converges in about 10 iterations with high computational efficiency, providing a reliable solution for practical underwater NCA applications. Full article
(This article belongs to the Section Engineering and Materials)
Show Figures

Figure 1

24 pages, 4284 KB  
Article
Spatial Distribution, Source Apportionment and Risk Assessment of Heavy Metal Pollution in Typical Redevelopment Sites in Pudong New District, Shanghai
by Cheng Shen, Jian Wu and Ye Li
Toxics 2026, 14(4), 315; https://doi.org/10.3390/toxics14040315 - 8 Apr 2026
Abstract
To investigate the characteristics and health risks of heavy metal (HM) contamination in soils of typical industrial sites during urban renewal, this study selected Pudong New District, Shanghai, as a case. Seven HMs (Cd, Pb, Cu, Zn, Ni, Hg, and As) were analyzed [...] Read more.
To investigate the characteristics and health risks of heavy metal (HM) contamination in soils of typical industrial sites during urban renewal, this study selected Pudong New District, Shanghai, as a case. Seven HMs (Cd, Pb, Cu, Zn, Ni, Hg, and As) were analyzed for their concentrations, ecological risks, spatial patterns, and potential sources. Inverse Distance Weighted (IDW) interpolation was used to assess spatial distribution, Random Forest (RF) regression to predict HM concentrations, and a two-dimensional Monte Carlo simulation to evaluate human health risks. The results showed that all HMs except As exceeded Shanghai background values in surface soils, with varying levels observed in subsoil and saturated layers. The Index of Geoaccumulation (Igeo) and Risk Index (RI) indicated low contamination and moderate ecological risk. Pearson correlation combined with Positive Matrix Factorization (PMF) identified four major sources: traffic emissions dominated by Cd and Zn, combustion-related sources dominated by Pb and Hg, industry-related inputs dominated by Cu and Ni, and a natural source dominated by As. The RF model demonstrated strong predictive accuracy for Cd, Pb, Hg, and As (R2 = 0.80–0.94), and predicted values were consistent with observations. Monte Carlo results showed that non-carcinogenic risks for children and adults were within acceptable limits, while carcinogenic risks reached “notable” levels with probabilities of 62.06%, 55.65%, and 22.49% for children, adult females, and adult males, respectively. Cd and As were identified as key contributors. This work provides scientific support for soil pollution prevention and remediation during urban renewal. Full article
(This article belongs to the Special Issue Fate and Transport of Heavy Metals in Polluted Soils)
Show Figures

Graphical abstract

19 pages, 6970 KB  
Article
Reliability Research of Natural Gas Pipeline Units Based on Mechanistic Modeling
by Huirong Huang, Chen Wu, Jie Zhong, Huishu Liu, Qian Huang, Xueyuan Long, Yuan Tian, Weichao Yu, Shangfei Song and Jing Gong
Processes 2026, 14(7), 1183; https://doi.org/10.3390/pr14071183 - 7 Apr 2026
Abstract
Due to long-term burial underground, oil and gas pipelines are susceptible to external surface corrosion influenced by time and soil conditions, which can lead to leakage and burst failures. Pipeline failure not only results in significant economic losses but also has catastrophic impacts [...] Read more.
Due to long-term burial underground, oil and gas pipelines are susceptible to external surface corrosion influenced by time and soil conditions, which can lead to leakage and burst failures. Pipeline failure not only results in significant economic losses but also has catastrophic impacts on human safety and the environment. Therefore, modeling and analyzing the corrosion failure of these pipelines is of critical practical importance to ensure their safe operation during service. Addressing the insufficient research on correlation effects in current reliability evaluations of corroded pipelines, this paper proposes a calculation method for the failure probability of corroded oil and gas pipelines that considers the influence of two-layer correlations. Taking a specific segment of the Shaanxi–Beijing pipeline as a case study, the Monte Carlo sampling algorithm is employed to calculate the impact of two-layer correlations and the quantity of defect on the pipeline’s failure probability. Furthermore, a sensitivity analysis of the correlation coefficients is conducted. The results indicate that the influence of defect correlation on pipeline failure probability is significantly more pronounced than that of random variable correlation. The probabilities of pinhole leakage and burst failure decrease as the correlation coefficient between defects increases, while they increase with the number of defects. Random variable correlation exhibits no impact on pinhole leakage probability; however, the burst failure probability decreases with an increasing correlation coefficient between wall thickness and pipe diameter, but increases as the correlation between initial defect length and depth grows. Furthermore, the correlation coefficient between axial and radial defect growth rates exerts a bidirectional effect on burst failure probability: during the first 25 years of the prediction period, the failure probability increases with the correlation coefficient, whereas it subsequently decreases after approximately 25 years. These findings are applicable to the reliability evaluation of oil and gas pipelines containing multiple corrosion defects, providing valuable technical references for ensuring safe operation and the steady supply of energy resources. Full article
(This article belongs to the Section Petroleum and Low-Carbon Energy Process Engineering)
Show Figures

Figure 1

29 pages, 1107 KB  
Article
Secure Uplink Transmission in UAV-Assisted Dual-Orbit SAGIN over Mixed RF-FSO Links
by Zhan Xu and Chunshuai Ma
Aerospace 2026, 13(4), 341; https://doi.org/10.3390/aerospace13040341 - 4 Apr 2026
Viewed by 142
Abstract
To meet the need for global coverage, space–air–ground integrated networks (SAGINs) are crucial, but the openness of wireless links makes communications vulnerable to eavesdropping. This paper investigates the physical layer security (PLS) of uplink transmissions in a cooperative dual-hop SAGIN. The system comprises [...] Read more.
To meet the need for global coverage, space–air–ground integrated networks (SAGINs) are crucial, but the openness of wireless links makes communications vulnerable to eavesdropping. This paper investigates the physical layer security (PLS) of uplink transmissions in a cooperative dual-hop SAGIN. The system comprises a ground source with a directional antenna, an unmanned aerial vehicle (UAV) relay cluster, and a low Earth orbit (LEO) satellite. Utilizing stochastic geometry, we model the spatial randomness of terrestrial eavesdroppers and the multi-layered dual-orbital LEO destination. To combat mixed radio-frequency (RF) and free-space optical (FSO) fading, multiple relay selection and maximum ratio combining (MRC) are integrated into the UAV cluster. We analytically derive the piecewise probability density function for the FSO link distance, obtaining exact closed-form expressions for the end-to-end secrecy outage probability (SOP). Monte Carlo simulations strictly validate the derivations. The results demonstrate that while increasing available relays and antennas enhances PLS via spatial diversity, a security bottleneck restricts the RF-FSO architecture under high-transmit power regimes, generating asymptotic secrecy floors. These findings provide explicit theoretical guidelines for the secure design and parameter optimization of future SAGINs. Full article
(This article belongs to the Section Astronautics & Space Science)
Show Figures

Figure 1

40 pages, 6580 KB  
Article
Self-Organized Criticality and Multifractal Characteristics of Power-System Blackouts: A Long-Term Empirical Study of China’s Power System
by Qun Yu, Zhiyi Zhou, Jiongcheng Yan, Weimin Sun and Yuqing Qu
Fractal Fract. 2026, 10(4), 239; https://doi.org/10.3390/fractalfract10040239 - 3 Apr 2026
Viewed by 151
Abstract
Power system blackouts represent typical manifestations of instability in complex systems, whose evolution often exhibits non-stationarity, long-range correlations, and nonlinear scaling behavior. Most reliability assessment methods widely used in engineering practice are built on the core assumptions of event independence and light-tailed distribution, [...] Read more.
Power system blackouts represent typical manifestations of instability in complex systems, whose evolution often exhibits non-stationarity, long-range correlations, and nonlinear scaling behavior. Most reliability assessment methods widely used in engineering practice are built on the core assumptions of event independence and light-tailed distribution, which will inevitably lead to systematic underestimation of extreme tail risks when blackouts actually present long-range memory and power-law heavy-tailed characteristics. Based on long-cycle historical blackout records of China’s power grid spanning 1981–2025, this paper develops an integrated framework combining Self-Organized Criticality (SOC) theory, Hurst exponent analysis, symbolic time-series methods, and Multifractal Detrended Fluctuation Analysis (MFDFA). This study systematically characterizes the evolution law and inherent dependence structure of blackout events from four dimensions: statistical scaling, temporal correlation, nonlinear structure, and multi-scale fractal spectrum. The results show that both the load-loss magnitudes and inter-event intervals of blackouts follow strict power-law distributions, with the system exhibiting scaling behavior consistent with SOC theory. The blackout event sequence presents significant long-range positive correlation and self-similarity, confirming a persistent long-term memory effect in the system evolution. Symbolic analysis further reveals the nonlinear fluctuation patterns and burst clustering behavior of the blackout process, reflecting the intermittency and complexity of blackout risks. MFDFA results verify that the blackout sequence has a broad-spectrum multifractal structure across different temporal scales, and Monte Carlo shuffle tests demonstrate that this multifractality mainly arises from intrinsic long-range temporal correlations, rather than being driven solely by heavy-tailed distribution. This study confirms that blackouts in China’s power grid are not random independent events, but present fractal statistical characteristics consistent with the self-organized critical mechanism. The findings provide a novel fractal perspective and quantitative framework for the statistical characterization, operational security assessment, and multi-scale early-warning modeling of blackout risks in China’s large-scale power systems. Full article
(This article belongs to the Special Issue Multifractal Analysis and Complex Systems)
Show Figures

Figure 1

31 pages, 8837 KB  
Article
Design and Pricing of Weather Index Insurance for Alpine Grasslands Under Climate Extremes: A Case Study in the Source Region of the Yellow River
by Zhenying Zhou, Xinyu Wang, Jinxi Su and Huilong Lin
Agriculture 2026, 16(7), 798; https://doi.org/10.3390/agriculture16070798 - 3 Apr 2026
Viewed by 278
Abstract
The alpine grassland ecosystem in the Source Region of the Yellow River (SRYR) faces the dual pressures of ecological protection and economic development. Its ecological fragility and climate sensitivity make local animal husbandry susceptible to meteorological disasters. To overcome adverse selection and moral [...] Read more.
The alpine grassland ecosystem in the Source Region of the Yellow River (SRYR) faces the dual pressures of ecological protection and economic development. Its ecological fragility and climate sensitivity make local animal husbandry susceptible to meteorological disasters. To overcome adverse selection and moral hazard in traditional animal husbandry insurance, this study integrates 963 field sampling observation data, over 400 valid herdsmen survey data, and long-term environmental time series variables. A random forest model (R2 = 0.59, RMSE = 65.84 g/m2, superior to the artificial neural network in this paper) was used to estimate grass yield. Hodrick–Prescott (HP) filtering was used to separate meteorological yield per unit area and derive yield loss rate. A joint distribution model of meteorological indicators and loss rate was constructed using a Copula function to capture tail-dependent structures, providing a basis for determining trigger thresholds and actuarial pricing of pure insurance premiums. The study reveals the transmission mechanism of climate disasters to feeding costs and designs regional drought and snow disaster index insurance. The compensation standard is based on meteorological indicators falling below the trigger threshold and a yield reduction rate greater than 5%. Using 10,000 Monte Carlo simulations, the drought premium rates for zones I-IV are determined to be 2.03–6.03%, and the snow premium rates to be 2.25–5.42%, corresponding to a premium of RMB 5.21–9.61 per mu for drought and RMB 5.78–8.64 per mu for snow. This design reduces basis risk through zoning and composite triggering, providing a scientific tool for climate risk management in alpine grasslands. Full article
(This article belongs to the Section Ecosystem, Environment and Climate Change in Agriculture)
Show Figures

Figure 1

31 pages, 12121 KB  
Article
Momentum-Accelerated Phase Synchronization for UAV Swarm Collaborative Beamforming
by Fei Xie, Longqing Li, Chan Liu, Zhiping Huang, Yongjie Zhao and Junyu Wei
Drones 2026, 10(4), 254; https://doi.org/10.3390/drones10040254 - 2 Apr 2026
Viewed by 199
Abstract
Distributed beamforming in UAV swarms requires fast and accurate carrier-phase alignment under sparse connectivity and propagation-induced phase bias. This paper proposes a physics-aware decentralized synchronization framework for quasi-static UAV swarm beamforming by integrating momentum-accelerated Metropolis–Hastings consensus with position-aided phase pre-compensation. To preserve phase [...] Read more.
Distributed beamforming in UAV swarms requires fast and accurate carrier-phase alignment under sparse connectivity and propagation-induced phase bias. This paper proposes a physics-aware decentralized synchronization framework for quasi-static UAV swarm beamforming by integrating momentum-accelerated Metropolis–Hastings consensus with position-aided phase pre-compensation. To preserve phase evolution on the circular manifold, a sinusoidal coupling law is adopted, while the momentum term improves convergence in sparse random geometric graphs. A propagation model is further established to characterize how geometric separation and ranging uncertainty translate into residual phase error and coherent power loss. Under small-signal conditions, local stability is analyzed, and Monte Carlo simulations are conducted to evaluate convergence, synchronization accuracy, robustness, and beam-focusing performance. Results show that, at 2.4 GHz with low-centimeter ranging uncertainty, the proposed method achieves sub-wavelength synchronization accuracy while providing an effective balance among convergence speed, accuracy, and complexity. Compared with standard Metropolis–Hastings, fixed-weight, and other accelerated consensus methods, the proposed scheme converges faster over most sparse topologies. Although its steady-state accuracy is slightly lower than that of filter-based predictive methods such as KF-DFPC in some cases, those schemes incur higher implementation and computational overhead. Therefore, from the perspectives of decentralized realization and practical deployment, the proposed method is more suitable for lightweight phase synchronization in distributed UAV swarms. Full article
Show Figures

Figure 1

16 pages, 1425 KB  
Article
On the Classification–Causal Tradeoff in Neural Network Propensity Score Estimation
by Seungman Kim, Jaehoon Lee and Kwanghee Jung
Stats 2026, 9(2), 37; https://doi.org/10.3390/stats9020037 - 31 Mar 2026
Viewed by 238
Abstract
Observational studies serve as a vital alternative to randomized experiments but are highly susceptible to selection bias. Propensity score (PS) methods address this by balancing covariates between groups. Although including all relevant covariates is theoretically ideal, high dimensionality often destabilizes traditional estimation models. [...] Read more.
Observational studies serve as a vital alternative to randomized experiments but are highly susceptible to selection bias. Propensity score (PS) methods address this by balancing covariates between groups. Although including all relevant covariates is theoretically ideal, high dimensionality often destabilizes traditional estimation models. This study evaluates the efficacy of deep neural networks (DNN) and convolutional neural networks (CNN) for PS estimation compared to traditional logistic regression (LR), leveraging their capacity to handle complex nonlinear relationships and interactions. Using a Monte Carlo simulation across 36 conditions, model performance was evaluated based on bias and imbalance reduction. Results indicate that DNNs and CNNs significantly outperform LR. Specifically, while LR increased outcome bias by 17% and reduced covariate imbalance by only 5%, DNNs and CNNs reduced outcome bias by 13% and 16%, respectively, while decreasing covariate imbalance by 18% and 21%. We conclude that despite requiring specialized computational resources, neural networks offer substantial advantages for high-dimensional PS estimation. However, their reliable application necessitates stability-aware training and proper error rate thresholds to prevent probability degeneracy. Full article
Show Figures

Figure 1

20 pages, 8455 KB  
Article
Reliability Analysis of Landslide Dam Slope Against Seepage Failure Considering Spatial Variability of Material Composition
by Zhe Zhang, Hengwei Zhang, Ning He, Qiming Zhong and Yi Luo
Water 2026, 18(7), 832; https://doi.org/10.3390/w18070832 - 31 Mar 2026
Viewed by 272
Abstract
Landslide dams, as a special type of earth dams, are characterized by complex geomorphological features and geotechnical properties. The failure of landslide dams induced by seepage should not be overlooked. This study introduces a calculation method for analyzing the slope stability of landslide [...] Read more.
Landslide dams, as a special type of earth dams, are characterized by complex geomorphological features and geotechnical properties. The failure of landslide dams induced by seepage should not be overlooked. This study introduces a calculation method for analyzing the slope stability of landslide dams with three different material compositions under seepage conditions. Furthermore, the influence of spatial heterogeneity in particle size on the stability of landslide dam slopes subjected to unsaturated seepage is investigated using the random finite element method combined with Monte Carlo simulation. This paper provides a reference for the reliability evaluation of landslide dams with different material types. Full article
Show Figures

Figure 1

24 pages, 6618 KB  
Article
Automated Identification and Quantification of 3D Failure Domains in Spatially Variable Soil Slopes Under Rectangular Footings
by Qinji Jia, Xiaoming Liu, Xin Kang and Changfu Chen
Buildings 2026, 16(7), 1321; https://doi.org/10.3390/buildings16071321 - 26 Mar 2026
Viewed by 205
Abstract
Accurate identification of slope failure mechanisms under shallow foundations is essential for reliable risk assessment and reinforcement design. However, existing studies often neglect the spatial variability of soil properties and the influence of footing shape. This study develops a non-intrusive stochastic finite difference [...] Read more.
Accurate identification of slope failure mechanisms under shallow foundations is essential for reliable risk assessment and reinforcement design. However, existing studies often neglect the spatial variability of soil properties and the influence of footing shape. This study develops a non-intrusive stochastic finite difference framework integrating random field theory, Monte Carlo simulation, and a Gaussian mixture model to automatically characterize three-dimensional slope failure domains under rectangular footing loads. Results show that slope failure mechanisms are primarily governed by the footing aspect ratio and the scale of fluctuation in soil strength. Square footings mainly induce shallow slope face failure, whereas rectangular footings significantly increase the probability of deep toe failure as the scale of fluctuation increases. Stochastic analyses generally yield larger mean failure volumes than deterministic analyses. Risk assessment further indicates that risk levels are primarily controlled by the absolute failure volume at low safety factors, whereas failure variability becomes increasingly influential at higher safety factors. Full article
(This article belongs to the Special Issue New Reinforcement Technologies Applied in Slope and Foundation)
Show Figures

Figure 1

16 pages, 2916 KB  
Article
Deep Learning-Based Relay Selection in a Decode-and-Forward Cooperative System with Energy Harvesting and Signal Space Diversity
by Ahmed Oun, Divyessh Maheshwari and Ahmed Ammar
Electronics 2026, 15(7), 1363; https://doi.org/10.3390/electronics15071363 - 25 Mar 2026
Viewed by 343
Abstract
Deep learning techniques have been widely applied in wireless communication systems to enhance resilience and reduce computational complexity. This paper investigates both traditional and deep learning-based approaches for real-time relay selection in a cooperative communication system with multiple energy-harvesting relays and signal space [...] Read more.
Deep learning techniques have been widely applied in wireless communication systems to enhance resilience and reduce computational complexity. This paper investigates both traditional and deep learning-based approaches for real-time relay selection in a cooperative communication system with multiple energy-harvesting relays and signal space diversity. The assumed relay decoding scheme is decode-and-forward (DF), with selection criteria based on successful decoding from the source, sufficient energy availability, and the best channel to the destination. The system performance is evaluated in terms of outage probability. Monte Carlo simulations are used to determine the exact outage probability of the system and to generate datasets for training machine learning models. The traditional machine learning models implemented include Decision Tree (DT), Logistic Regression (LR), K-Nearest Neighbor (KNN), and Support Vector Machines (SVMs). The deep learning-based method used is the deep neural network (DNN). Two datasets—one with six features and another with nine features—were used for training and testing. The 6-feature datasets are comparatively less random and complex than the 9-feature datasets. The results indicate that among traditional models KNN achieves the highest accuracy and is thus used as a benchmark to compare against DNN performance. For the 9-feature datasets, both KNN and DNN struggle to accurately approximate the exact outage probability, suggesting that the 9-feature datasets are too complex and noisy for effective modeling. However, on the 6-feature datasets, KNN achieves 77% accuracy, while DNN achieves a significantly higher accuracy of 99%. Due to its high accuracy, the DNN model closely approximates the exact outage probability while offering greater computational efficiency compared to the KNN model. These results underscore the potential of deep learning in optimizing real-time relay selection for energy-harvesting cooperative communication systems. Full article
(This article belongs to the Special Issue Advances in Networked Systems and Communication Protocols)
Show Figures

Figure 1

25 pages, 1505 KB  
Article
Food Security–Climate Change–National Income Nexus: Insights from GCC Countries
by Raga M. Elzaki
Foods 2026, 15(6), 1099; https://doi.org/10.3390/foods15061099 - 20 Mar 2026
Viewed by 306
Abstract
Food security is being experienced particularly deeply in vulnerable regions that are impacted by climate change. Therefore, this study aims to examine the impact of climate change and gross national income on food security in the Gulf Cooperation Council (GCC) countries. The study [...] Read more.
Food security is being experienced particularly deeply in vulnerable regions that are impacted by climate change. Therefore, this study aims to examine the impact of climate change and gross national income on food security in the Gulf Cooperation Council (GCC) countries. The study utilized cross-country panel data for GCC countries from 2000 to 2024, with food access acting as the dependent variable for food security. The annual meteorological temperature, energy-related carbon emissions, and gross national income are involved as independent variables representing the factors of climate change and economic growth, respectively. The Pedroni and Johansen–Fisher panel cointegration tests were implemented. Furthermore, the study employs Bayesian random-effects (BRE) and Bayesian mixed-effects (BME) models, estimated through Markov Chain Monte Carlo (MCMC) methods, for achieving posterior distributions of the model’s parameters. The results confirm the existence of a long-term cointegrating relationship among the selected variables. Gross national income has a positive impact on food security, whereas carbon emissions exert a negative effect. The findings reveal that food security is shaped by interconnected economic and climate factors, with notable differences between countries. These results underline the importance of regional cooperation and country-specific policies that focus on enhancing income, mitigating emissions, and investing in food systems. Full article
(This article belongs to the Section Food Security and Sustainability)
Show Figures

Figure 1

24 pages, 611 KB  
Article
Discrete Asymmetric Double Lindley Distribution on Z: Theory, Likelihood Inference, and Applications
by Hugo S. Salinas, Hassan S. Bakouch, Sudeep R. Bapat, Amira F. Daghestani and Anhar S. Aloufi
Symmetry 2026, 18(3), 533; https://doi.org/10.3390/sym18030533 - 20 Mar 2026
Viewed by 200
Abstract
We introduce the discrete asymmetric double Lindley distribution, a new two-parameter family on the integer line designed to model signed counts and net changes with flexible asymmetric tail behavior. This statistical model is obtained by merging two Lindley-type linear-geometric kernels on the negative [...] Read more.
We introduce the discrete asymmetric double Lindley distribution, a new two-parameter family on the integer line designed to model signed counts and net changes with flexible asymmetric tail behavior. This statistical model is obtained by merging two Lindley-type linear-geometric kernels on the negative and non-negative half-lines, with tail decay rates that are coupled through a simple two-parameter mechanism. This construction yields an analytically tractable probability mass function with an explicit normalizing constant, as well as closed-form expressions for the cumulative distribution function and one-sided tail probabilities. We further provide a transparent stochastic representation based solely on Bernoulli and geometric random variables, leading to an exact and efficient simulation algorithm that is convenient for Monte Carlo studies and validating numerical likelihood routines. Graphical illustrations highlight the role of the asymmetry parameter in controlling the imbalance between the two tails and the resulting skewness on Z. The proposed family offers a practical and interpretable alternative to existing integer-line models for asymmetric discrete data, with direct applicability to likelihood-based inference and real-world datasets. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

Back to TopTop