Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (211)

Search Parameters:
Keywords = normalized Gamma distribution

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
41 pages, 8144 KB  
Article
Statistical Development of Rainfall IDF Curves and Machine Learning-Based Bias Assessment: A Case Study of Wadi Al-Rummah, Saudi Arabia
by Ibrahim T. Alhbib, Ibrahim H. Elsebaie and Saleh H. Alhathloul
Hydrology 2026, 13(3), 96; https://doi.org/10.3390/hydrology13030096 - 16 Mar 2026
Abstract
Reliable estimation of extreme rainfall is essential for hydraulic design and flood risk mitigation, particularly in arid regions where rainfall exhibits strong temporal and spatial variability. This study presents a statistical framework for developing rainfall intensity-duration-frequency (IDF) curves, complemented by a machine learning-based [...] Read more.
Reliable estimation of extreme rainfall is essential for hydraulic design and flood risk mitigation, particularly in arid regions where rainfall exhibits strong temporal and spatial variability. This study presents a statistical framework for developing rainfall intensity-duration-frequency (IDF) curves, complemented by a machine learning-based assessment of model bias and performance. The analysis was conducted using data from ten rainfall stations located within or near the Wadi Al-Rummah Basin. Annual maximum series (AMS) from 1969 to 2024 were first reconstructed to address missing years using a modified normal ratio method (NRM) combined with nearest-station selection, ensuring spatial consistency while preserving station-specific rainfall characteristics. Six probability distributions (Weibull, Gumbel, gamma, lognormal, generalized extreme value (GEV), and generalized Pareto) were fitted to each station, and the best-fit distribution was identified using multiple goodness-of-fit (GOF) criteria, including the Kolmogorov–Smirnov (K-S) test, Anderson–Darling (A-D) test, root mean square error (RMSE), chi-square (χ2) statistic, Akaike information criterion (AIC), Bayesian information criterion (BIC), and the coefficient of determination (R2). Statistical IDF curves were then developed for durations ranging from 5 to 1440 min and return periods from 2 to 1000 years. To evaluate the robustness of the statistically derived IDF curves, three machine learning (ML) models, multiple linear regression (MLR), regression random forest (RRF), and multilayer feed-forward neural network (MFFNN), were trained as surrogate models using duration, return period, and station geographic attributes as predictor variables. Model performance was evaluated using RMSE, MAE, and mean bias metrics across stations and return periods. The lognormal distribution emerged as the best-fit model for four stations, while the Gumbel and gamma distributions were selected for two stations each. Overall, no single probability distribution consistently outperformed others, indicating station-dependent behavior. Among the machine learning models, the MFFNN achieved the closest agreement with statistical IDF estimates (RMSE0.97, MAE0.65, bias0.02), followed by RRF and MLR based on global average performance across all stations and return periods. The proposed framework offers a reliable approach for rainfall IDF development and evaluation in arid region watersheds. Full article
(This article belongs to the Section Statistical Hydrology)
Show Figures

Figure 1

18 pages, 1850 KB  
Article
A Median-Centered Sequential Monitoring Scheme Based on Golden Ratio Weighting for Skewed Distributions
by Elif Kozan
Mathematics 2026, 14(6), 941; https://doi.org/10.3390/math14060941 - 11 Mar 2026
Viewed by 74
Abstract
Detecting small location shifts in stochastic processes is a fundamental problem in sequential statistical monitoring. Classical procedures such as Shewhart-type schemes, exponentially weighted moving average (EWMA), and cumulative sum (CUSUM) methods are known to perform well under normality or near-symmetry assumptions; however, their [...] Read more.
Detecting small location shifts in stochastic processes is a fundamental problem in sequential statistical monitoring. Classical procedures such as Shewhart-type schemes, exponentially weighted moving average (EWMA), and cumulative sum (CUSUM) methods are known to perform well under normality or near-symmetry assumptions; however, their effectiveness may deteriorate in the presence of right-skewed distributions. In such settings, mean-based monitoring statistics can be highly sensitive to tail behavior, which may lead to delayed detection of small shifts or unstable false alarm performance. This paper introduces a monitoring scheme referred to as the Golden Ratio (GR) control chart, designed for detecting small location shifts in right-skewed distributions. The proposed method is constructed using a median-centered statistic combined with a geometrically decaying weighting mechanism derived from the golden ratio. Unlike classical time-based weighting schemes, the GR chart assigns weights according to the rank-based distance from the sample median, thereby attenuating the influence of isolated extreme observations while preserving sensitivity to persistent distributional changes. The run-length performance of the proposed chart is investigated using Monte Carlo simulation experiments. All competing procedures are calibrated to achieve comparable in-control average run lengths. The GR chart is compared with classical EWMA and CUSUM charts under several skewed distributions, including Gamma, Lognormal, and Weibull models. Simulation results indicate that the proposed approach provides a robust and stable monitoring alternative for skewed processes. In particular, the GR chart demonstrates competitive performance for detecting small location shifts while reducing the influence of extreme observations commonly encountered in right-skewed environments. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

65 pages, 1161 KB  
Article
The Empirical Bayes Estimators of the Variance Parameter of the Normal Distribution with a Normal-Inverse-Gamma Prior Under Stein’s Loss Function
by Ying-Ying Zhang
Axioms 2026, 15(2), 127; https://doi.org/10.3390/axioms15020127 - 10 Feb 2026
Viewed by 225
Abstract
For the hierarchical normal and normal-inverse-gamma model, we derive the Bayesian estimator of the variance parameter in the normal distribution under Stein’s loss function—a penalty function that treats gross overestimation and underestimation equally—and compute the associated Posterior Expected Stein’s Loss (PESL). Additionally, we [...] Read more.
For the hierarchical normal and normal-inverse-gamma model, we derive the Bayesian estimator of the variance parameter in the normal distribution under Stein’s loss function—a penalty function that treats gross overestimation and underestimation equally—and compute the associated Posterior Expected Stein’s Loss (PESL). Additionally, we determine the Bayesian estimator of the same variance parameter under the squared error loss function, along with its corresponding PESL. We further develop empirical Bayes estimators for the variance parameter using a conjugate normal-inverse-gamma prior, employing both the method of moments and Maximum Likelihood Estimation (MLE). Theoretical properties, including posterior and marginal distributions, two inequalities that relate two Bayes estimators and their corresponding PESLs, and consistencies of hyperparameter estimators and empirical Bayes estimators, are established. The simulation results demonstrate that MLEs outperform moment estimators in estimating hyperparameters, particularly with respect to consistency and model fit. Finally, we apply our methodology to real-world data on poverty levels—specifically, the percentage of individuals living below the poverty line—to validate and illustrate our theoretical findings. Full article
(This article belongs to the Section Mathematical Analysis)
Show Figures

Figure 1

11 pages, 1620 KB  
Article
Frequency Distribution of Sward Heights and Forage Species Composition in Different Integrated Crop–Livestock Systems
by Renata Franciéli Moraes, Daniela Maria Martin, Arthur Pontes Prates, Carolina Bremm, Paulo Cesar de Faccio Carvalho, Lucas Aquino Alves, Leandro Bittencourt de Oliveira and Anibal de Moraes
Grasses 2026, 5(1), 8; https://doi.org/10.3390/grasses5010008 - 9 Feb 2026
Viewed by 298
Abstract
Sward height is a practical indicator for defining management targets that reflect pasture structure. The complexity of integrated systems, including the coexistence of trees, crops, and livestock, can modify animal grazing distribution and microhabitat conditions, leading to different degrees of sward heterogeneity and [...] Read more.
Sward height is a practical indicator for defining management targets that reflect pasture structure. The complexity of integrated systems, including the coexistence of trees, crops, and livestock, can modify animal grazing distribution and microhabitat conditions, leading to different degrees of sward heterogeneity and botanical composition. This study investigated sward-height distribution and species composition in four systems: livestock (L), livestock–forestry (LF), crop–livestock (CL), and crop–livestock–forestry (CLF). Data were collected over two years in pastures of black oat (Avena strigosa Schreb.), Aries grass (Megathyrsus maximus cv. Aries), Italian ryegrass (Lolium multiflorum Lam.), and other tropical grasses during summer, transition, and winter. Sward heights were classified into three categories (low, optimal, high) according to seasonal thresholds (winter: <18.0; 18–29.9; >30 cm; summer: <15.0; 15–24.9; >25 cm) and fitted to four probability distributions (normal, log-normal, Gamma, Weibull). Management based on target-height maintained 46% of observations within the optimal class, a satisfactory proportion for continuous stocking systems where structural heterogeneity is inherent. The CL system presented greater species diversity due to a higher frequency of Italian ryegrass and other grasses. Across systems and seasons, the Gamma distribution provided the best fit for sward-height frequencies. These findings offer a practical statistical tool for evaluating grazing management efficiency. Full article
Show Figures

Graphical abstract

25 pages, 2745 KB  
Article
A Natural Generalization of the XLindley Distribution and Its First-Order Autoregressive Process with Applications to Non-Gaussian Time Series
by Emrah Altun, Soheyla A. Ghomeishi and Hana N. Alqifari
Axioms 2026, 15(2), 107; https://doi.org/10.3390/axioms15020107 - 31 Jan 2026
Viewed by 339
Abstract
The natural generalization of the XLindley distribution is proposed. The mathematical properties of the generalized XLindley distribution are derived. The importance of the proposed model is evaluated on the first-order autoregressive process, and compared with its counterparts. Extensive simulation studies are carried out [...] Read more.
The natural generalization of the XLindley distribution is proposed. The mathematical properties of the generalized XLindley distribution are derived. The importance of the proposed model is evaluated on the first-order autoregressive process, and compared with its counterparts. Extensive simulation studies are carried out to demonstrate the suitability of the estimation methods. Empirical findings reveal that the first-order autoregressive process with generalized XLindley innovations produces better forecasting results than those of the gamma, weighted Lindley, and normal innovations. Additionally, a web-tool application of the proposed model is developed and deployed on a free server that is accessible for practitioners. Full article
(This article belongs to the Special Issue Advances in the Theory and Applications of Statistical Distributions)
Show Figures

Figure 1

27 pages, 586 KB  
Article
Symmetric Double Normal Models for Censored, Bounded, and Survival Data: Theory, Estimation, and Applications
by Guillermo Martínez-Flórez, Hugo Salinas and Javier Ramírez-Montoya
Mathematics 2026, 14(2), 384; https://doi.org/10.3390/math14020384 - 22 Jan 2026
Viewed by 185
Abstract
We develop a unified likelihood-based framework for limited outcomes built on the two-piece normal family. The framework includes a censored specification that accommodates boundary inflation, a doubly truncated specification on (0,1) for rates and proportions, and a survival formulation [...] Read more.
We develop a unified likelihood-based framework for limited outcomes built on the two-piece normal family. The framework includes a censored specification that accommodates boundary inflation, a doubly truncated specification on (0,1) for rates and proportions, and a survival formulation with a log-two-piece normal baseline and Gamma frailty to account for unobserved heterogeneity. We derive closed-form building blocks (pdf, cdf, survival, hazard, and cumulative hazard), full log-likelihoods with score functions and observed information, and stable reparameterizations that enable routine optimization. Monte Carlo experiments show a small bias and declining RMSE with increasing sample size; censoring primarily inflates the variability of regression coefficients; the scale parameter remains comparatively stable, and the shape parameter is most sensitive under heavy censoring. Applications to HIV-1 RNA with a detection limit, household food expenditure on (0,1), labor-supply hours with a corner solution, and childhood cancer times to hospitalization demonstrate improved fit over Gaussian, skew-normal, and beta benchmarks according to AIC/BIC/CAIC and goodness-of-fit diagnostics, with model-implied censoring closely matching the observed fraction. The proposed formulations are tractable, flexible, and readily implementable with standard software. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

20 pages, 5273 KB  
Article
Investigation of the Vertical Microphysical Characteristics of Rainfall in Guangzhou Based on Phased-Array Radar
by Jingxuan Zhu, Jun Zhang, Duanyang Ji, Qiang Dai and Changjun Liu
Remote Sens. 2026, 18(2), 322; https://doi.org/10.3390/rs18020322 - 18 Jan 2026
Viewed by 353
Abstract
The accurate retrieval of the raindrop size distribution (DSD) is a longstanding objective in meteorology because it underpins reliable quantitative precipitation estimation. Among remote sensors, weather radars are the primary tool for mapping DSD over wide areas, and phased-array systems in particular have [...] Read more.
The accurate retrieval of the raindrop size distribution (DSD) is a longstanding objective in meteorology because it underpins reliable quantitative precipitation estimation. Among remote sensors, weather radars are the primary tool for mapping DSD over wide areas, and phased-array systems in particular have demonstrated unique advantages owing to their high temporal and spatial resolution together with agile beam steering. Exploiting the underused high-resolution capability of an X-band phased-array radar, this study induced a Rainfall Regression Model (RRM). The RRM assumes a normalized gamma DSD model and retrieves its three parameters. It was then applied to a rain event influenced by the remnant circulation of Typhoon Haikui that affected Guangzhou on 8 September 2023. First, collocated disdrometer observations and T-matrix scattering simulations are used to build polynomial regressions between DSD parameters (D0, Nw, μ) and the polarimetric variables. Validation against independent disdrometer samples yields Nash–Sutcliffe efficiencies of 0.93 for D0 and 0.91 for log10Nw. The RRM is then applied to the full volumetric radar data. Horizontal maps reveal that the surface elevation angle consistently exhibited the largest standard deviation for all three parameters. A vertical profile analysis shows that large-drop cores (D0 > 2 mm) can reside above 2 km and that iso-value contours tilt rather than align vertically, implying an appreciable horizontal drift of raindrops within the complex remnant typhoon–monsoon wind field. By demonstrating the ability of X-band phased-array radar to resolve the three-dimensional microphysical structure of remnant typhoon precipitation, this study advances our understanding of the vertical characteristics of raindrops and provides high-resolution DSD information that can be directly ingested into severe weather monitoring and nowcasting systems. Full article
(This article belongs to the Section Environmental Remote Sensing)
Show Figures

Figure 1

31 pages, 13946 KB  
Article
The XLindley Survival Model Under Generalized Progressively Censored Data: Theory, Inference, and Applications
by Ahmed Elshahhat and Refah Alotaibi
Axioms 2026, 15(1), 56; https://doi.org/10.3390/axioms15010056 - 13 Jan 2026
Viewed by 224
Abstract
This paper introduces a novel extension of the classical Lindley distribution, termed the X-Lindley model, obtained by a specific mixture of exponential and Lindley distributions, thereby substantially enriching the distributional flexibility. To enhance its inferential scope, a comprehensive reliability analysis is developed under [...] Read more.
This paper introduces a novel extension of the classical Lindley distribution, termed the X-Lindley model, obtained by a specific mixture of exponential and Lindley distributions, thereby substantially enriching the distributional flexibility. To enhance its inferential scope, a comprehensive reliability analysis is developed under a generalized progressive hybrid censoring scheme, which unifies and extends several traditional censoring mechanisms and allows practitioners to accommodate stringent experimental and cost constraints commonly encountered in reliability and life-testing studies. Within this unified censoring framework, likelihood-based estimation procedures for the model parameters and key reliability characteristics are derived. Fisher information is obtained, enabling the establishment of asymptotic properties of the frequentist estimators, including consistency and normality. A Bayesian inferential paradigm using Markov chain Monte Carlo techniques is proposed by assigning a conjugate gamma prior to the model parameter under the squared error loss, yielding point estimates, highest posterior density credible intervals, and posterior reliability summaries with enhanced interpretability. Extensive Monte Carlo simulations, conducted under a broad range of censoring configurations and assessed using four precision-based performance criteria, demonstrate the stability and efficiency of the proposed estimators. The results reveal low bias, reduced mean squared error, and shorter interval lengths for the XLindley parameter estimates, while maintaining accurate coverage probabilities. The practical relevance of the proposed methodology is further illustrated through two real-life data applications from engineering and physical sciences, where the XLindley model provides a markedly improved fit and more realistic reliability assessment. By integrating an innovative lifetime model with a highly flexible censoring strategy and a dual frequentist–Bayesian inferential framework, this study offers a substantive contribution to modern survival theory. Full article
(This article belongs to the Special Issue Recent Applications of Statistical and Mathematical Models)
Show Figures

Figure 1

32 pages, 907 KB  
Article
Performance Analysis of Uplink Opportunistic Scheduling for Multi-UAV-Assisted Internet of Things
by Long Suo, Zhichu Zhang, Lei Yang and Yunfei Liu
Drones 2026, 10(1), 18; https://doi.org/10.3390/drones10010018 - 28 Dec 2025
Viewed by 479
Abstract
Due to the high mobility, flexibility, and low cost, unmanned aerial vehicles (UAVs) can provide an efficient way for provisioning data communication and computing offloading services for massive Internet of Things (IoT) devices, especially in remote areas with limited infrastructure. However, current transmission [...] Read more.
Due to the high mobility, flexibility, and low cost, unmanned aerial vehicles (UAVs) can provide an efficient way for provisioning data communication and computing offloading services for massive Internet of Things (IoT) devices, especially in remote areas with limited infrastructure. However, current transmission schemes for unmanned aerial vehicle-assisted Internet of Things (UAV-IoT) predominantly employ polling scheduling, thus not fully exploiting the potential multiuser diversity gains offered by a vast number of IoT nodes. Furthermore, conventional opportunistic scheduling (OS) or opportunistic beamforming techniques are predominantly designed for downlink transmission scenarios. When applied directly to uplink IoT data transmission, these methods can incur excessive uplink training overhead. To address these issues, this paper first proposes a low-overhead multi-UAV uplink OS framework based on channel reciprocity. To avoid explicit massive uplink channel estimation, two scheduling criteria are designed: minimum downlink interference (MDI) and the maximum downlink signal-to-interference-plus-noise ratio (MD-SINR). Second, for a dual-UAV deployment scenario over Rayleigh block fading channels, we derive closed-form expressions for both the average sum rate and the asymptotic sum rate based on the MDI criterion. A degrees-of-freedom (DoF) analysis demonstrates that when the number of sensors, K, scales as ρα, the system can achieve a total of 2α DoF, where α0,1 is the user-scaling factor and ρ is the transmitted signal-to-noise ratio (SNR). Third, for a three-UAV deployment scenario, the Gamma distribution is employed to approximate the uplink interference, thereby yielding a tractable expression for the average sum rate. Simulations confirm the accuracy of the performance analysis for both dual- and three-UAV deployments. The normalized error between theoretical and simulation results falls below 1% for K > 30. Furthermore, the impact of fading severity on the system’s sum rate and DoF performance is systematically evaluated via simulations under Nakagami-m fading channels. The results indicate that more severe fading (a smaller m) yields greater multiuser diversity gain. Both the theoretical and simulation results consistently show that within the medium-to-high SNR regime, the dual-UAV deployment outperforms both the single-UAV and three-UAV schemes in both Rayleigh and Nakagami-m channels. This study provides a theoretical foundation for the adaptive deployment and scheduling design of UAV-assisted IoT uplink systems under various fading environments. Full article
Show Figures

Figure 1

39 pages, 23728 KB  
Article
Parametric Inference of the Power Weibull Survival Model Using a Generalized Censoring Plan: Three Applications to Symmetry and Asymmetry Scenarios
by Refah Alotaibi and Ahmed Elshahhat
Symmetry 2025, 17(12), 2142; https://doi.org/10.3390/sym17122142 - 12 Dec 2025
Viewed by 362
Abstract
Generalized censoring, combined with a power-based distribution, improves inferential efficiency by capturing more detailed failure-time information in complex testing scenarios. Conventional censoring schemes may discard substantial failure-time information, leading to inefficiencies in parameter estimation and reliability prediction. To address this limitation, we develop [...] Read more.
Generalized censoring, combined with a power-based distribution, improves inferential efficiency by capturing more detailed failure-time information in complex testing scenarios. Conventional censoring schemes may discard substantial failure-time information, leading to inefficiencies in parameter estimation and reliability prediction. To address this limitation, we develop a comprehensive inferential framework for the alpha-power Weibull (APW) distribution under a generalized progressive hybrid Type-II censoring scheme, a flexible design that unifies classical, hybrid, and progressive censoring while guaranteeing test completion within preassigned limits. Both maximum likelihood and Bayesian estimation procedures are derived for the model parameters, reliability function, and hazard rate. Associated uncertainty quantification is provided through asymptotic confidence intervals (normal and log-normal approximations) and Bayesian credible intervals obtained via Markov chain Monte Carlo (MCMC) methods with independent gamma priors. In addition, we propose optimal censoring designs based on trace, determinant, and quantile-variance criteria to maximize inferential efficiency at the design stage. Extensive Monte Carlo simulations, assessed using four precision measures, demonstrate that the Bayesian MCMC estimators consistently outperform their frequentist counterparts in terms of bias, mean squared error, robustness, and interval coverage across a wide range of censoring levels and prior settings. Finally, the proposed methodology is validated using real-life datasets from engineering (electronic devices), clinical (organ transplant), and physical (rare metals) studies, demonstrating the APW model’s superior goodness-of-fit, reliability prediction, and inferential stability. Overall, this study demonstrates that combining generalized censoring with the APW distribution substantially enhances inferential efficiency and predictive performance, offering a robust and versatile tool for complex life-testing experiments across multiple scientific domains. Full article
Show Figures

Figure 1

21 pages, 26649 KB  
Article
A Hybrid Deep Learning-Based Modeling Methods for Atmosphere Turbulence in Free Space Optical Communications
by Yuan Gao, Bingke Yang, Shasha Fan, Leheng Xu, Tianye Wang, Boxian Yang and Shichen Jiang
Photonics 2025, 12(12), 1210; https://doi.org/10.3390/photonics12121210 - 8 Dec 2025
Viewed by 794
Abstract
Free-space optical (FSO) communication provides high-capacity and secure links but is strongly impaired by atmospheric turbulence, which induces multi-scale irradiance fluctuations. Traditional approaches such as adaptive optics, multi-aperture and multiple-input multiple-output FSO schemes offer limited robustness under rapidly varying turbulence, while statistical fading [...] Read more.
Free-space optical (FSO) communication provides high-capacity and secure links but is strongly impaired by atmospheric turbulence, which induces multi-scale irradiance fluctuations. Traditional approaches such as adaptive optics, multi-aperture and multiple-input multiple-output FSO schemes offer limited robustness under rapidly varying turbulence, while statistical fading models such as log-normal and Gamma–Gamma cannot represent multi-scale temporal correlations. This work proposes a hybrid deep learning framework that explicitly separates high-frequency scintillation and low-frequency power drift through a conditional variational autoencoder and a bidirectional long short-term memory dual-branch architecture with an adaptive gating mechanism. Trained on OptiSystem-generated datasets, the model accurately reconstructs irradiance distributions and temporal dynamics. For model-assisted signal compensation, it achieves an average 79% bit-error-rate (BER) reduction across all simulated scenarios compared with conventional thresholding and Gamma–Gamma maximum a posteriori detection. Transfer learning further enables efficient adaptation to new turbulence conditions with minimal retraining. Experimental validation shows that the compensated BER approaches near-zero, yielding significant improvement over traditional detection. These results demonstrate an effective and adaptive solution for turbulence-impaired FSO links. Full article
(This article belongs to the Special Issue Advances in Free-Space Optical Communications)
Show Figures

Figure 1

49 pages, 2669 KB  
Article
On a Three-Parameter Bounded Gamma–Gompertz Distribution, with Properties, Estimation, and Applications
by Tassaddaq Hussain, Mohammad Shakil and Mohammad Ahsanullah
AppliedMath 2025, 5(4), 177; https://doi.org/10.3390/appliedmath5040177 - 8 Dec 2025
Viewed by 580
Abstract
A novel statistical model, the Bounded Gamma–Gompertz Distribution (BGGD), is presented alongside a full characterization of its properties. Our investigation identifies maximum-likelihood estimation (MLE) as the most effective fitting procedure, proving it to be more consistent and efficient than alternative approaches like L-moments [...] Read more.
A novel statistical model, the Bounded Gamma–Gompertz Distribution (BGGD), is presented alongside a full characterization of its properties. Our investigation identifies maximum-likelihood estimation (MLE) as the most effective fitting procedure, proving it to be more consistent and efficient than alternative approaches like L-moments and Bayesian estimation. Empirical validation on Tesla (TSLA) financial records—spanning open, high, low, close prices, and trading volume—showcased the BGGD’s superior performance. It delivered a better fit than several competing heavy-tailed distributions, including Student-t, Log-Normal, Lévy, and Pareto, as indicated by minimized AIC and BIC statistics. The results substantiate the distribution’s robustness in capturing extreme-value behavior, positioning it as a potent tool for financial modeling applications. Full article
Show Figures

Figure 1

20 pages, 1589 KB  
Article
A Computational Framework for Reproducible Generation of Synthetic Grain-Size Distributions for Granular and Geoscientific Applications
by Seweryn Lipiński
Geosciences 2025, 15(12), 464; https://doi.org/10.3390/geosciences15120464 - 4 Dec 2025
Viewed by 598
Abstract
Particle size distribution (PSD), also referred to as grain-size distribution (GSD), is a fundamental characteristic of granular materials, influencing packing density, porosity, permeability, and mechanical behavior across soils, sediments, and industrial powders. Accurate and reproducible representation of PSD is essential for computational modeling, [...] Read more.
Particle size distribution (PSD), also referred to as grain-size distribution (GSD), is a fundamental characteristic of granular materials, influencing packing density, porosity, permeability, and mechanical behavior across soils, sediments, and industrial powders. Accurate and reproducible representation of PSD is essential for computational modeling, digital twin development (i.e., virtual replicas of physical systems), and machine learning applications in geosciences and engineering. Despite the widespread use of classical distributions (log-normal, Weibull, Gamma), there remains a lack of systematic frameworks for generating synthetic datasets with controlled statistical properties and reproducibility. This paper introduces a unified computational framework for generating virtual PSDs/GSDs with predefined statistical characteristics and a specified number of grain-size fractions. The approach integrates parametric modeling with two histogram-based allocation strategies: the equal-width method, maintaining uniform bin spacing, and the equal-probability method, distributing grains according to quantiles of the target distribution. Both methods ensure statistical representativeness, reproducibility, and scalability across material classes. The framework is demonstrated on representative cases of soils (Weibull), sedimentary and industrial materials (Gamma), and food powders (log-normal), showing its generality and adaptability. The generated datasets can support sensitivity analyses, experimental validation, and integration with discrete element modeling, computational fluid dynamics, or geostatistical simulations. Full article
(This article belongs to the Section Geomechanics)
Show Figures

Graphical abstract

24 pages, 3213 KB  
Article
The UG-EM Lifetime Model: Analysis and Application to Symmetric and Asymmetric Survival Data
by Omalsad H. Odhah, Saba M. Alwan and Sarah Aljohani
Symmetry 2025, 17(12), 2027; https://doi.org/10.3390/sym17122027 - 26 Nov 2025
Viewed by 513
Abstract
This paper introduces the UG-EM (Unconditional Gamma-Exponential Model) as a new compound lifetime model designed to enhance flexibility in tail behavior compared to traditional distributions. The UG-EM model provides a unified framework for analyzing deviations from symmetry in survival data, effectively capturing right-skewed [...] Read more.
This paper introduces the UG-EM (Unconditional Gamma-Exponential Model) as a new compound lifetime model designed to enhance flexibility in tail behavior compared to traditional distributions. The UG-EM model provides a unified framework for analyzing deviations from symmetry in survival data, effectively capturing right-skewed patterns, which are commonly observed in real-world lifetime phenomena. The main analytical properties are derived, including the probability density, cumulative distribution, hazard and reversed-hazard functions, mean residual life, and several measures of dispersion and uncertainty. The effects of the UG-EM parameters (α and λ) are examined, showing that increasing either parameter can cause a temporary reduction in entropy H(T) at early times followed by a long-term increase; in some cases, the influence of α is stronger than that of λ. Parameter estimation is carried out using the maximum likelihood method and assessed through Monte Carlo simulations to evaluate estimator bias and variability, highlighting the significant role of sample size in estimation accuracy. The proposed model is applied to three survival datasets (Lung, Veteran, and Kidney) and compared with classical alternatives such as Exponential, Weibull, and Log-normal distributions using standard goodness-of-fit criteria. Results indicate that the UG-EM model offers superior flexibility and can capture patterns that simpler models fail to represent, although the empirical results do not demonstrate a clear, consistent superiority over standard competitors across all tested datasets. The paper also discusses identifiability issues, estimation challenges, and practical implications for reliability and medical survival analysis. Recommendations for further theoretical development and broader model comparison are provided. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

36 pages, 560 KB  
Review
A Review: Construction of Statistical Distributions
by Kai-Tai Fang, Yu-Xuan Lin and Yu-Hui Deng
Entropy 2025, 27(12), 1188; https://doi.org/10.3390/e27121188 - 23 Nov 2025
Viewed by 1266
Abstract
Statistical modeling is fundamentally based on probability distributions, which can be discrete or continuous and univariate or multivariate. This review focuses on the methods used to construct these distributions, covering both traditional and newly developed approaches. We first examine classic distributions such as [...] Read more.
Statistical modeling is fundamentally based on probability distributions, which can be discrete or continuous and univariate or multivariate. This review focuses on the methods used to construct these distributions, covering both traditional and newly developed approaches. We first examine classic distributions such as the normal, exponential, gamma, and beta for univariate data, and the multivariate normal, elliptical, and Dirichlet for multidimensional data. We then address how, in recent decades, the demand for more flexible modeling tools has led to the creation of complex meta-distributions built using copula theory. Full article
(This article belongs to the Special Issue Number Theoretic Methods in Statistics: Theory and Applications)
Show Figures

Figure 1

Back to TopTop