Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (76)

Search Parameters:
Keywords = Empirical Cumulative Distribution Function

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 5011 KB  
Article
New Insights into Meteorological and Hydrological Drought Modeling: A Comparative Analysis of Parametric and Non-Parametric Distributions
by Ahmad Abu Arra and Eyüp Şişman
Atmosphere 2025, 16(7), 846; https://doi.org/10.3390/atmos16070846 - 11 Jul 2025
Cited by 4 | Viewed by 443
Abstract
Accurate drought monitoring depends on selecting an appropriate cumulative distribution function (CDF) to model the original data, resulting in the standardized drought indices. In the numerous research studies, while rigorous validation was not made by scrutinizing the model assumptions and uncertainties in identifying [...] Read more.
Accurate drought monitoring depends on selecting an appropriate cumulative distribution function (CDF) to model the original data, resulting in the standardized drought indices. In the numerous research studies, while rigorous validation was not made by scrutinizing the model assumptions and uncertainties in identifying theoretical drought CDF models, such oversights lead to biased representations of drought evaluation and characteristics. This research compares the parametric theoretical and empirical CDFs for a comprehensive evaluation of standardized Drought Indices. Additionally, it examines the advantages, disadvantages, and limitations of both empirical and theoretical distribution functions in drought assessment. Three drought indices, Standardized Precipitation Index (SPI), Streamflow Drought Index (SDI), and Standardized Precipitation Evapotranspiration Index (SPEI), cover meteorological and hydrological droughts. The assessment spans diverse applications, covering different climates and regions: Durham, United Kingdom (SPEI, 1868–2021); Konya, Türkiye (SPI, 1964–2022); and Lüleburgaz, Türkiye (SDI, 1957–2015). The findings reveal that theoretical and empirical CDFs demonstrated notable discrepancies, particularly in long-term hydrological drought assessments, where underestimations reached up to 50%, posing risks of misinformed conclusions that may impact critical drought-related decisions and policymaking. Root Mean Squared Error (RMSE) for SPI3 between empirical and best-fitted CDF was 0.087, and between empirical and Gamma it was 0.152. For SDI, it ranged between 0.09 and 0.143. The Mean Absolute Error (MAE) for SPEI was approximately 0.05 for all timescales. Additionally, it concludes that empirical CDFs provide more reliable and conservative drought assessments and are free from the constraints of model assumptions. Both approaches gave approximately the same drought duration with different intensities regarding drought characteristics. Due to the complex process of drought events and different definitions of drought events, each drought event must be studied separately, considering its effects on different sectors. Full article
(This article belongs to the Special Issue Drought Monitoring, Prediction and Impacts (2nd Edition))
Show Figures

Figure 1

23 pages, 3151 KB  
Article
Should We Use Quantile-Mapping-Based Methods in a Climate Change Context? A “Perfect Model” Experiment
by Mathieu Vrac, Harilaos Loukos, Thomas Noël and Dimitri Defrance
Climate 2025, 13(7), 137; https://doi.org/10.3390/cli13070137 - 1 Jul 2025
Cited by 1 | Viewed by 2748
Abstract
This study assesses the use of Quantile-Mapping methods for bias correction and downscaling in climate change studies. A “Perfect Model Experiment” is conducted using high-resolution climate simulations as pseudo-references and coarser versions as biased data. The focus is on European daily temperature and [...] Read more.
This study assesses the use of Quantile-Mapping methods for bias correction and downscaling in climate change studies. A “Perfect Model Experiment” is conducted using high-resolution climate simulations as pseudo-references and coarser versions as biased data. The focus is on European daily temperature and precipitation under the RCP 8.5 scenario. Six methods are tested: an empirical Quantile-Mapping approach, the “Cumulative Distribution Function—transform” (CDF-t) method, and four CDF-t variants with different parameters. Their performance is evaluated based on univariate and multivariate properties over the calibration period (1981–2010) and a future period (2071–2100). The results show that while Quantile Mapping and CDF-t perform similarly during calibration, significant differences arise in future projections. Quantile Mapping exhibits biases in the means, standard deviations, and extremes, failing to capture the climate change signal. CDF-t and its variants show smaller biases, with one variant proving particularly robust. The choice of discretization parameter in CDF-t is crucial, as the low number of bins increases the biases. This study concludes that Quantile Mapping is not appropriate for adjustments in a climate change context, whereas CDF-t, especially a variant that stabilizes extremes, offers a more reliable alternative. Full article
Show Figures

Figure 1

18 pages, 803 KB  
Article
Gaussian Process with Vine Copula-Based Context Modeling for Contextual Multi-Armed Bandits
by Jong-Min Kim
Mathematics 2025, 13(13), 2058; https://doi.org/10.3390/math13132058 - 21 Jun 2025
Cited by 1 | Viewed by 517
Abstract
We propose a novel contextual multi-armed bandit (CMAB) framework that integrates copula-based context generation with Gaussian Process (GP) regression for reward modeling, addressing complex dependency structures and uncertainty in sequential decision-making. Context vectors are generated using Gaussian and vine copulas to capture nonlinear [...] Read more.
We propose a novel contextual multi-armed bandit (CMAB) framework that integrates copula-based context generation with Gaussian Process (GP) regression for reward modeling, addressing complex dependency structures and uncertainty in sequential decision-making. Context vectors are generated using Gaussian and vine copulas to capture nonlinear dependencies, while arm-specific reward functions are modeled via GP regression with Beta-distributed targets. We evaluate three widely used bandit policies—Thompson Sampling (TS), ε-Greedy, and Upper Confidence Bound (UCB)—on simulated environments informed by real-world datasets, including Boston Housing and Wine Quality. The Boston Housing dataset exemplifies heterogeneous decision boundaries relevant to housing-related marketing, while the Wine Quality dataset introduces sensory feature-based arm differentiation. Our empirical results indicate that the ε-Greedy policy consistently achieves the highest cumulative reward and lowest regret across multiple runs, outperforming both GP-based TS and UCB in high-dimensional, copula-structured contexts. These findings suggest that combining copula theory with GP modeling provides a robust and flexible foundation for data-driven sequential experimentation in domains characterized by complex contextual dependencies. Full article
Show Figures

Figure 1

16 pages, 8067 KB  
Article
Asymmetry in Distributions of Accumulated Gains and Losses in Stock Returns
by Hamed Farahani and Rostislav A. Serota
Economies 2025, 13(6), 176; https://doi.org/10.3390/economies13060176 - 17 Jun 2025
Viewed by 476
Abstract
We studied decades-long (1980 to 2024) historic distributions of accumulated S&P500 returns, from daily returns to those over several weeks. The time series of the returns emphasize major upheavals in the markets—Black Monday, Tech Bubble, Financial Crisis, and the COVID pandemic—which are reflected [...] Read more.
We studied decades-long (1980 to 2024) historic distributions of accumulated S&P500 returns, from daily returns to those over several weeks. The time series of the returns emphasize major upheavals in the markets—Black Monday, Tech Bubble, Financial Crisis, and the COVID pandemic—which are reflected in the tail ends of the distributions. De-trending the overall gain, we concentrated on comparing distributions of gains and losses. Specifically, we compared the tails of the distributions, which are believed to exhibit a power-law behavior and possibly contain outliers. To this end, we determined confidence intervals of the linear fits of the tails of the complementary cumulative distribution functions on a log–log scale and conducted a statistical U-test in order to detect outliers. We also studied probability density functions of the full distributions of the returns with an emphasis on their asymmetry. The key empirical observations are that the mean of de-trended distributions increases near-linearly with the number of days of accumulation while the overall skew is negative—consistent with the heavier tails of losses—and depends little on the number of days of accumulation. At the same time, the variance of the distributions exhibits near-perfect linear dependence on the number of days of accumulation; that is, it remains constant if scaled to the latter. Finally, we discuss the theoretical framework for understanding accumulated returns. Our main conclusion is that the current state of theory, which predicts symmetric or near-symmetric distributions of returns, cannot explain the aggregate of empirical results. Full article
Show Figures

Figure 1

27 pages, 1182 KB  
Article
The New Gompertz Distribution Model and Applications
by Ayşe Metin Karakaş and Fatma Bulut
Symmetry 2025, 17(6), 843; https://doi.org/10.3390/sym17060843 - 28 May 2025
Viewed by 974
Abstract
The Gompertz distribution has long been a cornerstone for analyzing growth processes and mortality patterns across various scientific disciplines. However, as the intricacies of real-world phenomena evolve, there is a pressing need for more versatile probability distributions that can accurately capture a wide [...] Read more.
The Gompertz distribution has long been a cornerstone for analyzing growth processes and mortality patterns across various scientific disciplines. However, as the intricacies of real-world phenomena evolve, there is a pressing need for more versatile probability distributions that can accurately capture a wide array of data characteristics. In response to this demand, we introduce the Marshall–Olkin Power Gompertz (MOPG) distribution, an innovative and powerful extension of the traditional Gompertz model. The MOPG distribution is crafted by enhancing the Power Gompertz cumulative distribution function through the Marshall–Olkin transformation. This distribution yields two pivotal contributions: a power parameter (c) that significantly increases the model’s adaptability to diverse data patterns and the Marshall–Olkin transformation, which modifies tail behavior to enhance predictive accuracy. Furthermore, we derived the distribution’s essential statistical properties and evaluate its performance through extensive Monte Carlo simulations, along with a maximum likelihood estimation of model parameters. Our empirical validation, utilizing three real-world data sets, compellingly demonstrated that the MOPG distribution not only surpasses several well-established lifetime distributions but is also superior in terms of flexibility and tail behavior characterization. The results highlight that the proposed MOPG stands out as a superior choice, delivering the most precise fit to the data when compared to various competing models, and its performance makes it a compelling option worth considering. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

28 pages, 1606 KB  
Article
Modelling Value-at-Risk and Expected Shortfall for a Small Capital Market: Do Fractionally Integrated Models and Regime Shifts Matter?
by Wafa Souffargi and Adel Boubaker
J. Risk Financial Manag. 2025, 18(4), 203; https://doi.org/10.3390/jrfm18040203 - 9 Apr 2025
Viewed by 1124
Abstract
In this study, we examine the relevance of the coexistence of structural change and long memory to model and forecast the volatility of Tunisian stock returns and to deliver a more accurate measure of risk along the lines of VaR and expected shortfall. [...] Read more.
In this study, we examine the relevance of the coexistence of structural change and long memory to model and forecast the volatility of Tunisian stock returns and to deliver a more accurate measure of risk along the lines of VaR and expected shortfall. To this end, we propose three time-series models that incorporate long-term dependence on the level and volatility of returns. In addition, we introduce structural change points using the iterated cumulative sums of squares (ICSS) and the modified ICSS algorithms, synonymous with stock market turbulence, into the conditional variance equations of the models studied. We choose a conditional innovation density function other than the normal distribution, that is, a Student distribution, to account for excess kurtosis. The empirical results show that the inclusion of structural breakpoints in the conditional variance equation and Dual LM provides better short- and long-term predictability. Within such a framework, the ICSS-ARFIMA-HYGARCH model with Student’s t distribution was able to account for the long-term dependence in the level and volatility of TUNINDEX index returns, excess kurtosis, and structural changes, delivering an accurate estimator of VaR and expected shortfall. Full article
(This article belongs to the Special Issue Machine Learning-Based Risk Management in Finance and Insurance)
Show Figures

Figure 1

31 pages, 10483 KB  
Article
Optimal Coordination of Directional Overcurrent Relays Using an Innovative Fractional-Order Derivative War Algorithm
by Bakht Muhammad Khan, Abdul Wadood, Herie Park, Shahbaz Khan and Husan Ali
Fractal Fract. 2025, 9(3), 169; https://doi.org/10.3390/fractalfract9030169 - 11 Mar 2025
Cited by 1 | Viewed by 1236
Abstract
Efficient coordination of directional overcurrent relays (DOCRs) is vital for maintaining the stability and reliability of electrical power systems (EPSs). The task of optimizing DOCR coordination in complex power networks is modeled as an optimization problem. This study aims to enhance the performance [...] Read more.
Efficient coordination of directional overcurrent relays (DOCRs) is vital for maintaining the stability and reliability of electrical power systems (EPSs). The task of optimizing DOCR coordination in complex power networks is modeled as an optimization problem. This study aims to enhance the performance of protection systems by minimizing the cumulative operating time of DOCRs. This is achieved by effectively synchronizing primary and backup relays while ensuring that coordination time intervals (CTIs) remain within predefined limits (0.2 to 0.5 s). A novel optimization strategy, the fractional-order derivative war optimizer (FODWO), is proposed to address this challenge. This innovative approach integrates the principles of fractional calculus (FC) into the conventional war optimization (WO) algorithm, significantly improving its optimization properties. The incorporation of fractional-order derivatives (FODs) enhances the algorithm’s ability to navigate complex optimization landscapes, avoiding local minima and achieving globally optimal solutions more efficiently. This leads to the reduced cumulative operating time of DOCRs and improved reliability of the protection system. The FODWO method was rigorously tested on standard EPSs, including IEEE three, eight, and fifteen bus systems, as well as on eleven benchmark optimization functions, encompassing unimodal and multimodal problems. The comparative analysis demonstrates that incorporating fractional-order derivatives (FODs) into the WO enhances its efficiency, enabling it to achieve globally optimal solutions and reduce the cumulative operating time of DOCRs by 3%, 6%, and 3% in the case of a three, eight, and fifteen bus system, respectively, compared to the traditional WO algorithm. To validate the effectiveness of FODWO, comprehensive statistical analyses were conducted, including box plots, quantile–quantile (QQ) plots, the empirical cumulative distribution function (ECDF), and minimal fitness evolution across simulations. These analyses confirm the robustness, reliability, and consistency of the FODWO approach. Comparative evaluations reveal that FODWO outperforms other state-of-the-art nature-inspired algorithms and traditional optimization methods, making it a highly effective tool for DOCR coordination in EPSs. Full article
Show Figures

Figure 1

16 pages, 5014 KB  
Article
Innovative Long-Term Exploration of the Impact of an Ecological Core Constructed Wetland System on Tailwater Treatment: A Statistical Analysis Incorporating Temperature Factors
by Yanchen Li, Chaozhou Mou and Qigui Niu
Water 2025, 17(5), 667; https://doi.org/10.3390/w17050667 - 25 Feb 2025
Cited by 1 | Viewed by 794
Abstract
The purpose of this article is to evaluate the tailwater treatment capacity of an emerging constructed wetland using statistical knowledge. For this purpose, the quality of the influent and effluent from the Fangshangou River constructed wetland was monitored and tested for a period [...] Read more.
The purpose of this article is to evaluate the tailwater treatment capacity of an emerging constructed wetland using statistical knowledge. For this purpose, the quality of the influent and effluent from the Fangshangou River constructed wetland was monitored and tested for a period of 3 years. The total area of the ecological wetland land plot is about 59 acres, and it has been officially put into use since 2021. Since its operation, the wetland has maintained good long-term stability in tail water treatment, pollutant removal, and other aspects. Statistically, we used empirical cumulative distribution functions (CDFs), removal rates, and Spearman correlation tests to strongly argue for the stable operational efficiency and high purification capacity of this constructed wetland. The high-efficiency ecological core wetland and the surface flow wetland are the main components of the constructed wetland. The average removal rates of the chemical oxygen demand (COD), ammonium (NH4+–N), total nitrogen (TN), and total phosphorus (TP) were 36.34%, 57.61%, 48.49%, and 71.47%, respectively. The analysis results indicate that temperature can affect the tailwater treatment capacity of constructed wetlands to a certain extent. The results of this study provide an important basis for studying the purification capacity of the constructed wetland. Full article
(This article belongs to the Section Wastewater Treatment and Reuse)
Show Figures

Figure 1

24 pages, 7152 KB  
Article
Benchmarking Uninitialized CMIP6 Simulations for Inter-Annual Surface Wind Predictions
by Joan Saladich Cubero, María Carmen Llasat and Raül Marcos Matamoros
Atmosphere 2025, 16(3), 254; https://doi.org/10.3390/atmos16030254 - 23 Feb 2025
Viewed by 1261
Abstract
This study investigates the potential of uninitialized global climate projections for providing 12-month (inter-annual) wind forecasts in Europe in light of the increasing demand for long-term climate predictions. This is important in a context where models based on the past climate may not [...] Read more.
This study investigates the potential of uninitialized global climate projections for providing 12-month (inter-annual) wind forecasts in Europe in light of the increasing demand for long-term climate predictions. This is important in a context where models based on the past climate may not fully account for the implications for climate variability of current warming trends, and where initialized 12-month forecasts are still not widely available (i.e., seasonal forecasts) and/or consolidated (i.e., decadal predictions). To this aim, we use two types of simulations: uninitialized climate projections from CMIP6 (Coupled Model Intercomparison Project Phase 6) and initialized 6-month seasonal forecasts (ECMWF’s SEAS5), using the latter as a benchmark. All the predictions are bias-corrected with five distinct approaches (quantile delta mapping, empirical quantile mapping, quantile delta mapping, scaling bias-adjustment and a proprietary quantile mapping) and verified against weather observations from the ECA&D E-OBS project (684 weather stations across Europe). It is observed that the quantile-mapping techniques outperform the other bias-correction algorithm in adjusting the cumulative distribution function (CDF) to the reference weather stations and, also, in reducing the mean bias error closer to zero. However, a simple bias -correction by scaling improves the time-series predictive accuracy (root mean square error, anomaly correlation coefficient and mean absolute scaled error) of CMIP6 simulations over quantile-mapping bias corrections. Thus, the results suggest that CMIP6 projections may provide a valuable preliminary framework for comprehending climate wind variations over the ensuing 12-month period. Finally, while baseline methods like climatology could still outperform the presented methods in terms of time-series accuracy (i.e., root mean square error), our approach highlights a key advantage: climatology is static, whereas CMIP6 offers a dynamic, evolving view of climatology. The combination of dynamism and bias correction makes CMIP6 projections a valuable starting point for understanding wind climate variations over the next 12 months. Furthermore, using workload schedulers within high-performance computing frameworks is essential for effectively handling these complex and ever-evolving datasets, highlighting the critical role of advanced computational methods in fully realizing the potential of CMIP6 for climate analysis. Full article
(This article belongs to the Special Issue High-Performance Computing for Atmospheric Modeling)
Show Figures

Figure 1

23 pages, 9435 KB  
Article
Autonomous Quality Control of High Spatiotemporal Resolution Automatic Weather Station Precipitation Data
by Hongxiang Ouyang, Zhengkun Qin, Xingsheng Xu, Yuan Xu, Jiang Huangfu, Xiaomin Li, Jiahui Hu, Zixuan Zhan and Junjie Yu
Remote Sens. 2025, 17(3), 404; https://doi.org/10.3390/rs17030404 - 24 Jan 2025
Viewed by 894
Abstract
How to prevent the influence of precipitation’s localized and sudden characteristics is the most formidable challenge in the quality control (QC) of precipitation observations. However, with sufficiently high spatiotemporal resolution in observational data, nuanced information can aid us in accurately distinguishing between intense, [...] Read more.
How to prevent the influence of precipitation’s localized and sudden characteristics is the most formidable challenge in the quality control (QC) of precipitation observations. However, with sufficiently high spatiotemporal resolution in observational data, nuanced information can aid us in accurately distinguishing between intense, localized precipitation events, and anomalies in precipitation data. China has deployed over 70,000 automatic weather stations (AWSs) that provide high spatiotemporal resolution surface meteorological observations. This study developed a new method for performing QC of precipitation data based on the high spatiotemporal resolution characteristics of observations from surface AWSs in China. The proposed QC algorithm uses the cumulative average method to standardize the probability distribution characteristics of precipitation data and further uses the empirical orthogonal function (EOF) decomposition method to effectively identify the small-scale spatial structure of precipitation data. Leveraging the spatial correlation characteristics of precipitation, partitioned EOF detection with a 0.5° spatial coverage effectively minimizes the influence of local precipitation on quality control. Analysis of precipitation probability distribution reveals that reconstruction based on the first three EOF modes can accurately capture the organized structural features of precipitation within the detection area. Thereby, based on the randomness characteristics of the residuals, when the residual of a certain observation is greater than 2.5 times the standard deviation calculated from all residuals in the region, it can be determined that the data are erroneous. Although the quality control is primarily aimed at accumulated precipitation, the randomness of erroneous data indicates that 84 continuous instances of error data in accumulated precipitation can effectively trace back to erroneous hourly precipitation observations. This ultimately enables the QC of hourly precipitation data from surface AWSs. Analysis of the QC of precipitation data from 2530 AWSs in Jiangxi Province (China) revealed that the new method can effectively identify incorrect precipitation data under the conditions of extreme weather and complex terrain, with an average rejection rate of about 5%. The EOF-based QC method can accurately detect strong precipitation events resulting from small-scale weather disturbances, thereby preventing local heavy rainfall from being incorrectly classified as erroneous data. Comparison with the quality control results in the Tianqing System, an operational QC system of the China Meteorological Administration, revealed that the proposed method has advantages in handling extreme and scattered outliers, and that the precipitation observation data, following quality control procedures, exhibits enhanced similarity with the CMAPS merged precipitation data. The novel quality control approach not only elevates the average spatial correlation coefficient between the two datasets by 0.01 but also diminishes the root mean square error by 1 mm. Full article
Show Figures

Figure 1

21 pages, 4725 KB  
Article
Benchmarking Measures for the Adaptation of New Irrigation Solutions for Small Farms in Egypt
by Abousrie A. Farag and Juan Gabriel Pérez-Pérez
Water 2025, 17(2), 137; https://doi.org/10.3390/w17020137 - 7 Jan 2025
Cited by 1 | Viewed by 1077
Abstract
The aim of this study is to construct and validate an expert system to predict the adaptation of irrigation technologies, water-saving strategies, and monitoring tools by small-scale farmers in Egypt. The research investigates the impact of economic, educational, environmental, and social factors on [...] Read more.
The aim of this study is to construct and validate an expert system to predict the adaptation of irrigation technologies, water-saving strategies, and monitoring tools by small-scale farmers in Egypt. The research investigates the impact of economic, educational, environmental, and social factors on adaptation rates. To build the expert system, extensive knowledge was collected from experts, key concepts were identified, and production rules were created to generate tailored scenarios. These scenarios utilize the empirical cumulative distribution function (ECDF), selecting the scenario with the highest ECDF as the optimal irrigation technology. This approach ensures well-informed, data-driven decisions that are tailored to specific conditions. The expert system was evaluated under the conditions of ten small farms in Egypt. The results indicate that water cost and availability are significant drivers of technology adaptation. Specifically, subsurface drip irrigation (SDI) demonstrated an adaptation percentage of 75% at high water costs, with probabilities of 0.67 and 0.33, while soil mulching (SM) showed a 75% adaptation rate with a probability of 0.33 in high-cost scenarios. Conversely, when water availability was high, the adaptation percentage for all techniques was zero, but it reached 100% adaptation with a probability of 0.76 for SM and SDI and a probability of 1 for variable number of drippers (VND) and the use of sensors as monitoring tools during water shortages. Educational attainment and professional networks enhance the adaptation of advanced technologies and monitoring tools, emphasizing the role of knowledge and community engagement. Environmental conditions, including soil texture and salinity levels, directly affect the choice of irrigation methods and water-saving practices, highlighting the need for localized solutions. The source of irrigation water, whether groundwater or surface water, influences the preference for water-saving technologies. The study underscores the importance of tailored approaches to address the challenges and opportunities faced by small farmers in Egypt, promoting sustainable agriculture and efficient water management. The evaluation findings reveal that SDI is the most favored irrigation technology, with a probability of 0.55, followed by variable number of drippers (VND) at 0.38 and ultralow drip irrigation (ULDI) at 0.07 across various scenarios for small farmers. Regulated deficit irrigation (RDI) and SM are equally preferred water-saving strategies, each with a probability of 0.50. Sensors emerged as the preferred monitoring tool, boasting a high probability of 0.94. The analysis reveals the critical roles of economic pressures, educational levels, environmental conditions, and social networks in shaping the adaptation of sustainable agricultural practices. Full article
Show Figures

Figure 1

22 pages, 1347 KB  
Article
Semi-Empirical Approach to Evaluating Model Fit for Sea Clutter Returns: Focusing on Future Measurements in the Adriatic Sea
by Bojan Vondra
Entropy 2024, 26(12), 1069; https://doi.org/10.3390/e26121069 - 9 Dec 2024
Cited by 1 | Viewed by 928
Abstract
A method for evaluating Kullback–Leibler (KL) divergence and Squared Hellinger (SH) distance between empirical data and a model distribution is proposed. This method exclusively utilises the empirical Cumulative Distribution Function (CDF) of the data and the CDF of the model, avoiding data processing [...] Read more.
A method for evaluating Kullback–Leibler (KL) divergence and Squared Hellinger (SH) distance between empirical data and a model distribution is proposed. This method exclusively utilises the empirical Cumulative Distribution Function (CDF) of the data and the CDF of the model, avoiding data processing such as histogram binning. The proposed method converges almost surely, with the proof based on the use of exponentially distributed waiting times. An example demonstrates convergence of the KL divergence and SH distance to their true values when utilising the Generalised Pareto (GP) distribution as empirical data and the K distribution as the model. Another example illustrates the goodness of fit of these (GP and K-distribution) models to real sea clutter data from the widely used Intelligent PIxel processing X-band (IPIX) measurements. The proposed method can be applied to assess the goodness of fit of various models (not limited to GP or K distribution) to clutter measurement data such as those from the Adriatic Sea. Distinctive features of this small and immature sea, like the presence of over 1300 islands that affect local wind and wave patterns, are likely to result in an amplitude distribution of sea clutter returns that differs from predictions of models designed for oceans or open seas. However, to the author’s knowledge, no data on this specific topic are currently available in the open literature, and such measurements have yet to be conducted. Full article
Show Figures

Figure 1

22 pages, 3167 KB  
Article
Gumbel–Logistic Unit Distribution with Application in Telecommunications Data Modeling
by Vladica S. Stojanović, Mihailo Jovanović, Brankica Pažun, Zlatko Langović and Željko Grujčić
Symmetry 2024, 16(11), 1513; https://doi.org/10.3390/sym16111513 - 11 Nov 2024
Cited by 3 | Viewed by 1144
Abstract
The manuscript deals with a new unit distribution that depends on two positive parameters. The distribution itself was obtained from the Gumbel distribution, i.e., by its transformation, using generalized logistic mapping, into a unit interval. In this way, the so-called Gumbel-logistic unit (abbr. [...] Read more.
The manuscript deals with a new unit distribution that depends on two positive parameters. The distribution itself was obtained from the Gumbel distribution, i.e., by its transformation, using generalized logistic mapping, into a unit interval. In this way, the so-called Gumbel-logistic unit (abbr. GLU) distribution is obtained, and its key properties, such as cumulative distribution function, modality, hazard and quantile function, moment-based characteristics, Bayesian inferences and entropy, have been investigated in detail. Among others, it is shown that the GLU distribution, unlike the Gumbel one which is always positively asymmetric, can take both asymmetric forms. An estimation of the parameters of the GLU distribution, based on its quantiles, is also performed, together with asymptotic properties of the estimates thus obtained and their numerical simulation. Finally, the GLU distribution has been applied in modeling the empirical distributions of some real-world data related to telecommunications. Full article
Show Figures

Figure 1

17 pages, 5072 KB  
Article
Image Feature Extraction Using Symbolic Data of Cumulative Distribution Functions
by Sri Winarni, Sapto Wahyu Indratno, Restu Arisanti and Resa Septiani Pontoh
Mathematics 2024, 12(13), 2089; https://doi.org/10.3390/math12132089 - 3 Jul 2024
Cited by 2 | Viewed by 1360
Abstract
Symbolic data analysis is an emerging field in statistics with great potential to become a standard inferential technique. This research introduces a new approach to image feature extraction using the empirical cumulative distribution function (ECDF) and distribution function of distribution values (DFDV) as [...] Read more.
Symbolic data analysis is an emerging field in statistics with great potential to become a standard inferential technique. This research introduces a new approach to image feature extraction using the empirical cumulative distribution function (ECDF) and distribution function of distribution values (DFDV) as symbolic data. The main objective is to reduce the dimension of huge pixel data by organizing them into more coherent pixel-intensity distributions. We propose a partitioning method with different breakpoints to capture pixel intensity variations effectively. This results in an ECDF representing the proportion of pixel intensities and a DFDV representing the probability distribution at specific points. The novelty of this approach lies in using ECDF and DFDV as symbolic features, thus summarizing the data and providing a more informative representation of the pixel value distribution, facilitating image classification analysis based on intensity distribution. The experimental results underscore the potential of this method in distinguishing image characteristics among existing image classes. Image features extracted using this approach promise image classification analysis with more informative image representations. In addition, theoretical insights into the properties of DFDV distribution functions are gained. Full article
Show Figures

Figure 1

18 pages, 6817 KB  
Article
Investigation of Variability of Flaw Strength Distributions on Brittle SiC Ceramic
by Jacques Lamon
Ceramics 2024, 7(2), 759-776; https://doi.org/10.3390/ceramics7020050 - 4 Jun 2024
Cited by 1 | Viewed by 1581
Abstract
The present paper investigates flaw strength distributions established using various flexural tests on batches of SiC bar test specimens, namely four-point bending as well as three-point bending tests with different span lengths. Flaw strength is provided by the elemental stress operating on the [...] Read more.
The present paper investigates flaw strength distributions established using various flexural tests on batches of SiC bar test specimens, namely four-point bending as well as three-point bending tests with different span lengths. Flaw strength is provided by the elemental stress operating on the critical flaw at the fracture of a test specimen. Fracture-inducing flaws and their locations are identified using fractography. A single population of pores was found to dominate the fracture. The construction of diagrams of p-quantile vs. elemental strengths was aimed at assessing the Gaussian nature of flaw strengths. Then, empirical cumulative distributions of strengths were constructed using the normal distribution function. The Weibull distributions of strengths are then compared to the normal reference distributions. The parameters of the Weibull cumulative probability distributions are estimated using maximum likelihood and moment methods. The cumulative distributions of flexural strengths for the different bending tests are predicted from the flaw strength density function using the elemental strength model, and from the cumulative distribution of flexural strength using the Weibull function. Flaw strength distributions that include the weaker flaws that are potentially present in larger test pieces are extrapolated using the p-quantile diagrams. Implications are discussed regarding the pertinence of an intrinsically representative flaw strength distribution, considering failure predictions. Finally, the influence of the characteristics of fracture-inducing flaw populations expressed in terms of flaw strength interval, size, dispersion, heterogeneity, and reproducibility with volume change is examined. Full article
(This article belongs to the Special Issue Advances in Ceramics, 2nd Edition)
Show Figures

Figure 1

Back to TopTop