Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (30)

Search Parameters:
Keywords = empirical CDF

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
32 pages, 907 KiB  
Article
A New Exponentiated Power Distribution for Modeling Censored Data with Applications to Clinical and Reliability Studies
by Kenechukwu F. Aforka, H. E. Semary, Sidney I. Onyeagu, Harrison O. Etaga, Okechukwu J. Obulezi and A. S. Al-Moisheer
Symmetry 2025, 17(7), 1153; https://doi.org/10.3390/sym17071153 - 18 Jul 2025
Viewed by 534
Abstract
This paper presents the exponentiated power shanker (EPS) distribution, a fresh three-parameter extension of the standard Shanker distribution with the ability to extend a wider class of data behaviors, from right-skewed and heavy-tailed phenomena. The structural properties of the distribution, namely complete and [...] Read more.
This paper presents the exponentiated power shanker (EPS) distribution, a fresh three-parameter extension of the standard Shanker distribution with the ability to extend a wider class of data behaviors, from right-skewed and heavy-tailed phenomena. The structural properties of the distribution, namely complete and incomplete moments, entropy, and the moment generating function, are derived and examined in a formal manner. Maximum likelihood estimation (MLE) techniques are used for estimation of parameters, as well as a Monte Carlo simulation study to account for estimator performance across varying sample sizes and parameter values. The EPS model is also generalized to a regression paradigm to include covariate data, whose estimation is also conducted via MLE. Practical utility and flexibility of the EPS distribution are demonstrated through two real examples: one for the duration of repairs and another for HIV/AIDS mortality in Germany. Comparisons with some of the existing distributions, i.e., power Zeghdoudi, power Ishita, power Prakaamy, and logistic-Weibull, are made through some of the goodness-of-fit statistics such as log-likelihood, AIC, BIC, and the Kolmogorov–Smirnov statistic. Graphical plots, including PP plots, QQ plots, TTT plots, and empirical CDFs, further confirm the high modeling capacity of the EPS distribution. Results confirm the high goodness-of-fit and flexibility of the EPS model, making it a very good tool for reliability and biomedical modeling. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

25 pages, 5011 KiB  
Article
New Insights into Meteorological and Hydrological Drought Modeling: A Comparative Analysis of Parametric and Non-Parametric Distributions
by Ahmad Abu Arra and Eyüp Şişman
Atmosphere 2025, 16(7), 846; https://doi.org/10.3390/atmos16070846 - 11 Jul 2025
Viewed by 196
Abstract
Accurate drought monitoring depends on selecting an appropriate cumulative distribution function (CDF) to model the original data, resulting in the standardized drought indices. In the numerous research studies, while rigorous validation was not made by scrutinizing the model assumptions and uncertainties in identifying [...] Read more.
Accurate drought monitoring depends on selecting an appropriate cumulative distribution function (CDF) to model the original data, resulting in the standardized drought indices. In the numerous research studies, while rigorous validation was not made by scrutinizing the model assumptions and uncertainties in identifying theoretical drought CDF models, such oversights lead to biased representations of drought evaluation and characteristics. This research compares the parametric theoretical and empirical CDFs for a comprehensive evaluation of standardized Drought Indices. Additionally, it examines the advantages, disadvantages, and limitations of both empirical and theoretical distribution functions in drought assessment. Three drought indices, Standardized Precipitation Index (SPI), Streamflow Drought Index (SDI), and Standardized Precipitation Evapotranspiration Index (SPEI), cover meteorological and hydrological droughts. The assessment spans diverse applications, covering different climates and regions: Durham, United Kingdom (SPEI, 1868–2021); Konya, Türkiye (SPI, 1964–2022); and Lüleburgaz, Türkiye (SDI, 1957–2015). The findings reveal that theoretical and empirical CDFs demonstrated notable discrepancies, particularly in long-term hydrological drought assessments, where underestimations reached up to 50%, posing risks of misinformed conclusions that may impact critical drought-related decisions and policymaking. Root Mean Squared Error (RMSE) for SPI3 between empirical and best-fitted CDF was 0.087, and between empirical and Gamma it was 0.152. For SDI, it ranged between 0.09 and 0.143. The Mean Absolute Error (MAE) for SPEI was approximately 0.05 for all timescales. Additionally, it concludes that empirical CDFs provide more reliable and conservative drought assessments and are free from the constraints of model assumptions. Both approaches gave approximately the same drought duration with different intensities regarding drought characteristics. Due to the complex process of drought events and different definitions of drought events, each drought event must be studied separately, considering its effects on different sectors. Full article
(This article belongs to the Special Issue Drought Monitoring, Prediction and Impacts (2nd Edition))
Show Figures

Figure 1

23 pages, 3151 KiB  
Article
Should We Use Quantile-Mapping-Based Methods in a Climate Change Context? A “Perfect Model” Experiment
by Mathieu Vrac, Harilaos Loukos, Thomas Noël and Dimitri Defrance
Climate 2025, 13(7), 137; https://doi.org/10.3390/cli13070137 - 1 Jul 2025
Viewed by 752
Abstract
This study assesses the use of Quantile-Mapping methods for bias correction and downscaling in climate change studies. A “Perfect Model Experiment” is conducted using high-resolution climate simulations as pseudo-references and coarser versions as biased data. The focus is on European daily temperature and [...] Read more.
This study assesses the use of Quantile-Mapping methods for bias correction and downscaling in climate change studies. A “Perfect Model Experiment” is conducted using high-resolution climate simulations as pseudo-references and coarser versions as biased data. The focus is on European daily temperature and precipitation under the RCP 8.5 scenario. Six methods are tested: an empirical Quantile-Mapping approach, the “Cumulative Distribution Function—transform” (CDF-t) method, and four CDF-t variants with different parameters. Their performance is evaluated based on univariate and multivariate properties over the calibration period (1981–2010) and a future period (2071–2100). The results show that while Quantile Mapping and CDF-t perform similarly during calibration, significant differences arise in future projections. Quantile Mapping exhibits biases in the means, standard deviations, and extremes, failing to capture the climate change signal. CDF-t and its variants show smaller biases, with one variant proving particularly robust. The choice of discretization parameter in CDF-t is crucial, as the low number of bins increases the biases. This study concludes that Quantile Mapping is not appropriate for adjustments in a climate change context, whereas CDF-t, especially a variant that stabilizes extremes, offers a more reliable alternative. Full article
Show Figures

Figure 1

16 pages, 5014 KiB  
Article
Innovative Long-Term Exploration of the Impact of an Ecological Core Constructed Wetland System on Tailwater Treatment: A Statistical Analysis Incorporating Temperature Factors
by Yanchen Li, Chaozhou Mou and Qigui Niu
Water 2025, 17(5), 667; https://doi.org/10.3390/w17050667 - 25 Feb 2025
Cited by 1 | Viewed by 624
Abstract
The purpose of this article is to evaluate the tailwater treatment capacity of an emerging constructed wetland using statistical knowledge. For this purpose, the quality of the influent and effluent from the Fangshangou River constructed wetland was monitored and tested for a period [...] Read more.
The purpose of this article is to evaluate the tailwater treatment capacity of an emerging constructed wetland using statistical knowledge. For this purpose, the quality of the influent and effluent from the Fangshangou River constructed wetland was monitored and tested for a period of 3 years. The total area of the ecological wetland land plot is about 59 acres, and it has been officially put into use since 2021. Since its operation, the wetland has maintained good long-term stability in tail water treatment, pollutant removal, and other aspects. Statistically, we used empirical cumulative distribution functions (CDFs), removal rates, and Spearman correlation tests to strongly argue for the stable operational efficiency and high purification capacity of this constructed wetland. The high-efficiency ecological core wetland and the surface flow wetland are the main components of the constructed wetland. The average removal rates of the chemical oxygen demand (COD), ammonium (NH4+–N), total nitrogen (TN), and total phosphorus (TP) were 36.34%, 57.61%, 48.49%, and 71.47%, respectively. The analysis results indicate that temperature can affect the tailwater treatment capacity of constructed wetlands to a certain extent. The results of this study provide an important basis for studying the purification capacity of the constructed wetland. Full article
(This article belongs to the Section Wastewater Treatment and Reuse)
Show Figures

Figure 1

24 pages, 7152 KiB  
Article
Benchmarking Uninitialized CMIP6 Simulations for Inter-Annual Surface Wind Predictions
by Joan Saladich Cubero, María Carmen Llasat and Raül Marcos Matamoros
Atmosphere 2025, 16(3), 254; https://doi.org/10.3390/atmos16030254 - 23 Feb 2025
Viewed by 928
Abstract
This study investigates the potential of uninitialized global climate projections for providing 12-month (inter-annual) wind forecasts in Europe in light of the increasing demand for long-term climate predictions. This is important in a context where models based on the past climate may not [...] Read more.
This study investigates the potential of uninitialized global climate projections for providing 12-month (inter-annual) wind forecasts in Europe in light of the increasing demand for long-term climate predictions. This is important in a context where models based on the past climate may not fully account for the implications for climate variability of current warming trends, and where initialized 12-month forecasts are still not widely available (i.e., seasonal forecasts) and/or consolidated (i.e., decadal predictions). To this aim, we use two types of simulations: uninitialized climate projections from CMIP6 (Coupled Model Intercomparison Project Phase 6) and initialized 6-month seasonal forecasts (ECMWF’s SEAS5), using the latter as a benchmark. All the predictions are bias-corrected with five distinct approaches (quantile delta mapping, empirical quantile mapping, quantile delta mapping, scaling bias-adjustment and a proprietary quantile mapping) and verified against weather observations from the ECA&D E-OBS project (684 weather stations across Europe). It is observed that the quantile-mapping techniques outperform the other bias-correction algorithm in adjusting the cumulative distribution function (CDF) to the reference weather stations and, also, in reducing the mean bias error closer to zero. However, a simple bias -correction by scaling improves the time-series predictive accuracy (root mean square error, anomaly correlation coefficient and mean absolute scaled error) of CMIP6 simulations over quantile-mapping bias corrections. Thus, the results suggest that CMIP6 projections may provide a valuable preliminary framework for comprehending climate wind variations over the ensuing 12-month period. Finally, while baseline methods like climatology could still outperform the presented methods in terms of time-series accuracy (i.e., root mean square error), our approach highlights a key advantage: climatology is static, whereas CMIP6 offers a dynamic, evolving view of climatology. The combination of dynamism and bias correction makes CMIP6 projections a valuable starting point for understanding wind climate variations over the next 12 months. Furthermore, using workload schedulers within high-performance computing frameworks is essential for effectively handling these complex and ever-evolving datasets, highlighting the critical role of advanced computational methods in fully realizing the potential of CMIP6 for climate analysis. Full article
(This article belongs to the Special Issue High-Performance Computing for Atmospheric Modeling)
Show Figures

Figure 1

22 pages, 1347 KiB  
Article
Semi-Empirical Approach to Evaluating Model Fit for Sea Clutter Returns: Focusing on Future Measurements in the Adriatic Sea
by Bojan Vondra
Entropy 2024, 26(12), 1069; https://doi.org/10.3390/e26121069 - 9 Dec 2024
Cited by 1 | Viewed by 833
Abstract
A method for evaluating Kullback–Leibler (KL) divergence and Squared Hellinger (SH) distance between empirical data and a model distribution is proposed. This method exclusively utilises the empirical Cumulative Distribution Function (CDF) of the data and the CDF of the model, avoiding data processing [...] Read more.
A method for evaluating Kullback–Leibler (KL) divergence and Squared Hellinger (SH) distance between empirical data and a model distribution is proposed. This method exclusively utilises the empirical Cumulative Distribution Function (CDF) of the data and the CDF of the model, avoiding data processing such as histogram binning. The proposed method converges almost surely, with the proof based on the use of exponentially distributed waiting times. An example demonstrates convergence of the KL divergence and SH distance to their true values when utilising the Generalised Pareto (GP) distribution as empirical data and the K distribution as the model. Another example illustrates the goodness of fit of these (GP and K-distribution) models to real sea clutter data from the widely used Intelligent PIxel processing X-band (IPIX) measurements. The proposed method can be applied to assess the goodness of fit of various models (not limited to GP or K distribution) to clutter measurement data such as those from the Adriatic Sea. Distinctive features of this small and immature sea, like the presence of over 1300 islands that affect local wind and wave patterns, are likely to result in an amplitude distribution of sea clutter returns that differs from predictions of models designed for oceans or open seas. However, to the author’s knowledge, no data on this specific topic are currently available in the open literature, and such measurements have yet to be conducted. Full article
Show Figures

Figure 1

24 pages, 1762 KiB  
Article
Deploying Bottleneck Management Strategies for Ameliorating Critical Delays in Building Construction Projects: A Case for Developing Country of Iran
by Hamidreza Karimi, Hadi Sarvari, David J. Edwards, Daniel W. M. Chan and Timothy O. Olawumi
Systems 2024, 12(6), 195; https://doi.org/10.3390/systems12060195 - 5 Jun 2024
Cited by 4 | Viewed by 2905
Abstract
One of the primary concerns and challenges encountered in the construction industry is the emergence of crucial factors instigating project delays throughout the construction project lifecycle (CPL). The critical delay factors (CDFs) are the significant factors that not only cause project delays but [...] Read more.
One of the primary concerns and challenges encountered in the construction industry is the emergence of crucial factors instigating project delays throughout the construction project lifecycle (CPL). The critical delay factors (CDFs) are the significant factors that not only cause project delays but also create obstacles and bottlenecks for the projects. Hence, the current study aims to determine CDFs affecting project completions and ameliorates the adverse situation by developing relevant bottleneck management strategies. To achieve this goal, a desktop review of previous research studies was undertaken to identify the CDFs in the CPL. The brainstorming technique was further utilized to filter the identified CDFs and match them to the context of developing countries, using Iran as a case example. Finally, an empirical questionnaire was created that included 22 CDFs divided into three distinct groups. The questionnaire’s validity and reliability were checked and validated before massive distribution to target respondents. Sixty industry experts appraised the identified CDFs in the CPL based on two assessment criteria: the severity of impact and probability of occurrence. The findings revealed that the groups with the most significant level of impact (out of 5 points) are project planning and design (2.29), construction and delivery (1.99), and policymaking and legislation (1.72). Similarly, the groups of project planning and design (2.30), construction and delivery (2.20), and policymaking and legislation (1.5) were ranked from first to third based on the probability of occurrence. According to the survey findings, the project planning and design stage is the most optimal time to mitigate the impact of project delays. Moreover, the study posited some pragmatic recommendations as bottleneck management strategies for ameliorating the identified CDFs for future projects. The study deliverables can serve as an effective tool for project stakeholders and decision makers to diminish the impact on and penetration of CDFs into building construction projects and enhance the delivery path leading to project success. Full article
Show Figures

Figure 1

24 pages, 4521 KiB  
Article
Calibrated Empirical Neutrosophic Cumulative Distribution Function Estimation for Both Symmetric and Asymmetric Data
by Hareem Abbasi, Usman Shahzad, Walid Emam, Muhammad Hanif, Nasir Ali and Mubeen Mukhtar
Symmetry 2024, 16(5), 633; https://doi.org/10.3390/sym16050633 - 20 May 2024
Cited by 3 | Viewed by 1812
Abstract
The traditional stratification weight is widely used in survey sampling for estimation under stratified random sampling (StRS). A neutrosophic calibration approach is proposed under neutrosophic statistics for the first time with the aim of improving conventional stratification weight. This addresses the challenge of [...] Read more.
The traditional stratification weight is widely used in survey sampling for estimation under stratified random sampling (StRS). A neutrosophic calibration approach is proposed under neutrosophic statistics for the first time with the aim of improving conventional stratification weight. This addresses the challenge of estimating the empirical cumulative distribution function (CDF) of a finite population using the neutrosophic technique. The neutrosophic technique extends traditional statistics, dealing with indeterminate, vague, and uncertain values. Thus, using additional information, we are able to obtain an effective estimate of the neutrosophic CDF. The suggested estimator yields an interval range in which the population empirical CDF is likely to exist rather than a single numerical value. The proposed family of neutrosophic estimators will be defined under suitable calibration constraints. A simulation study is also computed in order to assess the effectiveness of the suggested and adapted neutrosophic estimators using real-life symmetric and asymmetric datasets. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

7 pages, 231 KiB  
Article
The Law of the Iterated Logarithm for Lp-Norms of Kernel Estimators of Cumulative Distribution Functions
by Fuxia Cheng
Mathematics 2024, 12(7), 1063; https://doi.org/10.3390/math12071063 - 1 Apr 2024
Cited by 1 | Viewed by 1630
Abstract
In this paper, we consider the strong convergence of Lp-norms (p1) of a kernel estimator of a cumulative distribution function (CDF). Under some mild conditions, the law of the iterated logarithm (LIL) for the Lp-norms [...] Read more.
In this paper, we consider the strong convergence of Lp-norms (p1) of a kernel estimator of a cumulative distribution function (CDF). Under some mild conditions, the law of the iterated logarithm (LIL) for the Lp-norms of empirical processes is extended to the kernel estimator of the CDF. Full article
(This article belongs to the Special Issue Advances in Applied Probability and Statistical Inference)
14 pages, 324 KiB  
Article
A New Test Statistic to Assess the Goodness of Fit of Location-Scale Distribution Based on Progressive Censored Data
by Kyeongjun Lee
Symmetry 2024, 16(2), 202; https://doi.org/10.3390/sym16020202 - 8 Feb 2024
Cited by 2 | Viewed by 1446
Abstract
The problem of examining how well the data fit a supposed distribution is very important, and it must be confirmed prior to any data analysis, because many data analysis methods assume a specific distribution of data. For this purpose, histograms or Q-Q plots [...] Read more.
The problem of examining how well the data fit a supposed distribution is very important, and it must be confirmed prior to any data analysis, because many data analysis methods assume a specific distribution of data. For this purpose, histograms or Q-Q plots are employed for the assessment of data distribution. Additionally, a GoF TstS utilizes distance measurements between the empirical distribution function and the theoretical cumulative distribution function (cdf) to evaluate data distribution. In life-testing or reliability studies, the observed failure time of test units may not be recorded in some situations. The GoF TstSs for completely observed data can no longer be used in progressive type II censored data (PrCsD). In this paper, we suggest a GoF TstSs and new plot method for the GoF test of symmetric and asymmetric location-scale distribution (LoScD) based on PrCsD. The power of the suggested TstSs is estimated through Monte Carlo (MC) simulations, and it is compared with that of the TstSs using the order statistics (OrSt). Furthermore, we analyzed real data examples (symmetric and asymmetric data). Full article
Show Figures

Figure 1

31 pages, 2344 KiB  
Article
Properties of a Random Bipartite Geometric Associator Graph Inspired by Vehicular Networks
by Kaushlendra Pandey, Abhishek K. Gupta, Harpreet S. Dhillon and Kanaka Raju Perumalla
Entropy 2023, 25(12), 1619; https://doi.org/10.3390/e25121619 - 4 Dec 2023
Viewed by 2091
Abstract
We consider a point process (PP) generated by superimposing an independent Poisson point process (PPP) on each line of a 2D Poisson line process (PLP). Termed PLP-PPP, this PP is suitable for modeling networks formed on an irregular collection of lines, such as [...] Read more.
We consider a point process (PP) generated by superimposing an independent Poisson point process (PPP) on each line of a 2D Poisson line process (PLP). Termed PLP-PPP, this PP is suitable for modeling networks formed on an irregular collection of lines, such as vehicles on a network of roads and sensors deployed along trails in a forest. Inspired by vehicular networks in which vehicles connect with their nearest wireless base stations (BSs), we consider a random bipartite associator graph in which each point of the PLP-PPP is associated with the nearest point of an independent PPP through an edge. This graph is equivalent to the partitioning of PLP-PPP by a Poisson Voronoi tessellation (PVT) formed by an independent PPP. We first characterize the exact distribution of the number of points of PLP-PPP falling inside the ball centered at an arbitrary location in R2 as well as the typical point of PLP-PPP. Using these distributions, we derive cumulative distribution functions (CDFs) and probability density functions (PDFs) of kth contact distance (CD) and the nearest neighbor distance (NND) of PLP-PPP. As intermediate results, we present the empirical distribution of the perimeter and approximate distribution of the length of the typical chord of the zero-cell of this PVT. Using these results, we present two close approximations of the distribution of node degree of the random bipartite associator graph. In a vehicular network setting, this result characterizes the number of vehicles connected to each BS, which models its load. Since each BS has to distribute its limited resources across all the vehicles connected to it, a good statistical understanding of load is important for an efficient system design. Several applications of these new results to different wireless network settings are also discussed. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

24 pages, 569 KiB  
Article
Derivative of Reduced Cumulative Distribution Function and Applications
by Kevin Maritato and Stan Uryasev
J. Risk Financial Manag. 2023, 16(10), 450; https://doi.org/10.3390/jrfm16100450 - 18 Oct 2023
Cited by 1 | Viewed by 2601
Abstract
The reduced cumulative distribution function (rCDF) is the maximal lower bound for the cumulative distribution function (CDF). It is equivalent to the inverse of the conditional value at risk (CVaR), or one minus the buffered probability of exceedance (bPOE). This paper introduces the [...] Read more.
The reduced cumulative distribution function (rCDF) is the maximal lower bound for the cumulative distribution function (CDF). It is equivalent to the inverse of the conditional value at risk (CVaR), or one minus the buffered probability of exceedance (bPOE). This paper introduces the reduced probability density function (rPDF), the derivative of rCDF. We first explore the relation between rCDF and other risk measures. Then we describe three means of calculating rPDF for a distribution, depending on what is known about the distribution. For functions with a closed-form formula for bPOE, we derive closed-form formulae for rPDF. Further, we describe formulae for rPDF based on a numerical bPOE when there is a closed-form formula for CVaR but no closed-form formula for bPOE. Finally, we give a method for numerically calculating rPDF for an empirical distribution, and compare the results with other methods for known distributions. We conducted a case study and used rPDF for sensitivity analysis and parameter estimation with a method similar to the maximum likelihood method. Full article
(This article belongs to the Special Issue Financial Technologies (Fintech) in Finance and Economics)
Show Figures

Figure 1

24 pages, 1216 KiB  
Article
Empirical Squared Hellinger Distance Estimator and Generalizations to a Family of α-Divergence Estimators
by Rui Ding and Andrew Mullhaupt
Entropy 2023, 25(4), 612; https://doi.org/10.3390/e25040612 - 4 Apr 2023
Cited by 4 | Viewed by 4055
Abstract
We present an empirical estimator for the squared Hellinger distance between two continuous distributions, which almost surely converges. We show that the divergence estimation problem can be solved directly using the empirical CDF and does not need the intermediate step of estimating the [...] Read more.
We present an empirical estimator for the squared Hellinger distance between two continuous distributions, which almost surely converges. We show that the divergence estimation problem can be solved directly using the empirical CDF and does not need the intermediate step of estimating the densities. We illustrate the proposed estimator on several one-dimensional probability distributions. Finally, we extend the estimator to a family of estimators for the family of α-divergences, which almost surely converge as well, and discuss the uniqueness of this result. We demonstrate applications of the proposed Hellinger affinity estimators to approximately bounding the Neyman–Pearson regions. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

15 pages, 292 KiB  
Article
An Efficient Ratio-Cum-Exponential Estimator for Estimating the Population Distribution Function in the Existence of Non-Response Using an SRS Design
by Ayesha Khalid, Aamir Sanaullah, Mohammed M. A. Almazah and Fuad S. Al-Duais
Mathematics 2023, 11(6), 1312; https://doi.org/10.3390/math11061312 - 8 Mar 2023
Cited by 3 | Viewed by 2509
Abstract
To gain insight into various phenomena of interest, cumulative distribution functions (CDFs) can be used to analyze survey data. The purpose of this study was to present an efficient ratiocum-exponential estimator for estimating a population CDF using auxiliary information under two scenarios of [...] Read more.
To gain insight into various phenomena of interest, cumulative distribution functions (CDFs) can be used to analyze survey data. The purpose of this study was to present an efficient ratiocum-exponential estimator for estimating a population CDF using auxiliary information under two scenarios of non-response. Up to first-order approximation, expressions for the bias and mean squared error (MSE) were derived. The proposed estimator was compared theoretically and empirically, with the modified estimators. The proposed estimator was found to be better than the modified estimators based on present-relative efficiency PRE and MSE criteria under the specific conditions. Full article
(This article belongs to the Special Issue Survey Statistics and Survey Sampling: Challenges and Opportunities)
31 pages, 7450 KiB  
Article
Modelling the Operation Process of Light Utility Vehicles in Transport Systems Using Monte Carlo Simulation and Semi-Markov Approach
by Mateusz Oszczypała, Jarosław Ziółkowski and Jerzy Małachowski
Energies 2023, 16(5), 2210; https://doi.org/10.3390/en16052210 - 24 Feb 2023
Cited by 9 | Viewed by 2653
Abstract
This research paper presents studies on the operation process of the Honker 2000 light utility vehicles that are part of the Polish Armed Forces transport system. The phase space of the process was identified based on the assumption that at any given moment [...] Read more.
This research paper presents studies on the operation process of the Honker 2000 light utility vehicles that are part of the Polish Armed Forces transport system. The phase space of the process was identified based on the assumption that at any given moment the vehicle remains in one of four states, namely, task execution, awaiting a transport task, periodic maintenance, or repair. Vehicle functional readiness and technical suitability indices were adopted as performance measures for the technical system. A simulation model based on Monte Carlo methods was developed to determine the changes in the operational states. The occurrence of the periodic maintenance state is strictly determined by a planned and preventive strategy of operation applied within the analysed system. Other states are implementations of stochastic processes. The original source code was developed in the MATLAB environment to implement the model. Based on estimated probabilistic characteristics, the authors validated 16 simulation models resulting from all possible cumulative distribution functions (CDFs) that satisfied the condition of a proper match to empirical data. Based on the simulated operation process for a sample of 19 vehicles over the assumed 20-year forecast horizon, it was possible to determine the functional readiness and technical suitability indices. The relative differences between the results of all simulation models and the results obtained through the semi-Markov model did not exceed 6%. The best-fit model was subjected to sensitivity analysis in terms of the dependence between functional readiness and technical suitability indices on vehicle operation intensity. As a result, the proposed simulation system based on Monte Carlo methods turned out to be a useful tool in analysing the current operation process of means of transport in terms of forecasts related to a current environment, as well as when attempting its extrapolation. Full article
Show Figures

Graphical abstract

Back to TopTop