Next Article in Journal
Social Media as a Catalyst for Sustainable Public Health Practices: A Structural Equation Modeling Analysis of Protective Behaviors in China During the COVID-19 Pandemic
Next Article in Special Issue
Quantum-Inspired Spatio-Temporal Inference Network for Sustainable Car-Sharing Demand Prediction
Previous Article in Journal
Mapping Water Yield Service Flows in the Transnational Area of Tumen River
Previous Article in Special Issue
ST_AGCNT: Traffic Speed Forecasting Based on Spatial–Temporal Adaptive Graph Convolutional Network with Transformer
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Resource Consumption Standards for Highway Electromechanical Equipment Based on Monte Carlo Model

1
School of Business, Xinjiang University, Urumqi 830017, China
2
Xinjiang Transportation Investment (Group) Co., Ltd., Urumqi 830001, China
3
Engineering Technology and Transportation Industry in Arid Desert Areas, Urumqi 830001, China
4
School of Transportation Engineering, Xinjiang University, Urumqi 830017, China
5
Xinjiang Key Laboratory of Green Construction and Maintenance of Transportation Infrastructure and Intelligent Traffic Control, Urumqi 830017, China
*
Author to whom correspondence should be addressed.
Sustainability 2025, 17(10), 4640; https://doi.org/10.3390/su17104640
Submission received: 3 March 2025 / Revised: 14 May 2025 / Accepted: 16 May 2025 / Published: 19 May 2025

Abstract

:
The increasing complexity of highway electromechanical systems has created a critical need to improve the accuracy of resource consumption standards. Traditional deterministic methods often fail to capture inherent variability in resource usage, resulting in significant discrepancies between budget estimates and actual costs. To address this issue for a specific device, this study develops a probabilistic framework based on Monte Carlo simulation, using manual barrier gate installation as a case study. First, probability distribution models for key parameters were established by collecting and statistically analyzing field data. Next, Monte Carlo simulation generated 100,000 pseudo-observations, yielding mean labor consumption of 1.08 workdays (SD 0.29), expansion bolt usage of 6.02 sets (SD 0.97), and equipment shifts of 0.20 (SD 0.10). Comparison with the “Highway Engineering Budget Standards” (JTG/T 3832-2018) revealed deviations of 1% to 4%, and comparison with market bid prices showed errors below 2%. These results demonstrate that the proposed method accurately captures dynamic fluctuations in resource consumption, aligning with both national norms and actual tender data. In conclusion, the framework offers a robust and adaptable tool for cost estimation and resource allocation in highway electromechanical projects, enhancing budgeting accuracy and reducing the risk of cost overruns.

1. Introduction

Over the past few decades, the rapid development of highway infrastructure has expanded road networks and accelerated the digital transformation and modernization of transportation systems [1]. Electromechanical equipment, encompassing systems such as traffic monitoring, toll collection, communication networks, and lighting, is pivotal in enhancing operational efficiency and safety. As investment in electromechanical equipment continues to rise, constituting a substantial portion of project costs, the complexity of managing, maintaining, and estimating the costs of these systems has significantly increased.
Despite significant advances in construction practices and digital technologies, existing methods for establishing resource consumption standards often fall short of capturing the dynamic consumption volumes of modern electromechanical equipment [2]. Traditional approaches to setting these standards do not fully accommodate the rapid evolution in equipment specifications and functionalities, leading to discrepancies between estimated and actual consumption [3]. This gap highlights the need for a more robust probabilistic framework to develop resource consumption standards that accurately reflect actual consumption patterns, thereby supporting more effective resource planning and management in highway electromechanical systems.
In the context of production and operational management, international research has primarily focused on resource consumption standards related to the consumption of production factors. In practice, these standards are often classified into two main categories. The first category pertains to labor consumption standards, typically estimated using work-hour-based methods. International scholars refer to this as the establishment of labor work-hour allocations. For instance, Frederick Taylor—often regarded as the father of scientific management—pioneered time-and-motion studies in 1880 to analyze the work-hour requirements of various tasks, laying the groundwork for systematic work-hour allocation [4]. Later, Lawrence S. Aft [5] proposed using capacity evaluation methods and production models to develop standard work hours. Furthermore, L. Xi et al. [6] applied a multi-input single-output PSO-BP neural network model to segmented work-hour allocation, thereby improving the computational efficiency and accuracy in estimating labor input. In addition, Vigoya et al. [7] employed IoT sensor data and time series modelling techniques to construct an annotated dataset for anomaly detection in data centers, thereby providing a systematic foundation for analyzing labor consumption and identifying performance anomalies. The second category concerns material consumption standards, which focus on using various physical resources during production. These standards are often derived using activity-based costing approaches, wherein the consumption of materials is allocated to specific operations based on their associated activities. In 1930, Professor Eric Koller introduced the concept of activity-based costing by incorporating the idea of personal responsibility into cost control [8]. Subsequently, Stobbs further explored the application of activities for calculating standard costs [9]. Moreover, Relph G et al. [10] argued that because inventory consumption constitutes a significant part of total cost, the highest and lowest reserve levels should be given equal weight when constructing an optimal model. Finally, Yong-Woo Kim et al. [11] applied time-driven activity-based costing to construction projects, establishing a standard cost model for material consumption. Although the international literature lacks a unified system for resource consumption standards—as fixed quotas are less common outside domestic studies—the distinction between labor and material consumption remains a useful framework for analysis.
Various methods have been developed for establishing resource consumption standards, each with merits and limitations. The main methods include data statistical analysis, technical determination, machine learning, and empirical estimation. To estimate standard values, statistical analysis evaluates historical consumption data across different periods and project phases. It requires sufficient historical records to ensure reliability. For example, panel data cointegration techniques (e.g., CUP-FM and CUP-BC) are widely used to analyze long-term trends in resource consumption standards and emissions, especially in complex scenarios involving globalization and natural resource utilization [12]. However, this approach heavily depends on data quality and availability. Technical determination involves the direct observation and measurement of production processes. Time studies, work sampling, and performance recording capture labor and material consumption under ideal conditions [13]. Although these methods provide granular insights into process dynamics, they are time-intensive and prone to observer bias. Machine learning leverages large datasets to train predictive models, uncovering hidden patterns in consumption behavior. For instance, advanced models integrating renewable energy adoption and urbanization trends have improved emission predictions in high-income countries [14,15]. These models excel in handling non-linear relationships but require extensive computational resources. Empirical estimation relies on operators’ practical experience to set standards. While simple and fast, it tends to overestimate or underestimate values when applied to new technologies or materials [16].
In response to these limitations, there has been growing interest in probabilistic methods that capture uncertainty and variability more effectively. Alternative approaches such as LDA topic modeling, probabilistic latent semantic analysis (pLSA), Hidden Markov Models (HMMs), Gaussian Mixture Models (GMMs), Bayesian networks, and bootstrapping offer different strengths in data analysis: While LDA and pLSA excel at extracting latent topics from large datasets, they typically require extensive data and focus more on feature discovery than on directly quantifying uncertainty [17,18]. HMMs and GMMs are powerful for modeling sequential and clustered data but depend on assumptions that may not hold in all practical applications [19,20]. Bayesian networks provide robust graphical representations of variable relationships yet often require detailed domain expertise [21]. And bootstrapping, though useful as a non-parametric method, may not fully capture the complex randomness inherent in high-dimensional systems [22]. In contrast, Monte Carlo simulation stands out for its simplicity and direct approach, as it bypasses strict model assumptions by generating random samples based on observed or assumed probability distributions [23]. This technique not only quantifies inherent uncertainty by establishing confidence intervals and error bounds but also adapts flexibly to diverse and complex scenarios, making it particularly effective in capturing non-linear dependencies and extreme events [24]. In this study, Monte Carlo simulation is enhanced by integrating adaptive parameter calibration with real-time operational data feedback, allowing the model to dynamically capture site-specific variations and complex non-linear interactions in resource consumption, which in turn significantly improves estimation accuracy and broadens its applicability in highway electromechanical projects. Its scalability and ease of integration with modern computational methods further enhance its robustness and efficiency compared to other probabilistic approaches. Among these approaches, Monte Carlo simulation is a promising technique, as it models complex uncertainties via random sampling [25]. By generating many simulated data points from probability distributions derived from historical records or expert judgment [26], this method quantifies inherent uncertainty, establishes confidence intervals for consumption estimates, and provides valuable insights into the likelihood of extreme consumption scenarios. Darmanto et al. [27] introduced a Monte Carlo Regression Tree (MCRT) algorithm that improved energy consumption prediction accuracy by 2% in small datasets. Similarly, Costa et al. [28] applied Monte Carlo simulation to assess water supply reliability in Brazilian reservoirs, accounting for uncertainties arising from climate change. Moreover, integrating Monte Carlo methods with data-driven techniques—such as machine learning—has led to the development of dynamic, adaptive frameworks that can be updated in real time as new data become available [29]. In this regard, Smith et al. [30] integrated Monte Carlo simulations with deep learning-based memory modules to develop a Memory-Augmented Monte Carlo Tree Search (M-MCTS) framework, demonstrating its superior real-time decision making and adaptability to dynamic environments. Xing et al. [31] integrated Monte Carlo simulations with the GM (1,1) model to predict multi-type electric vehicle charging loads, analyzing charging behaviors to optimize grid planning and dispatching. Collectively, these advancements ensure that resource consumption standards remain robust and relevant in rapidly evolving operational environments.
Against this backdrop, the present study makes several key contributions. First, it develops an innovative and adaptive probabilistic framework that integrates traditional statistical analysis with Monte Carlo simulation, enabling a more accurate characterization of dynamic fluctuations in electromechanical equipment consumption. Unlike conventional models, this framework incorporates real-time site-specific data and dynamically adjusts probability distributions based on observed variability, which constitutes a methodological innovation in the field. Second, by explicitly modeling uncertainty and stochastic variability, the proposed method significantly improves estimation precision compared to existing approaches. Third, the framework is rigorously validated using empirical measurement data and benchmark comparisons, confirming both its reliability and practical value. Finally, a dynamic adjustment mechanism, as well as the coefficient of variation and real-time feedback, is included to ensure the proposed standards can evolve with changes in equipment specifications and consumption behavior. Together, these innovations establish a more robust and forward-looking approach to resource planning in highway infrastructure systems.

2. Methods

2.1. Monte Carlo Method Overview

The Monte Carlo method is a statistical technique based on random sampling that estimates the solution of mathematical problems by generating many random samples. Its name originates from the Monte Carlo casino, as this approach relies on randomness like gambling events. The core idea of the Monte Carlo method is to simulate problems through extensive random sampling and utilize statistical principles for numerical estimation. This method is widely applied in solving problems that are challenging to address using traditional approaches due to their complexity or high dimensionality. It benefits numerical integration, probability estimation, stochastic optimization, and risk analysis. By generating many random samples and performing statistical analysis, the Monte Carlo method provides approximate solutions to complex problems, especially those involving uncertainty and randomness.
As early as the 18th century, French mathematician Georges-Louis Leclerc, Comte de Buffon, proposed a method to estimate probability based on an event’s occurrence frequency. This technique represents an early application of Monte Carlo integration known as Buffon’s needle problem. In this problem, a plane is marked with parallel lines spaced at an interval of a, and a needle of length x ( x a ) is randomly dropped onto the plane. By recording the number of times the needle intersects the parallel lines, the value of π can be estimated using probability theory. The probability P of the needle intersecting a line can be approximated using the ratio of intersection occurrences to the total number of throws. The relationship between P, the needle length x, and the line spacing a is expressed as Equation (1):
P = 2 x π a
Through actual needle-throwing experiments, the ratio of intersection occurrences to total throws can be used to approximate the value of π . This example serves as a classical case of Monte Carlo integration, demonstrating how random sampling can be utilized to estimate mathematical solutions. Specifically, when dealing with real-world engineering problems, scientists and engineers analyze the logical relationships between internal and external influencing factors, including environmental conditions and resource configurations. Based on this analysis, a reasonable probabilistic model is constructed using probability theory and statistics principles.
In practical engineering applications, variations in construction efficiency are influenced by multiple factors, such as construction conditions, individual worker proficiency, and differences between construction projects and work sections. As a result, labor, materials, and machinery consumption per unit of time exhibit inherent randomness. Statistically, this randomness follows a probability distribution, often modeled as a normal distribution or a t-distribution. The normal distribution is symmetric and bell-shaped, while the t-distribution is a modification used for small sample sizes. As the degrees of freedom (df) increase, the t-distribution converges toward the standard normal distribution. Therefore, resource consumption standards can be approximated using a normal distribution.
In summary, the flexibility and applicability of the Monte Carlo method make it a powerful tool for solving various real-world problems. It provides a practical numerical solution approach for complex problems that cannot be addressed using traditional mathematical techniques. Computational tools such as MATLAB R2022b can generate multiple sets of simulated data, construct probabilistic distribution models, and randomly sample large datasets. Through the extraction and analysis of these random samples, the expected values obtained can approximate the valid values, yielding a reliable estimate of the solution. A schematic representation of the Monte Carlo method is illustrated in Figure 1.
The Monte Carlo method is theoretically grounded in the law of large numbers and the Central Limit Theorem. The law of large numbers ensures that as the number of samples increases, the sample mean converges to the true expected value, providing statistical reliability for repeated simulations. Meanwhile, the Central Limit Theorem explains why the distribution of sample means tends to approximate a normal distribution, even when the underlying data are not normally distributed. Together, these principles form the foundation for using random sampling to estimate the behavior of complex systems. In this study, the Monte Carlo method is used to repeatedly sample from known probability distributions of input variables, allowing for the statistical estimation of outputs based on their functional relationships.

2.2. Model Construction for Highway Electromechanical Equipment Resource Consumption Standards

In constructing resource consumption standards models for electromechanical equipment in highway engineering, the Monte Carlo method is an effective stochastic simulation technique with which to estimate resource utilization patterns across diverse projects accurately. This approach relies on in-depth analysis of raw data and incorporates probabilistic distribution models to capture the complexities inherent in real-world scenarios, thereby providing robust data support for resource allocation standards and planning. The process encompasses multiple stages, from data collection and error handling to establishing probabilistic models, ultimately achieving precise simulation and analysis of resource consumption standards dynamics.

2.2.1. Data Collection and Processing

Raw Data Collection

Prior to formulating resource consumption standards, it is essential to collate and collect extensive foundational data, including construction plans, daily logs, work team records, machinery usage logs, and cost materials. A structured framework (e.g., revision principles, scope, and measurement units) was established to streamline data processing. Field data from representative projects were systematically analyzed to derive labor, material, and machinery consumption metrics, which serve as inputs for Monte Carlo simulations. The workflow for developing resource consumption standards is illustrated in Figure 2.

Error Classification and Characteristics

Error processing is required for the collected raw data. In actual data measurement, errors can impact the accuracy of the data. These errors mainly include systematic errors, random errors, and gross errors.
(1) Systematic errors
Systematic errors arise due to inherent defects in measuring equipment, tools, instruments, or environmental conditions, causing all measurement results to deviate consistently from the actual value. Generally, systematic errors follow a fixed pattern and are introduced by the measurement method or the observation and recording system, exhibiting a certain regularity.
(2) Random errors
Random errors result from uncontrolled or unavoidable random variations. They manifest as instability or fluctuations in experimental outcomes. Random errors can originate from various sources, such as instrument precision, changes in environmental conditions, operator skill differences, and other factors that cannot be precisely measured or controlled. Such errors are referred to as random or accidental errors.
Random errors are unpredictable and follow no specific pattern. However, due to their random nature, the mean value of multiple measurements tends to converge toward the actual value, as random errors have an equal probability of occurring in different directions. As the number of measurements n increases indefinitely, the mean error asymptotically approaches zero, as expressed in Equation (2):
lim n 1 n i = 1 n δ i = 0
Although random errors cannot be eliminated, their impact can be minimized or analyzed using probabilistic and statistical methods.
(3) Gross errors
Gross errors are significant and abnormal deviations during measurements or experiments, typically deviating substantially from the actual value. Unlike random errors, gross errors are often caused by systematic issues rather than purely accidental factors.
In the context of raw data for resource consumption standards, gross errors can arise for various reasons, such as instrument failures, operator transcription errors, changes in construction site conditions, data entry errors or computational errors. These factors can lead to measurement or experimental results that deviate significantly from the actual value, introducing gross errors. The presence of gross errors can significantly affect the accuracy and reliability of experimental results, thereby impacting subsequent calculations and analyses. Therefore, it is crucial to promptly identify, correct, or eliminate the influence of gross errors.

Error Processing Methods

The errors encountered in the resource consumption standard calculation process include systematic errors, random errors, and gross errors. The methods for identifying and handling these errors are as follows:
(1) Identification of systematic errors
The collected consumption data are arranged in ascending order of x i , i.e., x 1 x 2 x n , where
u = σ 1 / σ 2 1
σ 1 = i = 1 n v i 2 n 1
σ 2 = 1.253 n ( n 1 ) i = 1 n v i ; v i = x i x ¯
In the above equations, σ 1 represents Bessel’s standard deviation, σ 2 represents Belyaev’s standard deviation, and v i represents residual error.
If the condition | u | 2 / n 1 is met, it indicates the presence of systematic errors in the data, requiring an analysis of the potential causes and a recollection of the data. Conversely, if the condition is not met, the data are considered free from systematic errors.
(2) Identification of gross errors
Standard statistical methods for detecting gross errors and outliers include the 3 σ rule and Grubbs’ and Dixon’s tests. The 3 σ rule suits cases with many measurements or a large sample size. When the sample size exceeds 50, it is convenient and does not require table lookup. However, this method becomes ineffective for small sample sizes (≤10) [32]. Grubbs’ test is commonly used for outlier elimination for relatively small sample datasets. When the sample size is between 3 and 50, and there is a single outlier, Grubbs’ test is convenient. Dixon’s test does not require standard deviation calculations and is more effective when multiple outliers exist. The specific calculation methods are as follows.
(1) The 3 σ rule
The 3 σ rule is a commonly used quality control method for identifying outliers in a dataset. This rule is based on the assumption of a normal distribution and utilizes the standard deviation to determine whether a data point deviates from the mean. The fundamental principle of the 3 σ rule is that for a dataset approximately following a normal distribution, about 68% of the data points fall within one standard deviation of the mean, about 95% fall within two standard deviations, and approximately 99.7% fall within three standard deviations. According to this rule, data points that exceed three standard deviations from the mean can be considered outliers.
When the dataset follows a normal distribution, the 3 σ rule is suitable for detecting and removing outliers. If X N μ , σ 2 follows a normal distribution, the probability that it falls within an interval centered on the mean with a radius of three times the standard deviation is given by Equation (6):
P ( μ 3 σ X μ + 3 σ ) = μ 3 σ μ + 3 σ 1 2 π σ · e ( x μ ) 2 2 σ 2 d x = 0.9973
Thus, values that deviate from the mean by more than three standard deviations are marked as outliers. The 3 σ rule is simple and convenient, not requiring table lookups. However, it has stringent data requirements. This method can achieve satisfactory results when the inspection frequency is high and accuracy requirements are relatively low.
(2) Grubbs’ test
Grubbs’ test is a statistical method for detecting outliers in a dataset. This method primarily identifies potential outliers in a single sample or a dataset. The fundamental idea behind Grubbs’ test is that outliers typically deviate significantly from other observations in the dataset. Therefore, outliers can be detected by comparing an observation’s deviation from the rest of the data. Grubbs’ test is remarkably accurate when the sample size is small. The general procedure of Grubbs’ test is as follows:
1. Suppose a dataset is given as x 1 , x 2 , , x n , where the maximum value is x max , and the minimum value is x min . The extreme values are the most likely candidates for outliers.
2. Select a significance level α that represents the probability of making an incorrect outlier determination. For example, α = 0.05 indicates a 5 % probability of an incorrect decision. The appropriate significance level should be chosen based on the required accuracy.
3. Calculate Grubbs’ test statistic by taking the largest absolute deviation in any observation from the sample mean and dividing it by the sample standard deviation.
4. Obtain the critical value for Grubbs’ test based on the chosen significance level and sample size from standard statistical tables. If the test statistic exceeds the critical threshold, the corresponding observation is classified as an outlier.
(3) Dixon’s Q test
Dixon’s Q test is a method for detecting outliers in small sample datasets. It is beneficial for determining whether a single extreme value in a dataset should be classified as an outlier. The test is applicable when the sample size is small, typically between 3 and 30 observations. The core concept of Dixon’s Q test is to compute the difference between a suspected outlier and its nearest neighbor, standardize this difference, and compare it to a critical threshold. The test follows the following steps:
1. Sort the data: Arrange the dataset in ascending order. Suppose a dataset consists of values x 1 , x 2 , , x n , where x 1 is the minimum value and x n is the maximum value.
2. Compute the Q statistic:
If the maximum value is suspected to be an outlier, compute Q = x n x n 1 x n x 1 . If the minimum value is suspected to be an outlier, compute Q = x 2 x 1 x n x 1 .
3. Determine the critical value: Refer to Dixon’s Q test table to obtain the critical value based on the sample size and the chosen significance level.
4. Determine if it is an outlier: If the calculated Q value exceeds the critical threshold, the suspected data point is considered an outlier. If the Q value is less than or equal to the critical threshold, the suspected value is not classified as an outlier.
Dixon’s Q test is straightforward and particularly useful for preliminary analyses of small sample datasets. However, its limitations should be considered, such as its sensitivity to small sample sizes and its dependency on distributional assumptions.

2.2.2. Establishing the Probability Distribution Function Model and Analysis

One of the key steps of the Monte Carlo method is establishing the probability model distribution function for random variables, typically achieved using objective or subjective estimation. The objective estimation method relies on factual data and observed, measured, and analyzed results to infer probability distribution characteristics. In contrast, the subjective estimation method depends on expert opinions, experience, and subjective judgment to reach a relatively unified consensus.
Due to the difficulty in obtaining historical data for electromechanical engineering projects and the significant variations in historical data from similar projects, the subjective estimation method is often preferred for determining the probability distribution of random variables. The standard theoretical distribution models for resource consumption standards are shown in Table 1.
The probability density function often tends toward a normal distribution through subjective resource consumption standard data estimation. This is because, as confirmed by numerous engineering cases and theoretical validations, most natural and social phenomena follow a regular distribution pattern. Thus, the resource consumption standards determined using subjective estimation are assumed to follow a typical distribution model, improving estimation reliability. The probability density function is expressed as Equation (7):
f ( x ) = 1 σ 2 π e 1 2 x μ σ 2 , < x <
Prior to selecting a probability distribution model, descriptive statistics and goodness-of-fit tests were conducted using IBM SPSS Statistics 26 to empirically determine the most appropriate distribution type for each resource parameter. Histograms were inspected, skewness and kurtosis coefficients were calculated, and the Kolmogorov–Smirnov test was applied to assess normality. All key consumption variables met the normality criteria (|CS| < T and CE within critical bounds), justifying the use of the normal density function. For any parameter exhibiting significant deviation from normality, alternative distributions (e.g., lognormal, gamma, and triangular) as defined in Table 1 were applied in the MATLAB simulation to ensure the accurate representation of the observed data characteristics.
To extend these empirically validated distribution models into practical cost forecasting under uncertainty, Monte Carlo simulation was employed to translate input variability into probabilistic estimates of resource-related costs. Monte Carlo simulation is an established method for capturing uncertainty in project cost and resource estimation. It generates a probability distribution of possible outcomes by repeatedly sampling from assumed input distributions, rather than relying on a single-point estimate. This approach explicitly incorporates variability and risk factors, providing a clearer picture of potential costs than deterministic forecasts. In practice, Monte Carlo simulations are widely used to model the probability of cost overruns and resource usage under uncertainty. Even when actual historical data are limited (small sample sizes), Monte Carlo methods remain applicable: one can construct reasonable distributions for each cost driver based on available data or expert judgment, and then run a large number of trials. The law of large numbers ensures that as the number of simulation iterations grows, the estimated mean and confidence intervals converge, yielding robust estimates of resource-related costs despite limited initial data. This capability makes Monte Carlo simulation well suited to real-world scenarios of resource planning, where it can help identify inefficiencies and inform contingency planning, reinforcing that once the distributions have been theoretically and empirically validated, the specific probability density function can be obtained by calculating each dataset’s mean and standard deviation.
The specific probability density function can be obtained by calculating the standard deviation and mean of the data. However, testing the normality of the data is required before using the probability function. Several methods exist for normality testing, among which skewness and kurtosis tests are commonly used. The testing method is as follows:
Assuming a set of observations sorted in ascending order y 1 < y 2 < y 3 , < y n , the following formulas apply:
y ¯ = i = 1 n y i / n
k 2 = i = 1 n y i y ¯ 2 / n
k 3 = i = 1 n y i y ¯ 3 / n
k 4 = i = 1 n y i y ¯ 4 / n
The skewness coefficient = C S = k 3 / k 2 measures the asymmetry of the distribution curve relative to the mean. The kurtosis coefficient C E = k 4 / k 2 2 assesses the sharpness or flatness of the distribution curve at its peak. If the data follow a normal distribution, the skewness coefficient C S should be close to the symmetry test critical value T, and the kurtosis coefficient C E should fall within the critical range Q 1 Q 2 . The symmetry test critical value T and kurtosis test critical range depend on the confidence level Q 1 Q 2 and sample size and can be obtained from statistical tables.

2.2.3. Data Simulation

After collecting and processing the original data, a histogram with a standard distribution curve can be generated using IBM SPSS Statistics software to assess the data distribution. The computed skewness coefficient CS and kurtosis coefficient CE can be used to verify whether the data follow a normal distribution.
Once normality is confirmed, the normal function in MATLAB can generate many customarily distributed random variables based on the calculated standard deviation and mean.

2.2.4. Monte Carlo Simulation Results and Accuracy Analysis

Before compiling resource consumption standards, an accuracy analysis must be conducted on the simulated data to assess whether it accurately represents the actual consumption observations in engineering projects and whether extreme deviations exist. Since the simulated data are generated using a regular probability model, accuracy analysis can be performed using the sample mean and variance distribution theorem.
Let x 1 , x 2 , x 3 , , x n be a random sample from a population N ( μ , σ ) with sample mean x ¯ and sample variance s 2 . Then, x ¯ follows a normal distribution with N ( μ , σ ) , and its confidence interval can be categorized as follows:
(1) When the population variance σ 2 is known, the confidence interval can be constructed using the standardized variable Z, as shown in Equation (12):
Z = x ¯ μ σ 2 / n = x ¯ μ σ / n N ( 0 , 1 ) , μ = x ¯ σ n Z
Given a significance level α , the 1 α confidence interval for μ can be obtained by substituting the Z-value from the standard typical table, as shown in Equation (13).
x ¯ σ n z α / 2 , x ¯ + σ n z α / 2
(2) When the population variance σ 2 is unknown since the sample variance s 2 is an unbiased estimator of the population variance σ 2 , the t-distribution can be used to construct the confidence interval, as shown in Equation (14):
T = x ¯ μ s 2 / n = x ¯ μ s / n t ( n 1 ) , μ = x ¯ s n T
Given a significance level α , the 1 α confidence interval for μ can be obtained by substituting the T-value from the standard normal table, as shown in Equation (15).
x ¯ s n t α / 2 ( n 1 ) , x ¯ + s n t α / 2 ( n 1 )
Since the population variance σ 2 estimated through software simulation is unknown, the second method is adopted for confidence interval calculation. Additionally, to ensure model accuracy, absolute error β and relative error γ are introduced to evaluate the precision of simulation results, with specific computational formulas provided in Equation (16).
β = s n t α / 2 ( n 1 ) , γ = s n t α / 2 ( n 1 ) x ¯
If the mean of the simulated data falls within the confidence interval, the data are considered reasonable and can be used for the compilation of resource consumption standards. Typically, a confidence level of 95 % is chosen, corresponding to a significance level of 0.05.

2.3. Validation Analysis of Standardized Cost Benchmarks for Expressway Electromechanical Systems

Accurate, standardized cost benchmarks in highway electromechanical projects critically influence budget rationality and economic viability. This study establishes a dual verification mechanism through comparative analysis with national construction cost standards and real-time market prices, ensuring the developed benchmarks accurately capture resource expenditure patterns. The methodology enhances cost estimation reliability while providing operational guidance for engineering economics.

2.3.1. Compliance with the Price of National Construction Standards

National construction cost standards serve as authoritative references for engineering budgeting. We systematically evaluate their technical congruence and measurement rationality by comparing project-specific unit cost benchmarks with these standardized references. The key verification identifies deviations from prescribed labor–material–equipment ratios, particularly analyzing equipment installation person-hour benchmarks and power consumption metrics against national indices.

2.3.2. Compliance with Market Price

Market prices refer to the actual transaction costs of labor, materials, and equipment within a specific period, reflecting resource costs under current market conditions. Comparing standardized benchmarks with market prices validates their applicability in the current economic environment, ensuring they accurately reflect actual market fluctuations. The deviation between standardized benchmarks and market prices should be maintained within reasonable limits. Excessive deviations may indicate a disconnect between cost standardization and actual market conditions, failing to represent real resource consumption standards accurately. Through market price comparison analysis, standardized benchmarks can be dynamically adjusted to adapt to market changes, ensuring the rationality and timeliness of cost standardization.
The validation analysis of standardized cost benchmarks, through comparison with national standards and market prices, provides a comprehensive assessment of their rationality, standardization, and market adaptability. Through comparisons with national standards, the benchmarks’ compliance with industry norms is ensured; through market price comparison, their practical applicability is verified. Combining these two approaches facilitates the development of more accurate and realistic cost standards, providing reliable budgetary references for engineering projects.

3. Results and Discussion

3.1. Data Sources

The conceptual framework and foundational data of this study originate from the collaborative scientific research project Research on Parameter and Price Suitability of Highway Electromechanical Equipment in Xinjiang Based on Environmental Characteristics, jointly conducted by Xinjiang University and Xinjiang Transportation Investment (Group) Co., Ltd., Urumqi, China. Representative construction projects significantly influenced by environmental factors were selected for analysis, considering Xinjiang’s unique geographical location, climate, and environmental conditions. Through field investigations and data collection, manual barrier gates were chosen as the case study object due to their pronounced variability under different environmental conditions, mainly influenced by climate, geography, and usage frequency. This equipment demonstrated strong representativeness, making it a focal point of the research.

3.2. Results of National Construction Standards for Resource Consumption Standards

The installation process of manual barrier gates involves four sequential steps: equipment positioning → foundation cleaning → column installation → barrier arm assembly. The workflow is streamlined as follows.
Determine the installation location via construction drawings → clean foundation surface → mark and install expansion bolts → fix columns with horizontal level calibration → install barrier arms, support arms, and prohibition signs → final site cleanup.
The resource consumption standards for manual barrier gate installation, as stipulated in the Highway Engineering Budget Standards, are detailed in Table 2.

3.3. Results of Data Collection

Raw data were obtained from on-site work team records for manual barrier gate installations on the S20 and S21 Expressways. The workday method was employed to measure labor and machinery consumption, yielding 40 datasets (Table 3).
Resource consumption standard measurements are often limited by small sample sizes, making Monte Carlo simulation an effective method for quantifying uncertainty in resource use. By repeatedly resampling the empirical distribution to generate a large ensemble of pseudo-observations, this approach accurately estimates both the expected value and the dispersion of resource consumption standards. In accordance with the law of large numbers, the simulated results converge toward the true underlying distribution, thereby diminishing the influence of sampling variability inherent in sparse data. Although rare, extreme events may remain underrepresented; Monte Carlo simulation, nonetheless, provides robust estimates of average consumption and its variability.

3.4. Results of Outlier Identification and Elimination

Conduct systematic error checks, gross error checks, and the elimination of outliers with extreme deviation on the measured 40 sets of sample data, and the analysis results are shown in Table 4.
The results confirm no systematic errors, gross errors, or extreme deviations, validating the dataset for subsequent resource consumption standard compilation.

3.5. Results of Data Simulation and Distribution Analysis

(1) Histogram and normal distribution curve
For the measured sample data, the IBM SPSS Statistics software was used to draw histograms for labor, expansion bolts, other materials, the 3t electric cart, and small tools, with a standard distribution curve added to each histogram, as shown in Figure 3. Through software calculations, the normal distribution means μ for labor, expansion bolts, other materials, the 3t electric cart, and small tools were found to be 1.079, 6.018, 1.905, 0.197, and 11.410, respectively, with standard deviations σ of 0.285, 0.968, 0.812, 0.096, and 1.223.
As can be seen from Figure 3, the original data have good symmetry, and the histogram conforms to the normal distribution. However, there is a phenomenon of data not being concentrated, so it is necessary to simulate and generate the original data to improve the accuracy of the data further.
(2) Normality test
Based on practical experience, test data generally follow a normal distribution or something similar. According to the histogram obtained, we can roughly judge that it is a normal distribution. Subsequently, IBM SPSS Statistics 26 was used to calculate the kurtosis and skewness coefficients of the data to verify their normality. The verification results are shown in the table below (Table 5).
When the number of measurements n = 40 and the confidence level P = 95 % , the critical value for asymmetry test T = 0.59 and the critical value range for kurtosis test Q 1 Q 2 , 2.01 4.06 , can be found through table lookup. The skewness coefficients are all less than the critical value, and the kurtosis coefficients fall within the kurtosis test’s critical value range. Therefore, it is considered that the data passed the normality test and followed a normal distribution.
(3) Probability distribution model
Substitute the mean and standard deviation of the normal distribution curves for labor, expansion bolts, other materials, the 3t electric cart, and small tools, into the probability distribution function model, thereby establishing the probability distribution characteristics for each resource item.
(4) Empirical Validation and Model Applicability The initial validation of the model’s practical applicability and distributional assumptions commenced with statistical analysis of historical bid prices for highway electromechanical equipment. The empirical distribution exhibited close alignment with normal distribution parameters. The subsequent collection of composite bid prices encompassing labor and material costs from five contemporary highway projects enabled comparative mapping against Monte Carlo-simulated cost distributions derived from sample data in Table 3. All observed bid values resided within 95% confidence intervals, calculated as the mean plus or minus 1.96 times the standard deviation of their respective simulated distributions. This robust empirical correspondence substantiates two critical findings: first, the appropriateness of the normal distribution for equipment price modeling, and second, the framework’s capacity to incorporate alternative distributions (e.g., lognormal, gamma) when confronting skewed or bounded cost data. Collectively, these results confirm the Monte Carlo methodology’s dual strengths in distributional flexibility and predictive accuracy for infrastructure cost estimation under real-world variability conditions.
(5) Data simulation and accuracy analysis
MATLAB2021 software programming was used perform data simulation based on the probability distribution function model, generating 500, 1000, 3000, and 5000 simulated data points in sequence, and histograms were created for each.
The normal distribution means for labor, expansion bolts, materials, electric cart shifts, and small tools usage fees are 1.079, 6.018, 1.905, 0.197, and 11.410, respectively, with standard deviations of 0.285, 0.968, 0.812, 0.096, and 1.223. For example, to simulate the generation of 100,000 consumptions, the MATLAB program commands are as follows:
r = normal(1.079, 0.285, [1, 100,000]); % Generate 100,000 random numbers
mean_value = mean(r); % Calculate the mean
std_deviation = std(r); % Calculate the standard deviation
fprintf(’Mean: %.4f\n’, mean_value); % Output the mean
fprintf(’Standard Deviation: %.4f\n’, std_deviation); % Output the standard deviation figure;
histfit(r, 1000, ’normal’); % Draw a frequency graph with a probability density curve
title(’Frequency Graph with Probability Density Curve’); % Title
xlabel(’Artificial/Workday’); % Title of the x-axis
ylabel(’Frequency’); % Title of the y-axis
legend(’Frequency’, ’Normal Distribution Fit’); % Legend title
After calculation, the mean of the labor generated via simulation with 100,000 samples is μ = 1.0793 , with a standard deviation of σ = 0.2844 . The frequency graph and probability density curve of the manual consumption are shown in Figure 4d. Additionally, MATLAB was used to analyze the histogram and probability density for 3000, 5000, 10,000, and 100,000 simulated data points. As shown in Figure 4, with the increased simulated data, the data error becomes smaller and tends to approach a normal distribution.

3.6. Results of Resource Consumption Standard Determination

To ensure a certain level of accuracy in the simulated data, it is necessary to conduct error and precision analysis on the generated 100,000 simulated data points using the method described above. According to the t-distribution table, t 0.025 ( 100,000 1 ) = 1.96 . When these data are substituted into the formula, the results are as follows in Table 6.
As shown in the table, the means of simulated consumption data generated by the software all lie within the confidence intervals, exhibit strong normality, and demonstrate minimal absolute and relative errors. These results meet the precision requirements for resource consumption standard compilation, validating the feasibility of formulating resource consumption standards.

3.7. Results of Standard Validation

The simulated construction resource consumption standards can be derived based on the simulated data. The amplitude difference between the construction and budget standards must be considered when calculating the budget resource consumption standards. The budget standards are calculated as budget standards = construction standards × amplitude difference coefficient. The amplitude difference coefficient is adopted from the Highway Engineering Budget Standards, and the calculated budget resource consumption standards are detailed in Table 7.
(1) Comparison with national construction standards
The budget standards were compared with the “Highway Engineering Budget Standards” issued by the Ministry of Transport of the People’s Republic of China (JTG/T 3832-2018) [33]. Deviations for labor, expansion bolts, and small tools were 1.03%, 1.62%, and 4.00%, respectively, all within 5% (Table 7).
It is noted that the deviations for most resource items fall within the acceptable range of 5%, whereas the deviation for the 3t electric cart is observed at 6.79%, slightly surpassing this threshold. This elevated deviation is primarily attributed to the inherent variability in the operational patterns of the electric cart. Unlike fixed consumables such as bolts or labor, the performance of electric carts is subject to several dynamic factors—including transportation distance at the construction site, prevailing terrain conditions, trip frequency, and operator practices. In the context of highway electromechanical installations, certain project sites may necessitate prolonged travel distances or repeated equipment movements, thereby resulting in disproportionately higher usage durations. Consequently, the variability in the number of shifts required for the electric cart is greater, leading to a higher relative deviation. This phenomenon reflects the intrinsic operational variability of transportation equipment and should not be construed as a flaw in the simulation process or in the formulation of the standard.
(2) Comparison with Market Prices
Market price data (21 samples) from bidding documents were analyzed using Grubbs’ test, identifying one outlier. After exclusion, the mean market price was CNY 201.20, with a 0.6% deviation from the budget standards (Table 8).
In addition to bidding document analysis, the proposed budget standards were validated against actual bid prices collected from multiple highway electromechanical projects between 2020 and 2024. These bid prices are updated annually and reflect the actual expenditures incurred in completed projects. Calculations indicate that deviations between the budget standards, national construction standards, and market prices remain below 3 percent, which confirms the overall rationality of the proposed resource consumption standards. Specifically, the average bid price for manual barrier gate installations was CNY 201.20, which represents a deviation of only 0.6 percent from the modeled budget standard of CNY 200. This close correspondence demonstrates the practical reliability of the adaptive Monte Carlo framework in capturing market variability beyond static national norms. In addition, through continuous comparison and annual calibration based on inquiry prices collected from a wide range of tender documents, the simulated standards have been refined to remain consistent with actual pricing trends and to meet practical cost control requirements.

4. Conclusions

This study employed the Monte Carlo method to establish a probabilistic model for estimating resource consumption standards in highway electromechanical equipment, using manual barrier gates as a case study. The methodology systematically integrated raw data collection, error classification and correction, probabilistic distribution modeling, and simulation-based validation. The results demonstrate that the Monte Carlo approach effectively captures the stochastic nature of construction resource consumption standards, providing a robust framework for developing standardized cost benchmarks.
The findings indicate that labor, material, and machinery consumption in installing highway electromechanical equipment exhibit significant variability due to environmental and operational factors. Using manual barrier gates as an example, this study quantified resource consumption standard uncertainties through probabilistic distribution models and validated the estimates against national construction standards and real-time market prices. The comparative analysis confirms that the resource consumption values derived from probabilistic methods align well with industry standards while accommodating site-specific variations.
Furthermore, validation against national construction pricing and real-time market conditions underscores the reliability and applicability of the proposed methodology. The results suggest that integrating stochastic modeling techniques into resource consumption standard estimation can enhance accuracy, reduce cost uncertainties, and optimize engineering budget allocations. Additionally, this study emphasizes the necessity of periodically updating resource consumption standards to reflect dynamic market conditions and technological advancements.
Future research should explore extending this methodology to a broader range of highway electromechanical equipment, incorporating additional variables to improve predictive accuracy. Moreover, integrating advanced machine learning techniques with Monte Carlo simulations could provide deeper insights into the complex interactions affecting resource consumption standards. By adopting such probabilistic approaches, the engineering community can improve the precision and adaptability of resource consumption standards, ultimately contributing to more efficient and cost-effective infrastructure projects.

Author Contributions

L.L.: conceptualization, data curation, formal analysis, methodology, and writing—original draft. W.T.: investigation, software, validation, visualization, and writing—original draft. X.D.: conceptualization, data curation, formal analysis, methodology, investigation, and writing—review and editing. L.S.: methodology, supervision, writing—review and editing, and funding acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This study was jointly funded by the Key R&D Projects of Xinjiang Communications Investment Group Co., Ltd. (XJJTZKX-FWCG-202401-0044), Xinjiang Key R&D Program Projects (2022B03033-1), and the project of Dr. Tianchi in Xinjiang Uygur Autonomous Region.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data will be made available on request.

Conflicts of Interest

Authors Xiaomin Dai and Liang Song are employed by Xinjiang Key Laboratory of Green Construction and Maintenance of Transportation Infrastructure and Intelligent Traffic Control. Author Wei Tian is employed by Xinjiang Transportation Investment (Group) Co., Ltd., Engineering Technology and Transportation Industry in Arid Desert Areas, and Xinjiang Transportation Investment (Group) Co., Ltd. Author Linxuan Liu declares no financial or non-financial conflicts of interest. The remaining authors certify that no commercial, personal, or professional relationships influenced the research design, data interpretation, or conclusions of this study. All funding sources are disclosed in the Funding section.

References

  1. Cao, S.; Xu, H.; Xu, Y.; Wang, X.; Zheng, Y.; Li, Y. Assessment of the integrated benefits of highway infrastructure and analysis of the spatiotemporal variation: Evidence from 29 provinces in China. Socio-Econ. Plan. Sci. 2023, 90, 101740. [Google Scholar] [CrossRef]
  2. Croce, A.I.; Musolino, G.; Rindone, C.; Vitetta, A. Traffic and Energy Consumption Modelling of Electric Vehicles: Parameter Updating from Floating and Probe Vehicle Data. Energies 2021, 15, 82. [Google Scholar] [CrossRef]
  3. Shi, R.; Gao, Y.; Ning, J.; Tang, K.; Jia, L. Research on Highway Self-Consistent Energy System Planning with Uncertain Wind and Photovoltaic Power Output. Sustainability 2023, 15, 3166. [Google Scholar] [CrossRef]
  4. Ishii, K. A production research to create new value of business output. Int. J. Prod. Res. 2013, 51, 7313–7328. [Google Scholar] [CrossRef]
  5. Aft, L. Work Measurement in the Computer Age. 2024. Available online: https://www.semanticscholar.org/paper/WORK-MEASUREMENT-IN-THE-COMPUTER-AGE-Aft/4b31df04dad492cdeec2c8af6382634dd5f0d389 (accessed on 24 September 2024).
  6. Uddin, N.; Hossain, F. Evolution of Modern Management through Taylorism: An Adjustment of Scientific Management Comprising Behavioral Science. Procedia Comput. Sci. 2015, 62, 578–584. [Google Scholar] [CrossRef]
  7. Vigoya, L.; Fernandez, D.; Carneiro, V.; Cacheda, F. Annotated Dataset for Anomaly Detection in a Data Center with IoT Sensors. Sensors 2020, 20, 3745. [Google Scholar] [CrossRef]
  8. Zhang, R.; Li, J.L. The application of activity-based costing in the cost calculation of thermal-power enterprise. Therm. Sci. 2021, 25, 933–939. [Google Scholar] [CrossRef]
  9. Homburg, C. A note on optimal cost driver selection in ABC. Manag. Account. Res. 2001, 12, 197–205. [Google Scholar] [CrossRef]
  10. Relph, G.; Barrar, P. Overage inventory—How does it occur and why is it important? Int. J. Prod. Econ. 2003, 81–82, 163–171. [Google Scholar] [CrossRef]
  11. Kim, Y.W.; Han, S.H.; Yi, J.S.; Chang, S. Supply chain cost model for prefabricated building material based on time-driven activity-based costing. Can. J. Civ. Eng. 2016, 43, 287–293. [Google Scholar] [CrossRef]
  12. Wang, C.; Mahmood, H.; Khalid, S. Examining the impact of globalization and natural resources on environmental sustainability in G20 countries. Sci. Rep. 2024, 14, 30921. [Google Scholar] [CrossRef] [PubMed]
  13. Proverbs, D.; Holt, G.; Olomolaiye, P. A method for estimating labour requirements and costs for international construction projects at inception. Build. Environ. 1998, 34, 43–48. [Google Scholar] [CrossRef]
  14. Wong, J.M.W.; Chan, A.P.C.; Chiang, Y.H. Modeling and Forecasting Construction Labor Demand: Multivariate Analysis. J. Constr. Eng. Manag. 2008, 134, 664–672. [Google Scholar] [CrossRef]
  15. Sing, C.p.; Love, P.E.D.; Tam, C.M. Multiplier Model for Forecasting Manpower Demand. J. Constr. Eng. Manag. 2012, 138, 1161–1168. [Google Scholar] [CrossRef]
  16. Peng, J.; Jiang, B.; Chen, H.; Liang, S.; Liang, H.; Li, S.; Han, J.; Liu, Q.; Cheng, J.; Yao, Y.; et al. A New Empirical Estimation Scheme for Daily Net Radiation at the Ocean Surface. Remote Sens. 2021, 13, 4170. [Google Scholar] [CrossRef]
  17. Guo, C.; Lu, M.; Wei, W. An improved LDA topic modeling method based on partition for medium and long texts. Ann. Data Sci. 2021, 8, 331–344. [Google Scholar] [CrossRef]
  18. Figuera, P.; García Bringas, P. Revisiting probabilistic latent semantic analysis: Extensions, challenges and insights. Technologies 2024, 12, 5. [Google Scholar] [CrossRef]
  19. Zhang, H.; Deng, J.; Xu, Y.; Deng, Y.; Lin, J.R. An Adaptive Pedestrian Flow Prediction Model Based on First-Order Differential Error Adjustment and Hidden Markov Model. Buildings 2025, 15, 902. [Google Scholar] [CrossRef]
  20. Wang, M.; Li, X.; Chen, L.; Chen, H.; Chen, C.; Liu, M. A deep-based Gaussian mixture model algorithm for large-scale many objective optimization. Appl. Soft Comput. 2025, 172, 112874. [Google Scholar] [CrossRef]
  21. Mahajan, A.; Das, S.; Su, W.; Bui, V.H. Bayesian-Neural-Network-Based Approach for Probabilistic Prediction of Building-Energy Demands. Sustainability 2024, 16, 9943. [Google Scholar] [CrossRef]
  22. Hasni, M.; Babai, M.; Aguir, M.; Jemai, Z. An investigation on bootstrapping forecasting methods for intermittent demands. Int. J. Prod. Econ. 2019, 209, 20–29. [Google Scholar] [CrossRef]
  23. Dao, D.V.; Adeli, H.; Ly, H.B.; Le, L.M.; Le, V.M.; Le, T.T.; Pham, B.T. A sensitivity and robustness analysis of GPR and ANN for high-performance concrete compressive strength prediction using a Monte Carlo simulation. Sustainability 2020, 12, 830. [Google Scholar] [CrossRef]
  24. Yang, C.; Kumar, M. On the effectiveness of Monte Carlo for initial uncertainty forecasting in nonlinear dynamical systems. Automatica 2018, 87, 301–309. [Google Scholar] [CrossRef]
  25. Tamang, N.; Sun, Y. Application of the dynamic Monte Carlo method to pedestrian evacuation dynamics. Appl. Math. Comput. 2023, 445, 127876. [Google Scholar] [CrossRef]
  26. Yu, S.; Wu, W.; Naess, A. Extreme value prediction with modified Enhanced Monte Carlo method based on tail index correction. J. Sea Res. 2023, 192, 102354. [Google Scholar] [CrossRef]
  27. Darmanto, T.; Tjen, J.; Hoendarto, G. Monte Carlo Simulation-Based Regression Tree Algorithm for Predicting Energy Consumption from Scarce Dataset. J. Data Sci. Intell. Syst. 2024, 1–9. [Google Scholar] [CrossRef]
  28. Maia, A.G.; Camargo-Valero, M.A.; Trigg, M.A.; Khan, A. Uncertainty and Sensitivity Analysis in Reservoir Modeling: A Monte Carlo Simulation Approach. Water Resour. Manag. 2024, 38, 2835–2850. [Google Scholar] [CrossRef]
  29. Mao, J.; Wang, H.; Li, J. Bayesian Finite Element Model Updating of a Long-Span Suspension Bridge Utilizing Hybrid Monte Carlo Simulation and Kriging Predictor. KSCE J. Civ. Eng. 2020, 24, 569–579. [Google Scholar] [CrossRef]
  30. Xiao, C.; Mei, J.; Müller, M. Memory-augmented Monte Carlo Tree Search. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018; pp. 1455–1461. [Google Scholar]
  31. Xing, Y.; Li, F.; Sun, K.; Wang, D.; Chen, T.; Zhang, Z. Multi-type electric vehicle load prediction based on Monte Carlo simulation. Energy Rep. 2022, 8, 966–972. [Google Scholar] [CrossRef]
  32. Eraslan, E.; İundefined, Y.T. An improved decision support system for ABC inventory classification. Evol. Syst. 2019, 11, 683–696. [Google Scholar] [CrossRef]
  33. JTG/T 3832-2018; Highway Engineering Budget Standards. Ministry of Transport of the People’s Republic of China: Beijing, China, 2018.
Figure 1. Schematic of the Monte Carlo method.
Figure 1. Schematic of the Monte Carlo method.
Sustainability 17 04640 g001
Figure 2. Workflow of resource consumption standard development.
Figure 2. Workflow of resource consumption standard development.
Sustainability 17 04640 g002
Figure 3. Histograms of resource consumption with standard distribution curves. (a) Histogram of labor; (b) Histogram of expansion bolts; (c) Histogram of other materials; (d) Histogram of 3t electric cart; (e) Histogram of small tools.
Figure 3. Histograms of resource consumption with standard distribution curves. (a) Histogram of labor; (b) Histogram of expansion bolts; (c) Histogram of other materials; (d) Histogram of 3t electric cart; (e) Histogram of small tools.
Sustainability 17 04640 g003
Figure 4. Histograms of resource consumption with standard distribution curves. (a) Histogram of 3000 simulated data. (b) Histogram of 5000 simulated data. (c) Histogram of 10,000 simulated data. (d) Histogram of 10,000 simulated data.
Figure 4. Histograms of resource consumption with standard distribution curves. (a) Histogram of 3000 simulated data. (b) Histogram of 5000 simulated data. (c) Histogram of 10,000 simulated data. (d) Histogram of 10,000 simulated data.
Sustainability 17 04640 g004
Table 1. Common probability distribution models.
Table 1. Common probability distribution models.
No.Function NameProbability ModelCharacteristics
1Uniform distribution
f ( x ) = 1 b a a x b 0 x < a or x > b
The probability is constant over the interval [a,b], indicating that any two values within the range have equal probability. It is suitable for variables with a known limited range or cases where the possible values are unclear.
2Normal distribution
f ( x ) = 1 σ 2 π e 1 2 x μ σ 2 , < x <
Also called the Gaussian distribution, it forms a bell-shaped curve where the probability is higher in the middle and lower at both ends. This is a typical distribution pattern in natural phenomena.
3Triangular distribution
f ( x ) = 2 ( x a ) ( b a ) ( c a ) a x < c 2 ( b x ) ( b a ) ( b c ) c x b 0 x < a or x > b
A continuous probability distribution resembling a triangular shape, increasing from the minimum to the maximum value and then decreasing. The peak determines the height of the triangle.
4Step distribution
f ( x ) = i n x i < x x i + 1 0 x x i or x > x i + 1
Consists of several intervals, where the probability density remains constant within each interval but is zero elsewhere.
Table 2. National construction standards for manual barrier gate installation.
Table 2. National construction standards for manual barrier gate installation.
ItemLabor
(Workdays)
Expansion
Bolts (Sets)
Other
Materials ( CNY )
3t Electric Cart
(Shift)
Small Tools
( CNY )
Manual barrier gate1.16.11.90.1911.3
Table 3. Summary of labor, material, and machinery consumption for manual barrier gates.
Table 3. Summary of labor, material, and machinery consumption for manual barrier gates.
ItemLaborExpansion
Bolts
Other
Materials
3t Electric
Cart
Small
Tools
ItemLaborExpansion
Bolts
Other
Materials
3t Electric
Cart
Small
Tools
10.564.220.330.029.30211.065.971.800.2011.26
20.604.400.530.039.44221.095.981.850.2111.35
30.624.580.650.059.66231.106.081.950.2111.38
40.704.650.680.059.88241.116.142.030.2211.44
50.714.770.800.0710.05251.166.202.080.2311.52
60.784.800.850.0910.16261.196.302.140.2311.63
70.794.931.050.1010.22271.26.342.210.2411.70
80.805.021.240.1110.39281.226.402.280.2511.80
90.855.111.370.1210.45291.246.532.300.2611.95
100.885.281.400.1210.57301.276.612.420.2712.18
110.95.421.420.1310.62311.286.702.530.2712.29
120.925.551.480.1310.70321.316.942.650.2812.42
130.945.591.570.1510.72331.357.012.760.2912.60
140.955.621.590.1510.77341.377.152.800.3112.70
150.965.681.640.1610.85351.437.252.940.3112.98
160.965.731.680.1610.96361.497.403.020.3213.10
170.985.901.700.1711.00371.517.543.190.3313.38
180.995.901.720.1711.05381.577.653.270.3513.55
191.005.921.750.1811.09391.607.713.330.3713.85
201.045.951.800.1911.15401.667.803.400.3714.30
Table 4. Error testing and outlier elimination results.
Table 4. Error testing and outlier elimination results.
ItemLaborExpansion BoltsOther Materials3t Electric CartSmall Tools
Bessel’s Standard Deviation σ 1 0.2850.9680.8120.0961.224
Bessel’s standard deviation σ 2 0.2960.9790.8250.1011.232
| u | = σ 1 / σ 2 1 0.0340.0110.0170.0500.007
Systematic error judgment resultNoneNoneNoneNoneNone
Upper Grubbs g ( 1 ) 1.8171.8571.9401.8491.817
Lower Grubbs g ( n ) 1.9671.8211.8111.7231.967
Grubbs’ threshold T ( n , a ) 2.8702.8702.8702.8702.870
Gross error detectionNoneNoneNoneNoneNone
Maximum limit L max 1.8498.5244.0540.44214.910
Minimum limit L min 0.3093.512−0.244−0.0487.910
OutliersNoneNoneNoneNoneNone
Table 5. Skewness and kurtosis coefficients.
Table 5. Skewness and kurtosis coefficients.
ItemLaborExpansion BoltsOther Materials3t Electric CartSmall Tools
Skewness coefficient0.0540.0980.0520.0020.590
Kurtosis coefficient2.3042.2092.2992.1342.622
Table 6. Accuracy analysis of simulated data (n = 100,000).
Table 6. Accuracy analysis of simulated data (n = 100,000).
ItemLaborExpansion BoltsOther Materials3t Electric CartSmall Tools
Mean1.0796.0181.9050.19711.410
Standard Deviation0.2850.9680.8120.0961.223
t 0.025 1.961.961.961.961.96
Minimum limit1.0775.9991.8890.19511.386
Maximum limit1.0816.0370.0160.19911.434
Absolute error 1.77 × 10 3 0.0190.015 1.88 × 10 3 0.024
Relative error 1.64 × 10 3 3.15 × 10 3 8.35 × 10 3 9.55 × 10 3 2.10 × 10 3
Accuracy complianceCompliantCompliantCompliantCompliantCompliant
Table 7. Comparison between budget standards and national standards.
Table 7. Comparison between budget standards and national standards.
ItemNational StandardsBudget StandardsDeviation (%)
Labor (workdays)1.11.1111.03
Expansion bolts (sets)6.16.1991.62
Other materials (CNY)1.91.9623.27
3t electric cart (shift)0.190.2036.79
Small tools (CNY)11.311.7524.00
Price (CNY)205200−2.44%
Table 8. Comparison of budget standards, national standards, and market prices.
Table 8. Comparison of budget standards, national standards, and market prices.
ItemNational Standards (CNY)Budget Standards (CNY)Market Price (CNY)
Base price205200201.20
Deviation (%)−2.44-−0.60
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, L.; Tian, W.; Dai, X.; Song, L. Research on Resource Consumption Standards for Highway Electromechanical Equipment Based on Monte Carlo Model. Sustainability 2025, 17, 4640. https://doi.org/10.3390/su17104640

AMA Style

Liu L, Tian W, Dai X, Song L. Research on Resource Consumption Standards for Highway Electromechanical Equipment Based on Monte Carlo Model. Sustainability. 2025; 17(10):4640. https://doi.org/10.3390/su17104640

Chicago/Turabian Style

Liu, Linxuan, Wei Tian, Xiaomin Dai, and Liang Song. 2025. "Research on Resource Consumption Standards for Highway Electromechanical Equipment Based on Monte Carlo Model" Sustainability 17, no. 10: 4640. https://doi.org/10.3390/su17104640

APA Style

Liu, L., Tian, W., Dai, X., & Song, L. (2025). Research on Resource Consumption Standards for Highway Electromechanical Equipment Based on Monte Carlo Model. Sustainability, 17(10), 4640. https://doi.org/10.3390/su17104640

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop