Next Article in Journal
Resilience in Vulnerable Small and New Social Enterprises
Next Article in Special Issue
Developing a Risk Analysis Strategy Framework for Impact Assessment in Information Security Management Systems: A Case Study in IT Consulting Industry
Previous Article in Journal
Modeling Performance of Butterfly Valves Using Machine Learning Methods
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Risk Assessment Models to Improve Environmental Safety in the Field of the Economy and Organization of Construction: A Case Study of Russia

by
Arkadiy Larionov
1,
Ekaterina Nezhnikova
2 and
Elena Smirnova
3,*
1
Department of Economics and Management in Construction, Moscow State University of Civil Engineering, 129337 Moscow, Russia
2
Department of National Economy, Peoples’ Friendship University of Russia (RUDN University), 117198 Moscow, Russia
3
Department of Technosphere Safety, Saint Petersburg State University of Architecture and Civil Engineering, 190005 Saint Petersburg, Russia
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(24), 13539; https://doi.org/10.3390/su132413539
Submission received: 14 October 2021 / Revised: 7 November 2021 / Accepted: 9 November 2021 / Published: 7 December 2021

Abstract

:
This article assesses risks in order to substantiate the economic and organizational efficiency of housing and industrial construction. This topic is relevant because it is necessary for sustainable development. In Russia, environmental safety in construction and housing, as well as communal services, is poorly developed and not regulated by the legal system. As building construction, housing, and communal services should be based on environmental safety, this topic requires rapid development. Methods related to quantifying environmental risk and making decisions under conditions of uncertainty were studied. A quantitative risk assessment was performed using the Monte Carlo method for pessimistic and optimistic options to prevent environmental damage. The model reproduced the distribution derived from the evidence-based fit. The results of sensitivity analysis are also presented to prove the hypothesis. The selection of the most appropriate probability density functions for each of the input quantities was implemented through settings in a computer program. The simulation modeling results clearly illustrate the choice of the general principle of assessment and the adoption of the optimal decision. In conditions of uncertainty, the decision to choose the optimistic options with high cost (to maintain the reliability of the technical system) but less risk plays a decisive role in the future environmental safety strategies of construction projects. The Monte Carlo method is preferable for environmental impact assessments. In the future, the amended methodology can be applied to raise environmental safety in the field of construction.

1. Introduction

1.1. Objectives

As a rule, a quantitative study is required after a qualitative risk analysis using tools such as brainstorming, the development of industrial safety declaration, Delphi techniques, interviews, PESTLE, and SWOT, potentially and significantly affecting the development of competing project requirements. Determining whether the corresponding level of risk is acceptable is the task of a stochastic estimator, such as the Monte Carlo method (See in Appendix A).
The objects of this study are the factors that determine the environmental safety of construction (geographical, hydrological, and geological characteristics of the construction area as risk factors that cause emergencies and threats posed by objects, and technologies used during hazardous waste management). The research subjects are (optimistic and pessimistic) options for event development, defining the level of environmental safety, assessment methods, and a risk management scheme that provides an acceptable level of environmental safety. The main objectives of the study are as follows:
  • To develop scenarios for the development of risky situations, identify their parameters, evaluate each scenario and compare them;
  • To analyze the characteristics of each risk scenario;
  • To identify the most significant risks affecting the development of the scenarios under consideration;
  • To apply the objective methods of Monte Carlo assessment (logistics and lognormal distributions as risk analysis tools);
  • To conduct studies to form a complete algorithm and methods for use in environmental safety and construction.
The economic factors associated with unforeseen expenses are the main obstacles to improving the efficiency of housing and industrial construction. The essence of the problem lies in disproportionate increases in the cost of housing in comparison with the general level of prices for goods and wages. Hired workers can find it difficult to acquire suitable housing, and enterprises cannot provide efficient production indicators. What drives up housing prices and causes inefficient production? Production costs (including organizational and unforeseen costs resulting from risky and irrelevant decisions) are the reason. Therefore, the aim of this article is to substantiate a methodology that helps prevent various kinds of damage by mitigating the likelihood of their occurrence and protecting a certain type of investment. Risks and their consequences are associated with damage and can be foreseen and mitigated; risk management measures aim to reduce the rise in housing prices and increase construction efficiency. This circumstance is itself evidence of an increase in the economic features of construction. In other words, choosing the right risk solutions can help avoid investing in unforeseen damage. However, the authors did not aim to prove the economic and organizational efficiency of housing and industrial construction a priori; rather, they analyzed its applicability for calculating environmental risks in the construction sector to improve the environmental safety of this area.

1.2. The Use of the Monte Carlo Method

However, using the Monte Carlo method to evaluate environmental risk is quite rare. The main difficulty is in creating independent samples from the target distribution. Kuang et al. applied the Monte Carlo method to the results of studying the concentration of heavy metals in sediments and marine organisms in order to evaluate environmental and public health risks [1]. The uncertainty they encountered was a lack of knowledge of the concentrations of toxic substances and toxic traces among people. Under these conditions, they turned to a probabilistic approximated solution using Monte Carlo modeling. The authors calculated the average concentrations of heavy metals in sediments and marine organisms, and the sequence of pollutants was established. The Monte Carlo simulation algorithm was as follows: (1) First, random variables were identified and selected by the estimated model, in class intervals for the construction of a frequency histogram; (2) next, a suitable distribution model for a random sample generated by the probability density function was chosen; this was the best representation of the sample; (3) then, a simulation with the number of iterations, n, was performed, and the results were interpreted. The weak point of such studies is that there is often no information about the argument rate of the sampling. The main difficulty for authors was conducting an adequate statistical analysis. Although the authors used the properties of lognormal distribution (limited to the bottom by 0 with a skew to the right), it had to explain the “left” loop with a sharp exponential. The cumulative function (CDF) plays an important role. It determines the likelihood that the measured value will be less than some Z, and the measurement results will fall into a certain range. Later, authors found it extremely difficult to interpret the relationships between CDF and the characteristics of potential environmental risk. Such a work plan becomes relevant for assessing environmental risk.
Pirsaheb et al. chose a lognormal distribution to analyze the concentrations of heavy metals in grain crops since it is this distribution that is often characteristic of ecotoxicological data and sets the characteristic sequence of critical (hazard) toxicant concentrations, which will be dangerous for the most sensitive types of cereals and ineffective for the rest [2]. The fact that the model itself is unknown or is not determined is a powerful, widespread difficulty in risk analysis. The strategy is to determine the distribution having the greatest entropy, consistent with the existing knowledge of restrictions on a possible figure. Such an approach, without any assumptions about the figure, allows users to select the input distribution in an optimal way using only limited information about variables. For example, a test indicates the following features:
  • A uniform distribution if only the upper and lower bounds of the values are known;
  • An exponential distribution if only the lower bound and the mean are known;
  • A normal distribution if only the mean and standard deviation are known.
Due to imperfect measurements and equipment, the bitstream generated by a quantum random number generator contains bias and correlation, two indications of non-randomness. The bias can be reduced: Neumann’s method uses normalization techniques. Von Neumann points out that there are certain difficulties in implementing the Monte Carlo method [3]. As has been mentioned many times, there are no random numbers—there are only methods for obtaining random numbers. Any physically existing machine has a certain limit of accuracy. Since the average value |f′(x)| on (0, 1) is equal to 2, in each transformation from xi to xi+1, any error will increase by two on average. In about 33 steps, the first round-off error will grow to about 1010. No matter how random the sequence {xi} is in theory, after about 33 steps, only the random properties of the round-off error are tested. The von Neumann procedure consists of dividing the sequence { x i } i = 1 n into pairs, discarding pairs of equal bits, and replacing every 10 pairs by a 0, each 01 pair by 1. From n biased bits, this procedure extracts approximately npq unbiased bits. The Von Neumann normalization without bias can both increase and decrease the (algorithmic) randomness of the generated sequences [4]. The method’s technique requires having a source of uniformly distributed pseudo-random numbers. A commonly used algorithm for generating such numbers is von Neumann’s middle-square digits. Here, an arbitrary n-digit integer is squared. A new integer is formed by extracting the middle n digits from the product. This process is iterated to form a string of integers. Clearly, this chain of numbers will be repeated at some point. H. Lehmer proposed a scheme based on the Kronecker–Weyl theorem, which generates all possible numbers of n digits before it is repeated [5]. Using the Monte Carlo method, one can directly obtain the values of the first few points of the distribution or the first few coefficients: “…if one is interested in the value of U(f) where U is a functional like the above, and f satisfies a certain operator equation ψ (f) = 0, we can in many cases obtain an idea of the value of U(f) directly, without ‘knowing’ f at each point” [6].
Gardiner et al. (1956) solved the problem of expanding code procedures for computer calculus. There is an infinite number of lucky numbers, and they have many properties similar to prime numbers: Their asymptotic density is equal to 1/ln n; that is, it coincides with the asymptotic density of primes. They could be a random number generator. Remarkably, if a number is lucky, then all members of its sequence are also lucky. Starting with any positive integer (for example, 19), one can replace this number with the sum of the squares of its digits in the decimal notation system and repeat this process until the number either becomes 1 (where the whole process stops) or ends up in an infinite loop not containing 1 (19 → 82 → 68 → 100 → 1). Little is known about the distribution of lucky numbers. Some questions that have not yet been answered are as follows: What is the density of lucky numbers? Are there arbitrarily long sequences of consecutive lucky numbers? How large can the gaps be [7]?

1.3. EIA as an Assessment of the Technogenic Impact on the Environment

In this regard, the concept of risk should be discussed. Environmental legislation requires implementing environmentally sound decisions related to economic and other activities by eliminating possible adverse impacts and environmental consequences and developing measures to reduce and prevent disasters and emergencies.
The following laws, standards, and guidelines for environmental safety have been established:
The environmental impact assessment (EIA) procedure gives a new impetus to the greening of the environment, use of natural resources, and economic activities in general. In other words, the aforementioned documents allow for the coordination of risk-based management actions, although stakeholder objectives may have different aspects and categories and may be applied at different levels. Environmental impact assessment is inseparable from ranking and comparing the risks faced by those who implement solutions and the public concerned. Any assessment with a technogenic impact on the environment makes sense in the context of changes in its quality and impact on the ecosystem. In most developed countries with well-established social and economic strategies, priority is given to the concept of sustainable development. This meets the goals of environmental activities, which include two interrelated components: (1) ensuring an acceptable level of safety for the population and the natural environment; (2) improving the quality of life determined by the country’s level of economic and social development.
In Russia, ecological expertise in the government is organized by environmental authorities, and the environmental impact assessment is carried out by customers interested in obtaining the necessary documentation. The process of projects’ environmental assessment includes the following aspects: selection of projects; definition of the objectives and scope of the impact assessment; assessment of the magnitude and significance of impacts; development of mitigation measures; documentation of the impact assessment’s results; preparation of an Environmental Impact Statement (EIS, environmental law); decision making on project implementation; post-project monitoring.
Preparing an EIS is inseparable from assessing risk due to the potential involvement of economic activity. The EIS should contain an interpretation of the hazards associated with the implementation of the project and an assessment of measures to mitigate the impact of the planned economic activity. This circumstance meets the goals of a green economy. It should be noted that at the forecasting stage, significant uncertainty in the assessment of potential negative consequences associated with the planned economic activity is reduced if the environmental risk assessment is included in a single decision-making process. Risk assessment makes it possible to determine how significant the risks caused by the planned economic activity are for the environment [8]. Risk assessment provides greater clarity and transparency in environmental decision making and provides a framework for risk management during the project implementation phase [9].

1.4. Study Objectives in the Context of Risk Assessment

The purpose of this article is to analyze the applicability of the Monte Carlo method for making informed decisions about an environmentally friendly construction option (See in Appendix A). The bibliography regarding risk assessment procedures is very extensive, and includes the following:
The basic and traditional definition of risk is the consequence of an event. Risk is a situation or event in which something of value for a person is at stake, but the result is unclear [10]. It is an undefined consequence of an event or activity in relation to something of human value [11]. Campbell (2005) believes that risk is the likelihood of an unwanted event, and it equals expected damage [12]. Wiener and Graham (1997) define risk as the likelihood of an unfavorable outcome [13]. Lowrance (1976) refers to risk as a measure of the likelihood and severity of adverse effects (undesired consequences) [14]. The standard ISO/IEC 27000:2018 (see https://www.iso.org/obp/ux‘i/#iso:std:iso-iec:27000:ed-5:v1:en (accessed on 24 November 2021)) interprets risk as a combination of the likelihood and magnitude of the consequences. Risk is characterized by its reference to potential events and consequences, or a combination of both (ISO Guide 73: 2009, 3.6.1.3). It is expressed as a combination of the consequences of an event (including changes in circumstances) and the associated likelihood of occurrence (ISO Guide 73: 2009, 3.6.1.1). According to Kaplan and Garrick (1981), risk is characterized by a set of triplets, si, pi, xi, where si is a scenario identification or description, pi is the probability of this scenario, and xi is the consequence of the scenario, i.e., the measure of damage, i = 1, 2, …, N [15].
According to GOST R 51897-2011, risk is the consequence of some uncertainty that should be understood in order to carry out informed decisions and actions (See in Appendix A). The consequence of uncertainty should be understood as a deviation from the expected result or event (positive and/or negative). Risk is often characterized by describing a possible event and its consequences, or a combination of both. Uncertainty means a lack of knowledge about certain factors, which can be eliminated with additional research. There are three main categories of uncertainty in relation to risk assessments. First, there are uncertainties associated with the conceptual model used as the basis for the study of the object. Risk assessors should describe, in as much detail as possible, what estimates and assumptions are included in the conceptual model. Second, uncertainty in parameter values should be distinguished from variability that occurs because of real heterogeneity or changes in building site characteristics and the environment. Only some of these indicators can be selected, leaving the true distribution of parameter values undefined. The risk assessor should provide a quantitative or qualitative description of the uncertainties in the distributions of parameter values. Third, there is uncertainty associated with how adequately the model relates to the test object. The models currently available are fairly simple. In this regard, it seems appropriate to identify the main assumptions of the model and their potential impact on risk assessment.
A useful approach for categorizing uncertainties, according to the Society for Risk Analysis, involves two main types of ambiguity: the stochastic uncertainty (the uncertainty of a random event), which recognizes the intrinsic variability of some phenomena and cannot be reduced by further research, e.g., in throwing a dice; and the epistemic uncertainty that usually arises from a lack of knowledge and therefore can be reduced by collecting more data, refining models, improving sampling methods, etc. In many situations, both types of uncertainty are encountered. The probability depends on the assessor’s knowledge of K, which is expressed by the formula P(A|K) (https://www.sra.org (accessed on 24 November 2021)).
As the SRA Risk Analysis Quality Test defines, uncertainty in a broader sense can include uncertainty about the validity of assumptions, including assumptions about how systems might behave; variability of the parameters on which the decision should be based; uncertainty about the applicability of models that have been created to predict the future; events (including changes in circumstances) whose occurrence or nature is uncertain; uncertainty associated with disruptive events; uncertainty about the outcome of systemic issues, such as a lack of competent personnel, which can have widespread impacts that cannot be clearly identified; lack of knowledge about any aspect; lack of knowledge, occurring when uncertainty is recognized but not fully understood; unpredictability; the inability of the human mind to recognize complex data; situations with long-term consequences. Recognizing that uncertainty exists in a particular area allows early warning systems to be implemented to detect changes and the mechanisms that need to be used to prevent unforeseen circumstances.

1.5. Research Novelty

The novelty of this article is that this model is designed to optimize the search for a solution under conditions of uncertainty. This aspect of risk assessment for environmental safety in the construction and housing sector has not been the subject of any previous analysis. In the simplest case, when nothing is new or unusual in a situation, the risk is well understood, without any significant consequences or with minor consequences. This article has developed a model for completely new or complex situations where there is high uncertainty and little experience, and traditional assessment technologies may not produce the desired result. This also applies to circumstances where the parties involved have very divergent views on environmental safety. Risk assessment is more concerned with reducing uncertainty. This article provides an understanding of the significance of factors contributing to the occurrence of a specific risk. The scenario conditions, assumptions, constraints, or required resources related to the valuation activity are indicated. However, it is impossible to specify all conditions within one article. The presented model shows the decision-making process when choosing between several options associated with different risks (having positive or negative consequences, or both). This is the logic of the work. The key questions for the analysis stage can be formulated as follows: What are the expected benefits of the chosen solution? Is there a possibility in which the implementation of a decision, while reducing the degree of some risks, results in other risks? This article seeks to provide the most comprehensive possible disclosure of risk assessment information without advocating any ideal quality in the analysis. Each deficiency could be considered an opportunity for correction.
The highlights of the article are as follows: the Monte Carlo method confirms the correctness of the chosen mathematical model; this model is designed to optimize the search for a solution under conditions of uncertainty; a scenario approach was used in the work. The scenario approach is the most optimal for assessing the risks in the field of environmental safety; in the case of choosing the optimal option, one should take into account the parameter s2, which should receive priority for the sustainable implementation and operation of construction projects; the input parameters of the model s1 and s2 are important risk factors that determine the environmental safety level of construction.

2. Literature Review

Recently, problems have been intensively developed in various spheres that apply the Monte Carlo method. Researchers have studied the Monte Carlo method for solving a system of linear algebraic equations in several articles [16,17]. Moreover, this is understandable: Since the advent of mathematical methods, many models of various degrees of complexity and type have appeared. For example, consider a truncation error in mathematical formulas. The total approximation error is the sum of the rounding error and the truncation error. The truncation error decreases with decreasing step size. However, when the step size is reduced, it usually leads to the need for more accurate calculations, which therefore leads to an increase in round-off error. Consequently, the errors directly contradict each other—as we decrease one, the other increases. In the field of information theory, a simple parity check can detect all single-bit errors. It can also detect burst errors if the total number of changed bits is odd. However, this method cannot detect errors if the total number of modified matches is even [18,19]. In such conditions, assessing the total errors in results using the techniques recommended in theory of errors is not precise enough, as it does not consider the real probability distribution functions of the initial data. Thus, the Monte Carlo method should be applied for situations of this kind; it allows users to choose any method to analyze the initial data when these data are set in an interval probabilistic manner. The presentation of the final results as probabilistic curves facilitates the researcher’s work at all stages of data processing to reveal patterns of placement and formation, forecast, search, and assessment of the object under study.
Understanding the solutions of algebraic equations and inversions of large matrices using the Monte Carlo method is very important. Branford et al. studied the applicability of this method to GRID computing to solve scientific, mathematical problems requiring significant computing resources. Effective networking technologies based on Monte Carlo methods (for performing random sampling of a certain random variable) apply to many engineering applications based on coordinated resource sharing and problem solving in dynamically changing virtual organizations with many participants [20].
The simulation and modeling of complex systems, parallel to scalable algorithms, and collaborative, cluster, and grid computing, require the Monte Carlo method since it uses technologies that provide access to computing resources and capacities when needed. The problem posed in mathematical modeling is to what extent can the Monte Carlo method be used for an accurate statistical assessment of the moment of extremely rare events, analyzing the reliability of multidimensional variables, solving time-consuming problems, and reducing the number of computations required for accurately modeling the objects under study [21,22]. The Monte Carlo simulation method is a powerful tool for complex engineering problems with many random variables [23]. The absolute error of measurement results incurred using the proposed method is extremely small. For example, in a study by Chen et al., it does not exceed +/−0.0003 m3 [24]. Due to its precision, this method is used when calculating infinity. The developed Monte Carlo code gives satisfactory predictions about desired values [25].
Mathematical modeling using the Monte Carlo method has affected all branches of science, both theoretical and practical. The method is applied to assess the correctness of the modeling procedure used and to extract the best model parameters [26,27]. In addition, scientific research sets the following assessment objectives: the unknown probability of an event; unknown distribution function; distribution of known parameters; dependence of a random variable on one or more random variables; testing statistical hypotheses about the form of an unknown distribution or the value of the distribution of known parameters, etc. [28,29]. Experts are increasingly using this method in various spheres of engineering, assessing the contribution of uncertainty and sensitivity to model predictions [30].
In the last decade, the sensitivity of the Monte Carlo method to ensure environmental safety has evolved to address uncertain assumptions about the input data. This method has proven itself in assessing the risk of exposing the ecosystem to toxic substances [31]. Another article presents the Monte Carlo method for determining the conservative fuel depletion limit of a pressurized water reactor (PWR) regarding centerline temperature, internal pressure, and strain measurement deviation. The uncertainty of the converted power was introduced to record the uncertainties of various computation models. A specific limitation of the fuel elements’ bundle depletion was not required according to the neutron-physical characteristics of the reactor. The restrictions established by the author have become essential to environmental regulatory authorities [32]. Oh and Nam used the method for predicting changes in the dynamic behavior of offshore structures for safety purposes since seabed ground motion is stochastic [33]. The Monte Carlo method they used is 5.8 times faster than the conventional method for assessing the likelihood of facility failure. According to the opinion of Oliver et al., a probabilistic risk modeling chain with continuous simulation can provide a more detailed picture of flood risks [34].
As for the literature on the efficiency of housing and industrial construction, it is almost non-existent, except for a few papers considering the use of the Monte Carlo method in the field of construction. For example, a study by Stewart et al. (2018) describes a risk analysis model of the economic consequences of damage to the metal roof of a typical modern Australian home under extreme wind loads. Monte Carlo simulations and structural reliability methods were used to stochastically simulate spatially varying pressure coefficients, fracture of roof elements, and load redistribution across the roof. Based on the analysis of spatial reliability, brittleness curves were created, which relate the probability and degree of destruction of the roof cover to the wind speed [35]. Many other works only marginally touch on this topic, without referring to the Monte Carlo method for risk assessment, for example, [36,37]. It should be noted that the scarcity of the literature only emphasizes the novelty of our article.
As the literature review has shown, the Monte Carlo method has confirmed its significance and wide range of applications in various spheres of science and technology; however, it has not found wide application in environmental safety due to difficulties with the definition of the initial type of distribution. Given the possibility of the method, the authors aim to employ the method for calculating environmental risks in the field of construction and increasing its safety.

3. Materials and Methods

The authors faced the task of finding the optimal distribution of funds intended to reduce the likelihood of emergencies and funds to prevent possible environmental damage, a method for determining the risk value in design and construction. The authors used a risk minimization formula with controllable parameters for further stochastic modeling of processes and created a specific methodology based on it that assesses the environmental risk of construction projects. The lognormal distribution of risk probability was dictated by the values of the mean and sigma (scale) deviation when simulating with the Monte Carlo method, as well as the fact that it can represent a variety of shapes, from almost symmetric to highly skewed. The type of distribution form allows us to confirm that the same impact will cause different consequences depending on whether the object is damaged or new. As is well known, a normal distribution is most often considered to be a suitable model for describing a process where there is a large number of independent random causes. In this case, the distribution density of random variables has a symmetric, bell-shaped form. However, in some cases (when it cannot be negative), random variables have an asymmetric distribution. What is the difference between the processes providing normal or lognormal distributions? Both distributions occur when the described object is determined by many random and independent factors. The model of additive interaction with a normal distribution describes the sequential effects of each of the factors on the damaged object, reaching a certain threshold value. In the second case (multiplicative interaction), each subsequent event affects the object in proportion to its current state as a new object. The model of multiplicative interaction, i.e., when the random influence of a factor and the state of the system itself (until its destruction) are both considered, is determined by the lognormal distribution, with a location μ and a scale parameter σ. Thus, in contrast to a normal distribution, the form of lognormal distribution indicates the state of degradation and destruction of the building. In the model of multiplicative interaction, the random influence of a factor is considered, in addition to the state of the system itself on which that factor acts [38].
The Monte Carlo method is used to construct a mathematical model of a subject with uncertain parameter values. This makes it possible to obtain the distribution of scenario flows in graphical and/or analytical expression, which has the probability distributions of the project parameters, as well as the relationship between parameter changes.
A common way to include uncertainty in simulations is to self-define certain distributions on undefined input values, i.e., sampling these distributions, running the model with the sampled values, and iterating this process several times to create the output distribution. If the uncertainty is represented more accurately, the resulting solution will be more efficient. Couto et al. (2013) addressed the problem of the influence of variable uncertainties on the uncertainty of the resulting function. Through an illustration, they graphically explained the advantages of the Monte Carlo method (Figure 1).
Figure 1a illustrates the propagation of uncertainties. In this case, three input quantities, x1, x2, and x3, are presented together with the corresponding uncertainties of u(x1), u(x2), and u(x3), and y with u(y) are the measurand and its uncertainty. As can be seen, propagation uses only the main points (mathematical expectation and standard deviation) of the input quantities, and thus, a certain amount of information is lost. However, when propagating distributions (Figure 1b, where g(x1), g(x2) and g(x3) are the distribution functions of the input quantities, and g(y) is the distribution function of the measured quantity), no approximations are made, and all information contained in the input distributions is propagated to the outputs [39].
According to most authors, the use of Monte Carlo modeling consists of five stages: (1) transmitting data in class intervals to generate a frequency histogram; (2) determining the distribution for the probability density function, which is a random variable that best represents the sample; (3) implementing simulation with N number of repetitions; (4) assessing whether the number of simulations is appropriate; (5) after performing the simulation, generating a cumulative probability density function for analyzing the results, for example, [40]. The fact that the model itself is unknown or undefined is a serious and widespread difficulty in risk analysis. The strategy is to determine the distribution with the highest entropy, in accordance with the existing knowledge that forms the constraints on the possible figure. This approach, without any assumptions about the number, allows users an optimal way to choose the input distribution using only limited information about the variables. According to its definition, a stochastic simulation model must contain at least one random variable. Random variable, which is a numerical representation of the result of a random experiment, is a key term in statistical analysis and an essential element in any stochastic modeling [41].
There are several different definitions of a random variable. Some scholars define a random variable as a function defined on an elementary event space that assigns a real number with a certain probability to each elementary event. Aczel (1995) states that a random variable is one whose estimated values depend on the case. Therefore, this value cannot be predicted in advance since it depends on a random event [42]. As Benjamin and Cornell (2018) argue that a random variable is one that takes on numerical values whose outcome cannot be predicted with complete certainty [43]. On the other hand, a random variable is that which, as a result of an experiment, can be assumed with a certain probability to be one of the values of a certain set of real numbers [44].
The technique for assessing scenarios using simulation methods encompasses the following: assessment of the intervals of probable changes in basic variables; determination of the types of probability distributions within the specified intervals; determination of correlation coefficients between dependent variables; multiple computations of the resulting indicators. Mathematical statistical methods, such as mathematical expectation, variance, lognormal distribution function, and probability density, are then applied to the simulation results. The probability of the resulting indicators falling into a particular interval is calculated, as well as the probability of exceeding the boundary values and other necessary parameters [45]. By assessing the values of the resulting indicators of the evaluated scenarios, it is possible to calculate the possible interval of their change in various project conditions and thereby identify the direct and inverse problem for risk assessment. The advantages of using simulation methods are in the ease with which the results of the analysis can be perceived, in their broad practical applicability for coordinating investment decisions, ranking projects, assessing potential losses, and justifying rational simulation models. In cases where the values arranged in order, for objective reasons, slightly deviate from the mean values, the use of the Gaussian curve is quite viable; the objective functions associated with optimizing the indicators of the considered scenarios of environmental protection were also used [46,47] to assess the quantitative value of the risk.
To analyze the change in several variable parameters during the construction of a facility, a method for analyzing the scenarios of occurrence and development of undesirable events was also used. The method helped us to determine the probabilistic range of changes in the phenomenon with the most unsuccessful (pessimistic) or the most successful (optimistic) changes in internal or external parameters. In this way, the main idea of this study is to complement the application of the above-mentioned methods with a scenario approach that addresses rare events to quantify construction risks and make decisions in the face of uncertainty associated with these events. The entire flow diagram of the research method is provided, and an explanation of the chart is given (Figure A1; see in Appendix B).

4. Results

In its quantitative measure (particularly in the field of environmental safety), the risk is defined as the product of the investigated hazard factor and the amount of damage caused.
R = P × C
where R is a quantitative value of the risk;
  • P is the probability of emergencies occurrence;
  • C is the expected damage and consequences in the event of an accident.
This expression is, in fact, old, and there are currently several parameters that correct the factors of probability and consequences. This article is not based on this expression. In the case of a single scenario, the view that risk is probability multiplied by consequences would equate a low probability and high damage scenario with a high probability and low damage one, which is clearly not the same. Hence, the above Formula (1) appears as follows:
R = { ( s i , p i , x i ) } ,   i = 1 , 2 , , N
where si is the identification or description of the scenario; pi is the probability of that scenario; xi is the consequences, a measure of the assessment of that scenario, a measure of damage.
This approach to risk identification seems to be suitable for a scenario approach to risk assessment in the field of environmental safety of construction. Its rationale is based on the work of Kaplan and Garrick (1981) [15]. The authors argue that risk is the expected rate of damage, i.e., the average value of the risk curve. However, the risk is not the mean of the curve but the curve itself. A single number is not a vast enough concept to convey the idea of risk. The whole curve is required. On the other hand, a curve is also not sufficient to understand the risk. To fully convey the idea of risk, a whole family of curves is necessary. If the scenarios are arranged in ascending order of s damage severity, then a risk curve can be constructed. Further, the probability curve pii), which is a probability density function for the frequency φi ith of the scenario, is introduced into the formula as follows:
R = { ( s i , p i ( φ i ) , x i ) }
From set (1), a family of risks can be constructed by summing the frequencies from below. If there is uncertainty in the damage, then the following formula is used:
R = { ( s i , p i ( φ i ) , ξ i ( x i ) }
Thus, the scenario approach expands the definition of risk in Formula (1), including uncertainty. This is especially important in risk analysis when there are a number of scenarios, and the underlying baseline data for damage intensity is uncertain. The decision to be made becomes clear from the context of the scenario and the optimal solution associated with it: a much larger risk may be perfectly acceptable if it results in a significant reduction in costs or an increase in benefits.
According to the Russian standard GOST R 51898-2002 (“Safety aspects. Guidelines for their inclusion in standards”), risk must be considered acceptable when it comes to an optimal balance between safety and requirements. In this case, an iterative process of assessing and reducing the expected damage follows. The initiators (at the same time, they are also stressors, causing a stressful ecosystem) of economic activity are obliged to compensate for the harm caused to the environment. The purpose of risk assessment is to maintain the equilibrium of an ecosystem exposed to harmful effects from outside. Risk assessment technology is used to ensure its acceptable level in the construction industry [48,49,50].
Ordinarily, construction must consider the multidimensional nature of the impact on environmental components (See in Appendix A). That is, the same factor can have both negative (increase the risk) and positive (reduce the risk) impacts on the ecosystem. The risk magnitude associated with possible damage through the probability of the event occurrence must remain within the limits of normalized permissible value expressed in a quantitative form [51]. In what follows, the probable environmental damage is calculated (See in Appendix A. Formula (A1) is in this note).
To calculate a probable situation, we define that the probability P of an accident (A) over time (t) depends on the analysis of the conditions for the design, construction, and operation of the facility and the statistical data on an emergency (See in Appendix A). The exponential distribution of time between accidents is determined by Poisson’s law, which describes accidents as a flow of random events in the form of the following formula, in which the exponential distribution describes the time intervals between independent events occurring with average intensity. The number of occurrences of such an event over a certain period of time is described by a discrete Poisson distribution [52]:
P ( A , t ) = ( λ × t ) A N ! × exp ( λ × t )
at   A = 0 ,   1 ,   2 ; λ × t 0
where λ is an average value of the accident intensity, mathematical expectation μ = λ, and the variance σ2 = λ of the random variable in the Poisson’s distribution; e = 2.718281828 is the base of the natural logarithm.
The priority issue is as follows: whether to focus on risk in the form of costs for the technical safety system, in which case the probability of an accident is close to zero, or to direct funds to prevent the anticipated damage (reliability of the technical system’s failure-free operation), in which case, conversely, the probability of an accident will be close to one. The following formula describes the method:
P ( s ) = P ( s 1 ) × 1 P ( s 2 ) at   s = s 1 + s 2
where P(s1) is the probability of an accident in the technical safety system, its value depending on the amount of funds, s1, allocated for emergency prevention and environmental safety measures;
P(s2) is the probability of failure-free operation, which depends on the amount of funds, s2, allocated to maintain the reliability of the technical system and reduce the expected damage.
The first case is about the deterioration of the quality of the environment, not due to the regular operation of the facility but due to the risk of emergencies in it. The following measures can improve the reliability of the technical system within the framework of the requirements related to environmental protection (maintaining its conditions and characteristics) and environmental safety (taking into account the impacts and characteristics of the habitat that may be subject to technogenic impact from the construction site; for example, changing the landscape, movement and waste management, hazardous spills due to natural disasters):
  • Monitoring the assessment of the technical condition of system elements;
  • Carrying out and providing organizational and technical solutions aimed at improving the quality and reliability of the system, and ultimately, at a significant improvement in the state of the environment at the construction site;
  • Developing methods for operational control of the system’s operating modes;
  • Providing the system with qualified personnel;
  • Attracting investments to improve the safety of the system’s operation.
In the second case, s2 means the best available technologies based on modern advances in science and technology and the best combination of criteria for achieving environmental protection goals, subject to the availability of technical feasibility (according to Directive 2010/75/EU of the European Parliament and of the Council of 24 November 2010 on industrial emissions (integrated pollution prevention and control)). The use of this kind of technology is aimed at comprehensive prevention and minimization of negative impacts on the environment. The normal functioning of the enterprise is qualitatively updated since the new system is characterized by the lowest level of negative impact on the environment per unit of time or volume of products produced; work performed; economic efficiency of implementation and operation of installed technologies; application of resource- and energy-saving methods for work. Thus, investments in such technologies improve the environmental and resource efficiency of production, consistently reducing the negative impact on the environment. After the installation of advanced equipment (IAE), the enterprise receives an incentive, whereby it does not pay for negative environmental impact but receives an investment tax credit. Accordingly, if the parameter s2 is not updated, then surcharges for negative environmental impact and environmental fees become large financial risk factors.
Other technical measures that improve the state of the environment are the design, construction, and reconstruction of water supply systems, sewerage systems, sewer net-works, structures, and installations for the capture and disposal of pollutants; the installation of equipment to improve fuel combustion and waste disposal and automated systems to control the composition and volume or mass of substances polluting the atmosphere and bodies of water.
In Russia, however, the issue is not simply to promote IAE for implementation in the field of construction, housing, and communal services. With the transition to “green technologies,” the volume of revenues to the country’s budgetary system decreases, and the indicators of the effectiveness of state environmental supervision fall. In 2018, these fees amounted to RUB 13 bn (about USD 0.23 bn) compared with RUB 30.8 bn (approximately USD 1 bn) in 2013. In the absence of a dedicated environmental fund, fines received for pollution are not spent on the remediation of environmental damage. The paradox is that companies are paying less and less for environmental pollution. The introduction of an environmental tax instead of an environmental fee, in our opinion, will lead to a deterioration of the investment climate in Russia. Since 2021, a market mechanism has been implemented in the country to regulate greenhouse gas emissions. However, enterprises will have to pay twice: (1) for the negative impact on the environment and an ecological fee, and (2) a new payment for greenhouse gas emissions. Starting in 2025, Russian companies must pay the government USD 100 bn each year as a carbon tax. The introduction of this tax will tighten the regulation of greenhouse gas emissions. It is obvious that the command-and-control method of industrial modernization, particularly construction, to reduce the negative impact on the environment and the introduction of a carbon tax are not closely associated with minimizing risk in environmental safety.
The decrease in the probability P of a negative impact on the ecosystem depends on s1, the number of funds spent on its reduction, and s2, allocated to reduce the probable damage C. Based on (5) and (6), the probability of negative impact P and probable damage C can be expressed as
P ( s 1 ) = 0.01 × exp ( a × s 1 )
C ( s 2 ) = b s 2
where α is the frequency of accidents and b is the period (time) during which funds were allocated to reduce damage. As is known, the number e is an exponent of the natural logarithm, which is written, for example, as ln10, that is, the logarithm with a base of 2.718 out of 10.
The number e itself is an indicator of growth for any process, the dependent values of which change continuously with the change of independent values. Processes such as the decay of radioactive substances can serve as examples (knowing the decay coefficient, you can find out how much radioactive substance has already decayed into simpler elements).
Then, the solution to the optimal allocation of funds will be as follows:
R ( s 1 min , s 2 min ) = min { s 1 . s 2 } P ( s 1 ) × C ( s 2 ) = min { s 1 . s 2 } [ 0.01 × exp ( a × s 1 ) × b s 2 ]
In order to simulate a real situation and assess the choice of protection against hazardous consequences, when the probability of an accident tends to one, we complicate the task by introducing an additional parameter. Suppose that one of the scenarios given below is possible during the construction of an enterprise with a known value of risk probability P:
  • Scenario 1. The facility’s placement caused a change in the landscape, meaning there is a risk of destruction by flooding; the probability P1 = 0.3. According to Formula (9), despite an object’s low risk, its location must be considered (Scenario 1 states that the placement of the object caused the terrain to change. The object itself is not dangerous as such (the anthropogenic factor, in this case, is excluded). However, the object can still be destroyed (the natural factor is considered) due to changes in the landscape. Therefore, the destruction of an object depends directly on its location, which must be considered). The amount of funds (s1) aimed at preventing emergencies and environmental safety measures does not provide a low-risk value a priori due to the possibility of a landscape disturbance (for example, flooding). The investment of s2 funds in maintaining the reliability of the technical system may be unjustified due to the negative effects of external factors, which leads to a pessimistic, unfavorable scenario rather than one that has a relatively low-risk value.
  • Scenario 2. The construction did not meet the parameters of design documentation due to a building violation—namely, failure to install fire extinguishing equipment, and the risk of fire is a likelihood; the probability P2 = 0.6. According to Formula (9), a high risk of emergencies always arises when project parameters are violated, despite the high investment s2 in the safety of the technical system.
  • Scenario 3. The design costs of construction for this area did not consider the parameter of seismic safety; therefore, there is a risk of damage or destruction of the facility during an earthquake; the probability P3 = 0.1. According to Formula (9), given the values of s1 and s2 and a low value of risk, it is important to understand what is relevant in a scenario associated with a place that has a potential seismic hazard. For example, in St. Petersburg (Russia), the possibility that events in a seismic safety scenario could develop is non-existent. However, in Tbilisi (Georgia), the damaging factors of an earthquake can reach the scale of a natural disaster.
  • Scenario 4. The facility is associated with an environmentally hazardous technological process and can cause an emergency at any time; the probability P4 = 0.7. According to Formula (9), it should be noted that when in full compliance with the requirements for the technological process and corresponding investments (s2) in the safety of the technical system, emergencies with a high value of risk are impossible, and the development of this scenario is likely to be optimistic. In choosing between s1 (funds allocated for the prevention of emergencies and environmental safety measures), and s2 (funds allocated for the failure-free operation of the technical system), the optimal scenario will be the one that ensures the technical safety of production.
In each of the proposed scenarios, it is assumed that an object with a probability P fails and is in a state of emergency, negatively affecting the ecosystem and reducing its ecological safety. An entrepreneur can use income in 100 conventional payment units (I) without accumulating profit to cover the damage. The entrepreneur is faced with the task of allocating funds from the income of his economic activities to improve the environmental safety (s1) and reliability of the technical system (s2), according to the protection measures provided for in each of the four scenarios (in our example, the entrepreneur must choose one of the scenarios). The initial data of the scenarios are presented in Figure 2.
The results of optimizing the risk assessment when choosing a specific scenario were calculated using the formula with controlled variables (I is the annual income of the enterprise in the form of 10 conditional unit costs; D equals 10 contributions to each of the four scenarios; C is the total amount of prevented damage in conditional units in relation to maximum losses—100 units; ci(d), i.e., c1(d), c2(d), c3(d), c4(d), is the ith part of the prevented damage in conventional units; P is the known value of the probability of risk) for an optimal assessment of environmental risk. It should be noted that in this case, the degree of risk R depends on the costs of preventing the expected damage, so all income can be considered as annual damage incurred in an unforeseen situation (See in Appendix A). This function can be expressed as
R i ( D ) = ( 1 P i ) × D I + P i [ C c i ( D ) ] C
As a result of substituting the values from the table (Figure 2) and risk probability values into Formula (10), the values presented in Figure 3 were found.
Graphs R1(c), R2(c), R3(c), R4(c) were built using the results of the computational example for the objective function. Figure 3 shows that even with a zero-investment package (D = 0), each project (scenarios 1–4) provides for the costs of ensuring environmental safety. It is necessary to look at the two possible outcomes of events to interpret the calculation results for scenarios. The two possible outcomes are pessimistic (first and second scenarios) and optimistic (third and fourth scenarios). In the first (pessimistic) variant, the minimum value of risk assessment R is 0.52 at the cost of c in 3.5 conventional units. In the second (optimistic) variant, the minimum value of risk assessment R will not exceed 0.46 at the cost of 4.2 conventional units. In addition, when choosing scenarios 3 or 4, the entrepreneur will contend with even lower risk scores, corresponding to the amounts invested.
This conclusion was analyzed in more detail (See in Appendix A). If investments in the technical system are higher, then the risk is lower (Table A1 and Table A2; See in Appendix B).
Even under pessimistic conditions, the first scenario wins over the second scenario precisely because investments in the technical system were large, increasing its reliability (Figure A2; See in Appendix B).
In optimistic scenarios, investing in the second scenario is also justified. Scenario 3 has the lowest probability of risk due to significant investments in the security of the technical system (70%). Although the overall risk value is greater (5.65) than in the pessimistic options (5.55 and 5.2), the graph shows that the lowest risk values occur before the investment (4.2). Measures related to environmental safety costs, the payback period of which is longer, also have their value, compared with higher costs for increasing the reliability of the technical system, which has a shorter payback period. This circumstance reduces the significance of the risk. Notably, in scenario 4, starting from an investment of 4.2, the risk decreases with a decrease in environmental safety costs because the payback period is reduced (51.73%) (Figure A3; See in Appendix B).
The analysis suggests that if the probability of risk P tends to 0 or 1 in unfavorable conditions, although the costs are small, then the decision-making risk is quite high, so the costs may turn out to be unjustified. In favorable conditions, higher costs (even in the amount of annual income) will be justified since the decision-making risk is an order of magnitude lower. A quantitative risk assessment or risk analysis (QRA) can be based on a deterministic or stochastic modeling approach. The difference between both approaches is related to two issues—namely, risk and uncertainty. Risks are quantitative estimates of the expected likelihood of certain events occurring (ISO 2002) (Of course, risk is not the same as probability. Risk is often characterized by describing a possible event (3.5.1.3) and its consequences (3.6.1.3), or a combination of both (ISO 2002). However, for the analyst, risk is defined in terms of probability. Federal Law N 184-FZ of 02.07.2020 (https://docs.cntd.ru/document/901836556 (accessed on 24 November 2021)) “On technical regulation” explains the risk as follows: “Risk is the likelihood of harm to the life or health of citizens, property of individuals or legal entities, state or municipal property, the environment, life or health of animals and plants, taking into account the severity of this harm.” If the risk is a consequence of uncertainty about achieving the set goals [53], then the next question arises: What is the probability of minimizing the impact of this uncertainty in order to concretize it by type (for example, type A, “random error”, and type B, “bias”)? Understanding risk is structured by its probability in different aspects. For example, what is the probability of type A uncertainty, and hence the risk of random error? To minimize the risk, it is necessary to subject the specified uncertainty to statistical processing. As a result of such processing, ideally, the influence of random factors of uncertainty on the measurement result will be minimized (type A uncertainty is quantitatively characterized by variance and standard deviation). Uncertainty of type B, “systematic error”, combines uncertainty factors of a known character, quantities that change according to known laws. Its risk is minimized based on non-statistical information. However, the overall picture of risk versus opportunity is very complex. For example, some scientists propose to distinguish between possibility and probability, asserting that risk is calculated as a measure of possibilities, not probabilities. Some are convinced that there are methods for calculating “relative risk” as a measure of the relationship between a given state and the occurrence of an event. This is a case of “odds ratios”. If there are several fires under certain conditions and a number of fires in the absence of such a condition, the relative risk can be calculated as the relationship between the condition and the fire. Thus, probability is an abstract mathematical concept defined for an infinite number of trials. The point is the calculated probability, that is, approximated by the Poisson distribution [54]. The fact that they are connected with the probabilities implies that their outcome in any given situation is subject to uncertainty or randomity. This uncertainty is ignored in deterministic modeling, while in stochastic modeling using the Monte Carlo method, it is actually taken into account. The deterministic model will generate only one value for the outcome parameter, while the stochastic model will generate a probability distribution of possible outcomes.
A model was then conducted using the Monte Carlo method [55]. It is necessary to construct a probability distribution for the expected risk levels in the range from 0.1 to 0.95 (This data range (for both pessimistic and optimistic scenarios) was obtained using the analytical Formula (10) to assess the risk. Specific values for it were taken from the table in Figure 2 (Initial data of scenarios for the target function and the graph of changes in the prevented damage)). This is the logic of obtaining a sample from the general population. Then, specific distributions (logistic and lognormal) were selected using the Crystal Ball computer program. The reliability of the selected distributions was proved by ranking according to the statistics of the degree of compliance, for which the Anderson–Darling, Kolmogorov–Smirnov, and the Pearson chi-square tests were used. The distributions were also validated by the Shapiro–Wilk criterion. Parameters of the logistic distribution (for the pessimistic scenario) and lognormal distribution (for the optimistic scenario) were considered via computer simulation (Table 1).
In the Monte Carlo method, we denoted the random variable “level of expected risk” as X. Using the Crystal Ball program (Oracle Crystal Ball Enterprise Performance Management, Ver. 11.1.2.4.850), we selected the most suitable probability distribution for the specified data. At an academic and professional level, the “Fit Distribution” tool is a great help for the creation of scenarios, as it makes a preliminary selection of the data and assigns the most feasible probability distribution for the model being executed. It is a complete tool, capable of generating a summary of statistical data, goodness-of-fit tests, and chi-squared test, among others, making it possible to compare and choose the distribution that best suits the optimization or minimization needs of the model.
Excel spreadsheets are remarkable tools for analysis, but in their original application, they have many limitations. The greatest limitation of Excel is that it only allows assigning a simple value to each cell; therefore, to create scenarios, you must manually change the value of each one. Crystal Ball improves Excel performance and allows you to set uncertain values for different cells, as well as to calculate their effect on each variable. For this, Crystal Ball adds three new menus and a new toolbar to Excel.
To use “Fit Distribution”, the first step is to open the file in Excel, having the Crystal Ball bar active, and in the CBTools menu select “Fit Distribution”. The Fit Distribution wizard is then displayed. In step 1 of 2, select all the distributions of the model, and click next. Then, Crystal Ball allows you to fit a probability distribution to certain data.
In step 2, where input options are selected, perform the following tasks:
Click on the icon of the selected cells to locate the field where the data series are located;
Select the data available in the Excel spreadsheet;
Click on the return icon to return to the “tools” dialog.
When the “Fit Distribution” tool is running, it assigns a column of data for each distribution. The best distribution is designated for a model design according to user requirements. The “Fit Distribution” tool simulates the best scenario for the model, iterating between the data and the assumption options. The results correspond to the specified data as such. The rank by goodness-of-fit statistic is highlighted in blue. The optimal distribution is marked in green (Figure 4).
Using the “Fit Distribution” tool, we selected the most suitable probability distribution for the specified data (Figure 5 and Figure 6).
The logistic distribution allows users to calculate density, probability, and quantiles, and generate pseudo-random numbers distributed according to logistic law. The mean and scale (sigma) are its parameters. It describes various laws of development in biology, physics, economics, and other areas.
The lognormal distribution is widely used if values are positively skewed (most values occur around the minimum value). The parameters for the lognormal distribution are the mathematical expectation and standard deviation. The lognormal distribution is based on three conditions: (1) the unknown variable can increase infinitely but is limited from below by a finite value; (2) the unknown variable shows a distribution with positive skewness; (3) the natural logarithm of the unknown variable gives a normal curve.
To test the hypothesis about the distribution law, we need to formulate the main (H0) and alternative (H1) hypotheses. H0 is when the random variable is normally distributed, and H1 is when the random variable is not normally distributed. The alpha level is 0.05. The Shapiro–Wilk criterion was used to test the hypotheses. When the test is run, the W statistic is only positive and represents the difference between the evaluated model and the observations. The Shapiro–Wilk test has a good potential to reject normality. After verification, the following results were obtained:
  • Hypothesis H0: since the p-value > α, it needs to take H0. The data are assumed to be normally distributed. In other words, the difference between a sample of data and a normal distribution is not large enough to be statistically significant;
  • p-value. The p-value is 0.411559. Therefore, if H0 is rejected, the probability of type 1 error (deviation of the correct H0) would be too high: 0.4116 (41.16%);
  • W Statistic: 0.955926. It is within the acceptable range of 95% of the critical value: [0.9112: 1.0000];
  • Median: 0,53. Average (x): 0.529545. Sample standard deviation (S): 0.0934766. Sample size (n): 22. Skewness: −0.403046. Excess kurtosis: 1.395218.
It is known that the logistic distribution resembles a normal distribution but has “heavier” ends and a larger kurtosis coefficient. Therefore, the statistical reliability of the results was confirmed. The correctness of the distribution was also confirmed by graphical methods (Figure 7).
The following results were obtained after testing the hypothesis of a lognormal distribution of data on the optimal risk assessment variant:
  • Hypothesis H0: since the p-value > α, it needs to take H0. The data are assumed to be normally distributed. In other words, the difference between a sample of data and a normal distribution is not large enough to be statistically significant;
  • p-value. The p-value is 0.875764. Therefore, if H0 is rejected, the probability of type 1 error (deviation of the correct H0) would be too high: 0.8758 (87.58%);
  • W Statistic: 0.977658. It is within the acceptable range of 95% of the critical value: [0.9112: 1.0000];
  • Median: 0.45499999999999996. Average (x): 0.495000. Sample standard deviation (S): 0.219437. Sample size (n): 22. Skewness: 0.336493. Excess kurtosis: −0.408523.
The relationship between the normal and lognormal distribution is as follows: For a random variable X with a lognormal distribution, the logarithm Y = lnX has a normal distribution. The converse statement is also true: For a random variable Y with a normal distribution, the random variable X = eY obeys a lognormal distribution. In this case, the property of the logarithm requires that X be positive. The normal distribution is a symmetrical bell shape. The lognormal distribution is one example of a skewed distribution. Since the Shapiro–Wilk test uses only a right-tailed test, we can assert that the selected distribution was confirmed. Therefore, the statistical reliability of the results was confirmed. The correctness of the distribution was also confirmed by graphical methods (Figure 8).
The priority task associated with an entrepreneur’s choice of scenario is to invest in minimizing risk or investing in preventing possible environmental damage. Stochastic modeling using the Monte Carlo method fully confirmed the probability–risk hypothesis studied in the context of pessimistic (scenarios 1 and 2) and optimistic (scenarios 3 and 4) options. In a pessimistic scenario, an insufficient amount of investment (in s2, to maintain the reliability of the technical system) results in a higher risk value. In contrast, the more investments allocated (in s2, to reduce the expected damage), the lower the degree of risk. The sensitivity diagram confirmed the above conclusion (Figure A4, Figure A5, Figure A6 and Figure A7; See in Appendix B).

5. Discussion

In conditions of uncertainty, the decision to choose the optimistic options (P3 with a value of 0.1 and P4 with a value of 0.7) with high cost (to maintain the reliability of the technical system) but with less risk plays a decisive role in the future environmental safety strategy of a construction project.
Despite a high-risk value of 0.7, option 4 is optimistic. The optimistic character is explained by the fact that funds aimed at reducing damage reduce the likelihood of an emergency in high-risk conditions. The optimal allocation of funds (s1, reducing the probability of an ecological accident and s2, maintaining technical safety), as shown by the simulation results, is determined by a block of technical measures to ensure safety (investment in technology to ensure equipment safety, excluding the possibility of an emergency).
The probability value of an accident in the technical safety system depends on the number of funds (s1) allocated for the prevention of accidents and measures to ensure environmental safety. During construction, the following risks are possible: insufficient analysis and assessment of information about the construction site before the start of design, unprocessed tasks in terms of reference for the design of the facility, changes in the design process, project budget overruns, approval of the project and introduction of changes, ensuring the safety of the facility during construction and its subsequent operation; unrealistic construction timetable, incorrect logistics of the construction site, irrational use of the territory, etc. There can be many factors to understanding the key risks of construction projects, depending on the production practice in a particular country. In Russia, the safety culture and other factors such as training, human reliability, environmental conditions, experience, etc. have an insignificant impact on ensuring environmental safety, and therefore, assessing it is extremely problematic, if not impossible. The quality of building materials is poor, the paradigm of the urban planning environment is ill conceived, there is a high economic risk, unstable legal framework, construction companies lack their own funds, and the investment infrastructure is underdeveloped. All these factors impede the development of a safety culture in Russia. For this reason, its analysis was not included in the study plan. Various options are currently being proposed for introducing a safety culture (for example, setting up safety training parks).
Each new use of the MC method resembles a small scientific discovery. Modeling is used especially when solving a problem analytically is too difficult. The MC method enables the understanding of the impact or ranking of the importance of uncertainties, thereby motivating any additional measurement, modeling, or research and development. The probability distribution demonstrates the probability for each possible event that the event will occur. This is why the US Environmental Protection Agency (EPA) considers MC to be a reliable statistical tool capable of analyzing uncertainty in risk assessments. The EPA (1997) has issued guidance that includes rules on the application of this method for the analysis and estimation of uncertainty (https://www.epa.gov/sites/default/files/2014-11/documents/montecar.pdf (accessed on 24 November 2021)).
To be clear, results obtained when using stochastic values are not very accurate. In the presented case, the optimal scenario was calculated for the environmental safety of a particular construction object during its desired service life, the simulated process will occur in the future, and verification of such comparisons may be difficult or even impossible. According to Zadeh (1999), the principle of incoherence is innate in any modeling [56]. According to this rule, the incoherence of reasoning is caused by a gross violation of the measure, which undermines the rationality of subsequent steps. The more complex the simulation model, the lower the ability to formulate, based on the simulation, important statements about the system being modeled. To this must be added the complexity of the simulation: it takes a long time for the analyst to prepare a suitable input database, develop a model, and interpret the obtained results.
According to Pappenberger and Beven (2006), some analysts believe that all parameters, boundary conditions, etc. can be determined a priori, and therefore, there is no need for uncertainty. Another group of authors argues that it is enough to change the parameters of the model in a strictly defined range and test the models using non-statistical methods, excluding models that do not give satisfactory results. A third reason given by many is that decision makers and management personnel are not mathematically prepared to make sound decisions, as they often confuse the concepts of risk and uncertainty. Moreover, uncertainty cannot be integrated into the decision-making processes, which, in many cases, are binary. Another reason is that an uncertainty analysis should not be used because it is highly subjective and difficult to perform. Lastly, the final reason emphasizes the absence of a real impact of uncertainty on the process of reaching a final decision [57].
The less frequent an event is, the less accurately we can estimate the degree of its probability. Since large deviations, which Taleb (2007) calls “Extremistan” [45] (pp. 216–228) are extremely rare, their contribution to the result will be extremely small, and thus, they are neglected in calculations using the normal truncated probability distribution (+/−σ is the standard deviation) [58]. Ignoring the possibility of unpredictable large deviations is the main disadvantage of the MC method. The validity of output data is a key factor in any risk analysis method. Certainly, the danger associated with the assumption that future events will not be drastically different from what was usually observed in the past is also obvious when using the MC method.
When there is a dynamic system, the ability to predict events becomes extremely limited. Uncertainty scaling based on a bell curve does not account for the possibility of sharp jumps or discontinuities. Unpredictable large deviations are very rare but should be considered as their cumulative effect is significant [59]. As part of an analysis of the most used risk assessment methods, Semenoglou et al. (2021) conclude that the latest and most sophisticated information methods do not necessarily achieve more accurate results than the simplest ones [60].
The following example shows how difficult it is to apply the MC method. Dalal et al. (1999) have estimated the likelihood of a shuttle disaster due to O-ring failure at the design startup temperature. The method showed that the probability of a catastrophic O-ring failure at the 31 °F temperature at which the Challenger was launched aggregated 13% (i.e., the probability of a catastrophe is 1 in 8). However, delaying start-up until the temperature reached 60 °F would reduce the likelihood to at least 2% [61]. The MC method can play an important role in the risk management process for space shuttles. Kelly and Smith (2008) concluded that at an indeterminate temperature (as being Gaussian distributed with a mean of 52 °F and a standard deviation of 13 °F) the shuttle disaster probability due to O-ring failure would be equal to 8% (the probability of a catastrophe would be 1 in 12) [62]. What risk is considered acceptable? In fact, we cannot address whether a risk is acceptable but only the combination of the costs and benefits that come with this risk. Although the problem with the O-rings was known, NASA made the decision (to launch the spacecraft at low temperatures) with a high risk of disaster. Flights with zero incidents were excluded from the graph by the solid rocket booster engineers, as these flights were not believed to have contributed any information about temperature influence. The Rogers Commission cited this deficiency as the main reason for the decision to launch (Figure A8a,b; See in Appendix B).
Portugués (2021) assumes that the trend for incident launches is flat (Figure A8a; See in Appendix B) and assumes no correlation with temperature. However, the trend for all launches indicates a clear negative correlation between temperature and incident rate (Figure A8b; See in Appendix B). The minimum temperature for an incident-free launch ever recorded was above 64.4 °F, yet the Challenger was launched at 30.92 °F. Unaware of the consequences of such cold temperatures, NASA’s decision was right [63]. Kelly and Smith argue quite the opposite: panel b shows many successful starts at high temperature (>64.4 °F) with a small percentage of O-ring problems, and only a few low-temperature starts when 100% of the O-rings were broken; in some cases, the air temperature was between 50 °F and 64.4 °F, not 31 °F. If NASA management had relied on these data, the launch would most likely have been postponed (Figure A9; See in Appendix B).
The main conclusion that follows from the shuttle accident is obvious. Viewed in isolation, no risk is tolerable. A rational person will not take any risk, except in return for the accompanying benefits. Even if the risk is acceptable on a certain basis, it is still unacceptable if the same benefit can be obtained in another way with less risk. If the risk can be mitigated at a low cost, then it is judged to be unacceptable. However, a much greater risk may be perfectly acceptable if it entails a significant reduction in costs or an increase in benefits. As expressed by Aven (2009), risk should be understood as a two-dimensional combination of events A, the consequences of these events C, and the associated uncertainties U [53]. In other words, low uncertainty does not necessarily mean low risk, or high uncertainty does not necessarily mean high risk. Since it is defined as a two-dimensional combination of consequences and uncertainties, any assessment of the level of risk must consider both aspects of quantifying risk as such, but quantification is often associated with simplifications and assumptions, and as a result, important factors can either be ignored or carry a great deal of weight. Qualitative or semi-quantitative analyses can provide a more complete picture of risk. In this regard, the scenario approach to risk analysis provides a broad qualitative picture of risks, highlighting possible hazards and accident scenarios. The difference in points of view only suggests that one should not reject but integrate into practice all somehow useful technologies for assessing the probability of events and approaches to risk management.

6. Conclusions

The article analyzed the applicability of the Monte Carlo method to the optimization of environmental safety in the field of construction, housing, and communal services. Research of this kind is one of the first (judging by the review of the available literature). The objective of the study was to develop a method for the construction and utility sector and create a methodology that makes it possible to predict, depending on the impact of anthropogenic and natural factors, how a building or structure will be built and operated at all stages of its life cycle. As an example, the article provided four scenarios that reflect standard situations that can lead to an emergency. The developed scenarios were calculated using standard formulas and were verified by the Monte Carlo method. This resulted in the identification of pessimistic and optimistic scenarios among those developed.
The result of the Monte Carlo simulation was a graphical representation of the logistic and lognormal probability distribution of possible outcomes for the given scenarios in the form of a curve on a scalable field of the estimated consequences. The model reproduced the distribution derived from a set of actual data. This risk analysis technique has a computer implementation.
The averted environmental harm was considered in the context of four scenarios. Obviously, if risk analysts calculate a given scenario in advance (at the design stage), then they will be able to determine whether the scenario will be optimistic or pessimistic and adjust investments in the required area to prevent an accident (emergency) due to technical or natural reasons. The possibility of preventing an accident during construction or operation is an investment in the optimization of environmental safety. This is because the funds allocated to liquidate an accident (ES), as a rule, are many times higher than investments in any of the projects’ components (according to the scenarios) that would prevent such an accident.
The following distribution of investments is relevant when applying the method: (1) in technical safety, to prevent damage to the environment and (2) in measures, to reduce the likelihood of risk. The high risk in an optimistic scenario can be explained by the influence of external factors and comparatively high investment of funds aimed at preventing environmental damage. Low risk in developing a pessimistic scenario is due to a violation of project parameters, despite investments in safety.
The above scenarios do not consider options for improving the safety and health of workers, refining the economic conditions of the company, or improving the environmental conditions around the construction area. The Monte Carlo method makes it possible to optimize the environmental safety of construction by minimizing risks (by choosing an optimistic scenario) as a whole and preventing possible damage and costs for its elimination. The future development of the method would lead to a methodology based on it, which could be applied to assess environmental risks in the construction area to increase its environmental safety.

Author Contributions

Data curation, formal analysis, funding acquisition, project administration, resources, visualization, A.L.; data curation, funding acquisition, investigation, project administration; resources, supervision, validation, E.N.; conceptualization, formal analysis, methodology, project administration, visualization, writing—original draft preparation, writing—review and editing, E.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All the data are available within this manuscript.

Acknowledgments

This paper has been supported by the RUDN University Strategic Academic Leadership Program.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Notes

  • In practice, carrying out a quantitative assessment of risks, the following basic classification of methods is repeatedly used: statistical (to assess the probability of a random event occurring based on the relative frequency of occurrence of this event in a series of observations); analytical (to study functional dependence, modeling with probabilistic indicators); expert assessments (to analyze quantitative and qualitative groups of factors). The above qualitative and quantitative methods are used in the instruments of this classification. For example, the Monte Carlo method can be defined as an analytical method for modeling random variables to calculate the characteristics of their distribution; it is the construction of an artificial random process using conventional computing means. However, the main problem with using a classical scenario analysis for risk assessment is that the large number of possible combinations that can affect the resulting project performance indicator tends to infinity. Solving this problem is one of the main advantages of Monte Carlo simulation. However, such modeling refers to statistical methods, because knowing the distribution laws of variables, it is possible to obtain not a single value but the distribution of the resulting indicator for an unlimited number of different scenarios.
  • What is an environmentally friendly construction option? Any construction is unsafe and poses a threat to the environment. This is about minimizing the risk of negative impact by the natural and anthropogenic factors of an environmental hazard. The method of work represents several measures that provide, with a given probability, an admissible negative impact by these factors. Risk should always be viewed in the context of a decision scenario. Then, a risk accompanied by the best solution is acceptable. In our opinion, all other risks are unacceptable, even if they are less risky. This is an environmentally friendly construction option.
  • This definition is formulated in the current document adopted by the Federal Agency for Technical Regulation and Metrology. The state standard itself (i.e., GOST) was prepared by the Scientific Research Center for Control and Diagnostics of Technical Systems, submitted for consideration by the Technical Committee for Standardization TC 10 “Risk Management”. The standard is identical to ISO Guide 73: 2009 “Risk management—Vocabulary—Guidelines for use in standards”, IDT).
  • The statement “Ordinarily, construction must consider the multidimensional nature of the impact on environmental components” indicates the multidimensional nature of the environmental hazard. According to this article, several scenarios are considered that deal with natural and anthropogenic impacts on the environment. The multidimensional nature of an environmental hazard is attributed to several factors and has many properties because both the natural environment and the technogenesis itself are multidimensional, and together, they form some natural and technical system on a certain territory. In turn, the multidimensionality of a natural–technical system is determined by the number and difference of types of natural and artificial objects that give the entire system the specific properties of a multidimensional whole. As for the components of the natural environment, these are earth, subsoil, soils, surface and underground waters, atmospheric air, flora, fauna, and other organisms, as well as artificial objects that together provide favorable conditions for the existence of life. Any change in the parameters of the state of an environmental component at one of its levels of organization causes changes in all of its hierarchical levels. The lists of environmental components, whose descriptions are necessary for decision making, generally depend on the type of planned activity and the expected impacts. Indicative lists of this kind may be found in departmental instructions or corporate guidelines. An important role in clarifying which natural conditions and environmental components need to be described for a given type of project can be played by a scenario approach for assessing their environmental safety.
  • According to the “Methodological fundamentals for the analysis of hazards and risk assessment of accidents at hazardous production facilities” (Russia, 2016, https://docs.cntd.ru/document/1200133801 (accessed on 24 November 2021)), environmental damage for various potentially hazardous industrial and construction sites or projects is estimated as the sum of losses inflicted by each type of environmental pollution according to the following formula:
    C = E C A + E C G + E C L + E C B + E C W
    where ECA is compensation for damage from air pollution; ECG is compensation for damage from the pollution of water bodies (hydrosphere); ECL is for damage from soil pollution (lithosphere); ECB is for damage from biosphere pollution; ECW is for damage caused to the territory by construction waste. The answers to the question (1) who is compensated and (2) from whom are as follows: (1) representatives of the state technical supervision of Russia who monitor the operation of hazardous facilities and assess the damage caused to the environment due to accidents at hazardous production facilities; (2) law offenders, employees of organizations, operating hazardous production facilities
  • It can be stated that the probability is not determined statistically or mathematically but based on how the conditions are analyzed for the design, construction, and operation of the facility. This is partly true since each method of analysis has some limitations in terms of the results obtained; however, the likelihood certainly does not depend solely on the method of analysis used. The probability assessment is a purely systemic issue. The scenario approach was introduced so that a deterministic description of the probability of risk would be neither practically possible nor expedient. Risk assessment includes the entire range of methodological tools and the entire context in which it is structured and implemented. The mathematical and statistical probability is only a part of the specified context of estimation. Does risk analysis involve establishing causal relationships between a hazardous event and its sources and consequences? Of course. Yes. Is probability related to the frequency of the occurrence of events? Of course, yes. This is its “empirical” definition. Thus, it is often said that we cannot use probability because we do not have enough data. Considering the current definitions, we can see that this is misleading. When there are not enough data, there is no choice but to use probability. Therefore, it could be said that probability, as a science that works with a lack of data, fits methodologically into our research well. Risk should not be thought of as acceptable or in isolation but only in conjunction with the costs and benefits that come with this risk. Viewed in isolation, no risk is tolerable. The rational person will not take any risk at all, except, perhaps, in return for the benefits that come with it.
  • Initial data are based on the following Russian guidelines:
    “Methodological recommendations for risk assessment accidents of hydraulic structures reservoirs and storages of industrial waste”, N 9-4/02-644 of 14.08.2001, developed by the Research Institute “VODGEO” (Moscow), https://70.mchs.gov.ru/uploads/resource/2021-07-16/metodicheskie-rekomendacii-mchs-rossii_16264287321541523919.pdf (accessed on 24 November 2021).
    “Methodological recommendations for risk assessment accidents at hydraulic structures of water management and industry” of 01.01.2009, https://normativ.kontur.ru/document?moduleId=1&documentId=235742 (accessed on 24 November 2021). Using these guides, the probability of risk was calculated for four scenarios. Depending on the desired scenario, the value C can be any value (including generated by a random number generator) but must include at least 10 values for the convenience of plots. Integer sequences defined by sieves are an interesting phenomenon as the random sieve model is more natural compared with the simple stochastic prime number distribution model; in the sieve of Eratosthenes, numbers are sieved out in multiples of every number, which is not a multiple of some previous sieving number. As a result, a sequence of random numbers determined by “lucky numbers” appears.
  • Each of the four risk curves (four scenarios) presented in Figure 3 (Results of the estimated example for the objective function) is divided into two components. The new ensemble of risk curves has eight segments (second-level scenarios with five micro-scenarios). Information about them is analyzed in Table A1 and Table A2 and on the next four graphs. Table A1 shows that each scenario includes two types of contributions: (1) preventive environmental safety measures, s1, and (2) maintaining the reliability of the technical system, s2. The numerical values make it clear which type is prioritized in the total investment in a particular scenario. The columns indicate the values of risks specific to each of the five micro-scenarios. Further, to assess a specific share of investments (s1 or s2), the set of risk values in the rows is indicated. The following rows have the same information in percentages. A summary of each scenario is presented in the last lines. Table A1 introduces the data for the pessimistic scenario. Table A2 reflects information for an optimistic (one might say, optimal) option.
  • The authors aimed to identify important relationships between observations, model inputs, and predictions, and to develop a better scenario model. Sensitivity analysis for pessimistic options shows the importance of the s2 parameter. It is wise to focus on the first 10 parameters to control them and simplify the model by relegating other micro-scenarios to the background. It makes sense to address model inputs that do not affect the output or to identify redundant parts of the model structure. It is natural to reduce uncertainty by identifying the input data of the scenario model, which causes significant uncertainty in the output and, therefore, should be focused on increasing the stability of the forecast. Additionally, the most essential initial variables and their changes should be controlled first. Based on the results obtained, it can be concluded that fluctuations in the value of risk when only one scenario variable is changed are rather small; therefore, the forecast risk associated with this variable is low. As a result of the calculation, the risk sensitivity of the optimistic scenarios was determined. The sensitivity diagram shows that s2 and s1 are the key scenario parameters that determine the construction projects’ degree of risk. Therefore, it is important to control them in order to improve the environmental safety of projects. It can also be concluded that in pessimistic and optimistic scenarios, the s2 parameter is a priority contribution to improving environmental safety in the construction industry.

Appendix B. Figures and Tables

Figure A1. The whole flow diagram of the research method.
Figure A1. The whole flow diagram of the research method.
Sustainability 13 13539 g0a1
Table A1. Risk assessment values for investments in s1 and s2 (pessimistic options).
Table A1. Risk assessment values for investments in s1 and s2 (pessimistic options).
R1(c), p = 0.3∑ R1(c)∑ R1(c), %
preventive environmental safety measures, s10.360.420.490.550.562.3842.88%
maintaining the reliability of the technical system, s20.570.580.620.670.733.1757.12%
∑ R1(c)0.931.01.111.221.295.55100%
R2(c), p = 0.6∑ R2(c)∑ R2(c), %
preventive environmental safety measures, s10.580.560.540.520.52.751.92%
maintaining the reliability of the technical system, s20.480.490.50.510.522.548.08%
∑ R2(c)1.061.051.041.031.025.2100%
Table A2. Risk assessment values for investments in s1 and s2 (optimistic options).
Table A2. Risk assessment values for investments in s1 and s2 (optimistic options).
R3(c), p = 0.1∑ R3(c)∑ R3(c), %
preventive environmental safety measures, s10.180.270.350.440.521.7631.71%
maintaining the reliability of the technical system, s20.610.690.780.860.953.8970.09%
∑ R3(c)0.790.961.131.31.475.65100%
R4(c), p = 0.7∑ R4(c)∑ R4(c), %
preventive environmental safety measures, s10.660.620.510.470.432.6951.73%
maintaining the reliability of the technical system, s20.390.350.340.340.31.7233.08%
∑ R4(c)1.050.970.850.810.734.41100%
Figure A2. The risk value dependence on involved contributions amount (pessimistic scenarios).
Figure A2. The risk value dependence on involved contributions amount (pessimistic scenarios).
Sustainability 13 13539 g0a2
Figure A3. The risk value dependence on involved contributions amount (optimistic scenarios).
Figure A3. The risk value dependence on involved contributions amount (optimistic scenarios).
Sustainability 13 13539 g0a3
Figure A4. Statistical indicators after simulation (pessimistic scenarios).
Figure A4. Statistical indicators after simulation (pessimistic scenarios).
Sustainability 13 13539 g0a4
Figure A5. A sensitivity diagram for assessing the main parameters (pessimistic scenarios), after 10,000 trials.
Figure A5. A sensitivity diagram for assessing the main parameters (pessimistic scenarios), after 10,000 trials.
Sustainability 13 13539 g0a5
Figure A6. Statistical indicators after simulation (optimistic scenarios).
Figure A6. Statistical indicators after simulation (optimistic scenarios).
Sustainability 13 13539 g0a6
Figure A7. A sensitivity diagram for assessing the main parameters (optimistic scenarios), after 10,000 trials.
Figure A7. A sensitivity diagram for assessing the main parameters (optimistic scenarios), after 10,000 trials.
Sustainability 13 13539 g0a7
Figure A8. Number of incidents in the O-rings versus temperatures. Panel (a) includes only flights with incidents. Panel (b) contains all flights (with and without incidents) (https://bookdown.org/egarpor/PM-UC3M/glm-challenger.html (accessed on 24 November 2021)).
Figure A8. Number of incidents in the O-rings versus temperatures. Panel (a) includes only flights with incidents. Panel (b) contains all flights (with and without incidents) (https://bookdown.org/egarpor/PM-UC3M/glm-challenger.html (accessed on 24 November 2021)).
Sustainability 13 13539 g0a8
Figure A9. Posterior distribution for shuttle failure probability at 31° and 60 °F [62].
Figure A9. Posterior distribution for shuttle failure probability at 31° and 60 °F [62].
Sustainability 13 13539 g0a9

References

  1. Kuang, Z.; Gu, Y.; Rao, Y.; Huang, H. Biological risk assessment of heavy metals in sediments and health risk assessment in marine organisms from Daya Bay, China. J. Mar. Sci. Eng. 2020, 9, 17. [Google Scholar] [CrossRef]
  2. Pirsaheb, M.; Hadei, M.; Sharafi, K. Human health risk assessment by Monte Carlo simulation method for heavy metals of commonly consumed cereals in Iran: Uncertainty and sensitivity analysis. J. Food Compos. Anal. 2021, 96, 103697. [Google Scholar] [CrossRef]
  3. Von Neumann, J. Various techniques used in connection with random digits. Natl. Bur. Stand. Appl. Math. Ser. 1951, 12, 36–38. [Google Scholar]
  4. Peres, Y. Iterating von Neumann’s procedure for extracting random bits. Ann. Stat. 1992, 20, 590–597. [Google Scholar] [CrossRef]
  5. Metropolis, N. The beginning of the Monte Carlo method. Los Alamos Sci. 1987, 15, 125–130. [Google Scholar]
  6. Metropolis, N.; Ulam, S. The Monte Carlo method. J. Am. Stat. Assoc. 1949, 44, 335–341. [Google Scholar] [CrossRef] [PubMed]
  7. Gardiner, V.; Lazarus, R.; Metropolis, N.; Ulam, S. On certain sequences of integers defined by sieves. Math. Mag. 1956, 29, 117. [Google Scholar] [CrossRef] [Green Version]
  8. Smith, R.L. Use of Monte Carlo simulation for human exposure assessment at a superfund site. Risk Anal. 1994, 14, 433–439. [Google Scholar] [CrossRef] [PubMed]
  9. Sonnemann, G.; Castells, F.; Schuhmacher, M.; Hauschild, M. Integrated Life-Cycle and Risk Assessment for Industrial Processes; CRC Press: Boca Raton, FL, USA, 2004; 391p. [Google Scholar] [CrossRef]
  10. Rosa, E.A. Metatheoretical foundations for post-normal risk. J. Risk Res. 1998, 1, 15–44. [Google Scholar] [CrossRef]
  11. Renn, O.; Klinke, A. Risk governance and resilience: New approaches to cope with uncertainty and ambiguity. In Risk Governance: The Articulation of Hazard, Politics and Ecology; Paleo, U., Ed.; Springer: Dordrecht, The Netherlands, 2015; pp. 19–41. [Google Scholar]
  12. Campbell, S. Determining overall risk. J. Risk Res. 2005, 8, 569–581. [Google Scholar] [CrossRef]
  13. Wiener, J.B.; Graham, J.D. Resolving risk tradeoffs. In Risk versus Risk: Tradeoffs in Protecting Health and the Environment; Graham, J.D., Wiener, J.B., Sunstein, C.R., Eds.; Harvard University Press: Cambridge, MA, USA, 1997; pp. 226–272. [Google Scholar] [CrossRef]
  14. Lowrance, W.W. Of Acceptable Risk: Science and the Determination of Safety; W. Kaufmann: Los Altos, CA, USA, 1976; 180p. [Google Scholar]
  15. Kaplan, S.; Garrick, B.J. On the quantitative definition of risk. Risk Anal. 1981, 1, 11–27. [Google Scholar] [CrossRef]
  16. Fathi-Vajargah, B.; Hassanzadeh, Z. A new Monte Carlo method for solving systems of linear algebraic equations. Comput. Methods Differ. Equ. 2021, 9, 159–179. [Google Scholar]
  17. Wang, M.-J.; Sjoden, G.E. Experimental and computational dose rate evaluation using SN and Monte Carlo method for a packaged 241AmBe neutron source. Nucl. Sci. Eng. 2021, 195, 1154–1175. [Google Scholar] [CrossRef]
  18. Chapra, S.C. Applied Numerical Methods with MATLAB for Engineers and Scientists; McGraw-Hill Education: New York, NY, USA, 2018; 697p. [Google Scholar]
  19. Kushner, H.J.; Dupuis, P.G. Numerical Methods for Stochastic Control Problems in Continuous Time—Applications of Mathematics; Springer: New York, NY, USA, 2014; 475p. [Google Scholar]
  20. Branford, S.; Sahin, C.; Thandavan, A.; Weihrauch, C.; Alexandrov, V.; Dimov, I. Monte Carlo methods for matrix computations on the grid. Futur. Gener. Comput. Syst. 2008, 24, 605–612. [Google Scholar] [CrossRef]
  21. Rashki, M. The soft Monte Carlo method. Appl. Math. Model. 2021, 94, 558–575. [Google Scholar] [CrossRef]
  22. Løvbak, E.; Samaey, G.; Vandewalle, S. A multilevel Monte Carlo method for asymptotic-preserving particle schemes in the diffusive limit. Numer. Math. 2021, 148, 141–186. [Google Scholar] [CrossRef]
  23. Peter, R.; Bifano, L.; Fischerauer, G. Monte Carlo method for the reduction of measurement errors in the material parameter estimation with cavities. Tech. Mess. 2021, 88, 303–310. [Google Scholar] [CrossRef]
  24. Chen, G.; Wan, Y.; Lin, H.; Hu, H.; Liu, G.; Peng, Y. Vertical tank capacity measurement based on Monte Carlo method. PLoS ONE 2021, 16, e0250207. [Google Scholar] [CrossRef]
  25. Huo, X. A compact Monte Carlo method for the calculation of k∞ and its application in analysis of (n,xn) reactions. Nucl. Eng. Des. 2021, 376, 111092. [Google Scholar] [CrossRef]
  26. Choobar, B.G.; Modarress, H.; Halladj, R.; Amjad-Iranagh, S. Electrodeposition of lithium metal on lithium anode surface, a simulation study by: Kinetic Monte Carlo-embedded atom method. Comput. Mater. Sci. 2021, 192, 110343. [Google Scholar] [CrossRef]
  27. Sharma, A.; Sastri, O.S.K.S. Numerical solution of Schrodinger equation for rotating Morse potential using matrix methods with Fourier sine basis and optimization using variational Monte-Carlo approach. Int. J. Quantum Chem. 2021, 121, e26682. [Google Scholar] [CrossRef]
  28. Toropov, A.; Toropova, A.; Lombardo, A.; Roncaglioni, A.; Lavado, G.; Benfenati, E. The Monte Carlo method to build up models of the hydrolysis half-lives of organic compounds. SAR QSAR Environ. Res. 2021, 32, 463–471. [Google Scholar] [CrossRef] [PubMed]
  29. Che, Y.; Wu, X.; Pastore, G.; Li, W.; Shirvan, K. Application of Kriging and Variational Bayesian Monte Carlo method for improved prediction of doped UO2 fission gas release. Ann. Nucl. Energy 2021, 153, 108046. [Google Scholar] [CrossRef]
  30. Pitchai, P.; Jha, N.K.; Nair, R.G.; Guruprasad, P. A coupled framework of variational asymptotic method based homogenization technique and Monte Carlo approach for the uncertainty and sensitivity analysis of unidirectional composites. Compos. Struct. 2021, 263, 113656. [Google Scholar] [CrossRef]
  31. Toropova, A.P.; Toropov, A.A. Can the Monte Carlo method predict the toxicity of binary mixtures? Environ. Sci. Pollut. Res. 2021, 28, 39493–39500. [Google Scholar] [CrossRef]
  32. Lee, E.-K. Determination of burnup limit for CANDU 6 fuel using Monte-Carlo method. Nucl. Eng. Technol. 2021, 53, 901–910. [Google Scholar] [CrossRef]
  33. Oh, K.-Y.; Nam, W. A fast Monte-Carlo method to predict failure probability of offshore wind turbine caused by stochastic variations in soil. Ocean Eng. 2021, 223, 108635. [Google Scholar] [CrossRef]
  34. Oliver, J.; Qin, X.S.; Madsen, H.; Rautela, P.; Joshi, G.C.; Jorgensen, G. A probabilistic risk modelling chain for analysis of regional flood events. Stoch. Environ. Res. Risk Assess. 2019, 33, 1057–1074. [Google Scholar] [CrossRef]
  35. Stewart, M.G.; Ginger, J.D.; Henderson, D.J.; Ryan, P.C. Fragility and climate impact assessment of contemporary housing roof sheeting failure due to extreme wind. Eng. Struct. 2018, 171, 464–475. [Google Scholar] [CrossRef]
  36. Qin, H.; Stewart, M.G. Risk perceptions and economic incentives for mitigating windstorm damage to housing. Civ. Eng. Environ. Syst. 2020, 38, 1–19. [Google Scholar] [CrossRef]
  37. Malmasi, S.; Fam, I.M.; Mohebbi, N. Health, safety and environment risk assessment in gas pipelines. J. Sci. Ind. Res. 2010, 69, 662–666. [Google Scholar]
  38. Karagöz, D. Asymmetric control limits for range chart with simple robust estimator under the non-normal distributed process. Math. Sci. 2018, 12, 249–262. [Google Scholar] [CrossRef] [Green Version]
  39. Couto, P.R.G.; Carreteiro, J.; de Oliveir, S.P. Monte Carlo simulations applied to uncertainty in measurement. In Theory and Applications of Monte Carlo Simulations; Chan, V., Ed.; InTechOpen: London, UK, 2013; pp. 27–51. [Google Scholar]
  40. Kalos, M.H.; Whitlock, P.A. Monte Carlo Methods; Wiley-VCH: Weinheim, Germany, 2009; 203p. [Google Scholar]
  41. Bieda, B. Stochastic approach to municipal solid waste landfill life based on the contaminant transit time modeling using the Monte Carlo (MC) simulation. Sci. Total Environ. 2013, 442, 489–496. [Google Scholar] [CrossRef] [PubMed]
  42. Aczel, A.D. Statistics: Concepts and Applications; Irwin: Chicago, IL, USA, 1995; 533p. [Google Scholar]
  43. Benjamin, J.R.; Cornell, C.A. Probability, Statistics and Decision for Civil Engineers; Dover Publication: Mineola, NY, USA, 2018; 684p. [Google Scholar]
  44. Bieda, B. Stochastic Analysis in Production Process and Ecology under Uncertainty; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 2012; 189p. [Google Scholar]
  45. Taleb, N.N. The Black Swan: The Impact of the Highly Improbable; Random House: New York, NY, USA, 2007; 366p. [Google Scholar]
  46. Strakhova, N.A.; Karmazin, S.A. Characteristics of the most used methods of risk analysis. Eurasian Sci. J. 2013, 3, 122–128. Available online: http://naukovedenie.ru/PDF/22ergsu313.pdf (accessed on 24 November 2021).
  47. Jones, M.; Silberzahn, P. Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001; Stanford University Press: Stanford, CA, USA, 2020; 375p. [Google Scholar]
  48. Larionov, A.; Nezhnikova, E. Energy efficiency and the quality of housing projects. ARPN J. Eng. Appl. Sci. 2016, 11, 2023–2029. [Google Scholar]
  49. Smirnova, E.; Larionov, A. Justification of environmental safety criteria in the context of sustainable development of the construction sector. E3S Web Conf. 2020, 157, 06011. [Google Scholar] [CrossRef] [Green Version]
  50. Larionova, Y.; Smirnova, E. Substantiation of ecological safety criteria in construction industry, and housing and communal services. IOP Conf. Ser. Earth Environ. Sci. 2020, 543, 012002. [Google Scholar] [CrossRef]
  51. Smirnova, E. Environmental risk analysis in construction under uncertainty. In Reconstruction and Restoration of Architectural Heritage; Sementsov, S., Leontyev, A., Huerta, S., Menéndez Pidal de Nava, I., Eds.; CRC Press: London, UK, 2020; pp. 222–227. [Google Scholar] [CrossRef]
  52. Kingman, J. Poisson Processes; Oxford Studies in Probability; Clarendon: Oxford, UK, 2002; 104p. [Google Scholar]
  53. Aven, R. Risk analysis and management: Basic concepts and principles. Reliab. Theory Appl. 2009, 1, 57–73. [Google Scholar]
  54. Stoica, G. Relevant coherent measures of risk. J. Math. Econ. 2006, 42, 794–806. [Google Scholar] [CrossRef]
  55. Smirnova, E. Monte Carlo simulation of environmental risks of technogenic impact. In Contemporary Problems of Architecture and Construction; Rybnov, E., Akimov, P., Khalvashi, M., Vardanyan, E., Eds.; CRC Press: London, UK, 2021; pp. 355–360. [Google Scholar] [CrossRef]
  56. Zadeh, L.A. Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets Syst. 1999, 100, 9–34. [Google Scholar] [CrossRef]
  57. Pappenberger, F.; Beven, K.J. Ignorance is bliss: Or seven reasons not to use uncertainty analysis. Water Resour. Res. 2006, 42, W05302. [Google Scholar] [CrossRef]
  58. Silberzahn, P. Welcome to Extremistan: Why Some Things Cannot be Predicted and What That Means for Your Strategy. 2011. Available online: https://silberzahnjones.com/2011/11/10/welcome-to-extremistan/ (accessed on 24 November 2021).
  59. Spiegelhalter, D.; Pearson, M.; Short, I. Visualizing uncertainty about the future. Science 2011, 333, 1393–1400. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  60. Semenoglou, A.-A.; Spiliotis, E.; Makridakis, S.; Assimakopoulos, V. Investigating the accuracy of cross-learning time series forecasting methods. Int. J. Forecast. 2021, 37, 1072–1084. [Google Scholar] [CrossRef]
  61. Dalal, S.R.; Fowlkes, E.B.; Hoadley, B. Risk analysis of the space shuttle: Pre-Challenger prediction of failure. J. Am. Stat. Assoc. 1989, 84, 945–957. [Google Scholar] [CrossRef]
  62. Kelly, D.L.; Smith, C.L. Risk analysis of the space shuttle: Pre-Challenger Bayesian prediction of failure. In Proceedings of the Conference on NASA Systems Safety Engineering and Risk Management, Los Angeles, CA, USA, 20 February 2008; Office of Nuclear Energy, Science, and Technology/Idaho National Laboratory: Washington, DC, USA, 2008; pp. 1–12. [Google Scholar]
  63. Portugués, E.G. Notes for Predictive Modeling. Version 5.9.0. 2021. Available online: https://bookdown.org/egarpor/PM-UC3M/ (accessed on 24 November 2021).
Figure 1. Illustrations of the methodologies of (a) propagation of uncertainties and (b) propagation of distributions. Source: https://www.intechopen.com/chapters/43533 (accessed on 24 November 2021).
Figure 1. Illustrations of the methodologies of (a) propagation of uncertainties and (b) propagation of distributions. Source: https://www.intechopen.com/chapters/43533 (accessed on 24 November 2021).
Sustainability 13 13539 g001
Figure 2. Initial data of scenarios for the target function and the graph of changes in the prevented damage.
Figure 2. Initial data of scenarios for the target function and the graph of changes in the prevented damage.
Sustainability 13 13539 g002
Figure 3. Results of the estimated example for the objective function.
Figure 3. Results of the estimated example for the objective function.
Sustainability 13 13539 g003
Figure 4. “Fit Distribution” tool (Crystal Ball program, Ver. 11.1.2.4.850).
Figure 4. “Fit Distribution” tool (Crystal Ball program, Ver. 11.1.2.4.850).
Sustainability 13 13539 g004
Figure 5. Logistic distribution (for scenarios 1 and 2) during Monte Carlo simulation modelling. (a) Logistic distribution with μ = 0.54 and s = 0.04. (b) Parameters: Mode, 0.54; Mean, μ; and Scale, s > 0.
Figure 5. Logistic distribution (for scenarios 1 and 2) during Monte Carlo simulation modelling. (a) Logistic distribution with μ = 0.54 and s = 0.04. (b) Parameters: Mode, 0.54; Mean, μ; and Scale, s > 0.
Sustainability 13 13539 g005aSustainability 13 13539 g005b
Figure 6. Lognormal distribution (for scenarios 3 and 4) during Monte Carlo simulation modelling. (a) Lognormal distribution with Mode = 0.39, μL = 0.51 and σL = 0.22. (b) Parameters: Mode = 0.39. Mean, μL > 0; and Std. Dev., σL > 0.
Figure 6. Lognormal distribution (for scenarios 3 and 4) during Monte Carlo simulation modelling. (a) Lognormal distribution with Mode = 0.39, μL = 0.51 and σL = 0.22. (b) Parameters: Mode = 0.39. Mean, μL > 0; and Std. Dev., σL > 0.
Sustainability 13 13539 g006aSustainability 13 13539 g006b
Figure 7. Graphical methods checking logistic distribution: Normal Distribution, Histogram and QQ-Plot chart.
Figure 7. Graphical methods checking logistic distribution: Normal Distribution, Histogram and QQ-Plot chart.
Sustainability 13 13539 g007
Figure 8. Graphical methods checking lognormal distribution: normal distribution, histogram, and QQ-plot chart.
Figure 8. Graphical methods checking lognormal distribution: normal distribution, histogram, and QQ-plot chart.
Sustainability 13 13539 g008
Table 1. Data on the parameters of the probability distribution.
Table 1. Data on the parameters of the probability distribution.
( 1 P i ) × D I R i ( D ) = ( 1 P i ) × D I + P i [ C c i ( D ) ] C P i   [ C c i   ( D ) ] C
D1D2D3D4R1(c)R2(c)R3(c)R4(c)P1P2P3P4
00000.30.60.10.70.30.60.10.7
0.070.040.090.030.360.580.1850.660.270.540.0950.63
0.140.080.180.060.420.560.270.620.240.480.090.56
0.210.120.270.090.490.540.3550.510.180.420.0850.42
0.280.160.360.120.550.520.440.470.150.360.080.35
0.350.20.450.150.560.50.5250.430.120.30.0750.28
0.420.240.540.180.570.480.610.390.090.240.070.21
0.490.280.630.210.580.490.6950.350.060.210.0650.14
0.560.320.720.240.620.50.780.3450.0450.180.060.105
0.630.360.810.270.670.510.8650.340.030.150.0550.07
0.70.40.90.30.730.520.950.300.120.050
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Larionov, A.; Nezhnikova, E.; Smirnova, E. Risk Assessment Models to Improve Environmental Safety in the Field of the Economy and Organization of Construction: A Case Study of Russia. Sustainability 2021, 13, 13539. https://doi.org/10.3390/su132413539

AMA Style

Larionov A, Nezhnikova E, Smirnova E. Risk Assessment Models to Improve Environmental Safety in the Field of the Economy and Organization of Construction: A Case Study of Russia. Sustainability. 2021; 13(24):13539. https://doi.org/10.3390/su132413539

Chicago/Turabian Style

Larionov, Arkadiy, Ekaterina Nezhnikova, and Elena Smirnova. 2021. "Risk Assessment Models to Improve Environmental Safety in the Field of the Economy and Organization of Construction: A Case Study of Russia" Sustainability 13, no. 24: 13539. https://doi.org/10.3390/su132413539

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop