Evaluation of Machinery Readiness Using Semi-Markov Processes

: This article uses Markov and semi-Markov models as some of the most popular tools to estimate readiness and reliability. They allow to evaluate of both individual elements as well as entire systems—including production systems—as multi-state structures. To be able to distinguish states with varying degrees of technical readiness in complicated and complex objects (systems) allows to determine their individual impact on the tasks performed, as well as on the total reliability. The application of the Markov process requires, for the process dwell times in the individual states, to be random variables of exponential distribution and the fulﬁlling Markov’s property of the independence of these states. Omitting these assumptions may lead to erroneous results, which was the authors’ intention to show. The article presents a comparison of the results of the examination of the process of non-parametric distribution with an analysis in which its exponential form was (groundlessly) assumed. Signiﬁcantly di ﬀ erent results were obtained. The aim was to draw attention to the inconsistencies obtained and to the importance of a preliminary assessment of the data collected for examination. The diagnostics of the machine readiness operating in the studied production company was additionally performed. This allowed to evaluate its operational potential, especially in the context of solving process optimization problems.


Background Introduction to the Study
Ensuring desired availability of all machine tools in a production line is an important issue [1,2]. It stands for their ability to obtain and maintain the functional state necessary to produce the required performance [3][4][5]. The technical readiness of machines is an important element of the company diagnostics and should be estimated, as its evaluation helps shape the capacity of a production line. High machine tools reliability translates into no unnecessary downtime and, consequently, greater process efficiency. Machine tools must be technically sound, adequately controlled and supplied with necessary materials, energy and information [6]. The availability of a machine tool is determined using a probability theory-based reliability model. In probability theory, the state of an object is defined as the result of one and only one event in a sequence of trials of finite or computable set of elementary events excluding each other in pairs [7]. This makes it possible to use the tools of probability calculus and mathematical statistics to analyze technical systems. When machine tools are in operation they stochastically transit from one state to another. As a result, transition probabilities are associated with all machine tools in a production line. Therefore, Markov chain and its derivatives are often used to set a model of reliability. Some of the relevant articles where Markov chain-based reliability models are used to study the availability of machines tools in a production line are described below.
The use of Markov processes and their generalization-semi-Markov processes-are popular. Their use is dictated by the multi-states condition of the technical objects and the assumption that the assessment of individual functional states the object is in, is a better measure than the readiness of the object as a whole. However, the use of these models is subject to restrictions. First of all, it is necessary to fulfil the Markov property which states that the probability of a future state is independent of the past states, and depends only on the present state. Identifying a model without meeting this assumption may lead to false conclusions, which is suggested by many authors [8,9]. They point out that ignoring the Markov property examination will results in incorrect analysis results, e.g., Shi et al. [10], Zhang et al. [8], or Kozłowski et al. [11]. Therefore, it is necessary to examine the randomness of sequences of subsequent operational states, as it is done by Yang et al. [12] or Komorowski and Raffa [13].
In addition, Markov's models require meeting the assumption that the unconditional process dwell times in the individual states and the conditional durations of an individual state, are random variables of exponential distribution, provided that the next one is one of the remaining states [14,15]. Many authors point out that proper matching of distributions affects the reliability of results [13,16]. They use Markov's model for exponential distributions [17,18], and the semi-Markov model for the remaining ones, e.g., Weibull [19] or Gamma [20]. Adoption of only the assumption on the form of distribution without a statistical survey of the collected sample may lead to wrong conclusions.

The Aim of the Study
The need to check Markov's property is discussed in more detail in the literature [10,11], while less attention is paid to the distribution of variables studied. Therefore, this publication compares the results of process examination according to the semi-Markov model for variables of non-parametric distribution with the analysis according to the Markov model, in which their exponential form was (falsely) assumed. The differences in the results obtained clearly indicate that it is necessary to carry out a preliminary test before choosing the right model. Failure to meet the assumptions leads to an inaccurate analysis of the process.
The aim of the article was also to evaluate the readiness of a production machine, which is an important element of the analyzed production process. The results obtained made it possible to determine the probabilities of transitions between the individual states distinguished in the production process, as well as to define limit probabilities and the technical readiness coefficient. This allows to assess the compliance of the functioning of the analyzed process with the schedule adopted in the company, or to evaluate the results of production abilities. The proposed models can also be used to simulate the production process, e.g., at the design phase.
The article consists of five sections. The first one presents an analysis of the literature on the application of Markov models for studying the technical readiness of machine tools in a production line. Section 2 presents a mathematical formulation of the research problem. In Section 3, a description of the studied company was given and data analysis was carried out in terms of studying the Markov property and the form of distribution of variables. Section 4 presents a case study containing the estimation of Markov and semi-Markov models parameters, as well as a numerical example and accurate calculations according to the developed model. The article ends with conclusions describing the goals achieved and indicating the added value of the study.

Mathematical Modeling
Definition 1. Let us consider a random process with a finite state space S = {1,..., s}, s < ∞. Let (Ω, F , P) be a probabilistic space and X (t) : t ∈ T a stochastic process defined for (Ω, F , P), taking values from the finite or calculable set S. Process X(t) : t ∈ T is called a Markov process if for each i, j, i 0 , i 1,..., i n−1 ∈ S and for each t 0 , t 1 , . . . , t n , t n+1 ∈ T meeting the requirement t 0 < t 1 < t n < t n+1 the dependency given below is met: Assuming that t n = u, t n+1 = τ, then the conditional probability: for i, j ∈ S, where p ij (u, s) denotes the probability of transition from state i at time u, to state j at time s. Assuming that t 0 , t 1 , . . . , t n−1 denote time (instants) from the past, t n denotes the present instant, and t n+1 the time in the future, the equation says that the future does not depend on the past when the present is known, thus the probability of the future state is independent of the past states, but only of the present state. This property is called Markov's property, and the stochastic process that satisfies it, a memoryless process. If instants of time are discrete, T = N 0 = {0, 1, 2, . . .}, then we are dealing with Markov's chain, and when the process is realized in a continuous time T = R + = [0, ∞), it is a continuous-time Markov process.
For the stochastic process X(t) : t > 0 taking values from the finite or countable set S with fixed and right-hand continuous phase trajectories in some sections, and for τ 0 = 0 which marks the start of the process and τ 1 , τ 2 , . . . which denotes successive times of change of states, the random variable: denotes the waiting time in the state i when a successor state is unknown. From the Chapman-Kolmogorov equation, it follows [21,22] that the process dwelling times in the individual states constitute random variables with exponential distributions and with the parameter λ i > 0: where G i is a cumulative probability distribution of a random variable T i [23] when a successor state is unknown. The generalization of Markov processes are semi-Markov processes, for which dwelling times in the individual states can have arbitrary distributions, concentrated in the set [0, ∞]. Based on [24,25] it was assumed in this article to define the semi-Markov process with a finite set of states starting from Markov renewal process.
In the probabilistic space (Ω, F , P) random variables are defined for each n ∈ N: A two-dimensional sequence of random variables (ξ n , ϑ n ) : n ∈ N is referred to as the Markov renewal process if for each n ∈ N, i, j ∈ S, t ∈ R + : and This definition shows that the Markov renewal process is a specific case of the two-dimensional Markov process. Transition probabilities of this process depend solely on the discrete value of the coordinate. The Markov renewal process (ξ n , ϑ n ) : n ∈ N is called homogeneous if the probabilities: Do not depend on n.
From the above definition, it follows that for each pair (i, j) ∈ SxS function Q ij (t) is [24,25]: Functional matrix: is called the renewal kernel of the semi-Markov process and together with the initial distribution: characterizes the homogeneous Markov renewal process. Semi-Markov process is defined based on the homogeneous Markov renewal process (ξ n , ϑ n ) : n ∈ N . Let: The stochastic process X(t) : t ∈ R + , which assumes a constant value in the range (τ n+1 ), n ∈ N: is called the semi-Markov process. Markov and semi-Markov models are particularly often used to assess the readiness and reliability of technical facilities or their individual components [26][27][28]. Various systems, including production ones [29,30], are analyzed both in terms of maintaining operability [31], production organization [32] as well as shaping of the demand [33]. This article analyzes the production system from the point of view of machine readiness to perform production tasks.

Description of the Company Studied
The subject of the research is a company manufacturing plastic garbage bags. It is a three shift serial production, with 8-h shifts. The roller welding machines, which weld and perforate finished rolls of polyethylene film, constitute a critical element of the whole process. Among all the machines in the production line, the efficiency of the roller welding machines is the lowest, they have the highest failure rate and their downtimes lead to substantial increase in costs, making them a bottleneck in the process. This is why they became the subject of the study. The analysis was carried out on the example of a selected model, marked with the H2 symbol.
The analysis of the activities carried out when operating the roller welding machines allowed to distinguish the states of the machine. They are presented in the Table 1. Maintenance activities (cleaning, reorientation, knife adjustment, inspection, roll change, consultation, raw material preparation) S 5 Scheduled employee breaks S 6 Downtime due to lack of raw materials Among the selected states, those directly related to the production process should be distinguished: S 1 -manufacturing process, S 4 -necessary maintenance activities to keep the machine in good working order and to prepare it for the manufacturing process. Planned employee breaks, resulting from the Labor Code (S 5 ), also constitute a necessary element of the manufacturing process. Other states should be identified as undesirable. These include the stoppage in the manufacturing process, due to lack of orders, S 3 , and lack of raw materials, S 6 .
The relationships between the individual states are shown in Figure 1.
Appl. Sci. 2020, 10, x FOR PEER REVIEW 5 of 15 State S1 Operation S2 Failure S3 Downtime due to lack of orders S4 Maintenance activities (cleaning, reorientation, knife adjustment, inspection, roll change, consultation, raw material preparation) S5 Scheduled employee breaks S6 Downtime due to lack of raw materials Among the selected states, those directly related to the production process should be distinguished: S1-manufacturing process, S4-necessary maintenance activities to keep the machine in good working order and to prepare it for the manufacturing process. Planned employee breaks, resulting from the Labor Code (S5), also constitute a necessary element of the manufacturing process. Other states should be identified as undesirable. These include the stoppage in the manufacturing process, due to lack of orders, S3, and lack of raw materials, S6.
The relationships between the individual states are shown in Figure 1.

Studying Markov's Property
In the first stage, the lack of memory characteristic of the process was assessed. The goodness of fit test 2 was used for the study, defining at the level of significance = 0.05 the zero hypothesis assuming that the chain studied meets the Markov property, and the alternative hypothesis that the Markov property is not met [11]. The test statistic 2 = 228.7, while related to it p-value = 0.264, which means that there are no grounds to reject the zero hypothesis on the chain meeting the Markov property.

Studying Markov's Property
In the first stage, the lack of memory characteristic of the process was assessed. The goodness of fit test χ 2 was used for the study, defining at the level of significance α = 0.05 the zero hypothesis assuming that the chain studied meets the Markov property, and the alternative hypothesis that the Markov property is not met [11]. The test statistic χ 2 = 228.7, while related to it p-value = 0.264, which means that there are no grounds to reject the zero hypothesis on the chain meeting the Markov property.

Fit of Distributions
The next step was to assess the form of distributions of the individual states. Considerations in this respect were presented using the example of state S 6 , and for the others the same was done. The goodness of fit to the selected theoretical distributions that were considered most likely was verified based on a Cullen and Frey graph, presented for state S 6 in Figure 2.

Fit of Distributions
The next step was to assess the form of distributions of the individual states. Considerations in this respect were presented using the example of state S6, and for the others the same was done. The goodness of fit to the selected theoretical distributions that were considered most likely was verified based on a Cullen and Frey graph, presented for state S6 in Figure 2. For further analysis, the Weibull and Beta distributions were selected, for which the estimated parameters are presented in Table 2, while the goodness of fit of empirical data to the individual distributions is presented in Figure 3. For further analysis, the Weibull and Beta distributions were selected, for which the estimated parameters are presented in Table 2, while the goodness of fit of empirical data to the individual distributions is presented in Figure 3.  For each of them the (Akaike information criterion) was calculated according to the formula (16) and based on that, the one with the better fit was selected.
where -number of parameters in the model, -credibility function. The same calculations were made for the other states. The proposed distributions are presented in Table 3.  For each of them the AIC (Akaike information criterion) was calculated according to the formula (16) and based on that, the one with the better fit was selected.
where k-number of parameters in the model, L-credibility function. The same calculations were made for the other states. The proposed distributions are presented in Table 3. Not all the distributions could be fitted to the parametric ones. Moreover, none of the distributions belong to the family of exponential distributions, which is a condition for using the Markov process [25]. The form of distributions makes the parameters estimation possible based on the semi-Markov model only. In order to compare whether meeting the condition of the distribution form affects the obtained results, the subsequent part of the article compares the limit values of the probabilities of the object's dwelling time according to two different models.

Estimation of the Semi-Markov Model Parameters
First of all, based on the actual relationship between the states defined in Figure 1, the transition probability matrix was calculated. If n i denotes the number of instants of the system waiting in state s i , while n ij denotes the number of state transitions from state s i to state s j , then the transition estimator from state s i to state s j shall be determined from the formula: The distribution of probability of changes of the distinguished operating states (in one step), assuming that each graph arch of the exploitation process representation (Equations (2) and (10)) corresponds to the value of probability p ij , is presented in Table 4. For the process studied, some limits exist: lim n→∞ p ij (n) = π j i, j = 1, 2, . . . , 6 i j, where p ij (n)-probability of transition from state S i to state S j in n steps.
The solution are the stationary probabilities presented in Table 5. The analysis of the stationary distribution showed ( Table 5) that the limit highest transitions probabilities concern states (S 1 , S 4 , S 5 ) related to standard activities resulting from the production process technology (all over 27%). Indications of undesirable conditions such as failure (S 2 ) or downtime (S 3 , S 6 ) range from 4% to 8%, which is a good result.
The calculated limit probabilities relate to the frequency of observations in the sample and do not take into account the duration of individual states, therefore, the limit distribution of the semi-Markov process represents more significant diagnostics. It can be determined using the stationary distribution of the Markov chain and the expected duration of the process states [24,25]. Then the limit probabilities of semi-Markov process are expressed by the formula: The solution requires to calculate the following forms from the sample of average conditional durations of the process states: that are presented in Table 6. Based on the transitions probabilities matrix P = p ij (Table 5) and the matrix of average conditional durations of the states of the process T = T ij of random variables T ij (Table 6), dependencies describing average unconditional durations of the process states were determined T j according to the formula: For this purpose, the following equation system was solved: The obtained expected values of unconditional dwelling T i times of the process X(t) in the individual operational states are presented in Table 7. The calculated random variables T i have finite, positive expected values. This allows to calculate, based on theorem (27), the limit probabilities P j which are presented in Table 8. Thus determined probabilities P j are limit probabilities determining that the system will remain, for a longer period ( t → ∞ ), in the given operational state. This prognosis is more satisfactory than for the frequency of the states occurring. The highest values are achieved by state S 1 , i.e., operation (over 65%) and less than 17% by state S 4 , which stems from the necessity to perform maintenance activities. The remaining limit values are satisfactorily small, which shows the correct operation of the machines.
The technical readiness factor was also determined in the form of the sum of appropriate probabilities of reliability states [34]. For the system under analysis, S 1 , S 4 , S 5 were considered as fitness states, while the states S 2 S 3 and S 6 as unfitness states. Then, the readiness of the 6-element semi-Markov model can be calculated as the sum of limit probabilities of the respective states: This gives K = 0.85, which means that the machine is in the readiness state for over 85% of the time, which is a very good result.

Calculations According to the Markov Model
Markov processes concern exponential distributions, the most popular ones in reliability theory [11,35]. They are described by two parameters, which fully define them. The first of these is the already calculated probability matrix of interstate transitions p ij ( Table 4).
The second, important parameter is the function describing the transitions of objects between states, called the process transition intensity λ ij (t), which characterizes the rate of changes in the probability of transition p ij (t) [36].
For homogeneous Markov processes, the transition intensity is constant and equal to the inverse of the expected duration t ij of the state S i before S j [37]: where: λ ij (t)-intensity of transitions from the state i to state j, E(t ij )-expected duration value t ij .
The intensities λ ii ≤ 0 for i = j are defined as a complement to the sum of transition intensity from state S i for i j to 0: thus: The modules |λ ii | = −λ ii are called the exit intensities from the state S i .
Calculated according to the above formulas (33)- (35), the element λ ij of the matrix Λ of transition intensity is shown in Table 9. Then, using the relationship (36), ergodic probabilities p j were calculated for the Markov model in continuous time. where: • T = p j T = [p 1 ; ; p ns ]-transposed vector of limit probabilities p j , • |Λ|-transition intensity matrix: This way, for the process studied, we obtain the following matrix Equation (37): Taking into account the normalization condition: 6 j=1 p j = 1, we get the limit probabilities p j of the system's dwelling time in the states S 1 -S 6 , which are shown in Table 10. The results obtained deviate from the values determined for the semi-Markov process, disturbingly revealing that the system studied tends primarily to remain in the downtime state (S 3 ). The state in which the production takes place (S 1 ) comes only second and takes the value lower by over 35% in relation to calculations made according to the semi-Markov process. A comparison of the other results is presented in Table 11. The highest difference concerns state S 3 , and is over 534%. The technical readiness coefficient was also calculated based on the (31), which for the Markov process amounts to 45% (K = 0.45)-almost half the size of what was determined according to the semi-Markov process.

Conclusions
The study achieved two important goals. The first of them was a presentation of the method of evaluating the readiness of a selected element of a machine tools in the production system. The analysis according to the Markov chain allowed to determine the probabilities of interstate transitions, which reflect the frequency of the occurrence of individual states. The highest values were achieved for relations S 6 -S 4 , S 3 -S 4 , S 4 -S 5 . They suggest a high incidence of unsuitability states-S 3 and S 6 -and the need to determine their causes and reduce their occurrence.
The limit values of transition probability were also calculated. The analysis of stationary distribution showed that the greatest indications concern states related to activities resulting from production process technology (S 1 , S 4 , S 5 ), which is a good result.
However, a complete evaluation is only ensured by an analysis according to the semi-Markov process, taking into account the average dwell times of an object in the individual operating states. The calculated probability limits, examining the behavior of the object for t → ∞ , were the highest for state S 1 -operation (over 65%) and state S 4 -service (almost 17%). The remaining limit values were found to be satisfactorily low, which means that the operation of the machine should be considered as proper. The calculated technical readiness rate of 85% should also be viewed as positive.
Such an analysis not only provides information on the assessment of the current and expected functioning of the machine, but also reveals areas where modifications can be made in order to increase the level of availability and, as a result, ensure more efficient execution of production orders.
Another goal was to compare the results according to the assumptions made, concerning the forms of distribution of the examined variables. In the literature this analysis is often omitted and it is assumed that the examined variable has an exponential distribution. This allows to use Markov's processes, whose parameter estimation is simpler and is described in more detail in publications. Such an assumption-as the study has shown-may lead to different results and effectively to form an incorrect assessment of the process/system studied. The intention of the authors was to indicate that omitting an important stage of statistical analysis of the collected data and assuming a priori the form of distributions does not guarantee the correctness of the obtained analyses.
In the presented study, the differences in the values of the calculated limit probabilities are large, reaching even over 530%. The overall evaluation of system readiness indicates a value lower by 46% in the case of the Markov process analysis.
However, the problem is not only the value of the calculated probabilities, but also the main aim of the system. According to the semi-Markov process, the system tends primarily to occupy state S 1 (operation) which is a satisfactory result, emphasizing the proper implementation of tasks. The results according to the Markov process show that the system tends to occupy mainly state S 3 -downtime, which indicates mismanagement and system inactivity.
The goals set by the authors have been achieved, but it should be stressed out that the results obtained concern only one selected machine. As part of further research, it is worth considering a comprehensive analysis of the entire production system using the method indicated in the article. It will provide complete information on its readiness, determine the level of impact of individual elements (machines), and identify areas for improvement.

Conflicts of Interest:
The authors declare no conflict of interest.