Entropic Measure of Epistemic Uncertainties in Multibody System Models by Axiomatic Design

In this paper, the use of the MaxInf Principle in real optimization problems is investigated for engineering applications, where the current design solution is actually an engineering approximation. In industrial manufacturing, multibody system simulations can be used to develop new machines and mechanisms by using virtual prototyping, where an axiomatic design can be employed to analyze the independence of elements and the complexity of connections forming a general mechanical system. In the classic theories of Fisher and Wiener-Shannon, the idea of information is a measure of only probabilistic and repetitive events. However, this idea is broader than the probability alone field. Thus, the Wiener-Shannon’s axioms can be extended to non-probabilistic events and it is possible to introduce a theory of information for non-repetitive events as a measure of the reliability of data for complex mechanical systems. To this end, one can devise engineering solutions consistent with the values of the design constraints analyzing the complexity of the relation matrix and using the idea of information in the metric space. The final solution gives the entropic measure of epistemic uncertainties which can be used in multibody system models, analyzed with an axiomatic design.


Introduction
Multibody systems represent a special class of mechanical systems made of rigid and/or flexible bodies, mutually interconnected by joint constraints, and subjected to external force fields [1][2][3][4][5][6][7][8][9][10].Several examples of such complex systems can be found in industrial engineering applications [11][12][13][14][15][16][17][18][19][20].The complexity of the dynamic behavior of such constrained mechanical systems requires the development of advanced analysis and modelling tools for performing virtual prototyping in a multibody framework [21][22][23][24][25][26][27][28].The design solutions employed in industrial applications are engineering approximations based on technical as well as economic constraints, and are usually obtained by using the ability and the experience of the designers.Considering the optimization problem for the design of a general mechanical system, the interdependencies of the connections between the elements that form the systems can be analyzed employing an axiomatic design.For this purpose, a method based on Wiener-Shannon's axioms theory can be used.Appling a theory of information for non-repetitive events as a measure of the reliability of data, Wiener-Shannon's axioms can be extended to non-probabilistic events involving complex mechanical systems.By doing so, engineering solutions consistent with the design constraints can be obtained.Analyzing the complexity of the relation matrix and using the idea of information in the metric space, the resulting solution gives the entropic measure of epistemic uncertainties.In this paper, the application of the MaxInf Principle in real optimization problems is investigated and the measure of epistemic uncertainties is employed in the design process of multibody system models analyzed within the framework of the axiomatic design.
Uncertainties in design are known in different forms.The two major categories are the aleatory uncertainty and the epistemic uncertainty.The aleatory uncertainties can be taken into account using the probability theory.The epistemic uncertainty can be taken into account incorporating uncertainty in design analysis.In multibody system simulations, axiomatic design can be used for qualitative analysis and the quantification of model uncertainty.In general, the design process gives the structure necessary for the transformation of the qualitative needs, often stated in non-engineering terms, into the real products.In the simulation of real problems, the different source of data can exhibit a high level of error and, consequently, the final result cannot be considered as satisfactory.This transformation is achieved through the application of scientific knowledge to the problems.Using previous design databases, the design process generates several alternatives to be frequently evaluated.Usually, the design process is divided into a series of phases with given specifics.Evaluations are needed to make educated choices between these phases.Each evaluation determines whether the phase needs to be repeated, or if the designer needs to go back to one or more phases.
In the axiomatic design of highly complex cases, the process optimizes elements using a set {FR_i} of functional requirements and a set {DP_j} of physical parameters.The axiomatic method can also be used for the optimal design of rigid-flexible multibody systems based on new formulation strategies.
In axiomatic designs, the transformation of needs into functional requirements are given by the following matrix: The structure of the matrix [A] gives the epistemic uncertainty.While the aleatory variability is the natural randomness in a process, the epistemic uncertainty is the scientific uncertainty in the model of the process.The epistemic uncertainty is due to limited data or poor knowledge.In the axiomatic design method, the scheme process optimizes elements using a set {FRi} of functional requirements and a set {DPj} of physical parameters.To compare two design solutions on the basis of axiomatic design postulates, one can compare the information content of the two designs which can satisfy the functional parameters.The information content can be described by means of a mathematical framework similar to Wiener-Shannon's theory.When the number of DPs < FRs, then the DPs are insufficient to achieve all of the FRs.Then, epistemic uncertainty is maximal.In this respect, the numerical simulation can help to reduce the model uncertainties.The multibody system simulations are a great help in the product development and can be used to evaluate the safety and human-machine interaction as well as the alternative solutions.
In axiomatic designs, there are two axioms that could help to achieve a good design.The relation of {FR_i} with {DP_j} is mathematically expressed as: The design process is reduced to a series of mappings from the design's functional requirements into the design's parameter space.The mapping process between the domains is repeated several times, so that the previous design parameters determine the next set of functional requirements.The domains are defined by the vectors: {FR} = vector of functional domain {DP} = vector of physical domain {PV} = vector of process domain In the classical design before of the introduction of the axiomatic design approach, there was a unique domain.In this latter approach, we are able to find all the information.In Figure 1, one can see the transformation steps from the traditional design and the axiomatic design.The design steps can be described as a statements system.The information was chosen as shown in Figure 2. The relation between these two domains in matrix notation can be written as: where [A] and [B] are the design matrices.In the problems in which the {FR_i} depend on non-linear functions, Equation ( 2) can be written in the following differential form: The elements of design matrix [A] can be written as = and the design matrix is given by: The design steps can be described as a statements system.The information was chosen as shown in Figure 2. The relation between these two domains in matrix notation can be written as: where [A] and [B] are the design matrices.In the problems in which the {FR_i} depend on non-linear functions, Equation ( 2) can be written in the following differential form: The elements of design matrix [A] can be written as and the design matrix is given by: A small change in a parameter may cause a deviation in the functional requirement given by: In linear design, A ij = ∂FR i /∂DP j are constants.Therefore, one has: In Equation ( 2), the diagonal matrix is a special case.The design matrix [A], in general, is a rectangular array of values.In the axiomatic design, two axioms are used on functional requirement, in order to examine the actions of planning.The axioms are: The first axiom states that the independence of functional set {FR_i} must be always maintained.The information axiom states that the best design has the minimum information and the minimum of functional requirements.To compare two designs, one can compare the information content of the two designs that can satisfy the functional parameters.The information content can be described by Wiener-Shannon's theory.
The elements of a design matrix [A] can be either constants or functions, so that the design may be non-linear.Although mathematical techniques can transform a matrix, the physical significance of the elements A ij can be lost.An ideal design matrix is a square diagonal matrix with each FR related one-to-one to a single DP.The uncoupled tolerance for a DP i is ∆DP i = ∆FR i /A ii .The propagation of tolerance for a decoupled design with a lower triangular matrix n × n, is expressed as From Equation ( 9), it is evident that ∆DP i ≥ ∆DP * i .The consequence of this is that a decoupled design has less tolerance than an uncoupled design, and the increase of the order of design matrix makes the tolerance of the last DP i smaller.If the number of s DP s is greater than the number of s FR s , then the design is redundant.When the number of DP s is less than the number of FR s , then the coupled design cannot be satisfied.Supposing that there is a set of three {FR s } and a set of two {DP 1 , DP 2 }, then the equation in matrix notation is: In this equation, FR 3 cannot be always satisfied.Equation ( 10) can be written as: It is not possible to have a solution of system (11) without making changes to functional requirements.

Classical Design
The main difficulty is understanding what information can be ignored or rejected by the designer.The design steps can be described by the following system: verify the design statement at step m th (12 For example, D evolves from N to P, as described in Figure 3.In N, there is general information and unclassified information, whereas in P there is accurate information, classified, and quantified information.An existence condition of the project is that: The criteria initially set (they are present in D) are used by the evaluator to examine the relationships between D m and D n .The greater the power of the set D n , the greater the possibility that the existence condition is verified, as shown in Figure 4.

Non-Probabilistic Information in Metric Space
The idea of information, according to the theories of Fisher and Wiener-Shannon, is a measure of only probabilistic and repetitive events.When the idea of information is broader than the probability and the axioms of Wiener-Shannon, it can be extended to non-probabilistic and repetitive events "Wiener (1948)".
Using the information axioms, it is possible to develop models for information that can be very useful in applications.For every event A ∈ D, it is possible to have a measure of information using the mathematical expression: This definition of information has a natural application in metric space [29].Norbert Wiener in Cybernetics gives the clearest definition of entropy [30]: "We may conceive this in the following way: we know a priori that a variable lies between 0 and 1, and that a posteriori it lies on the interval (a,b) inside (0,1) Then, the amount of information we have from our a posteriori knowledge is [as follows]": The information can be evaluated by the probability and by the non-probabilistic measures of diameters.Thus, it is possible to have the measure of the information from non-probabilistic data.In non-probabilistic information, instead of probability, it is possible to utilize a finite number of appropriate proportions subject to a set of constraints that add up to one [31,32].In observance of the axioms, let d 1 , d 2 ,..., d n be non-negative real numbers, let: It is possible to use the measure of information by the relation: So that: J(ρ) is maximum when ρ 1 = ρ 2 = . . ... = ρ n , and J(ρ) is minimum when: ∀i only one number is = zero.

Entropic Analysis of Matrix
In the probabilistic approach, if we analyze our project, we have to use the probability of success.If we have the following uncoupled design: Suppose p is the probability of satisfying FR i with DP i , then p 1 and p 2 are the probabilities of satisfying FR 1 and FR 2 with DP 1 and DP 2 , respectively.The probability to achieve a solution is: Now let us consider the problem of choice among the three possible pairs of solutions in a design.In the simple case of a choice between two solutions, it is assumed that designers select the one, between two possible choices, that has the highest score in their preferences, even if their number of choices may be different from the two choices under consideration.In a choice between three solutions, say A, B, and C, we can proceed as follows: if the first group of designers would choose solution A, the second group would also prefer solution A and the third group would prefer solution B, then the three-choice problem concerning solution A would win two-thirds of the ballot and we could say that solution A is the preferred solution with respect to solution B. Moreover, with respect the choice between the solutions A, B, and C, we can have the impossibility of solution C, which is expressed in mathematical terms as A > B > C > A. Kenneth Arrow mathematically proved that there is no method for constructing the list of preferences starting from some arbitrary rules.In other words, for establishing design preferences, some precise rules of design are needed.Thus, the proper implementation of a project requires some precise priorities.Their lacking may induce some new uncertainty, which is highlighted by Arrow's theorem.In order to simplify the analysis of the relationships between elements of the array, in this paper we do not analyze Arrow's constraints between elements.In particular, we assume that the order and the solution of the constraints do not have any influence on the final result.Moreover, we skip the analysis of the entropy, which, instead, is considered in Arrow's theorem.If we have a coupled design with the FR s /FR j , then we have a complexity with a probability of solution.
We can have the following conditioning: For example, a vector with three elements can have six conditionings: 3! = 3 × 2 × 1 = 6.In this case, we can have the conditional complexity: For matrix [A] we can write the same relation as follows: with {g} i as a set of conditional elements of the matrix.From the Wiener theorem, one has that the non-normalized "a priori" set is n! and the "a posteriori" set that lies in the a priori set is ∑ n i {g} i .Therefore, the information is − log 2 At this point, we can introduce the entropy and we can speak of entropic complexity:

Numerical Example
If one has in the relation that: p 3 is the probability of satisfying FR 3 with DP 3 p 2 is the probability of satisfying FR 2 with DP 2 and in the vector [FR 1 , FR 2 , FR 3 ] −1 the conditioned relation (FR 3 /FR 2 ), then we have the entropic complexity (20) p = p 3 p 2 .The conditional complexity (19) will be: The entropic stochastic complexity is: EC is also a measure of epistemic uncertainties in a solution.The best solution is: This solution has the minimum of epistemic uncertainties.If P is the total uncertainties, then the epistemic uncertainty is Pepistemic = (Ptotal − Pstochastic).

Conclusions
The main goal of the research of the authors is the development of nonconventional methodologies for the design and analysis of engineering solutions for complex mechanical systems [33][34][35][36][37][38][39][40][41][42][43].In this paper, stochastic and epistemic uncertainty were analyzed, extending Wiener-Shannon's information theory to non-probabilistic events by introducing the theory of information for non-repetitive events as a measure of data consistency.By analyzing the relation of elements of the relation matrix and by using the concept of information in metric space, the idea of measuring the complexity and the epistemic uncertainties was introduced and the entropy of independence of the matrix elements was defined.With this new definition in a design matrix and with the assumption that the order and the solution of the system constraints do not change the final result, one can have two kinds of entropy which are, respectively, probabilistic Shannon-Wiener Entropy and non-probabilistic Shannon-Wiener Entropy.The final solution yields the entropic measure of epistemic uncertainties in multibody system models analyzed using an axiomatic design.

Axiom 1 :
The independence axiom.Maintain the independence of functional requirement.Axiom 2: The information axiom.Minimize the information content of the design.

Fig. 4 D
Fig.4 D data d d verify the design statement at step m Formula (12)