Mathematical Models for Dealing with Risk in EngineeringView this Special Issue
Research Article | Open Access
Quantification of Margins and Uncertainties Approach for Structure Analysis Based on Evidence Theory
Quantification of Margins and Uncertainties (QMU) is a decision-support methodology for complex technical decisions centering on performance thresholds and associated margins for engineering systems. Uncertainty propagation is a key element in QMU process for structure reliability analysis at the presence of both aleatory uncertainty and epistemic uncertainty. In order to reduce the computational cost of Monte Carlo method, a mixed uncertainty propagation approach is proposed by integrated Kriging surrogate model under the framework of evidence theory for QMU analysis in this paper. The approach is demonstrated by a numerical example to show the effectiveness of the mixed uncertainty propagation method.
The uncertainties of material properties, environment loads, and design models are inevitable in engineering. Uncertainty is usually classified into aleatory and epistemic types [1, 2], and the presence of uncertain factors introduces uncertainty in the reliability of the structure. Probabilistic approaches that deal with aleatory parameter uncertainty have been vastly investigated in typical structure reliability analysis. When sufficient data are not available or there is lack of information due to ignorance, the concept of subjective probability is well established for quantifying epistemic uncertainty. However, when the probabilities of rare events are very difficult to assess or events occur only once, the classical probability methodology may not be suitable because there is not enough statistic data [3, 4]. In order to overcome the lack of probabilistic method, the nonprobabilistic methods have been proposed and are more suitable to handle the epistemic uncertainty based on the Fuzzy theory, interval theory, evidence theory, and so forth. The theoretical concept and the application of the Quantification of Margins and Uncertainties (QMU) methodology were reported in the certification of reliability, safety of nuclear weapons stockpile, and risk-informed decision making process under restriction of test data in the last decade. Recently, the QMU methodology has been applied to more general complex systems, such as commercial nuclear power plants, reactor safety, and missile reliability [5, 6]. Eardley  described the main components (performance gates, margins, and uncertainties) of the QMU methodology. Under the key ideas and application procedures of QMU methodology, the uncertainty propagation that determines the output uncertainty from input uncertainty is a broad research area for QMU process. Reference  shows that evidence theory is a more general theory that can handle both types of uncertainty but it requires much more computational cost.
The objective of this paper is to propose an implementation framework of QMU under mixed uncertainty based on the evidence theory. To alleviate the computational costs, a stochastic surrogate model based on Kriging model and adaptive sampling method has been applied for uncertainty propagation for structure performance response. The rest of this paper is organized as follows. Section 2 briefly introduces the basic concept and metric of QMU. Section 3 details the mixed uncertainty propagation using evidence theory in QMU implementation. The new calculation scheme of mixed uncertainty analysis by integrating Kriging model and adaptive sampling is presented. One case study is presented in Section 4 to demonstrate the new developed implementation.
2. The Concept and Metric of QMU
QMU is a decision-support methodology for complex technical decisions centering on performance thresholds and associated margins for engineering systems that are made under uncertainty . The basic concept of the QMU is shown in Figure 1. The margin is defined as the distance between the nominal response of the system and performance gate bound (e.g., a failure threshold not to be exceeded), whereas uncertainty () is described by the range of system responses caused by different sources of variability. A QMU metric, confidence factor (CF), has to be developed to quantify and certify the confidence of the system reliability, which can be defined as
A CF sufficiently larger than one intuitively indicates safe conditions. In order to get the value of CF, three key elements should to be implemented as follows: (1) identification and specification of performance threshold(s); (2) identification and specification of associated performance margin(s), that is, measure(s) of exceeding performance thresholds; (3) quantified uncertainty in threshold and margin specifications.
3. The Mixed Uncertainty Propagation Based on Evidence Theory
3.1. Fundamentals of Evidence Theory
The measures of uncertainty provided by evidence theory  are known as belief (Bel) and plausibility (Pl), which lie in the interval . Evidence theory application involves the evidence specification of , where denotes the universal set, denotes the collection of subsets or set of focal elements of , is the basic probability assignment (BPA), and should satisfy the following axioms of evidence theory: (1) for any ; (2) when ; (3) , for all .
The Bel and Pl of a given event set and can be derived from the basic probability assignment by
Resulting from (2), the belief function, Bel(·), is calculated by summing the BPAs that totally agree with event , while the plausibility function, Pl(·), is calculated by summing BPAs that agree with event totally and partially. Both Bel and Pl play roles similar to distribution functions in the standard probability and they give the lower and upper bounds of the event set.
3.2. Mixed Uncertainty Propagation Using Evidence Theory
The mixed uncertainty propagated from input parameters to systems output needs to be quantified for structure reliability analysis when aleatory and epistemic uncertainties exist. The performance function with aleatory and epistemic uncertainties can be given bywhere represents the aleatory uncertainty variables described by probability distributions. For easy demonstration, we assume that the elements of are independent. is the vector of parameters with epistemic uncertainty described by evidence specification . The uncertainty associated with the model inputs and are propagated through the model to the model output . The joint evidence specification can be expressed by and the joint BPA is defined by
Let the number of the subsets of in the joint space be . The probability of the system output value () is less than the threshold () which can be given by the following based on the total probability formula:where is the probability of , which is denoted as event . means the probability of the focal element in joint space equals the joint BPA value. Because of the intervals in , the minimum value (belief) and maximum value (plausibility) of can be expressed by 
3.3. The Solution Framework of Mixed Uncertainty Propagation
Equations (6)-(7) can be calculated using sampling methods such as MCS with large computational cost. The Kriging surrogate model [12, 13] can be employed to reduce the cost for uncertainty propagation. With training observations, the response and predicted mean square error for any given new point can be expressed as where represents a correlation matrix, is the correlation vector between and the observed samples, and is an unit vector. This surrogate model is used to evaluate the uncertainty distribution of system output based on the traditional MCS method without calling the original performance function. The Maximum Confidence Enhancement adaptive sampling  is employed to ensure the surrogate model accuracy by adding new training sample.
The procedure of the kriging-based method for solving (7) is simply introduced as follows:(1)Calculate the joint BPA of epistemic uncertainties variables .(2)Use the Latin hypercube sampling (LHS) to generate sample points of the aleatory variables .(3)For each , calculate the minimum output value of in focal element of joint space.(4)Build Kriging model based on training data and .(5)Calculate the maximum Confidence point () from the MC sampling and update Kriging model by adding new training sample .(6)Repeat steps (4)-(5) until the surrogate model approximate accuracy is satisfied.(7)Calculate the probability boundary using the updated Kriging model and MC samples.(8)Repeat steps (3)–(7) for each focal element of .(9)Compute the Pl(·) based on the joint BPA and probability of each focal element.
For the Bel(·) solution, the minimum output value needs to change to the maximum output value .
3.4. Calculation of CF under the Evidence Theory Framework
As described in Section 2, the safety or reliability of a structural system will be measured by the Confidence Factor, CF. When the system output uncertainty is represented in terms of evidence theory measures with belief and plausibility, the demonstration of uncertainties and margins with distance between system output and performance gate boundary is shown in Figure 2. The performance gate boundary is a random variable.
The calculations of and to perform upper boundary are given by the following equations:where the subscript corresponds to the quantity of belief/plausibility/probability level and is the specified confidence level.
4. Application Example for Structure Analysis
Figure 3 shows a crank-slider mechanism . The length of the crank , the length of the coupler , the external force , Young’s modulus of the material of the coupler , and the yield strength of the coupler are random variables. The coefficient of friction between the ground and the slider and the offset are epistemic variables. The random variables and epistemic variables with BPA are provided in Tables 1 and 2. The performance function is defined by the maximum stress as (10) and the boundary is the material strength:
The belief and plausibility measures of the maximum stress are calculated by the proposed adaptive sampling Kriging model approach, and these results are shown in Figure 4. The QMU analysis with different is summarized in Table 3. Results show that the confidence level is very sensitive for QMU analysis in risk-informed decision-making.
The mixed uncertainty propagation approach is proposed by integrated adaptive sampling method and Kriging model for QMU analysis in this paper. The technique is demonstrated by a numerical example to account for the QMU analysis process and the approach for mixed uncertainty propagation. The results indicate the potential effectiveness of the proposed QMU approach for the evaluation of structure reliability.
The authors declare that there is no conflict of interests regarding the publication of this paper.
The authors acknowledge China Academy of Engineering Physics (CAEP) Foundation (2013ZK1.2) and Foundation of NSAF, Grant no. U1330130 (NSAF-U1330103).
- J. C. Helton, J. D. Johnson, W. L. Oberkampf, and C. J. Sallaberry, “Representation of analysis results involving aleatory and epistemic uncertainty,” International Journal of General Systems, vol. 39, no. 6, pp. 605–646, 2010.
- J. C. Helton and W. L. Oberkampf, “Alternative representations of epistemic uncertainty,” Reliability Engineering & System Safety, vol. 85, no. 1–3, pp. 1–10, 2004.
- R. Rackwitz, “Reliability analysis—a review and some perspectives,” Structural Safety, vol. 23, no. 4, pp. 365–395, 2001.
- S.-K. Au and J. L. Beck, “Estimation of small failure probabilities in high dimensions by subset simulation,” Probabilistic Engineering Mechanics, vol. 16, no. 4, pp. 263–277, 2001.
- X. Du, P. K. Venigella, and D. Liu, “Robust mechanism synthesis with random and interval variables,” Mechanism & Machine Theory, vol. 44, no. 7, pp. 1321–1337, 2009.
- H.-Z. Huang, Z. L. Wang, Y. F. Li, B. Huang, N. C. Xiao, and L. P. He, “A nonprobabilistic set model of structural reliability based on satisfaction degree of interval,” Mechanika, vol. 17, no. 1, pp. 85–92, 2011.
- D. Eardley, “Quantification of margins and uncertainties (QMU),” Tech. Rep. JSR-04-330, JASON, 2005.
- H. Shah, S. Hosder, and T. Winter, “Quantification of margins and mixed uncertainties using evidence theory and stochastic expansions,” Reliability Engineering & System Safety, vol. 138, pp. 59–72, 2015.
- NAS/NRC (National Academy of Science/National Research Council), Evaluation of Quantification of Margins and Uncertainties for Assessing and Certifying the Reliability of the Nuclear Stockpile, National Academy Press, Washington, DC, USA, 2008.
- H.-R. Bae, R. V. Grandhi, and R. A. Canfield, “An approximation approach for uncertainty quantification using evidence theory,” Reliability Engineering & System Safety, vol. 86, no. 3, pp. 215–225, 2004.
- X. P. Du, “Unified uncertainty analysis by the first order reliability method,” Journal of Mechanical Design, vol. 130, no. 9, pp. 1–10, 2008.
- I.-K. Bang, D.-S. Han, G.-J. Han, and K.-H. Lee, “Structural optimization for a jaw using iterative Kriging metamodels,” Journal of Mechanical Science and Technology, vol. 22, no. 9, pp. 1651–1659, 2008.
- Z. Wang and P. Wang, “A maximum confidence enhancement based sequential sampling scheme for simulation-based design,” Journal of Mechanical Design, vol. 136, no. 2, Article ID 021006, 2014.
Copyright © 2016 Chaoyang Xie and Guijie Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.