Abstract

Plenty of hazards underlie complex systems, which have negative effects on the normal functionality of engineering events. To minimize the uncertainty, a comprehensive preevent checkout is necessarily required to judge if the engineering events will be carried out successfully under current circumstance, through which further improvements can be made. A generic belief rule-based safety evaluation approach for large-scale complex systems is proposed. The overall system is firstly decomposed and filtered into the measurable attributes that may potentially contribute to uncertainty. Belief structure is then applied to measure the uncertainty of vagueness and incompleteness and represent heterologous information in a unified scheme. With this scheme, a rule base is established with all antecedents, consequents, and attributes presented in belief degrees, which is used to determine the relationship between attributes, aggregate the influences, and generate the final inference with evidential reasoning algorithm. In the end, an estimation of uncertainty is achieved in the representation of distribution. It describes how the systems perform with given conditions and sources. A numeric case in aerospace program is provided for feasibility illustration.

1. Introduction

With increasing interests in space exploration, more and more aerospace programs are on the agenda. Usually, they are large-scale systemic constructions with couplings among various subsystems, as well as with bunch of unpredictable incidents. In terms of rocket launch, either uncertain factor in main subsystems launch-vehicle system, control system, or spaceport system will certainly affect the execution of mission. Due to systemic sophistication and practical necessity, uncertainty cannot be completely eliminated. Thus, whether the launch mission, for example, will be carried out as expected under given conditions or how well the mission is executed is unknown, which needs uncertainty evaluation beforehand.

Basically, aerospace system is characterized by hierarchical structure, coupled relationship between parts, and collective behaviors, which is defined as complex systems. The evaluation on uncertainty in aerospace system then can be generalized and categorized into uncertainty or safety analysis on complex engineering systems. Thus, from this aspect, study on uncertainty should go beyond analyzing specific one-off, examine the commonalities of potential fragility of various complex engineering systems in a broader perspective instead, and establish a common evaluation framework. In this way, one can get a better understanding to control uncertainty, when facing such systems in the future [1].

The concept of risk is then proposed to deal with the uncertainty underlying the hazards. The risk is measured by the product of probability and consequence of an incident, which has been the first attempt and an evolutionary breakthrough since it has bridged the gap between simply qualitative assessment and primary quantitative one.

However, with the increasing demand of high reliability and efficiency, more precise and accurate evaluation needs to be conducted and, specifically, supported by numeric methods. It requires processing the data and knowledge observed through experiments and past accidents to help evaluate the current state of incidents in a more credible form. Probability risk analysis (PRA) is one typical quantitative risk analysis method, which has been widely applied in risk and safety analysis [2]. Meanwhile, research institutions including NASA, US Department of Defense, and European Space Agency have also proposed numeric methods such like FMECA (failure mode, effects, and criticality analysis), FTA (fault tree analysis), ID (influence diagram), and BN (Bayesian network) [2, 3].

Moreover, the basic methods have been eventually modified and developed to adapt to systemic complexity scale and features from a common complex systems engineering perspective. They are usually hybrid framework by integrating different mathematical methods, either statistically or stochastically, to eliminate the deficiencies of single reliability method, to deal with nonlinearity and uncertainty, and to quantify interactions of decisions in system, and so forth [4, 5].

Mostly, quantitative risk analysis and evaluation methods are based on the traditional theory of probability measurement, which is to say that the uncertainty is generally described by the likelihood of occurrence and the well-parameterized model needs to be established with substantial information. Though it has taken probability theory centuries to develop into a mature theory with solid mathematical foundation and complete proof and to become a commonly used measurement in various areas, there still exists some constraints in realistic application when depicting some risk variables [6]. Usually, in large-scale and high-tech projects, there are few chances to obtain sufficient information to conclude the changing pattern of variables, let alone the distribution that is considered necessary in probability theory because inference is generated based on known probability and distribution of all variables. To cope with the situations where empirical data are sparse or lacking, some improvements have been made on existing quantitative approach; for instance, hierarchical Bayesian models are proposed to reduce the estimation variance and to address the multidimensional property along with cross-classified random effects [7].

Nevertheless, study will never stop because these models are still insufficient to solve practical issues. In fact, except for data lacking, quite a few inputs for analysis are derived from expert knowledge and engineering experience; thus, they can be vague, fuzzy, or incomplete, which is defined as feature of epistemic uncertainty opposite against aleatory uncertainty [8, 9]. The former is very common in realistic systemic engineering. Then, fuzzy theory, rough set, and so forth are widely applied to measure the vagueness and imperfection of information [1012]. Though they provide effective solutions to evaluation, these models are sometimes mathematically sophisticated which need profound mathematical foundation; thus, they are not easily applicable to everyone.

As a result, Dempster-Shafer theory of evidence, which arises in late 1960s, is proposed to make up the vacancy. It can be regarded as a modification of probability theory, which can effectively deal with impreciseness and ignorance of information [13]. It has been widely used in decision, expert system, data fusion area, and so forth [1416]. The concept “belief” in theory of evidence is just suitable to describe the uncertainty, which to some extent simulates the way that people make subjective judgment when only achieving partial knowledge [17]. It is also capable of aggregating heterologous information to generate final conclusion. Based on Dempster-Shafer theory, many algorithms have been developed to fuse data and knowledge. Here, belief rule-based inference methodology using the evidential reasoning algorithm (RIMER) is applied to evaluate the performance of complex systems [18]. The uncertain factors in each related subsystem are filtered and graded. By combining every influential attribute from bottom level, the overall evaluation is achieved. Due to its theoretical rationality, practical feasibility, and operational simplicity, it serves as a novel generic safety evaluation approach for large-scale complex engineering systems.

2. Problem Formulation of Safety Evaluation

There are two significant features of uncertainty evaluation on common complex engineering systems: one is that the number of uncertain factors is huge and type is diverse; the other one is that little accurate knowledge is obtained to evaluate the factors. Thus, a generic methodology framework needs to be established to represent data in a unified form, deal with incompleteness, and generate the conclusion on systems performance.

This mentioned problem and solution could be formulated as follows.

Procedure 1. Decompose system and filter uncertainty attributes. The complex systems are decomposed into several segments, which may possibly influence the execution of launch. These segments, containing core devices, facilities, and sources associated with each other, provide the fundamental working environment.

Attributes to evaluate the uncertainty of every segment are filtered. The values of attributes give the preview of the current state of each segment and the sources that can be provided at a specific time, through which overall numeric judgment will be made whether it can meet the fundamental demand.

Procedure 2. Establish influence structure. Generally, the uncertain attributes denoted by set within systems should be summarized and organized into hierarchical structure, which is to say the performance of overall systems is determined by these potential risk contributors. This hierarchical relationship is one way; in other words, the nodes in upper levels can only be affected by the nodes in lower levels and not vice versa.

Procedure 3. Generate conclusions from systems inputs. The decomposition and filtering continue until all the uncertainty attributes can be measured either by qualitative data or by linguistic term from expert denoted by , . The evaluation consequence derived from these attributes is represented by and calculated by logical function . Thus, the basic evaluation model is where , are the inputs of systems for uncertainty evaluation. The inputs reflect the current state of systems and shows the performance of systems under given conditions.

Then, the main concern is to determine appropriate function , which is suitable to deal with nonlinear relationship between inputs and consequence and the heterology of various inputs. In next section, the model is modified with belief and a belief rule-based evaluation methodology is proposed.

3. Belief Rule-Based Safety Evaluation Approach Framework

To better illustrate the structure of proposed belief rule-based safety evaluation approach for large-scale complex engineering systems, take an aerospace program as an example to help explain the procedures.

3.1. Systems Input Representation Using Belief Degree

Due to the ambiguity and incompleteness, neither the input of systems nor the consequence can be represented in certain form. The concept “belief” is applied to measure the degree, which assigns to a value. In this scheme, the inputs and evaluation consequences can be represented asrespectively, where indicates the degree taking value and is the belief degree to which the consequence is assigned. Usually, . If , the input is considered to include complete knowledge; otherwise it is incomplete. Particularly, indicates absolute ignorance about inputs and means that the input takes certain value. For example, the design reliability of an equipment is , which says that the reliability is 0.96 with belief degree 0.9 and 0.94 with degree 0.1.

3.2. Construct Belief Rule Base for Evaluation

To generate the consequence, the priority is to formulate inference logical mapping . It will be expressed in belief rules and a belief rule base is established. It provides a generic representation for evaluation problem made up by attributes, referential values, consequence, and logical function. Generally, the model to estimate the uncertainty in complex systems is represented aswhere and . Set to be the set of estimation referential values whose elements state the different grades of an attribute, where is a referential set of values for the uncertainty attribute .

The BRB is consisting of “IF-THEN” rules to describe the relationship between uncertainty attributes and inferred conclusions. The rules give the mapping relationship between the referential values of the attributes and consequence. Formally, the th rule with belief structure is denoted bywhere is the belief degree with which the consequence is believed to emerge in the th rule with constraints . The relative rule weight of the th rule is denoted by , and the relative attribute weight of attributes in this rule is represented by .

For example, when judging the uncertainty of possible tracking sources in control system during flight, “if the number of tracking ships in service is 2 and the possible tracking time that the ground control station can provide is 80 min, the uncertainty is .” The statement above is one typical rule. “The number of tracking ships” and “the possible tracking time” are two uncertainty attributes affecting the uncertainty of possible tracking sources; “2” and “80 min” are elements in the referential sets corresponding to “the number of tracking ships” and “the possible tracking time,” respectively. If the inputs are exactly the combination of “(2, 100%)” and “(80 min, 100%)”, then the result is generated as . It means the probability of system with low uncertainty is 80% and the one with medium level is 20%. In this rule, and are considered to be 1 and represents that the rule includes complete knowledge. The rule base is build up with expert knowledge and empirical data in aerospace area.

3.3. Generate the Evaluation Consequence

Evaluation will be generated by processing the given inputs of systems with the rule base above. However, the inputs may not be exactly equal to the values in ; thus, inference cannot be directly made. Input transformation is needed first.

The transformation process is to determine the degree to which the input matches the referential value. The input may not be exactly included in the referential set for reasoning sometimes, so for this situation it is required to rerepresent the input with given values in belief structure. By transformation, the input would be converted into the form as where is the matching degree to which the input belongs to the referential value , . is commonly calculated as

Take humidity of environment on launch pad as an example. The referential set is {30%, 50%}. The actual input is ; then by using formula the input is expressed as {(30%, 0.75), (50%, 0.25)}. For the attributes described using linguistic terms, the formula is suitable as well. The referential set of experimental proof is {highly, moderately, poorly}, representing scores 1, 0.5, and 0, respectively. The input 0.68 is transformed to {(highly, 0.36), (moderately, 0.64), and (poorly, 0)}.

Given that an input has been converted into form corresponding to the th rule,Now it is time to determine whether this rule concerning the input is chosen to generate consequence. Denote as the total matching degree to which can be calculated by

Function is used to aggregate every single to reflect the relationship among attributes as a whole. The selection of function varies with the connectivity among attributes in the th rule. In complex systems, uncertain attributes are mostly connected by operator “,” which means that the rule is inactivated until all the attributes are satisfied. In this case, the function whereThen, the activation weight of the th rule is generated by where is the relative weight of the th rule. If , none of the rules is activated.

What is more, in some situation to reserve the incompleteness feature of input and to propagate it to the consequence, the belief degree of the consequence of a rule needs to be fixed by where If the input is complete, then .

The last step is to combine multiple uncertainty attributes to generate inference consequences with all activated rules by applying ER approach.

Firstly, calculate basic probability masses using where is the basic probability mass assigned to the th consequence in the th rule and is the one unassigned to any consequence. Obviously, .

Secondly, suppose is the belief degree by combining the first attributes and , . Then the overall belief degrees of consequence are derived by RIMER algorithm:

4. Case Study

Consider a launch mission in China to illustrate the feasibility of the proposed method. The estimation is conducted to deal with the case that the launch vehicle is about to be equipped with a newly developed upper stage. Since it is the first time for the new upper stage to be carried into the outer space, a mission-oriented prelaunch checkout on the whole system is required to ensure if the current states of subsystems are qualified to fulfill the task. The launch process is a large-scale systemic engineering that calls for the cooperation of various subsystems, including launch-vehicle system, control system, and spaceport system. Any ignorance of uncertainty in either system may cause failure to launch, leading to huge loss. Thus, estimation of potential uncertain factors in a mission is necessary from systemic aspect. The estimation proceeds following the procedure proposed above.

4.1. System Decomposition and Uncertainty Attribute Filtering

The decomposition and filtering might not be comprehensive and technical based on the documents achieved so far, but it reflects the risk caused by launch to some extent and well illustrates the method.

Launch vehicle is generally composed of three main elements: several rocket engines, guidance, navigation and control systems, and a structure housing all these components including payloads. Among them, propulsion system is major contributor to uncertainty during flight [19]. Currently, the primary propulsion system for launch vehicles is restricted to solid or liquid propelled. The latter, referred to as engines, is of higher specific impulse than the former, usually called motors; thus, it provides much more long-lasting thrust when storing the same weight propellants. However, its disadvantage is the storage complexity and fuel leakage. From mission-oriented respect, the stronger the power is, the heavier payload the launcher can carry and the higher orbits it can reach. For example, a three-stage liquid-propelled rocket can take a 1.5 t payload to the sun-synchronous orbit which is 900 km high or carry a 1.1 t payload to the geosynchronous orbit which is about 36000 km high, while a three-stage hybrid rocket is less capable. The issue is selecting the appropriate rocket for payloads, for a mission. A less qualified rocket for a heavy payload into a high orbit may be overwhelmed and results in failure. What is more, a new upper stage is being developed to adapt to new tasks. Though the design reliability would be high, it has not frequently been tested in practical scenario. So there is little experimental proof to guarantee the maturity of upper stage technology. In all, the decomposition of launch-vehicle system is shown in Figure 1.

Control and command system implements the monitoring and control tasks with the cooperation of various sources-ground control stations and tracking ships. So far, the tracking facilities within China are plenty and the leading constraint is the lack of tracking station out of the country. San Diego station is the only source outside China that can provide monitoring service after the separation of upper stage and rocket with limited time. Furthermore, if the rockets are chosen to launch at Hainan spaceport, nothing but the tracking ships will help during some period because of the geographical difficulty of setting up station around spaceport. Specifically, only 3 tracking ships are in service unless repair is needed. Nevertheless, the real-time monitoring and communication are still not available throughout the whole flight, so uncertainty may underlie this disconnected period. Besides, timing reference signal between ground stations, ground stations and control center, tracking ships, control center, and so forth is sent either in shortwave or longwave. The former is less precise, which may result in the inaccuracy and calculation error of observed data between each other. The decomposition of control and command system is shown in Figure 2.

In terms of spaceport system, environmental factors affect the execution of a launch. Appropriate humidity and temperature will ensure the storage of fuel and the functionality of facilities and prevent project delay caused by weather, and vice versa. Though good environment conditions will provide plenty of time per year for launch pad to implement launch, there is still an interval for recovery and preparation before next launch, so the number of launch executed every year is limited. Generally speaking, the time interval at least can be 3 months if in a hurry. Therefore, the decomposition of spaceport system is shown in Figure 3.

4.2. Belief Rule Base for Aerospace Project Evaluation

In this way, the uncertainty evaluation of launch process is initially decomposed and filtered into specific and numeric attributes , which can provide quantitative assessment. The possible values assigned to each attribute are listed. Also, the rule base can be established with technical inference, practical experience, and expert knowledge, which is shown in Figure 4 and Tables 1 and 2.

4.3. System Input and Evaluation Result

Suppose a mission execution proposal is initially decided. According to the weather forecast, the temperature is 18°C and humidity is 35% with belief degree of 90% and 80%, respectively, on the launch day. After 4 months preparation of transporting, assembling, fuel filling, and so forth, a 3.8 t payload is about to be sent into GEO orbit by a three-stage liquid-propelled launch vehicle, which is equipped with a newly developed upper stage. The design reliability of the upper stage is 0.96 and it has been frequently tested in relevant environment. Due to the regular reparations and emergency, it is guaranteed that 2 tracking ships will stand by all the time with 90% belief. By all effort the San Diego ground control station can provide 80 min tracking and monitoring service, but in case of technical failure it is estimated that the probability of full service is 80%. During the flight in space, the real-time communication with control center is limited because of the location. Only 65% of time is available for real-time interaction. Anyway, the communication among control stations and control center is working well. Overall, the attributes can be summarized as in Table 3.

5. Results and Discussion

Use programmatic method to implement the belief rule-based algorithm and calculate the framework from the bottom level to the top level. Propulsion capability is calculated as {(high, 0.9083), (medium, 0.0917)} under constraints of rocket type, payload weight, and aim orbit. That means that the propulsion system is basically capable of fulfilling this task and so is the upper stage, whose maturity evaluation is {(high, 0.9697), (medium, 0.0303)}. All the attributes evaluation in the level 3 can be achieved in Table 4.

From the results, it is obvious that the information incompleteness has been propagated to “uncertainty in tracking sources” and “environment uncertainty” due to the incompleteness of their antecedents. And this incompleteness will continue to spread to the top level.

In the end, the uncertainty evaluation of the launch can be calculated and the result is shown in Table 5. The (low, 0.9196) means that according to this execution proposal, the uncertainty is quite low with a belief degree of 0.9196 under current condition. The launch mission is believed to be fulfilled as expected.

6. Conclusion

It is proved by the application that belief rule-based safety evaluation approach using evidential reasoning is qualified enough to deal with the mission estimation. It is also believed feasible to solve complex systems evaluation issue. The method is characterized with certain features. First, every uncertain factor affecting the performance of complex systems and all the evaluation results are represented in the distribution on referential values. It follows the way people recognizing the world and making subjective judgment. In this way, no matter what the data type is, quantitative or linguistic accurate or imprecise, even extremely vague, it can be interpreted under a unified criterion. Second, the rule base provides an effective approach to combine input with various structures from each subsystem, which solves the problem generated from fusion of heterologous data.

Due to the limited sources and uncontrollable factors, it is impossible that the mission is complete without any accidents. However, by applying the proposed method, uncertainty can be quantified, measured, and compared. Through the evaluation results, decision maker could figure out the main uncertainty contributors with higher degree, estimate whether the risk is acceptable, and judge if the complex systems will go on as expected when provided given conditions. What is more, comparison between different systems is also feasible using this method.

The importance of rule base cannot be neglected. The if-then rules help to deal with the nonlinear causal relationship and heterologous data fusion. It has great impact on the inference accuracy. So far, the rule base is build up with technical assessment, engineering experience, and expert knowledge, which to some extent is not subject enough. In the future, data training is applied to self-construct more reliable rule base [20].

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgment

This work was supported (in part) by the National Natural Science Foundation of China under Grants nos. 71331008 and 71201168.