Abstract

The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD) and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete) systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

1. Introduction

A number of different methodologies were proposed in order to analyze stochastic events and the time intervals that elapse between them. The most known theoretical background of these methodologies to treat and analyze dynamic systems was still based on the Markov approach. For instance, the Theory of Probabilistic Dynamics [13] was extensively investigated in order to perform analytical modelling and simulation related to the analysis of system reliability and safety. Dynamic reliability techniques [4] have been developed in order to study the reliability parameters of complex dynamic systems having continuous processes and discrete events (e.g., failures) interacting with each other.

In dynamic reliability theory, the concept of reliability includes the interaction existing between the sequence of dynamics and events, such as the crossing of the border of a safety domain in the space of the physical variables and the transitions between dynamics. The large number of states, possible time-dependent delays, and transition probabilities that are to be evaluated may be the most important limitation of using the Markov approach for large sets of system components. In addition, the stochastic events and uncertainty in the parameters of the dynamics complicate the analysis even more. Thus, uncertainty analysis becomes necessary and some general uncertainty estimation and analysis techniques have been further introduced and discussed.

The large parts of commonly used methods for reliability analysis and probabilistic safety assessment (PSA) are usually based on the assumption that the basic events are functionally independent of each other. This assumption does not often hold, and Markov processes are mainly used to account for the time dependence of the reliability and availability functions.

In this case, it is possible to use an assumption that the transfers from state to state follow a Markov process. The initial equations to be considered may be expressed as follows:where is the probability of the system being in state at time and is the system state vector, composed of the set of system states. In (2), as the total transition rate out of state is defined by sum of all , which are the transition rates from state to state . The term , as defined in (3), is called the ingoing density, that is, the instantaneous frequency at which state is entered from any other state at time .

In general, each state can be associated with specific evolution equations for the process variables describing the system dynamics. But, in a Markov process, the probability for the system to stay in a given state during a given sojourn time is independent of the time at which the state is entered, so state probabilities are independent of the past history (memoryless stochastic systems).

The assumption of a Markov approach (i.e., the assumption that the probability that a system will transfer from one particular state to another depends only on the initial and final states of the transition) holds major simplification of the simultaneous equations describing the state space diagrams.

However, even these simpler equations may not be soluble in analytical form, if the transition rates and possible delays between states are time-dependent functions. The analysis is even more complex, if transition rates are uncertain, that is, depend on uncertain parameters.

Initially, for uncertainty analysis and simulation of complex processes, the aggregate approach and the method of control sequences have been investigated and widely used. According to this approach, the simulation of dynamic systems and integration of modelling methods were also considered [5]. According to this approach, the investigated objects are presented as the set of interacting Piecewise Linear Aggregates (PLAs) [6]. The method of control sequences is used for the aggregate specification. Initially, PLA formalism was mainly used for discrete event system specification and analysis of distributed systems [7]. Later on, applying the advantages of PLA, the focus was set on simulation and analysis of hybrid systems considering the stimulated dynamics and interactions with various events [8]. However, the practical application of this approach showed various limitations and a need to look for different approaches or integration of them.

Due to the strong dependence existing along an accident scenario between stochastic events (e.g., operator actions or component failures) and dynamics (i.e., the time-dependent evolution of physical processes, e.g., a change in temperature during a transient), the traditional simulation using discrete PLA formalism or Markov processes is not able to cope with such time-dependent hybrid system simulation. The reliability analysis of the system is even more complex, if transition rates are uncertain, that is, depend on uncertain parameters. Thus, extended approaches are considered in order to cope with this issue and the uncertainty analysis.

The paper, based on a short article in proceedings of conference [9], is constructed as an investigation of issues in probabilistic dynamics to give for the reliability analyst and PSA practitioner a wider and clearer view of how accident sequence analysis considering uncertain events can be performed. This is very relevant for level-1 PSA [10] and especially for level-2 PSA [11]. It is worthwhile to mention that there are various techniques of dynamic event tree generation (e.g., ADAPT [12], MCDET [13]), which are specifically useful for level-2 PSA. The main contribution of this paper to the reliability assessment field is in wider discussion of stimulus-driven treatment of probabilistic dynamics and in practical application of related methods and development of approach for severe accident scenario simulation and uncertainty analysis, which was demonstrated by the test case and extension of its initial specification. This can be further used in the benchmark exercise for comparison of other methods and approaches.

The structure of paper is as follows: after the presentation of introduction and this outline of the paper as well as considered methods and issues (Sections 2.1 and 2.2) the formal concept of stimulated dynamics and dynamic systems is presented in Section 2.3; further, in Section 2.4 the modelling and simulation approach as well as analysis tool is introduced being more specific on implementation algorithm and stimulated dynamics treatment. The proposed approach for analysis of uncertainty issues is emphasized in Sections 3.13.3, where the uncertainty estimation and analysis taking into account sensitivity measures as well as the concept for implementation of integrated analysis using coupled software are presented. A practical part of the paper (starting from Section 4.1) is devoted for the test case analysis. Initially, the test case specification is focused on process timing and associated events with its relation to the stimulated dynamics concept accordingly presented in Section 2.1. Then, this case study (in Section 4.2) is presented and for comparison purpose (in Section 4.3) is related to the event tree and simplified analytical modelling. The simulation of time-dependent rupture, which was the main concern in the test case, is described in Section 4.4. Finally, the analysis of uncertain rupture frequency is presented in Section 4.5. In the same section the results of time-dependent uncertainty and sensitivity analysis are demonstrated and related to the idea of how this can be used in order to focus on the rarest sequence with the quite severe consequences and possibly reduce the computational time of simulations performed. Then, by discussions and conclusions, the paper is completed summarizing the related PSA issues and advantages of proposed approach application for accident sequence analysis considering hybrid systems and uncertain events.

2. Stimulated Dynamics and Uncertain Dynamic Systems

2.1. Extensions of the Markov Approach

Extensions of Markov processes initially have been developed for cases where the transition rates depend on process variables, that is, when the TPD is valid [14]. Indeed, in some unfortunately frequent cases, there is a stochastic time delay between events (e.g., satisfaction of ignition conditions and explosion itself). For instance, operators introduce delays in taking actions after alarms that may lead to different further accident developments. In these cases, stochastic delays complicate the situation of the state transitions.

More generally, the same situation occurs whenever, for a transition (event) to occur, some conditions ought first to be fulfilled that depend on the accident transients and timing. These conditions may persist after the transitions to the new states.

A typical example is the occurrence of combustion phenomena only if flammability conditions are met, with delays potentially resulting from stochastic ignition conditions and with potential for multiple combustions if the flammability conditions persist. For instance, this can be actually related to the hydrogen generation and possibility of combustion during the severe accident [15]. The more general conditions (including setpoints for thresholds as particular cases) may be considered as stimuli for the transitions (events). When process variables reach those conditions, the stimulus activation or start of the delay before the transition can be considered.

Because stimulus activation conditions the occurrence of events, the history of stimulus activations and subsequent delays during the event sequence does matter in calculating the scenario frequencies; extensions of the Markov process equations accounting for these features are then necessary. Those extensions constitute the so-called Stimulus-Driven Theory of Probabilistic Dynamics (SDTPD).

Indeed, SDTPD [14] provides a mathematical basis to estimate the probabilities per unit time of entering states with specified activated stimuli and subsequent delays. Exceeding safety objectives is a particular case of stimulus, so SDTPD considering various stimuli has the potential for analyzing multiple objectives, including safety [16].

More recently, the simplification of SDTPD or Theory of Stimulated Dynamics (TSD) was developed for the analytical modelling and the simulation of hybrid (continuous-discrete) systems [8, 17]. The theory at first deals with instantaneous and random variations of process variables; then, it introduces the concept of stimulus and how it can be implemented [14]. Both a semi-Markov and a non-Markovian treatment may be used in order to adapt TPD for practical applications, mostly in the context of PSA. The development of TSD as well as related methods and simulation methodologies has been used by the TSD developers as a basis in the perspective of their applications for PSA and severe accident analysis.

Since the application of TSD-related methods to the traditional PSA concept [18] needs a formal approach, the new definitions and issues of uncertainty analysis are specified and discussed. Then the related investigation of reliability and uncertainty analysis is performed.

2.2. Issues of Uncertainty Analysis

The part of uncertainty related to any estimation can be considered as a spread or distribution in the value of the result estimate. Obviously, the spread in this estimate is related to the spread in the parameters of the probabilistic model used to estimate the result, for example, risk [16].

However, in addition to uncertain model parameters, another cause of uncertainty may arise from incompleteness, that is, from the incomplete modelling or data used in the probabilistic model itself or in the analyses used to derive the model. The uncertainty in inputs may also affect the topology of the probabilistic model or the data and time dependence used to quantify it.

The completeness of the scenario inventory depends on the consideration of each scenario construction, that is, on the way of grouping sequences and assigning to them corresponding frequencies and consequences. The level of conditional risk, given the scenario occurrence, partly represents how rare and important this scenario is from a consequence point of view. However, in order to search for scenarios with almost unpredictable, but possibly severe, consequences, there is a possibility to generate and consider events and dynamics as well as related sequences, which can be very rare.

The technique to search for rare random events is not evident and is not related to traditional PSA; in addition, there is also a concern about how to generate and to consider rare events, which are dependent on the changes in process variables values and timing. Actually, this means that scenarios related to such events are time-dependent and uncertain. Thus, for uncertainty analysis, the scenario development should be considered as well as uncertain parameters.

Uncertainty related to PSA could be classified according to the uncertainty source: the frequency of events and the sequences of dynamics themselves (i.e., dynamics and timing in the process variables space). Taking into account that all sources of uncertainty are important, there is a need for such uncertainty analysis, which considers both sources and at the same time reflects the issue of model incompleteness.

On the basis of this classification, there is a need to note that in the case of the first source of uncertainty, changing the values related to the frequency of event occurrence (e.g., failure rate ) will not create new branching situations. It will affect the likelihood of already possible sequences and scenarios [19], without changes in the possible process variables evolutions and scenarios themselves. Conversely, a change in the value of an uncertain threshold related to a specific event creates a new dynamic trajectory in the process variables space.

Considering this classification, it is easy to conclude that the first source of uncertainty (i.e., fluctuations of failure or recovery rates or of on-demand failure probabilities) can be propagated with no additional deterministic calculations, as all sequences in the process variables space keep valid. However, the second source of uncertainty, in principle, causes a continuum of additional scenarios with different timing, what requires considering probabilistic dynamics [4]. This uncertainty has an effect on the process variables evolutions and it should therefore be investigated separately in order to save computational resources and represent conditions and scenario-related uncertainty or simulation incompleteness.

Taking into account the main features of dynamic systems, it can be seen that a simulation algorithm relevant for dynamic reliability and uncertainty issues should display the following characteristics:(i)Search for rare conditions under consideration.(ii)Representation of the uncertainty of the conditions considered.

2.3. Formal Concept of Stimulated Dynamics

In the considered case, the modelling and analysis of dynamic systems is related to stimulated dynamics. Dynamics is determined by laws of process variables evolution, which can be indexed by an integer . Process variables can be governed by a set of deterministic equations; that is, , , and . In general, is the dynamic model pertaining to the th configuration and driven by the vector of physical variables .

An instantaneous change of the dynamics due to stimulus activation and subsequent delay elapsing is associated with an event. Event is defined as a transition between dynamics at a certain time . A random event is an event whose occurrence is related to complex nature and timing, which is modelled stochastically, for example, a time distributed failure occurrence. A deterministic event is induced by deterministic rules (analytical equations).

To relate event with stimulus, there is a need to explain that a stimulus covers any situation or conditions whose occurrence, after a time delay, potentially causes an event to occur. An example of such an event can be related to the time moment when following a given process a threshold on pressure is reached and safety functions after the delay (e.g., operator reaction) are activated (see Figure 1).

In the usual formulations of the Theory of Probabilistic Dynamics, the change in the dynamic behavior of the physical process variables occurs with no delay. The main concept introduced here is the stochastically determined succession of stimulus activation and delay, which must take place prior to the transition between two dynamics, that is, the actual occurrence of an event.

Formally, for analysis, let be the set of all stimuli to be accounted for in the process evolution following the occurrence of a given event related to the transitions between dynamics. Denote by the probability density function of activating stimulus at states after time spent in dynamics which was entered at state (see Figure 2 [14]).

Also define , probability per unit time of having a time delay between the activation of stimulus at time moment and the actual occurrence of event at time moment , that is, transition of dynamics , if stimulus was activated at state :where , probability density function of the delay between the activation of stimulus at time moment and in the same conditions leaving dynamics at time moment .

2.4. Simulation Approach and Analysis Tool

After the occurrence of an initiating event, the corresponding evolution laws of the process variables are considered. They induce stimuli activations with their corresponding delays and related events. Such a process is carried on until the considered process variables reach one state among possible final absorbing end states. In reliability analysis, an end state is a consequence expressed as a damage state or as a stable safe situation. The probability estimate (frequency) of any consequence practically can be calculated as it develops during the simulation of the considered system.

The SDTPD is a good candidate for level-2 PSA because it mitigates the weaknesses of the current methods. But the direct application of this theory requires solving a lot of complex equations. Even in the simple test case presented below, it is not possible to solve them analytically. Moreover, for large systems, it is unthinkable to want to write the associated equations.

Analyzing complex dynamic systems and accidents, analytical methods often cannot be properly applied. In such cases, Monte Carlo simulation can be used. It is based on random numbers generation. The accident probability estimation is determined according to the rate of success/failures and the number of trials. The most important part of this approach is to develop the simulation model of the considered physical system and stochastic process.

For instance, considering a simple dynamic system, which includes a stimuli and one possible event changing the dynamics, there is a need to simulate the deterministic process (changes of variables) , , and the occurrence of the event related to the stimulus ’s activation and the subsequent time delay. Such system simulation can be expressed using the following algorithm:(i)Process variables follow the dynamics law , .(ii)On the (possibly random) time moment the stimulus is activated.(iii)During the random time delay following the stimulus activation, the process variables still follow the same dynamics.(iv)On time moment , the event occurs and the process is changed to another.This is a really simple algorithm, which is based on two random variables: the time moment of stimulus ’s activation and the time delay . In order to generate the values of these random variables one must know the probabilistic distribution functions and . Considering complex systems, similar functions usually are not known as they depend not only on time but on the process variables value as well.

Assuming that the following distribution functions of stimulus activation and time delay are known for each dynamics separately, it must be noted that these functions depend on process variables and this does not allow us to determine, in advance, when stimulus will be activated and how long the delay will take. The time moment of stimulus ’s activation is generated as follows:(1)A pseudorandom number is generated according to the uniform distribution from the interval .(2)According to the considered dynamics and related process variables , the corresponding distribution function is determined for any time moment .(3)The time moment of stimulus activation is determined from equation .In order to estimate process variables and distribution function of stimulus activation for any time moment, many computations are needed. Thus, in order to save resources, the calculations are made using discrete time framework. is determined by performing the following actions:(1)A pseudorandom number is generated according to the uniform distribution from the interval .(2)The time related probability initially is defined as follows: .(3)The discrete time t is considered starting from step by step .(4)During each time step the following computations are performed.(i)The process variables are estimated.(ii)According to the considered dynamics and values of process variables , the probability that stimulus will be activated or deactivated during the time interval is computed according to SDTPD.(iii)Consider .(iv)If , then time moment of stimulus activation .The described algorithm allows generating the time moments without direct (analytical) consideration of changes in process variables and distribution function . This algorithm can also be used for the generation of the random time delay after stimulus activation.

Based on this approach and in relation to the theory of stimulated dynamics, a simulation tool was developed. In this tool, the building of the system model with stimulated dynamics is based on three groups of essential characteristics:(i)Timing: it is the independent factor, which increases gradually and is related to every event. Time influences values of process variables, stimulus activations, delays, system events, and so forth.(ii)Deterministic characteristics: process variables and dynamics, which indicates the system state at a certain time moment.(iii)Stochastic characteristics: probabilistic density functions of stimulus activation and delays before events.(iv)The relationships between these characteristics are shown in the following schematic (see Figure 3).With reference to this schematic, three different modules are distinguished in the simulation tool:(i)Timing and events’ control module.(ii)Deterministic process module.(iii)Stochastic process module.In general, the timing control module simulates the system timing. The deterministic processes module estimates the process variables evolution and according to the considered dynamics defines the system state. The stochastic processes module estimates the probabilities of stimulus activations and of dependent delays defining the following events.

TPD enables analyzing systems, where transition between different dynamics is initiated immediately, when some critical conditions (reaching of threshold) occur. In addition, the application of TSD enables analyzing such systems, where delays before events are stochastically determined as in real systems with uncertain time between the stimulus activation and the start of a new dynamics.

3. Uncertainty Analysis Approach

The approach suggested for uncertainty and sensitivity analysis is based on well-established concepts and tools of probability calculus and statistics. In this paper, it is illustrated by an application of SUSA (Software System for Uncertainty and Sensitivity Analyses) developed by GRS [20]. The uncertainty analysis, in addition to uncertainty estimation, includes the identification of the potentially important contributors to the uncertainty of the model output and the quantification of the respective state of knowledge by subjective probability distributions [21].

In general, the probabilistic model expresses aleatory (stochastic) uncertainties of physical process. In addition, for each uncertain input of the model, its probability distribution also expresses how well input is known (i.e., epistemic uncertainty). The sensitivity analysis, as main part of the uncertainty analysis, can be used to identify and classify uncertain parameters, which mainly contribute to the variations of results and in order to see the uncertain input’s combined influence on the output due to uncertainty propagation. For this, the quantification of margins and uncertainties is essential [12].

3.1. Uncertainty Estimation and Analysis

The quantitative uncertainty estimation can be expressed using quantiles or percentiles (e.g., 5% and 95%) of the probability distribution. Knowing distribution law and parameters, it is possible to estimate the mean, standard deviation, median, quantiles, and other point estimates as well as confidence intervals. In practice, quantiles of output can be estimated using Monte Carlo simulations (MCS) with a specified number of model runs after input sampling.

In addition, the impact of possible sampling error on the output can be considered and related to the number of runs. This can be done by computing statistical tolerance limit (or two sided limits treated as interval). This limit (or interval) separates at least part of all possible output with at least a probability as confidence level. In other words, this means that, with a probability, part of all possible output will be separated by the specified statistical tolerance limit (or will be in the considered statistical tolerance interval).

According to the classical statistical approach, the confidence statement expresses the possible influence of the fact that only a limited number of model runs have been performed. For example, according to Wilks formula [21], 93 runs are sufficient to have a (0.95, 0.95) statistical tolerance interval (upper and lower limits). The required number of runs for one sided tolerance limit and correspondingly the number for tolerance interval can be expressed as follows: The minimum number of model runs needed is independent of the number of uncertain quantities taken into account and depends only on the two quantities and described above.

3.2. Uncertain Output Sensitivity Analysis

In general, outputs from models are subject to uncertainty. Usually uncertainty estimation can provide a statement about the separated or combined influence of potentially important uncertainty (aleatory and epistemic) sources on the model output. However, often more important, to analyze uncertainty providing quantitative sensitivity statements that rank the uncertain inputs with respect to their contribution to the model output uncertainty. On other hand, it is important to note that uncertainty in the model affects the ranking results [22]. In a frame of uncertainty analysis the purpose of the considered sensitivity analysis is(i)to analyze uncertain output sensitivity to the uncertain inputs,(ii)to identify which inputs mostly influence the model output.In general, sensitivity analysis is used not only to analyze uncertainty but also to examine which epistemic uncertainty sources are better to control.

In order to rank uncertain parameters according to their contribution to model output uncertainty, standardized regression coefficients (SRCs) [23] can be chosen from the many other sensitivity measures available. They are capable of indicating the direction of the contribution (negative means inverse proportion). SRC is supposed to tell by how many standard deviations the model result will change if the uncertain input is changed by one standard deviation.

Additionally, the correlation ratios (CRs) [23] can be computed. The ordinary CR is the square root of the quotient of the variance of the conditional mean value of the model output (conditioned on the uncertain input) divided by the total variance of the model output due to all uncertain input taken into account. It serves as a measure of how one uncertain model specification was quantified through a set of alternative specifications. The CR quantifies degrees of inputs and output relationship.

How well this is achieved in practice depends on the degree of linearity between the model output and the uncertain input. In case the number of uncertainties is large and the sample size is small, spurious correlations can play a nonnegligible role. The effect of spurious correlations on sensitivity measures may be investigated if the estimates of SRCs and correlation coefficients are compared [21]. Thus, often in practical cases SRCs are also applied with other sensitivity measures, like Partial Correlation Coefficient (PCC) [24]. Correlation may provide a measure of the strength of a linear association between variables and results. For nonlinear but monotonic relationships between results and variables, measures that work well are based on rank transforms. The PPC provides just a measure of variable importance that tends to exclude the effects of other variables [23]. The options described above and chosen for this illustration of the approach are part of the software system SUSA [25], which is applied in various accident cases.

3.3. Integrated Analysis Using Coupled Software

In order to perform the uncertainty analysis of a probabilistic model output, the double randomization MCS scheme can be applied. The dynamics simulation is proposed to be performed, using different values of the random input defined in Uncertainty Simulation Software denoted as USS. The considered Dynamics Simulation System, here denoted as DSS, is used for the dynamics simulation in relation to probabilistic system failures, accidents, and/or consequences. The used USS interacts with DSS and performs an additional statistical analysis; that is, outputs from DSS are transferred to the USS system, which then perform an integrated uncertainty and sensitivity analysis.

The major part of the integrated analysis is based on the coupling translator, which consists of a preprocessor and a postprocessor used for data flow between dynamics and the uncertainty simulation software. In general, four main steps are used to perform an integrated analysis:(1)Develop a probabilistic model (e.g., TSD based).(2)Sample model inputs (e.g., uncertain delays).(3)Simulate the stochastic process (e.g., pressure).(4)Analyze outputs (e.g., rupture frequency).The integration starts, when DSS input file, which represents the model, is processed by a translator, using a preprocessor to provide USS with information, which is available in the DSS model.

Most of the time, when the translator is running, it modifies the identified random variables in the DSS input file according to the values, determined by USS, and retrieve the values of response from the DSS output file for the following uncertainty analysis by USS. At the end of the computations, USS may perform the uncertainty analysis. The following scheme (Figure 4) graphically shows the interactions between the user, USS, DSS, and the translator.

The first step in using USS with a coupling of DSS is to set up the model uncertainties. This is done by selecting uncertain inputs, specifying distributions, and so forth. This part depends on the type of uncertainty analysis performed. In general, uncertain inputs and specific distributions with their parameters are specified. The list of considered inputs depends on quantities from DSS, which are available in input files and are proposed to be treated either as stochastic or epistemic. The distribution functions and distribution parameters for each uncertain input are specified. Then, prior to each model call (i.e., DSS activation) corresponding values from the DSS input file are modified by the translator according to the values selected by USS. In the final stage, the failure criterion and responses as considered output quantities (e.g., pressure) are analyzed in order to estimate reliability parameters (e.g., failure rate or frequency).

4. Test Case Simulation and Analysis

4.1. Test Case Specification

The extension of initial test case is specified in relation to a benchmark exercise [26] defined in the framework of the SARNET (Network of Excellence for a Sustainable Integration of European Research on Severe Accident Phenomenology) [27], dedicated to severe accident analysis and PSA methodologies development. The main task of the benchmark was to quantify the risk of containment rupture frequency (estimate of containment failure probability) due to hydrogen combustion. As a “basic transient” the following simplified object/situation was considered: A French 900 MWe Pressurized Water Reactor (PWR) with 3 loops and Passive Autocatalytic Recombiners (PAR) operating at nominal power before the initiating event; Loss of Coolant Accident (LOCA) after a break size on cold leg of Reactor Coolant System (RCS); failure of all water injection system and spray system. This issue was supposed to be most relevant for level-2 PSA.

The aim of the benchmark was to provide a simple example that demonstrates the limitations of the classical level-2 PSA methods and to assess the expected improvements that could be obtained by dynamic reliability methodologies against the classical PSA approach. This was done in order to test, validate, and compare the different methodologies mentioned above and within the framework of the SARNET [27].

The main part of benchmark [26] has been divided in 2 steps with progressive complexity in both the probabilistic and the dynamics aspects of the sequences: Step-1: first implementation of the problem with dynamic or classical method, with simple analytical model for the physics; Step-2: second implementation of the problem with complements in the analytical models for epistemic uncertainties.

The steps 1-2 comparison report has already been issued that indicates that it is essential to know the details of the interpretation made of the benchmark specification by the respective groups. Possible further activities were also considered, like implementation of ASTEC (Accident Source Term Evaluation Code) modules instead of simple analytic model. Although two main conclusions regarding classical PSA treating the issue of various possible chronological events have been drawn from the benchmark(1)“the treatment of all chronological issue is difficult,”(2)“the treatment of multiple combustions is impossible.”Consequently, the summary of this is that it is not possible to solve the benchmark with a classical event tree. The conclusion is that “the only way for a classical approach is a conservative approach.”

The study and solution of the benchmark might as well help to gain insights of the differences of the methodological approaches, its advantages, and its limitations, both for the theoretical and the practical issues involved in the development of the different approaches. By doing this, some model deficiencies could be as well detected and corrected for future considerations.

In this context, and also as part of the SARNET goals, some activities were extended, as, for instance, the so called Step-2 plus, and Step-3 was defined and additionally partly performed looking for the further benchmark.

The present test case is the result of the work done over two main objectives: to discuss and demonstrate the benchmark exercise and the details of its design and to acquire a deeper knowledge of SDTPD and Monte Carlo simulation techniques and uncertainty assessment methods and its particular application to the further benchmark, to investigate the potential of these techniques as a complement of the dynamic reliability techniques. This test case is therefore also divided in two main parts.

The first part, presented in this section, is dedicated to remember the description, specifications, and assumptions of the benchmark, as well as the solutions presented by the partner organizations which used probabilistic dynamics. In particular, it is focused on the solution developed by Lithuanian Energy Institute (LEI) and Université libre de Bruxelles (ULB) by means of SDTPD approach. Finally, some assumptions are revised and reinterpreted and specifications are extended for the further benchmark.

The other parts of this section concern the performance of the test case, mainly the Step-2 and Step-2-plus, where uncertainty and sensitivity analysis has been performed to study the influence of the uncertainties on the final result. The previous section introduces a brief description of approach used for uncertainty and sensitivity analysis.

The test case as benchmark exercise was divided into two consecutive steps, which are related to each other. Step-1 is the basic part used for basic probabilistic model development and point estimate calculations. Step-2 contains more details and extensions regarding random events and uncertain inputs; thus, this makes the modelling and a point estimate more realistic.

The single result as a point estimate can be quite precise due to a lot of histories in single simulation; however, for the practical analysis of physical behaviour and application of it for safety purposes, the investigation of variation of results itself is quite important. Such investigation is based on sensitivity and uncertainty analysis of sampled results and is a kind of a Step-2-plus assessment case. This research can show which uncertain parameters are the most important in order to change the behaviour of aleatory phenomena and decrease the uncertainty of the results.

The input parameters can also be related to the timing; that is, the set of input parameters can include uncertain delays and randomly distributed time moments of events occurrence. However, this was not a case in the Step-2 of benchmark exercise. For such investigation, the extended benchmark exercise could be used.

Further, in the next step (Step-3) or in the new benchmark, the focus could be more on epistemic uncertainties and timing parameters (e.g., random delays) when performance of additional analysis of modelling uncertainty could be a good demonstration of timing and modelling importance for the estimation of safety related result (in the similar benchmark case, the probability of containment rupture).

The practical test case used for investigation, as mentioned previously, concerns the containment failure risk assessment due to hydrogen combustion for a French 900 MWe PWR (3 loops with Passive Autocatalytic Recombiners). The specification is based on the results of two transitions (transitional processes) computed by Institut de Radioprotection et de Sûreté Nucléaire (IRSN) with ASTEC software package (Accident Source Term Evaluation Code) in which there is no water injection before core dewatering, the first one without spray system activation and the second one with spray system activation. Actually, the ASTEC calculations also provided the timing and basic information on(i)the kinetic of core degradation process;(ii)the kinetic of hydrogen and vapour releases in containment;(iii)the delay before vessel rupture;(iv)the pressure evolution in containment (and atmosphere composition).As it was mentioned, the different phases of the test case have been separated in several steps. Step-1 concerns the basic example, changing only the failure criterion. Step-2 contains some more details and extensions regarding some stochastic events in order to express some source of uncertainties in the initial data. Step-3 could be related to the reflection of epistemic uncertainties including uncertain modelling and different timing effects.

In general, after initial Loss of Coolant Accident (LOCA), there are three safety systems taken into account: Passive Autocatalytic Recombiners, water injection system (i.e., so-called Safety Injection System), and spray system (i.e., Containment Heat Removal System). The benchmark exercise introduces a stochastic character in the three main processes changing the dynamics: water injection, spray system, and H2 combustion, but not in the recombiners as they act as passive systems, being always available during the transient. The general schema of considered processes during the considered accident is presented in Figure 5.

Both the stochastic occurrence of the processes (where time scale with time variables is presented in Figure 6) and its physical effect for the considered test case are specified (arbitrary assumed) as follows.

Safety (Water) Injection System (SIS)

Time Activation/Actuation. The SIS starts before the vessel rupture (), which is related to the end of core degradation consideration, when hydrogen mass released in containment can be maximal. No failure during operation of the system is considered. However, a probability of having the system available is indeed considered. If it occurs before the total core uncovers () the situation is supposed to be safe (few hydrogen is produced and the vessel rupture is avoided). The probability of this situation is assumed to be 0.5. Between total core uncovering () and vessel rupture () the probability that water injection is available is assumed to be 0.5, as conditional probability, thus a total probability is equal to 0.25. In addition, the time when it does start follows a uniform PDF. Finally, it is assumed that all the injected water will entirely be changed by hydrogen.

Effects. The only direct assumed effect is an increase in the hydrogen flow rate coming from the primary system, thus, an increase in the number of moles of hydrogen.

Containment Heat Removal (Spray) System (CHRS)

Time Activation/Actuation. The probability that the CHRS can be activated after the core uncovers and before the vessel rupture () is assumed to be equal to 0.5. If it can be activated, the time to start is uniformly distributed.

Effects. The actuation of this system produces condensation in the containment. Therefore, the direct effects of this system are the decrease of the number of moles of steam and a decrease of the temperature and the pressure. The spray system cannot be stopped. Once it starts, it continues to work, so those effects have to be considered in the entire transient after the actuation.

Hydrogen Combustion

Time Activation/Actuation. Hydrogen combustion can occur only if the hydrogen concentration inside the containment is sufficient. There are defined two regions for initiating the combustion, the first due to the flammability conditions of the gases mixture and the second one due to the ignition capability of the recombiners. A variable delay is defined for each region. The delay before combustion depends on the hydrogen concentration inside the containment and is supposed to be shorter as molar fraction of hydrogen gets higher.

Effects. Hydrogen combustion is considered as a shock event, only causing sudden variations of some of the state variables. Therefore, combustion is assumed to happen instantaneously with a fraction of burnt hydrogen C given by an assumed uniform distribution between 0.05% and 100%.

Passive Autocatalytic Recombiners (PAR)

Time Activation/Actuation. They are supposed to be working at the beginning of the transient and never stop throughout the transient, so no random time is associated with this system of recombiners.

Effects. They have the similar effect as the combustion, that is, a decrease of the number of moles of hydrogen and oxygen.

Containment Failure Criteria. The containment failure occurs when there is overpressurization due to hydrogen combustion. In order to simplify the analysis of results, there were specified two different criteria for failure due to overpressurization. The initial criterion supposes that the containment fails with probability equal 1 if the pressure inside the containment after combustion exceeds a given value (containment pressure limit). The reference case for the pressure limit was fixed to 0.5 MPa. However finally, in Step-1 a more realistic assumption for containment failure has been considered, where the containment rupture probability is a routine based function of the amplitude of the pressure peak.

Scenario Definition. Five scenarios are determined by variation of different events related to the activation of considered safety systems. The occurrence of scenarios is based on the activation of the Safety Injection System and the Containment Heat Removal System (see Table 1).

The probability of each scenario occurrence is related to the activation of the safety systems described above. However, it is not possible to calculate analytically the precise theoretical estimates of scenario occurrence probability. The probabilities of the occurrences of the safety systems have discrete distribution; besides, scenarios C and E can be finished after the first event due to containment rupture.

The specification of the test case is supposed to be quite easy to implement in order to limit as far as possible effort of comparison with other calculations. For that reason, the proposed specifications and assumptions are rather analytical, to avoid direct use of complex severe accident codes and at least partly to allow a simplified analytical assessment taking into account timing of various events.

4.2. Accident Dynamics and Timing

List of used “physical” information specified according to the benchmark [26] is described as follows:(i)A representative ASTEC transient without spray and reflooding(a)start and finish: beginning of core degradation, vessel rupture;(b)dynamic effect: hydrogen mass released in containment;(c)process: containment pressure as a function of time.(ii)A simple law that allows predicting pressure evolution as a function of time after spray system start.(iii)A simple law that allows predicting H2 release after core reflooding.(iv)A simple law that allows to predict recombiners efficiency in function of H2 and H2O concentrations.(v)Criteria for hydrogen combustion, based on Shapiro diagram and effect of ignition by recombiners.(vi)The probability of containment failure as a function of pressure peak.(vii)Pressure peak in containment due to combustion evaluated by so called PAICC model (provided by IRSN).In addition, the following assumptions were used: the atmosphere ignition within a short delay is very probable if the recombiners ignition criteria are achieved for average H2 concentration and local (partial) and multiple ignition have been taken into account.

The scenario which reflects the accident dynamics and more realistic conditions for safety system actuation are actually not as simple as ordinary sequence of events. The specification of the hypothetical transient of the initial accident is related to the following events:(1)The initiating event of reactor core dewatering (when core uncover starts).(2)The core dewatering and vessel rupture.This transient corresponds to the following situation:(1)Loss of Coolant Accident (LOCA) with a break size occurs on the cold leg of RCS.(2)The Safety Injection System (SIS or water injection system) and Containment Heat Removal System (CHRS or spray system) are not available until the beginning of core dewatering.(3)The steam generators are available but not used by the operators.(4)No water injection occurs before core dewatering.(5)The reactor is operating at nominal power before the initiating event.(6)The calculated core dewatering occurs at 4080 s (1 h 08 min). The vessel rupture occurs at 14220 s (3 h 57 min) if no action is undertaken.If no action is taken, the time scale of this transient is illustrated in Figure 6 and time related variables and parameters are described in Table 2.

During the core dewatering phase, the situation is supposed to be as follows:(1)A water injection means is available (with an average flow rate = 0.833 kg/s) and can be used by the operators.(2)The spray system (CHRS) is available and can be used by the operators.(3)Water injection after the beginning of clad oxidation causes an increase of the hydrogen flow rate towards containment.(4)Hydrogen combustions can occur if the containment gas mixture is flammable; recombiners, because of their high temperature, can initiate a combustion; such combustions can be total (all the hydrogen in the containment is burnt) or not.

4.3. Event Tree and Simplified Modelling

In relation to dynamic effects in the test case, there is possibility to construct a simple event tree (see Figure 7) and to relate all sequences to the previously described scenarios (see Table 1). There is possibility to note that in this event tree the time-dependent sequence of two safety systems are reflected as the order of these systems is also considered in scenario definition.

As the test case is relatively simple and the dynamic effects are limited, the initial classical PSA based solution will allow a comparison with the results obtained by SDTPD (see the next sections) and a discussion about the advantages and disadvantages of both approaches.

Having event tree constructed, it is quantified. For each branch, its probability is calculated and, then, the conditional rupture probability (given the considered scenario) is estimated. At the end, the total probability of the containment rupture is calculated by combining these results. There is also need to note that the event tree and values calculated and shown below are related to the Step-1 of the test case and to the rupture limit defined as pressure equal to 5 bar (without any stochastic variation).

According to the initial assumptions, the probability that water is injected before the total core uncovers () is equal to 0.5. And the probability that this system does not start between the total core uncovering and the end of the transient is equal to 0.25. Thus, the probability that water is injected . In addition, it was assumed that the probability to start CHRS is .

During A scenario nothing happens, that is, no water injection and no spray system actuation. Thus, the probability of this type of scenario can be estimated as follows:where is the probability that no water is injected; that is, SIS does not start, and is the probability that CHRS does not start.

During B scenario there is no water injection, but CHRS starts. Thus, the probability of this type of scenario can be expressed and estimated as follows:During C scenario, CHRS (at time moment ) starts before water is injected (at time moment ). According to the initial assumptions and taking into account time related variables and parameters described in Table 2, this scenario reflects one of the following three situations with the corresponding probabilities estimated:(1)CHRS and SIS start before the total core uncovering occurs (), with probability :(2)CHRS starts before and SIS starts after this time moment , with probability :(3)CHRS and SIS start after the total core uncovering occurs (), with probability :Thus, the probability of C scenario, including all three situations, finally can be estimated as follows:During D scenario initially SIS starts and CHRS does not start. According to the initial assumptions and taking into account time related variables and parameters described in Table 2 this scenario reflects one of the following two situations with the corresponding probabilities estimated:(1)SIS starts before , with probability .(2)SIS starts after and CHRS does not start, with probability :Thus, the probability of D scenario, including both situations, finally can be estimated as follows:During E scenario, SIS (at time moment ) starts before the start of CHRS (at time moment ). So, water has to be injected after . Indeed, if water is injected before , the scenario is stopped and the spray system cannot start (this situation corresponds to type D). Thus, the probability of this type of scenario can be expressed and estimated as follows:On the basis of the analytically estimated scenario probabilities and taking into account possible transitions from branch E to branch D, they can be finally presented in Table 3.

Due to assumptions for simplification in analytical modelling the estimate of each scenario probability does not fully correspond to the possible reality because the calculations do not take into account the occurrence of combustions leading to rupture between the first event and the second one. For example, if SIS is the first system to start, there are three different situations after this start:(1)Combustion occurs and leads to a rupture (in spite of other conditions).(2)Combustion occurs but does not lead to a rupture and then CHRS starts.(3)Combustion does not occur as the CHRS starts before any combustion.The first situation may correspond to scenario D while the second and third situations may correspond to scenario E.

In general, the best estimates probabilities of various scenarios, which in some cases lead to containment rupture, in simulation depend on timing and stochastic parameters. The probabilities of combustion and containment rupture cannot be expressed analytically but may be estimated performing simulations during which any probability of scenario is also estimated. The difference between the analytically estimated probability and the simulation based estimates depends on the precision of the time-dependent reality and uncertainty reflection in the simulations.

4.4. Simulation of Time-Dependent and Stochastic Rupture

The model for calculations was prepared using SDTPD approach and various simulation and analysis software. The physical model was based on various deterministic laws and process variables (like hydrogen flow rate, number of moles of hydrogen, steam and oxygen, temperature and pressure inside the containment, etc.) and combustion phenomena modelling, all together combined with simulation of various stimuli and corresponding delays, dynamics and random events related to the Passive Autocatalytic Recombiners (PAR), Safety Injection System (SIS), and Containment Heat Removal System (CHRS) [26]. The considered containment physical phenomena were mostly related to the containment gas phase and dynamic hydrogen combustions. The pressure and temperature peaks for certain conditions were calculated by the so-called PAICC model (provided by IRSN). The developed time-dependant hybrid (continuous-discrete) physical model was used to perform 100,000 runs for each simulation case.

The final task of simulations was to estimate the containment rupture probability and uncertainty. Different scenarios were separated and analyzed depending on the possible failures of safety systems and related consequences. Using the developed probabilistic model, many histories were simulated with respect to the timing and grouped to the sequences of dynamics as grouped paths of processes [16]. Later on, the sequences of dynamics were grouped depending on the scenarios considered. For each scenario, the estimate of the conditional rupture probability was calculated.

In general, the result of probabilistic simulation is affected by internal stochastic variables of the probabilistic model with deterministically defined dynamics. However, during the simulation, the result also depends on values of uncertain input (see Table 4). For instance, the following assumption and coefficients and were used for flow rate of recombined hydrogen (and oxygen) calculation. The mass flow rate (g/s) of hydrogen recombined by the PAR system was specified by the following law: = (, with [] = /00 and (number of moles). For Step-1: = 3.0 g/s/bar, = 3.7 g/s and for Step-2: and have an uncertainty band between 0.5 and 1.5 (uniform distribution) times the above reference values. The mass flow rate (kg/s) for recombined oxygen is eight times the mass flow rate for hydrogen.

The difference in the results and impact of uncertain parameters on the results and the importance of events, which affect dynamics and timing in the process variables space (e.g., automatic activation of CHRS) in comparison to other model parameter can be estimated only using the sensitivity and uncertainty analysis for additionally calculated results sample.

A result of one probabilistic simulation with many runs (simulation of histories) is an estimate of the considered final event frequency (i.e., containment rupture probability estimate) for the specific set of fixed input parameters. Using a result based on point input values, it is not possible to estimate the distribution as there is only one result received from one probabilistic simulation with the specified amount of runs (e.g., 10,000 histories); however, it is possible to express the deviation of this result due to the probabilistic model itself.

Having more runs, the result of the probabilistic simulation will be more precise. However, with the same input and the same amount of runs, it is expected that the result of each simulation will be slightly different. This variation does not depend on the possible uncertainty of inputs. In this case, the distribution and statistical characteristics (e.g., mean and standard deviation) of the result depends only on the probabilistic model and can be based on statistical analysis of result deviations from simulations with the same inputs and the same amount of runs.

One run or history from simulation, taking into account a single set of model inputs, only treats one possible final event (e.g., containment rupture), which influences the result of the whole probabilistic simulation with various histories. Using one run, it is not possible to estimate failure frequency (i.e., the considered result) for one specific set of inputs.

Using a lot of (e.g., 10,000) probabilistic simulations based on one run with different set of inputs, an averaged influence of uncertain inputs can be considered. However, in such case, it is still not possible to analyze the sensitivity of the uncertain result with respect to the uncertainty of inputs.

In case the uncertainty and sensitivity of the result is considered, one must get a distribution of results based on a lot of runs (e.g., 10,000 histories) for each simulation (B option in Figure 8). For instance, 100 simulations with different sets of uncertain inputs will give the distribution of 100 results. In this case, the uncertainty of the result is related to the uncertain inputs and it is possible to perform the sensitivity of uncertainty analysis (SUA).

The last option, whose results are presented below, means that random start of CHRS at 2.4 MPa pressure was considered (i.e., automatic start of spray injection is working with probability equal to 0.5). The analysis itself is related to the 100 probabilistic simulations with the set of different uncertain parameters for each simulation but the same parameters for all histories in the each simulation case.

4.5. Analysis of Uncertain Rupture Frequency

As in the test case, the sample size of model results was 100 (i.e., results after 100 simulations with 10,000 histories each), it was possible to evaluate the (0.95, 0.95) tolerance interval. While the upper limit of rupture frequency is 0.157 and the lower limit is 0.000, with 0.95 probability it is possible to predict that 95 percent of model outputs (rupture frequencies) are in interval (0.000, 0.157). In addition, estimating the median, which is equal to 0.013, it is possible to say that half of the model results did not exceed the value 0.013. The variation of 100 results can be seen from Figure 9 with frequencies of combustions and frequencies of possibly following ruptures.

There is observable correlation in the variation of combustion frequency and rupture frequency and it is possible to note that some results of simulations are with very high combustion frequency. In some cases, where the combustion frequency is around 1, in each history, there is possibility to expect one or more combustions. In all simulations there are some histories with ruptures. However, in some histories there are no ruptures or even combustions at all.

In addition, it is possible to note that some sets of model parameters give quite large deviations in the rupture frequencies. The sample mean of rupture frequency is 0.0228, the standard deviation is 0.0277, the 5th quantile is 0.0004, and the 95th quantile is 0.0727.

The statistical characteristics (mean and standard deviation) obtained using the results from different probabilistic simulations (taking correspondingly different computing time) are presented in Table 5. It is possible to note that increase in the number of histories will not decrease significantly the standard deviation, which is quite high due to uncertain parameters.

Sensitivity analysis of the rupture frequency can help to identify which parameters (inputs) mostly influence the model results. In absolute values, the highest SRCs have 3 parameters out of the 12 ones considered (see Figure 10).

Since the coefficient of determination is not high enough (0.5511), it is not certain that the uncertainty of these parameters mostly influences the model results. This also does not reveal a linear relation between model parameters and results, although the other correlation coefficients mostly confirm sensitivity and may fit better instead of the SRCs rating; for example, see Partial Correlation Coefficients (PCC) in Figure 11. In this case the steam related parameter is additionally emphasized.

According to the sensitivity analysis of rupture frequency, the most important uncertain parameters are number of steam moles, pressure inside the containment, factor of containment rupture limit, and automatic CHRS availability. The absolute values of various sensitivity measures for these parameters are dominating. Part of parameters has negative values of sensitivity measures, which means inverse dependence between these parameters and the result.

This also points out the fact that significant hydrogen concentration could be reached, leading to a flammable gas mixture, but, because of the presence of high steam concentrations, hydrogen burn may be prevented. If the atmosphere is undergoing rapid condensation, for example, by spray initiation, a potentially detonable mixture could form rapidly in case of a high concentration of hydrogen. Additionally, depressurization can also take place as a result of some exchange taking place within the atmosphere, as in the case of energy and mass exchange with containment sprays (e.g., in B, C, and E scenarios).

The probabilities of various scenarios (for nominal and uncertain parameters) and related sensitivity measures (analysing these probabilities of scenarios) are presented in Figure 12.

In the future, these scenarios (see Table 1) probabilities and sensitivity measures could be also used for the additional simulations in order to focus on the rarest sequences with the quite severe consequences and reduce the computation time of the simulations performed with SDTPD.

Having the probabilities of scenarios, the weight of each scenario is known. It depends on the frequency of events which corresponds to the scenario and can be changed by increasing the values of the parameters with the highest positive measures of sensitivity. By increasing uncertainty for these parameters for the rarest scenario there is possibility to increase the chance during the simulation to focus on the sequences with less probable conditions but possibly more severe consequences. The different parameters impact on the severity may also be reflected by other sensitivity measures, which express parameters effects on the rupture probability.

Similarly it is possible to have more scenarios or consider sequence grouping with respect to sensitivity measures and vice versa. Hence, it is possible to identify and force rare accidental sequences. After complete simulations of these rare sequences, the results obtained can be weighted and incorporated in the other results in order to have the global probability of the uncertain events. In addition, this idea can be extended applying a time-dependent sensitivity analysis, which may guide the generation of events in the sequence. Then, the computation time may also be reduced, while keeping possibility to identify some scenarios likely to be forgotten by basic Monte Carlo simulation due to their very low probability. A time-dependent analysis was performed for the rupture frequency. While scalar analysis considers only final simulation results, the time-dependent analysis can reveal uncertainty and sensitivity of the model at any moment of the modelling process. The time-dependent analysis complements the results given by the scalar analysis. Most of the rupture probability estimates in time lie under 0.1, but some particular sets of model parameters give higher values (e.g., one, which is related to maximum rupture frequency).

Time-dependent uncertainty and sensitivity analysis can show how model parameters affect the result at each simulation moment (see Figures 13 and 14).

The set of considered parameters can also be extended and related to the timing; that is, the set of input parameters can reflect uncertain delays and stochastically distributed time moments of events occurrence. However, this was not a case in the Step-2 of the benchmark exercise. For such an investigation, the extended version of the benchmark exercise could be performed in the future. The additional uncertainty and sensitivity analysis could be also useful in order to show how the uncertain timing is important for the modelling and analysis of a dynamic system trying to make it more robust and reliable.

5. Discussions and Conclusions

Even complex systems behaviour and accidental transients involved in level-1 PSA are usually treated as mainly having been governed by the failure or the success of systems functions or by well prescribed human actions. The chance of occurrence of various events is relatively well known and they are ordered in a chronological way. Consequently, the classical methods based on these assumptions and used in level-1 PSA, the event and fault trees, have been widely developed.

But the situation is very different in case of the time-dependent hybrid systems and accidental transients involved in level-2 PSA. Indeed, these systems and accidents are mainly governed by physical phenomena and dynamics aspects. It is impossible to predict the chronology of the events and each event cannot be reduced to a binary situation (failure or success) but combined and functional aspects have to be considered. Classical methods developed for level-1 PSA do not cope with all the specificities of probabilistic dynamics in level-2 PSA and there is currently no single method devoted and specially adapted to the related issues of severe accidents. Nevertheless, the first attempts (related to memoryless Markov processes) have been made on the basis of the event trees, trying to include some dynamics and uncertainty aspects. They constitute a first step but are not really a satisfactory solution (when each state depends only on previous state). The most promising attempts are based on dynamic reliability methods because they explicitly model the dynamic evolution of the system, the time-dependent aspects of this evolution and take into account the interactions between the system states, the physical variables values, and the human actions.

Recently, the TSD as extension of the dynamic reliability and simplification of SDTPD has been developed. This is a practical approach of general theory based on dynamic reliability, supplemented by the notion of stimulus and delay. Hence, each physical event is divided into two phases: the stimulus activation (as soon as all the conditions necessary to the event occurrence are met) and delay (before the actual occurrence of the event), with a possible stimulus deactivation if the conditions are no longer met (cancelled). It allows modelling in an accurate way the interaction between events, which is one of the advantages of the TSD (in comparison to other methods based on the Markov processes).

For example, let us consider the hydrogen issue one more time. Hydrogen is released in the containment atmosphere and the gases mixture becomes flammable. There is no heat source and, consequently, no combustion. So, nothing happens. But the system continues evolving and the recombiners start (eventually with delay). Consequently, they reduce the amount of hydrogen in the atmosphere and the hydrogen concentration decreases; the gases mixture is no longer flammable and no combustion is possible, even in case of a spark. This kind of situations with events interactions is impossible to model or adapt with other classical techniques or methodologies.

Considering the drawbacks of time-dependent reliability and uncertainty analysis, the concept of dynamic reliability and stimulated dynamics was applied for the analysis of hybrid systems with uncertain events.

As shown in the test case, the time step (taking correspondingly different computing time) influence on the results is very limited. Consequently, it is possible to reduce the computing time by using a larger time step or smaller numbers of histories, without meaningful change of the containment rupture frequency mean and standard deviation, which is quite high due to uncertain parameters.

The stimulated dynamics with the considered uncertainty and sensitivity analysis allows a detailed simulation and representation of the dynamic system uncertainty. Taking into account which parameters mostly influence the uncertainty of the system, such simulation can be used to search for rare, but possibly severe, conditions of a dynamic system.

The combinations of uncertain events and process values can lead to various conditions or situations and probabilistic results can change. The impact of these uncertainties does not only concern the final results but also could lead to the totally new situations. Consequently, further investigations can be carried out.

The considered techniques of modelling clearly allow for a systematic analysis of complex system reliability and uncertainty. The developed approach for analysis of hybrid systems with uncertain events can be efficiently used to estimate system failure probability or reliability and at the, same time to analyze the uncertainty of this estimate.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The author would like to acknowledge the contacts started with the PSA2 group in the Severe Accident Research Network of Excellence (SARNET). The author also thanks Pierre-Etienne Labeau, Agnes Peeters, and Tadas Eimontas, for their valuable input via numerous discussions and participation in the related analysis. This research was partly funded by a grant from the Fonds de la Recherche Scientifique in Belgium and Research Council of Lithuania.