Abstract

Since the 1988 amendment of the 10 CFR 50.46 rule in 1988, Westinghouse has been developing and applying realistic or best-estimate methods to perform LOCA safety analyses. A realistic analysis requires the execution of various realistic LOCA transient simulations where the effect of both model and input uncertainties are ranged and propagated throughout the transients. The outcome is typically a range of results with associated probabilities. The thermal/hydraulic code is the engine of the methodology but a procedure is developed to assess the code and determine its biases and uncertainties. In addition, inputs to the simulation are also affected by uncertainty and these uncertainties are incorporated into the process. Several approaches have been proposed and applied in the industry in the framework of best-estimate methods. Most of the implementations, including Westinghouse, follow the Code Scaling, Applicability and Uncertainty (CSAU) methodology. Westinghouse methodology is based on the use of the WCOBRA/TRAC thermal-hydraulic code. The paper starts with an overview of the regulations and its interpretation in the context of realistic analysis. The CSAU roadmap is reviewed in the context of its implementation in the Westinghouse evaluation model. An overview of the code (WCOBRA/TRAC) and methodology is provided. Finally, the recent evolution to nonparametric statistics in the current edition of the W methodology is discussed. Sample results of a typical large break LOCA analysis for a PWR are provided.

1. Introduction

The 1988 amendment of the 10 CFR 50.46 rule allowed the use of realistic physical models to analyze loss-of-coolant accident (LOCA). Best-estimate LOCA methods are now extensively employed within the nuclear industry. In particular, Westinghouse has been developing and applying realistic or best-estimate LOCA methods for almost two decades now and a large amount of experience has being gained in this field.

The Westinghouse realistic (best-estimate) methodology is based on the Code Scaling, Applicability and Uncertainty (CSAU) methodology (Boyack et al. [1]). The methodology was approved by the NRC in 1996 after an extensive review. At that time, this was the first best-estimate (BE) LOCA evaluation model approved (Bajorek et al. [2], Young et al. [3]). In its original version W BE methodology was applicable to 3- and 4-loop plants with safety injection into the cold leg. Subsequently, the methodology applicability was extended to 2-loop plants with upper plenum injection (UPI) in 1999 (Takeuchi et al. [46]) and advanced passive plant such as the AP600 and AP1000 (Frepoli et al. [7]). Since its approval, Westinghouse has applied the methodology to more than 30 nuclear power plants (Muftuoglu et al. [8], Frepoli et al. [911]) both in the USA and abroad.

Westinghouse LOCA methodology is based on the use of WCOBRA/TRAC computer code. Sections 3 and 4 provide an overview of code features, its assessment basis, and identified source of biases and uncertainties.

A key step in a best-estimate analysis is the assessment of uncertainties associated with physical models, data uncertainties, and plant initial and boundary condition variabilities. As uncertainties are incorporated into the process, a procedure is developed where the results from several calculations are collected to develop a statement where compliance with prescriptive rules or acceptance criteria is demonstrated. Based on the current 10 CFR 50.46 rule, an emergency core cooling system (ECCS) design is required to satisfy three main criteria: (1) the peak clad temperature (PCT) should be less than 2200 F, (2) the local maximum clad oxidation (LMO) should be less than 17%, and (3) the core-wide oxidation (CWO) should be less than 1%. More insights on the regulations and how industry satisfies those rules in the framework of realistic calculations are provided in Section 2.

The technique used to combine those uncertainties evolved over the years. In its original implementation, Westinghouse methodology followed strictly CASU where the use of response surface was suggested as a practical means to combine the various uncertainty components. More recently, the methodology was modified toward nonparametric methods. The current methodology is called Automated Statistical Treatment of Uncertainty Method (ASTRUM) (Nissley, et al. [12], Frepoli and Oriani [13]). The main difference between the new and the old techniques is in the evaluation of final uncertainty, Element III of CSAU. A comparison between the two techniques is discussed by Muftuoglu et al. [8]. A review of these techniques is given in Section 5 while sample results are provided in Section 6.

2. Historical Background and Review of Regulations

A large-break-LOCA event is categorized as a design-basis accident. The current safety regulations of the United States Nuclear Regulatory Commission (US NRC) are stipulated in 10 CFR Part 50, Section 50.46. Based on the 10 CFR 50.46 rule, an emergency core cooling system (ECCS) design is required to satisfy prescriptive criteria. The regulation identifies the following five criteria.(1)Peak clad temperature (PCT) should be less than 2200 F.(2)Local maximum oxidation (LMO) should be less than 17%.(3)Core-wide oxidation (CWO) should be less than 1% (to limit the maximum amount of hydrogen generated).(4)The core should maintain a coolable geometry.(5)Long-term cooling should be demonstrated. Typically, the last two criteria (coolable geometry and long-term cooling) are satisfied outside the LOCA analysis once the LOCA calculation demonstrate to be in compliance with the first three criteria.

The acceptance criteria above were established following an extensive rulemaking in 1973. Also the regulation at that time was formulated to account of potentially unknown phenomena and recognizing lack of knowledge of fundamental physical phenomena. Several conservative “required features” were mandated in Appendix K to 10 CFR 50. To cite some, the decay heat was based on ANS 1971 model + 20%; the metal-water reaction calculation was based on the conservative Baker-Just model; the heat transfer was limited to steam only for low-flooding rates; and so on.

This led to broad international development efforts to better understand LOCA phenomena and processes, in particular the large break LOCA. The effort was both on the experimental side and analytical side (computer codes, evaluation models). The major contributor to the development effort was the international 2D-3D program which focus on multidimensional phenomena and scaling considerations. The test facilities are full-scale upper plenum test facility (UPTF); large-scale cylindrical core test facility (CCTF); slab core test facility (SCTF).

The knowledge gained over the years led the industry to consider a more realistic approach in the analysis of the LOCA scenario (ECCS [14]). In 1988, the USNRC amended its regulations (10 CFR 50.46) to allow the use of realistic physical models (Federal Register [15]), simulated in computer codes, to analyze the loss-of-coolant accident (LOCA) in a PWR. In the amended rule, the acceptance criteria were not changed (PCT = 2200 F, LMO = 17%, and CWO = 1%), however certain physical models were identified as acceptable but not prescribed. Acceptable data sources were identified and documentation requirements specified (Regulatory Guide 1.157). Any realistic calculation requires the assessment of the uncertainties. Overall requirements for quantifying uncertainties were specified and the Code Scaling, Applicability and Uncertainty (CSAU) method (Boyack et al. [1]) was cited CSAU as acceptable methodology framework. An overview of the CSAU process is given in the next section.

2.1. Overview of the Code Scaling, Applicability and Uncertainty (CSAU) Roadmap

A group of experts (referred to as the technical program group or TPG) under the sponsorship of the US Nuclear Regulatory Commission (USNRC) took an effort to demonstrate that practical methods could be developed which would be acceptable under the new regulations. Shortly after its completion, the CSAU methodology and its demonstration were described in a series of papers appearing in Nuclear Engineering and Design (Boyack et al. [16, 17]).

The CSAU process is divided in three main elements. In Element (1), the scenario is broken down into relevant-time periods (e.g., blowdown, refill, and reflood for large-break scenario) and the nuclear power plant broken down into relevant regions (e.g., fuel rod, core, lower plenum). Then potentially important phenomena/processes are identified for each time period and region. An expert’s panel performs ranking and document basis for consensus. Results are compiled in the phenomena identification and ranking table (PIRT). The PIRT is a critical element of CSAU-based methodologies. It is designed to focus the prioritization of code assessment and facilitate the decisions on physical model and methodology development.

Element (2) is the assessment of the code. An assessment matrix is established where separate effect tests (SETs) and integral effect tests (IETs) are selected to validate the code against the important phenomena identified in the PIRT. The code biases and uncertainties are established and the effect of scale determined. A key output from this element is the establishment of probability distributions and biases for the contributors identified in Element (1). In addition to the generation of probability distributions, and perhaps even more important, this element required a thorough assessment of the code’s ability to correctly predict all the dominant physical processes during the transient. This leads to the adequacy decision of the evaluation model.

Element (3) is the actual implementation stage of the methodology. Sensitivity and uncertainty analyses are performed here. This element is probably the most straight forward of all the elements. The dominant contributors and their probability distributions are properly identified and quantified, and if the computer code, through assessment and comparison with data, is shown to accurately predict the effect of variations in input variables on the output result, then several well-established methods are available to perform the uncertainty propagation step. The choice of method is basically a practical one, controlled by the expense incurred in performing computer calculations. The methods utilized evolved over the last two decades. An overview of the methods for combining the uncertainties is provided in Section 5.

The CSAU is a practical roadmap to develop a realistic methodology but shortcomings were recognized since its introduction. In particular, with regard to the PIRT, the human judgment factor and the fact that knowledge gained is not always factored back into final documentation were seen as a point of weakness. Soon after its introduction, the CSAU methodology was reviewed by the technical community, and comments were published in Nuclear Engineering and Design (Hochreiter [18]). Although there was agreement that the methodology described many of the key steps required for an acceptable methodology, there was also technical criticism and some skepticism on the practical applicability of the methodology (Boyack et al. [17]).

One important issue raised was whether the PIRT procedure eliminated too many important processes from consideration. This concern is heightened by the fact that since every additional process which is included increases the complexity and cost of subsequent steps, there is the possibility of ‘rationalizing’ a short list of contributors.

However, there are three conditions preventing such an occurrence: First, detailed independent review of the methodology by the USNRC’s experts eventually brings to light important processes which may have initially been ignored. Second, [19] provides a complete list of all the processes known to affect the LOCA transient, and requires a detailed assessment of each one. Third, the CSAU methodology requires thorough assessment of a “frozen” version of the computer code with a wide variety of experiments. Since these experiments are specifically selected to cover the expected range of conditions, important phenomena will be identified.

Overall, an important claim made by the TPG was that the methodology was structured, traceable, and practical and therefore it was ideally suited for application in the regulatory and design arenas. This was definitely demonstrated by several successful implementations of the CSAU-based methodologies currently licensed and applied to safety analysis in the industry.

Beginning in the mid 1980s, Westinghouse began development of a best-estimate methodology, in partnership with the Electric Power Research Institute and Consolidated Edison (Calif, USA). Acceptance of the methodology was achieved in 1996 after a rigorous review spanning over 3 years. A summary of the technical review and the conditions of acceptance was issued by the USNRC (Jones and Liparulo [20]). Many of the questions raised by the technical community concerning the CSAU methodology were dealt with during this review.

The PIRT concept has evolved over years (Wilson et al. [21] and Boyack et al. [22]) and has been extensively used in various areas by the industry. Main area of application is the development of realistic analysis methodologies (not limited to LOCA) and the development of testing requirements for new plant designs. Recent PIRT also includes the “state of knowledge.” This process puts significant emphasis on processes or phenomena that are flagged as highly important with a low state of knowledge.

The CSAU was recently endorsed as an acceptable structured process in the recently published Standard Review Plan (NUREG-0800) [23] and Regulatory Guide 1.203 (2005) [24]. In particular, RG 1.203 describes a structured evaluation model development and assessment process (EMDAP) which essentially follows the same principles of the CSAU roadmap with more emphasis given to the evaluation model development process which starts from the definition of the objectives, the functional requirements, and the assessment and leads to the evaluation model adequacy decision. The EMDAP process is depicted in the flowchart of Figure 1.

98737.fig.001
Figure 1: EMDAP (Reg. Guide 1.203).
2.2. Regulations within a Statistical Framework

While Elements (1) and (2) of the CSAU are generally applied in various form consistently with the original intent, the techniques used to combine the uncertainties evolved over the last few years. The CSAU originally suggested the use of response surfaces methods, however shortcomings were soon identified in early implementation. Direction in recent years is toward direct Monte Carlo methods and the use of nonparametric statistics. This generated a debate in the industry since the regulations are not directly suited to a statistical framework. A discussion on the interpretation of the regulations from this perspective is presented in this section.

The key step in a realistic analysis is the assessment of uncertainties associated with physical models, data uncertainties, and plant initial and boundary condition variabilities. The issue is how results are interpreted to demonstrate compliance with the 10 CFR 50.46 requirements. As an additional requirement/clarification, 10 CFR 50.46 states that “[] uncertainty must be accounted for, so that, when the calculated ECCS cooling performance is compared to the criteria set forth in paragraph (b) of this section, there is a high level of probability that the criteria would not be exceeded. ” Paragraph (b) of 10 CFR 50.46 contains the list of the acceptance criteria. 10 CFR 50.46 does not explicitly specify how this probability should be evaluated or what its value should be.

Additional clarification as to the US NRC expectations on the acceptable implementation of the “high probability” requirement is provided in Section 4 of Regulatory Guide 1.157 (Best-estimate Calculations of Emergency Core Cooling System Performance) that states “a 95% probability is considered acceptable by the NRC staff [].”

The regulatory guide was not developed to the point of explicitly considering a statistical approach to the uncertainties treatment, which would also require a statement with regard to the confidence level associated with a statistical estimate of the uncertainty. Regulatory Guide 1.157 introduced the concept of confidence level as a possible refinement to the uncertainty treatment, but did not expand further on this concept.

As statistical methods are implemented to perform LOCA safety analyses, a statistical statement based on a 95% confidence level has been suggested by the NRC as acceptable. This will be discussed further in Section 5. In practice, a 95% confidence that the 95th percentile of PCT, LMO, and CWO populations is within the specified acceptance criteria is considered acceptable by the USNRC to demonstrate the required “high probability.” In particular the safety evaluation report (SER) of the Westinghouse best-estimate large break LOCA methodology (ASTRUM) states the following: “the staff determined that a 95th percentile probability level based on best approximations of the constituent parameter distributions and the statistical approach used in the methodology is appropriately high.”

The main reason that a 95/95 statistical statement is accepted lies in the defense-in-depth philosophy. It is recognized that many other layers of conservatisism are included in any licensed realistic evaluation model. For example, the following is stated by the NRC in ASTRUM SER:“Because this application only applies to LBLOCA design basis analyses (which assume a single failure), a higher probability [] is not needed to assure a safe design.” Note that the single failure assumption is not the only conservative bias/assumption included in the Westinghouse methodology. The use of this and other conservative assumptions further supports the conclusions that a 95/95 statistical statement is adequate to satisfy the acceptance criteria, for the proposed evaluation model.

3. The Engine of W Methodology: WCOBRA/TRAC Computer Code

Westinghouse large break LOCA evaluation model is based on the use of the WCOBRA/TRAC thermal-hydraulic code, the engine of the methodology. This code was developed from COBRA/TRAC which was originally developed at Pacific Northwest Laboratory (Thurgood et al. [25]) by combining the COBRA-TF code (Thurgood et al. [26]) and the TRAC-PD2 codes (Liles et al. [27]). The COBRA-TF code, which has the capability to model three-dimensional flow behavior in a reactor vessel, was incorporated into TRAC-PD2 to replace its vessel model. TRAC-PD2 is a system-transient code designed to model all major components in the primary system. Westinghouse continued the development and validation of COBRA/TRAC through an extensive assessment against several separate effect tests (SETs) (Paik and Hochreiter [28]) and integral effect tests (IETs).

The COBRA-TF (3D Module) is based on a two-fluid, three-field representation of two-phase flow. The three fields are a vapor field, a continuous liquid field, and an entrained liquid drop field. Each field in the vessel uses a set of three-dimensional continuity, momentum, and energy equations with one exception: common energy equation is used by both the continuous liquid and the entrained liquid drop fields. The one-dimensional components (TRAC-PD2) consist of all the major components in the primary system, such as pipes, pumps, valves, steam generators, and the pressurizer. The one-dimensional components are represented by a two-phase, five-equation, drift flux model.

Among the new models and improvements incorporated by Westinghouse are (1) improved DFFB (dispersed flow film boiling); (2) bottom/top downflooding (Reflood Entrainment); accumulator nitrogen model; (3) a new core kinetic model (point kinetic); (4) spacer grid model which includes the heat transfer enhancement, drop breakup and grid rewet effects; (5) a two-fluid choke flow model based on TRAC-PF1 formulation (Liles et al. [29]); (6) an improved fuel rod model; (7) upgraded interfacial drag models.

The subchannel formulation included in the 3D module (COBRA) offers a large flexibility from the modeling stand point (Figure 2). The geometric complexity of the vessel internals and hardware can be modeled with great details with a relative coarse hydraulic mesh. For example, important is the capability of explicitly modeling the hot assembly within the core.

fig3.1
Figure 2: WCOBRA/TRAC subchannel formulation.

Westinghouse followed the PIRT process to identify and rank dominant phenomena. Important phenomena identified were as follows. (1)Break flow.(2)Break path resistance.(3)Initial stored energy/fuel rod.(4)Core heat transfer.(5)Delivery and bypass of ECCS water.(6)Steam binding/entrainment.(7)Condensation in cold leg and downcomer.(8)Noncondensable gases/accumulator nitrogen effects. Note that several additional contributors not considered important in the CSAU demonstration were identified by Westinghouse. Two examples are the effect of the broken loop resistances such as the pump and vessel nozzles on the core- ow rate, and the effect of fuel relocation after cladding burst on local linear power. It was also found that it is important to consider the effect of variations in plant-operating conditions such as the core power distribution and transient peaking factors allowed by the technical speci cations for the plant. In the CSAU demonstration, this aspect was not given much attention.

For large break LOCA application, more than 100 tests and 20 facilities were simulated by WCOBRA/TRAC. Quantifications of model uncertainty such as heat transfer and critical flow were performed via SETs. IETs and large component tests were used for judging the code’s ability to predict system responses. This includes the effect of the noding. In particular, PWR noding is consistent with noding used in code assessment as much as in the practical part. Compensating error analyses were performed to investigate the interaction of various models and identify situations where an apparently good prediction is due to offsetting mispredictions. Figure 3 shows the typical WCOBRA/TRAC vessel noding.

98737.fig.003
Figure 3: WCOBRA/TRAC typical noding of the reactor vessel.

The influence of the user on the results has been recognized as another potential source of uncertainty (Aksan et al. [30], Glaeser [3133]). To eliminate such variability, several engineering safeguards or procedures are considered as part of the methodology. Calculation of plant-specific inputs and setup of initial and boundary conditions follow a very prescriptive standard guidance which is formulated in standard procedures. Frequent engineering and peer reviews are implemented to assure adherence to this guidance. In this framework, plant-to-plant variations are limited as much as in the practical part. Steady-state criteria are established to minimize variability of initial conditions. Following this procedures, consistency with the code assessment conclusion is ensured and “user effects” are virtually eliminated.

4. Review Biases and Uncertainties

The Westinghouse methodology identified more than 30 important uncertainty contributors, as shown in Table 1. The list in Table 1 applies to all the standard Westinghouse 2-, 3- and 4-loop PWR. For the 2-Loop UPI, some additional uncertainty parameters were considered with regard to the upper plenum hydraulics (Takeuchi et al. [5]).

Table 1 is a substantially larger than the list developed in the CSAU demonstration. This fact does not indicate a aw in the CSAU methodology itself, but is indicative of the need to apply the PIRT process thoroughly, and not rely totally on the CSAU demonstration.

Note also that there are many other parameters beyond the list in Table 1 which may affect the results. However these are parameters whose sensitivity to the transient results is expected to be very small or negligible. In those circumstances, it is appropriate to consider those parameters at their nominal (expected or midpoint) value without consideration of uncertainty. Typically, this is a good approximation when the variation in the parameter is tightly controlled, such as pressurizer level, or when the sensitivity to the value of the parameter is known to be negligible, such as a small uncertainty in the vessel and loop dimensions or secondary side liquid mass.

For some other parameters, a conservative value may be used when the parameter varies gradually as a function of operating history, such as steam generator tube plugging, or when the value of the parameter at the time of the accident is indeterminate, such as location of the pressurizer relative to the break. A parameter may also be bounded when the sensitivity of the transient results of variations in the parameter is small, such as moderator temperature coefficient, or when the effort to develop and justify a detailed uncertainty treatment was judged to exceed the benefits of doing so, such as containment pressure response.

The Westinghouse methodology considers the distinction between global and local variables. Each LOCA transient analysis is divided in two parts as follows. (1)Predict the nominal behavior of fuel rods in the high power fuel assembly, as a result of variations in global variables. Global variables are defined as those variables which affect the overall system thermal-hydraulic transient response. By nominal we mean the predicted fuel behavior when local variables (see below) are at their as-coded or best-estimate value.(2)For a given reactor, coolant system response, and nominal (see definition above) hot assembly behavior, predict the behavior of the hot rod as a result of variations in local, or hot spot, variables. Local variables affect the hot spot response, but have a negligible effect on the overall system thermal hydraulics, which allows us to consider their impact only at the local level. Variables 24 to 37, for example, pertain to the second category.

Most of the uncertainties in Table 1 with only few exceptions are explicitly treated and propagated during the uncertainty analysis. The only exception is in the treatment of the ECCS bypass and the entrainment into the steam generators where the mild conservative bias observed during the code assessment against full-scale data is accepted. Another example of accepted conservative bias is the reflood heat transfer coefficient in the core during the initial insurge of water at the end of refill. The heat transfer is limited to a maximum value during reflood due to the lack of data at high reflood rates.

For each contributor in Table 1, the range over which the variable was expected to deviate from the nominal (i.e., as input or as coded value) was quantified using SETs and IETs data or plant operation data. The end result is a probability distribution function for each of the uncertainty parameters. For the plant operating conditions, this quantification was relatively straightforward. For example, the average power in the hot rod is constantly monitored during plant operation. However, uncertainties are introduced by the measurement and the software used in the control room to convert the raw measurement to a linear heat rate. These uncertainties have been thoroughly quantified by Westinghouse in actual reactors.

For thermal-hydraulic models, the analysis was more difficult. Each process has to be described in term of a single modeling variable. For example, the ratio of the measured to WCOBRA/TRAC predicted critical flow rate (CD) is identified as the modeling variable to describe the ability of the code to predict critical flow. The uncertainty probability distribution function of the modeling variable (CD in this case) is determined by generating a scatter plot obtained from the simulation of several critical flow experiments with WCOBRA/TRAC. Then, it had to be demonstrated that the model used to simulate each specific process was sufficiently correct so as not to introduce significant bias or scatter which did not reflect true uncertainty. This was required because the scatter plot used to quantify the uncertainty must not be dominated.

For some parameters. the probability distribution functions were approximated by normal distributions; for other parameters, an “actual” distribution was used. In some cases, a uniform distribution is assumed if the information was insufficient to characterize a more appropriate distribution.

Note also that a detailed compensating error analysis was performed to investigate the interaction of various models and identify situations where an apparently good prediction is due to offsetting mispredictions. The analysis was reviewed by the NRC in order to assess the code’s ability to correctly predict all the dominant physical processes during the transient.

5. Review Uncertainty Analysis Methods: From Response Surface Techniques to Application of Nonparametric Statistics

Element (3) of the CSAU roadmap discusses how uncertainties are combined and propagated throughout the transient. In Element (2), probability distribution functions have been obtained for all uncertainty parameters (about 40 in the Westinghouse methodology). The objective of the uncertainty analysis is to quantify the contributions or better the combined effects of all uncertainties to the PCT (or LMO and CWO) from the various sources. The exact solution of the problem would require to examine all the possible interactions among these parameters.

For example, let us assume a simple problem where there are only two parameters X1 and X2. For each of those parameters there are only three discrete value X1(1), X1(2), and so forth. with a probability of occurrence associated to each value, say P11, P12, and so forth. The exact solution to the problem would require to develop an event and outcomes table which include 9 possible events, 9 outcomes (9 PCT values). The resulting PCT distribution is obtained by arranging the 9 PCT values into bins and developing a histogram. The 95th percentile (probability) PCT is obtained counting the number of occurrence in each bin until 95% of all occurrences have been counted.

Clearly the problem is much more complicated than the example. There are about 40 uncertainty parameters and a continuous probability distribution function (PDF) is associated to each parameter. This leads to an infinite number of possibilities and the problem cannot be solved exactly but the solution needs to be approximated to a certain degree. Several approaches have been proposed over the years and the actual implementation of these methods in the industrial application evolved over the last decade. An overview of various methods is provided in the next sections.

5.1. Response Surface Method

A response surface method was suggested by the TGP in an effort to demonstrate that practical methods could be developed within the CSAU framework which would be acceptable by the NRC. Data points are generated by running the code with specific input variables to perform parametric studies on selected uncertainty contributors. Then response surfaces are fit calculation to these data points. The response surfaces are treated as a “surrogate” of the code which reflects the functionality between PCT and the uncertainty attributes. Finally, these response surfaces are used in a Monte Carlo simulation to generate the output distribution (PCT PDF, e.g.).

An advantage of this approach is that the generation of response surfaces requires a well organized matrix of calculations in which single and multiple effects are evaluated. These calculations allow the analyst to understand how each important contributor affects the PCT.

On the other hand, the actual implementation is not as straightforward. The uncertainty contributors have to be grouped together to limit the size of the run matrix which is a strong function of the number of parameters ranged in the uncertainty analysis. At the same time, it is important to ensure that the run matrix or matrices can adequately highlight key interactions.

The first Westinghouse realistic LOCA methodology (Young et al., 1998 [3]) was based on the use of response surfaces. A list of assumptions was made to solve the problem and they are highlighted in the following. The first main assumption was to divide the problem into two parts. (1)Predict the overall reactor response and the nominal thermal-hydraulic condition in the high power fuel assembly, as a result of variations in “global” variables. By nominal we refer to the predicted fuel behavior when the local variables (24–37 in Table 1) are set at their as coded “best-estimate” values.(2)For a given reactor response and nominal hot assembly condition, predict the probability distribution of the hot rod behavior as a result of variations in “local” variables. Step (1) is based on WCOBRA/TRAC simulation while step (2) is based on local evaluation performance with a one-dimensional conduction code called HOTSPOT. For each WCOBRA/TRAC run, the effect of the local uncertainties is collapsed to a probability distribution by performing a large number (1000) of repeated cladding temperature (HOTSPOT) calculations or trials in which the different values of the local variables are randomly sampled from their respective distributions like in a Monte Carlo simulation. The process is depicted in Figure 4.

98737.fig.004
Figure 4: Relationship of local hot spot distribution to global prediction.

It is noted that the HOTSPOT probability distribution is a function of the PCT. For example, an uncertainty on the oxidation reaction will have more effect if the clad temperature is high. In other words, the probability distribution is a function of the “global” thermal hydraulic response. The “local” probability distribution is therefore a conditional probability on the “global” outcome probability.

The segregation of some of the variables into the “local” category reduces the problem somewhat, but the runs matrix required to resolve the effect of the remaining global variables would still be too large.

The second main assumption was that the global parameters can be divided in groups. PCT contributions from each group are assumed independent and can be superimposed. The groups identified were as follows. (1)Initial condition variables (1–7).(2)Initial core power distribution (8–16).(3)Physical model and processes parameters (17–23). The variables are grouped with the justification that some interactions between variables are more important than others. In particular, interactions between variables in different groups (e.g., the fluid average temperature in Group (1) and the nominal hot assembly peaking factor in Group (2) are considered second-order relative to the interaction within group.

Within each group, some of the parameters were then statistically combined into others as “augmentation” factors and some were simply bounded and removed from the uncertainty analysis. The uncertainty of some other parameters were statistically “collapsed.” For example, it was shown that the contribution of the initial condition variables could be combined in a single normal distribution (third main assumption). At the end of this process, it was shown that the required WCOBRA/TRAC run matrix contains (1)10 initial conditions runs;(2)15 power distribution runs;(3)14 global model runs;(4)3-4 split break runs to determine limiting break area;(5)8 additional superposition runs. The last 8 runs were added to correct for the superposition assumption (second assumption). The run matrix of the additional “superposition step” is defined by combining different values of initial conditions, power distributions, and global models. A bias line is determined based on linear regression from the results of the superposition runs. The bias line correlates the PCTS obtained from the superposition and the PCTL predicted from the response surfaces and assuming the linear superposition for a given set of parameters. The end results are an additional PCT penalty which is intended to bound the effect of the nonlinear behavior.

With all these assumptions, the problem is reduced to a manageable size with a run matrix of the order of 50 WCOBRA/TRAC simulations.

A criticism on the use of response surfaces is that polynomials could not pick up discontinuities in results or properly identify cliff effects or bifurcations in the results. On the other hand, experience confirms that, at least for large break LOCA, the output is well behaved over a wide range of input values and the response surface seems ideally suited for capturing the local maxima which can occur over the range of variation.

5.2. Direct Monte Carlo Method

The problem could be solved (approximated) with a direct Monte Carlo method. The implementation of the method is straightforward and it simply requires to sample the input distributions n times, then use the computer code directly to generate n outputs which are used to estimate the actual distribution. The issue is to define how many runs are required to accurately define the distribution of the outcome (PCT, e.g.).

Several years ago, this approach was considered totally impractical due to the number of calculations involved (in the order of thousands). This may not be true today, however. While there are still several issues to resolve with this approach, particularly the number of calculations required to adequately represent the output distribution and extract knowledge about the importance and effect of each contributor, this approach can be considered today.

5.3. Data-Based Code Uncertainty Method

A third approach for estimating uncertainty in the PCT prediction due to uncertainties in the thermal-hydraulic models is to compare the computer code to many tests which simulate conditions in a PWR, which result in a measured PCT. Note that this step was also taken in the CSAU demonstration, but the results were not used directly.

The code bias and uncertainty are then determined directly from a PCT scatter plot. The advantage of this approach is that it effectively encompasses all potential contributors to uncertainty. The disadvantages are that the individual contributors cannot be separated, and the propagation of the dominant contributors at full scale is not adequately represented in the data base (e.g., most tests producing a PCT are single effect tests which do not combine the effects of blowdown and reflood).

5.4. Nonparametric Statistics Method

These methods derive from direct Monte Carlo methods. However, instead of attempting to obtain information with regard to underneath probability distribution function (PDF) of the measure (say PCT), the PDF is ignored (distribution-free) and nonparametric statistics is used to determine a bounding value of the population with a given confidence level.

These alternative methods have been proposed in recent years and started to be applied in realistic calculations of LOCA/ECCS analysis (Wickett Eds et al. [34]). Although, there are some conceptual similarities, most of these methods, started to be employed in Europe in the late 90s (Glaeser et al. [3133]).

More recently in the US, both AREVA-NP (in 2003) (Martin and O’Dell [35]) and Westinghouse (in 2004) (Nissley et al. [12]) have developed NRC-licensed best-estimate LOCA evaluation models based on the use of these methods. Other applications in the industry are the extended statistical method (ESM) by AREVA/EDF in France (Sauvage and Keldenich [36]) and the GE application to non-LOCA events (Bolger et al. [37]). While all of these implementations utilized essentially the same technique to combine the uncertainties, there are fundamental differences with regard to the interpretation of how these calculation results are used to satisfy the regulatory acceptance criteria.

The nonparametric statistical sampling technique is sometimes referred to as “distribution-free.” It is possible to determine the tolerance limits from unknown distributions by randomly sampling the character in question. The consideration of nonparametric tolerance limits was originally presented by Wilks [38]. Wilks study showed that the proportion of the population between two order statistics from a random sample is independent of the population sampled, it is only a function of the particular order statistics chosen. Using the well-known Wilks formula, one can determine the sample size for a desired population proportion at a given tolerance interval. Let us say that we are interested in determining a bounding value of the peak clad temperature (95th percentile (? = 0.95)) with 95% confidence level (ß = 0.95). The sample size (i.e., the number of computer runs required) is determined solving the following equation: By substituting ? = 0.95 and ß = 0.95, the number of computer runs, N is found to be 59. In this technique, all the uncertainty parameters are sampled simultaneously in each run similarly to the direct Monte Carlo method discussed in Section 5.2. The method is essentially a crude Monte Carlo simulation used with the minimum trial number to stabilize the “estimator.”

Results are then ranked from highest PCT to lowest, rank 1 provides a bounding estimate of the 95th percentile PCT with 95% confidence level.

Beside the PCT, the 10 CFR 50.46 acceptance criteria to be satisfied include also the estimated local maximum clad oxidation (LMO), which needs to be less than 17%, and the estimated value of core wide oxidation (CWO), which needs to be less than 1%.

A rigorous interpretation of the regulations would require the formulation of a simple singular statement of uncertainty in the form of a tolerance interval for the numerical acceptance criteria of the three attributes contained in the 10 CFR 50.46 (PCT, LMO, and CWO). The singular statement of uncertainty chosen in this case would be based on a 95% tolerance interval with a 95% confidence level for each of the 10 CFR 50.46 criteria, that is, PCT, LMO, and CWO.

According to Guba et al. [39], this required the extension of the sample size beyond the 59 runs which are only sufficient if one outcome is measured from the sample. A more general theory, which applies to the case where more than one outcome is considered from the sample, is discussed in Guba 2003 paper which provides a more general formula applicable to one-sided populations with multiple outcomes (P > 1). The number of runs can be found solving the following equation for N: By substituting ? = 0.95 and ß = 0.95, and p = 3, the number of computer runs, N is found equal to 124. This method was recently implemented in the Westinghouse realistic large break LOCA evaluation model, also referred to as “Automated Statistical TReatment of Uncertainty Method” (ASTRUM) (Nissley et al. [12]). The ASTRUM evaluation model and its approach to the treatment of uncertainties were approved by the US NRC in November 2004.

The implementation or interpretation of order statistics in safety analysis is not fully consistent within the industry. This has led to an extensive public debate among regulators and researchers which can be found in the open literature (Makai and Pál [40], Wallis et al. [4143], Orechwa [44, 45] and Nutt and Wallis [45]). The focus of this debate has been mostly on the minimum number of runs (sample size) required to satisfy the LOCA licensing criteria (10 CFR 50.46).

Westinghouse strategy was to take the most generic and robust approach to the issue and minimize licensing risks to its costumers. Westinghouse position is that there are three criteria that need to be satisfied simultaneously with a singular statistical statement in the form of 95/95. Further, no assumption is made with regard to degree of correlation between the three parameters (PCT, LMO, and CWO) which are measured against the criteria. Based on these assumptions, the sample size is obtained from the Guba and Makai equations (2) and results in 124 calculations.

The maximum values for PCT, LMO, and CWO are extracted from the sample and used as bounding estimators of the 95th percentile for all three quantities with 95% confidence level. The correct interpretation of the results thus obtained is as follows: there is at least a 95% confidence that the limiting PCT, LMO, and CWO from the sample exceed the “true” 95th percentile).

In general, this approach has been considered (overly) conservative, and various authors have suggested that a reduced number of runs would be sufficient compared to what is considered in the Westinghouse methodology. For instance, another approach assumes that while nothing is known relative to the output variable PDF, a strong correlation may exist between the output variables. For example, typically the local maximum oxidation is a strong function of the PCT. However, this approach may require that such a correlation being demonstrated and quantified for the specific analysis.

Both methods are considered acceptable, and each presents advantages and disadvantages. Westinghouse feels that the use of the most generic and robust approach simplifies the licensing and approval process, without requiring plant specific verifications relative to the degree of correlation between the variables or the dominant nature of one of the three criteria. Additionally oxidation is a function of clad temperature and associated time history, not merely of peak cladding temperature. Westinghouse analysis has shown that while a high degree of correlation between PCT and LMO exists, this is plant specific and a generic statement of perfect correlation can not be supported.

An alternative approach was outlined in recent papers from Wallis [43]. Wallis concluded that no matter what, there is only one “output” of interest from a safety analysis, and that is whether the regulatory criteria that apply to the specific transient under consideration are verified. Considering the application to a LOCA analysis, the question that Wallis therefore wants to address is “how many computer code runs are necessary to guarantee, at a 95% confidence, that there is a 95% probability that a LOCA will result in a PCT < 2200F, an LMO < 17%, and CWO < 1%?” The Wallis answer is that if 59 runs are performed, all resulting in an acceptable result (i.e., PCT < 2200F, an LMO < 17%, and CWO < 1%), then a positive answer to the above question can be provided.

The Wallis approach combines PCT, LMO, and CWO into a “single output.” The criteria evaluation process is absorbed into the “black box” and simply gives a binary output if succeeded or failed to pass the requirement (compliance with the ECCS design acceptance criteria). Wallis answers a simple “logical” question as depicted in Figure 5 which was extracted directly from his paper (Wallis [43]).

98737.fig.005
Figure 5: Walis Formulation.

Wallis’s answer is correct in the context of the question as posed, but has in our opinion some limitation in the real application of the method to nuclear safety. In particular, the sample size (number of runs) derived following Wallis’s formulation is not sufficient to make a singular statement (at 95/95 level) on the margin that is actually available in the plant design for each of the three criteria. In fact, while there is a 95% confidence that the 95th PCT, LMO, and CWO would be lower than the regulatory limits, the analyst cannot make an estimation (at a 95% confidence level) on how much margin is actually available with respect to the three criteria considered, without decreasing the confidence level or recurring to argumentations based on the correlation between oxidation and PCT.

The quantification and tracking of the margin is most often requested by both the plant operator and the regulator, and the Westinghouse approach (ASTRUM) was sensitive to this issue. More specifically, tracking of PCT margin is a regulatory requirement of 10 CFR 50.46 and cannot be well supported without a quantification of the margin available from the analysis of record.

Further insights on the robustness of the statistical method employed in the current Westinghouse realistic large break LOCA methodology (ASTRUM) are provided in a previous paper (Frepoli and Oriani [13]).

As far as the actual implementation is concerned, ASTRUM evaluation model was grandfathered to the original methodology which was approved in 1996. The extension mainly focused on replacing the method which is used to combine the uncertainties, from the response surface technique to a direct Monte Carlo sampling method. The code (WCOBRA/TRAC) was essen tially unchanged and more the uncertainty parameters were retained with the original probability distribution functions.

One main advantage of ASTRUM is that the number of runs (sample size) is fixed (124 runs) and it is independent on the number of uncertainty attributes considered in the sampling process. As a result, few additional uncertainty parameters were directly sampled instead of choosing the bounding approach considered in the 1996 version of the methodology. To mention some of these new parameters sampled in the procedure: (1) time in cycle on which the postulated LOCA event is predicted to occur; (2) break type (a double ended guillotine or a split); (3) break size for a split break.

The distinction between local and global variables developed in the 1996 version of the methodology was retained in ASTRUM for convenience. However in ASTRUM only a single HOTSPOT calculation is executed downstream a WCOBRA/TRAC, instead of 1000 HOTSPOT runs as in the 1996 methodology. The HOTSPOT calculation is now a single calculation where the local uncertainties coefficients are set at their biased values as selected by random sampling from their respective distributions. This procedure is required to be consistent with the Monte Carlo approach, where a random single-value uncertainty parameter is randomly sampled from the respective distributions for each simulation, which is composed by a WCOBRA/TRAC and a HOTSPOT calculation. There is no need to obtain the “local distribution” depicted in Figure 4 in this case, but simply a random local case within that local distribution, one for each WCOBRA/TRAC run.

6. Sample Analysis Results

Since its original approval in 1996, Westinghouse best-estimate large break LOCA methodology has been applied to perform safety analysis for several PWRs both in the USA and outside. Currently in the US, 24 plants are licensed or analyzed with Westinghouse 1996 and 1999 (upper plenum injection) methodologies and more than 10 plants have been analyzed with the most recent ASTRUM evaluation model which was approved by the NRC in late 2004.

A best-estimate LOCA safety analysis is an engineering project which encompasses several activities. A flow chart is illustrated in Figure 6. Data is collected and compiled in an input model which describes the plant. ASTRUM represents the central phase where uncertainties are combined as discussed in Section 5.4.

98737.fig.006
Figure 6: Flow chart of a typical BELOCA analysis.

Sample ASTRUM analysis results are presented for a typical Westinghouse 3-loop PWR. Other results can be found in the literature (Frepoli [7, 911]).

Results for a typical 3-Loop PWR are shown in Figure 7. Figure 7 is a scatter plot which shows the effect of the effective break area on the final PCT. The effective break area is defined by multiplying the discharge coefficient (CD) with the sample value of the break area (FA), normalized to the cold leg cross sectional area. Note that the break area is ranged only for the split breaks (SPLIT), whereas CD is ranged for both split and double-ended-guillotine-cold-leg (DEGCL) breaks. This creates a region in the FAxCD space where both types of break can be found.

98737.fig.007
Figure 7: Peak clad temperature (PCT) from the ASTRUM 124 run set.

Figure 7 shows that the limiting PCT case is a double-ended-guillotine-cold-leg break transient with a near nominal discharge coefficient CD. It is noted that the limiting case with respect to local maximum oxidation (LMO) has rank 2 in terms of PCT and is SPLIT case with a lower effective break area. The LMO case can be easily spotted in the scattered plot of Figure 7, since the PCT is relatively higher than other cases with similar value of effective break area.

Figure 8 shows the degree of correlation between the local maximum oxidation and PCT for the various runs. While the correlation degree is high, the figure shows that the maximum LMO case does not necessarily coincide with the maximum PCT case.

98737.fig.008
Figure 8: Oxidation and PCT from the ASTRUM 124 run set.

Figure 9 shows the clad temperature for the ranked top 10 PCT cases. The limiting PCT case and LMO cases are shown in red and green, respectively. It is noted that LMO case is reached in transient which was affected by delay quench. Although the peak clad temperature is lower than the limiting PCT case, more oxidation was occurred in the second case as high temperature were sustained for a longer period of time. The limiting case in term of core-wide oxidation (CWO) had rank 12 in terms of PCT.

98737.fig.009
Figure 9: Clad temperature traces at the PCT elevation for the top 10 ranked PCT cases in the ASTRUM 124 run set.

Since the limiting PCT, LMO, and CWO values from the run matrix (124 cases) were below the 10 CFR 50.46 limits, a statement can be made were 95th percentile PCT, LMO, and CWO populations are bounded by the limiting values with a 95% confidence level.

Other sample results obtained with both the 1996 methodology (response surfaces) and 2004 ASTRUM (nonparametric) are shown in Table 2. Note that for similar plant ASTRUM provided at least 150 F in additional PCT margin and significant more margins in term of oxidation.

7. Summary and Conclusions

Since the 1988 amendment of the 10 CFR 50.46 rule which allowed the use of realistic physical models to analyze loss-of-coolant accidents (LOCA), Westinghouse has been continuously developing and applying its realistic LOCA methodology for the purpose of safety analysis in nuclear power plants. The first version methodology was approved by the NRC in 1996 after an extensive review by the NRC and ACRS.

An overview of the methodology, starting from the thermal-hydraulic code WCOBRA/TRAC and the development of its biases and uncertainties was provided. The paper illustrated that a key step in any best-estimate or realistic analysis is the process selected to combine those uncertainties. Nonparametric order statistics is now the chosen technique to address the issue across the industry. However, the implementation or interpretation of statistics in safety analysis is still not fully consistent within the industry, in particular with regard to how the analysis satisfies the acceptance criteria set by the regulatory body (i.e., 10 CFR 50.46).

The Westinghouse NRC-approved method (ASTRUM) chooses to follow a rigorous implementation of the order statistics theory, which leads to the execution of 124 simulations within a large break LOCA analysis. This is a fundamentally sound approach which guarantees that a bounding value (at 95% probability) of the 95th percentile for each of the three 10 CFR 50.46 ECCS design acceptance criteria (PCT, LMO and CWO) is obtained. A 95/95 statistical statement on three main ECCS design criteria (10 CFR 50.46) is acceptable by the NRC.

In general, the successful approval of the methodology and several applications to the safety analysis of operating plants that followed is an evidence that the CSAU is indeed a workable roadmap for the development and implementation of realistic methods for safety analysis within the boundaries of the current regulatory environment. Some criticism is still present in the scientific community with regard to CSAU-based methodologies. In particular, concerns with regard to the degree of “engineering judgment” within the process are expressed. However a realistic methodology really represents the current state-of-knowledge and the CSAU; and PIRT is a systematic process that allows setting priorities and focus on the most important areas for the purpose of safety analyses. Layers of realistic conservatisism are often added to increase the robustness of the method. Review the methodology by the regulatory bodies, look at the evaluation model in its entirety, and extend well beyond the boundary of what is predicated by the PIRT process. As more information becomes available, information can be used to refine the models. Further improvements typically result in “uncovering” hidden safety margin which may be utilized to improve plant operation performances and economics. Such process prevents technology from being “frozen” in a highly regulated environment and it is in line with risk-informed regulation and defense-in-depth philosophy.