Abstract

The AREVA NP Inc. realistic large-break loss-of-coolant-accident (LOCA) analysis methodology references the 1988 amended 10 CFR 50.46 allowing best-estimate calculations of emergency core cooling system performance. This methodology conforms to the code scaling, applicability, and uncertainty (CSAU) methodology developed by the Technical Program Group for the United States Nuclear Regulatory Commission in the late 1980s. In addition, several practical considerations were revealed with the move to a production application. This paper describes the methodology development within the CSAU framework and utility objectives, lessons learned, and insight about current LOCA issues.

1. Introduction

The objective of any methodology for measuring the performance of an emergency core-cooling system (ECCS) during a loss-of-coolant accident (LOCA) is to provide a statement of assurance that the ECCS will preserve fuel integrity. For large-break LOCA analysis, the key measure (among several) is peak cladding temperature (PCT) relative to 2200°F (1200°C). Traditionally, LOCA analyses performed in the U.S. for Nuclear Power Plant design-basis safety analysis were required to comply with the U.S. Code of Federal Regulations Title 10, Part 50 (10 CFR 50), Appendix K, a conservative, deterministic approach. Following several research and development advances in two-phase flow and heat transfer phenomena specifically related to the LOCA, regulations were updated in 1988 to allow best-estimate approaches. Several events leading up to the rule change included the close of the 2D/3D program [1] and the development of NUREG-1230, Compendium of ECCS Research [2]. In addition, during the rule-making process, a committee of experts was convened to develop a paradigm for performing best-estimate LOCA evaluations. These experts came from the USNRC, national laboratories, and academia. This Technical Program Group (TPG) produced the code scaling, applicability and uncertainty (CSAU) methodology, which is documented in NUREG-5249 [3]. Today, the CSAU methodology is well known in the LOCA community and many papers have been inspired from both the content and the conclusion developed from that original work. Accompanying NUREG-5249, the USNRC released Regulatory Guide 1.157, best-estimate calculations of emergency core cooling system performance, which provides specific detail describing acceptable best-estimate LOCA methodologies [4].

An AREVA NP predecessor company, Siemens Power Corporation, developed and submitted to the USNRC a best-estimate LBLOCA methodology during the early 1990s; however, the USNRC could not provide resources to support the review for several years. As a consequence, Siemens Power Corporation decided to reinvent this methodology and resubmitted a realistic large-break LOCA (RLBLOCA) methodology in August 2001 [5]. In April 2003, AREVA NP received approval of an S-RELAP5-based realistic large-break LOCA methodology from the USNRC [6].

2. Evolution of BE Methods Since 1988

The development of this methodology is a product of the lessons learned since the 1988 rule change both internal to AREVA NP and by the thermal-hydraulic community at large. This is despite the fact that in May 1990 in a special issue of “Nuclear Engineering and Design” [7], the editor declared “the closure of the large-break LOCA issue.” This bold statement did not go unchallenged. In January 1992, a special issue of “Nuclear Engineering and Design” [8] was published providing comment and criticism, in the form of “Letters to the Editor,” of the existing technical understanding of LOCA, in general, and the CSAU methodology, specifically. Several areas were identified as being incomplete. These can be generally associated in the following categories [9]:

(i)defining “best-estimate methods;”(ii)merits of engineering judgment;(iii)methods for the convolution of uncertainty;(iv)data to quantify uncertainties. In order to produce an acceptable, usable methodology, resolution of these and other issues was necessary. “Resolution” is, of course, a negotiated condition involving the methodology developers, an applicant, and the regulatory reviewers. Nonetheless, this paper presents insights from AREVA NP’s experience in the process from the 1988 rule change until USNRC approval in 2003.

2.1. Defining “Best-Estimate” Methods

In the context of thermal-hydraulic safety analysis performed to support nuclear power plant operation, no consensus appears to have been established for defining “best estimate.” The difficulty stems from the many types of uncertainty contributing to a plant-scale accident scenario. Sources of uncertainty associated with a large-break loss-of-coolant accident (LOCA) analysis begin with that which can be observed—measurable quantities reflecting the design or condition of a system, structure, or component. In this context, “best-estimate” can be simply characterized as a preferred state for which any perturbation is followed by a return to its preferred or “best-estimate” state for the system, structure, or component.

The original problem tackled by the TPG in NUREG-5249 was for a double-ended large-break LOCA at a Westinghouse 4-loop PWR operating at steady-state full power. Several uncertainties associated with this problem were recognized in that reference including those associated with code models, the impact of test facility scaling, epistemic uncertainty resulting in compensating errors, nodalization, and, to a lesser extent, the user effect. On the surface, the TPG appeared to establish a well-defined description; however, even this description, supported by the discussion on uncertainty presented in NUREG-5249, disguises other uncertainties that are much more difficult to quantify and include into a definition for “best-estimate methods.” To identify these additional uncertainty contributors, this application statement can be dissected.

Beginning with “double-ended large-break LOCA,” this identifies a scenario with a particular break configuration. This vision of the large-break LOCA problem either ignores the spectrum of breaksizes associated with LOCAs or addresses this uncertainty with conservatism. Incorporating conservatism into the definition of “best-estimate methods” appears to undermine the original move to best-estimate methods. One of the primary criticisms of the Appendix K deterministic approach was that certain so-called “conservative” models could result in nonconservative behavior during a simulation. Best-estimate methods certainly should avoid this situation; however, the question of breaksize is just one element of the broader uncertainty category associated with the nature of the initiating event. The communicative nature of the break (i.e., guillotine or longitudinal split), break orientation (i.e., necking for guillotine break and directional nature of split breaks), break location (i.e., cold or hot leg; pressurizer or other loops, attached pipe), and the assumed single failure (a regulatory requirement) also contribute to the initiating event uncertainty.

The descriptor “Westinghouse 4-loop PWR” identifies a plant design; however, the nature of nuclear power plant development is such that even among Westinghouse 4-loop PWRs there can be significant differences. Component choices, such as reactor coolant pumps, steam generators, core/reactor vessel design (i.e., bypass flows, fuel assembly design, upper head design), and containment response features (i.e., sprays, ice, fan coolers, passive structure surface area), represent elements of the design uncertainty. In addition, operational and maintenance history can impact the performance of “equivalent” systems, structures, and components. As a consequence, there are no “identical” plants.

“Operating at steady-state full power” encompasses all uncertainties associated with plant operating state and event response. In analysis space, these are often initial or boundary conditions. A plant’s technical specifications and limiting condition of operation define the operational space enveloping acceptable plant states. Frequently, there is significant latitude for “acceptable” states for system variables, including core axial power and fuel burnup that can have a strong influence on the acceptance criteria metrics. The challenge for a “best-estimate” analysis is to balance the value of defining the likely plant state at the time of an accident with the need to support the plant’s operational envelope. Dozens of analysis parameters fall into this category.

In recognizing the complexity of the uncertainty problem associated with LOCA safety analysis, the term “best-estimate” as applied to this problem has evolved into “best-estimate plus uncertainty” (BEPU). The problem has always been the management of uncertainty. At the time the CSAU methodology was being developed, a relatively narrow view of uncertainty was necessary because of limitations in computational ability and limited appreciation of advanced statistical methods. This original CSAU view on uncertainty was criticized as being incomplete with relevant contributors to the LOCA safety analysis problem being treated implicitly and, as a consequence, wrong. As such, the conversation moved from BE to BEPU—with the emphasis on uncertainty management.

2.2. The Role of Engineering Judgment

Engineering judgment has always been a necessary part of any engineering task. Engineers, through the expression of their experience, have often applied engineering judgment to make big engineering challenges workable. Confirmation is, of course, necessary when safety is a concern. In developing the CSAU methodology, the TPG formalized this often unappreciated aspect of engineering. Doing so started a debate as to the extent that engineering judgment should play in the LOCA safety analysis problem.

The manifestation of engineering judgment in the CSAU process is the phenomenological identification and ranking table (PIRT). As the name implies, the PIRT reflects qualitative engineering judgment as to the importance of various phenomena relevant to the problem of interest. The intent of the PIRT is to provide a technical basis during the BEPU methodology development process for the many decisions, including the management of uncertainty, required to complete the task.

Resistance to this formalized use of engineering judgment inspired several criticisms, including the following.

(i)Who is qualified to be a part of a PIRT team?(ii)How do PIRT teams deal with differences of opinion?(iii)Should uncertainty with the ranking process be incorporated into the PIRT?(iv)Even after the PIRT is developed, engineering judgment is required to use the results.(v)How can the absence of knowledge (i.e., unmodeled parameters) be treated in this context? Despite the initial criticism, the PIRT exercise has found a degree of acceptance. Its foremost value has been in establishing an understanding of the processes and phenomena of interest among a group of peers. Once consensus is achieved, decisions impacting the solution of the task at hand may begin.

In the original CSAU large-break LOCA sample problem, the TPG, applying a PIRT they developed for this problem, established a precedent that the large-break LOCA problem can be well characterized by explicitly addressing a minimum set of very important processes and phenomena. Beyond that set of large-break LOCA contributors, other phenomenological or process parameters were treated as “nominal.” This application of engineering judgment has not found universal acceptance for two reasons: (1) there is a lack of consensus of “important” parameters and (2) it ignores traditional licensing measures defined in plant technical specifications and limiting condition of operation.

To satisfy this criticism, the BEPU approach recognizes the value of “realistic conservatism,” that is, the explicit treatment of uncertainty by characterizing the uncertainty parameter such that the key output variables are penalized relative to the acceptance criteria. For parameters with low large-break LOCA importance, this may be a trivial distinction; however, as importance increases, scrutiny over that which is proclaimed conservative also increases. Nonetheless, the acceptance of “realistic conservatism” represents a significant departure from the original concept of BE methods; yet, it is absolutely necessary for the complex LOCA analysis problem where engineering judgment is involved.

2.3. Convolution of Uncertainty

A constraint, recognized early by the TPG during the development of the CSAU method, stemmed from the application of statistics to convolve parameter uncertainty of several individual large-break LOCA contributors into a single uncertainty statement for PCT. Specifically, the broader the set of uncertainty contributors considered, the more than number of required LOCA simulations grows exponentially. This is the nature of the response surface methods that the TPG considered state-of-the-art for this application. There is no doubt that this practical constraint influenced their acceptance of the relatively small number of large-break LOCA contributors considered in their uncertainty analysis sample problem. Later, Westinghouse would introduce a clever extension to the response surface approach to expand the number of large-break LOCA contributors that could be considered [10].

When introduced in 1989, a few organizations in the international thermal-hydraulic community—in particular, Germany’s GRS—recognized that this obvious limitation could be eliminated by considering nonparametric statistical approaches. This counterpoint was not universally appreciated either because there was a lack of understanding or nonacceptance of nonparametric statistics lack of a definitive uncertainty statement. The uncertainty statement from a nonparametric statistical approach is expressed as an inequality characterized with a confidence level.

Today, nonparametric-ordered statistics (e.g., Wilk’s method) have become the method of choice. However, consensus with regard to its implementation within regulatory guidelines is still evolving. Current regulation in the U.S. and other countries recognize a multivariant acceptance criterion for large-break LOCA analysis. As a consequence, a debate over the required number of calculations necessary to provide an acceptable uncertainty statement has resulted in several journal articles on the subject [1115]. Much of this debate is on the semantics used to present the uncertainty statement. Specifically, should the acceptance criterion be measured individually or is it sufficient to consider the outcome of an analysis as a single statement concerning whether the entire acceptance criterion has been satisfied. AREVA NP’s position is with the latter.

2.4. Completeness of the Experimental Database

Driven by the recognized gap in knowledge of LOCA phenomena apparent in the early 1970s that resulted in the early Appendix K rule making, governments around the world invested heavily in experimental programs to rectify this situation. By the late 1980s, a large body of research on many facets of the large-break LOCA problem was completed. Coupled with the CSAU approach for performing BE analysis, was this body of work sufficient to declare the closure of the large-break LOCA problem? Undermining the closure position was the view that so much of the thermal-hydraulic phenomenological database was populated empirically and, as such, there remains much yet to be characterized.

While statistics appeared to be the answer to the analyst, the experimentalist was saying that in many areas data was insufficient for deriving statistical measures. In addition, the possibility of unknown phenomena or undesirable interplay between competing phenomena made any declaration of closure irresponsible. The TPG’s response was simply that a sufficient amount of experimentation focused on both separate and integral effects existed and that uncertainty associated with scale could be determined. In areas this may be large; however, if it turns out that uncertainty is too penalizing, this would be a motivation for new test programs.

2.5. AREVA NP’s BEPU Paradigm

Constraining factors that can limit a nuclear power plant’s efficiency include engineering design limits, equipment operability, and regulatory requirements. The acceptance of BE methods has revealed margin for improving plant operating performance. Figure 1 illustrates this view of the plant operating margin provided by BE methods relative to the traditional Appendix K deterministic methods. Margin is characterized by the separation between the design or the licensing limit and the nominal operating point. With regard to regulatory limits, this is measured by recognized metrics relative to the regulatory acceptance criteria, for example, PCT < 2200°F.

Deterministic methods provide a single “analysis of record” that quantifies the acceptance criteria metrics (PCT, total oxidation, and local hydrogen generation). Over the operating history of current generation nuclear power plants, utilities have nearly exhausted the availability of margin provided by this original method and, as a result, the apparent margin is small.

In contrast, BE methods strive to identify the acceptance criteria metrics associated with the real state of the plant. Practical limitations associated with the state of knowledge required to perform analyses force analysts to apply conservatisms that make the calculated BE value bounding of the real state. In addition, the real margin is never realized because the design basis limits reserve margin to cover uncertainties associated with the actual limits.

For the purpose of reporting plant operating performance margin relative to licensing limits, the goal is not to define this margin relative to the actual state; rather, it is to convolve all key phenomenological and process uncertainties to identify the calculated BEPU value—a conservative estimate of margin incorporating realistic models of the physical processes and associated phenomena.

In preparing the AREVA NP large-break LOCA methodology, the challenge of addressing the expectations of Regulatory Guide 1.157 and the CSAU process—balanced with the known criticisms of the CSAU process—moved the AREVA NP methodology development team towards nonparametric statistical methods and the “realistic conservatism” concept of uncertainty management. By taking this step, the focus of the methodology moves towards the resolution of individual uncertainty contributors.

The main advantage of nonparametric statistical methods is that the number of treatable uncertainty contributors is independent of the number of plant calculations. This characteristic provides flexibility during the development process to explicitly address as many or as few analysis contributors as necessary to resolve the outcome of the PIRT. As this is a product of engineering judgment, the uncertainty associated with this exercise can be reduced by explicitly addressing additional analysis contributors. In addition, this methodology characteristic provides the opportunity to incorporate customer requests for the explicit treatment of plant process uncertainty.

For the remainder of this paper, a description is provided of how AREVA NP’s RLBLOCA methodology conforms to the basic principles of the CSAU methodology while incorporating realistic conservatisms and nonparametric statistics.

3. Reconciling AREVA NP’s RLBLOCA Methodology with CSAU

The development of AREVA NP’s RLBLOCA methodology was primarily an exercise in complying with the main themes of the CSAU methodology. AREVA NP’s interpretation of the CSAU approach is that it represents a framework for deriving a quantifiable degree of assurance from a best-estimate analysis tool. This framework, graphically presented in Figure 2, consists of three elements and 14 steps that build on a qualitative understanding of (in this case) the large-break LOCA problem to define the necessary tasks to derive a quantitative solution. Highlighted components in Figure 2 represent steps that overlap with deterministic Appendix K methodologies. The CSAU framework outlines a procedure that leads from the identification and characterization of the dominant phenomena influencing the key acceptance parameter, PCT, to quantify a best-estimate of the consequences of a LBLOCA and its associated uncertainty. As with Appendix-K-derived methodologies, the final result is a calculation that provides a PCT to be measured against the 10 CFR 50.46 acceptance criteria and a statement of total uncertainty associated with that result.

3.1. Requirements and Code Capabilities

The first CSAU element sets a foundation of understanding to guide methodology development. Its emphasis is on defining the problem and capturing a knowledge base that will be used to provide the fundamental technical basis for decisions downstream in the methodology development process. Steps 1, 2, 4, and 5 shown in Figure 2 identify the problem through specification of the event scenario, plant type, computer code and version, and computer code documentation, respectively. Historically, this information represented all that would normally be required for evaluation methodologies (EM) based on 10 CFR 50 Appendix K. Table 1 summarizes the AREVA choices. Of particular note is the primary analysis tool S-RELAP5. S-RELAP5 is a modified version of RELAP5/MOD2 [16] with several updates including:

(i)multidimensional modeling capability (two-dimensional hydrodynamics);(ii)energy equations modified to better conserve transported energy;(iii)incorporation of a derivative of the CONTEMPT [17] containment analysis code;(iv)iterative evaluation for choked junctions;(v)bankoff CCFL model;(vi)modeling of noncondensable gases (e.g., nitrogen discharge form accumulators);(vii)revised two-phase pump degradation based on EPRI data;(viii)improvements to interphase friction and mass transfer models;(ix)Sleicher-Rouse used for single-phase vapor heat transfer. Step 3, identify and rank phenomena, marks a significant departure from traditional evaluation methodology approaches by formulizing engineering judgment to aid both methodology development and regulatory review. This is particularly important given the substantial effort required to develop a CSAU-based methodology. Step 3 acknowledges that plant behavior is not equally influenced by all processes and phenomena that occur during a transient. This provides the basis to reduce the analysis effort to a manageable set of phenomena ranked with respect to their influence or importance on the primary safety criteria (i.e., PCT).

The ranking process employed for the AREVA NP RLBLOCA methodology was accomplished primarily through structured discussions among AREVA NP engineers and recognized nuclear safety and thermal-hydraulics experts from industry and academia. The experts assembled for this task had extensive experience in both the experimental and computational areas of nuclear thermal-hydraulics. The PIRT team started with the original LBLOCA PIRT presented by the TPG [4]. This initial PIRT was reviewed by the three external experts, who offered recommendations for the addition or deletion of phenomena from the PIRT and revisions to the ranking of the phenomena based on the evolution of LBLOCA understanding since the publication of the CSAU methodology and lessons learned from early applications of BE methods. Following this review, a peer review was held with the three experts and four additional AREVA NP personnel to derive a final PIRT that incorporated the input from all seven participants. This final PIRT also merited from approximately 300 code sensitivity studies that served as a validation of the engineering judgment statements. The outcome of these meetings was an AREVA NP-proprietary phenomena identification and ranking table (PIRT) for large-break LOCAs that has many similarities with the original TPG large-break LOCA PIRT [4]. AREVA NP identified the PIRT parameters shown in Table 2 as dominant in a large-break LOCA and must be explicitly addressed in a CSAU-based methodology. Following PIRT development nearly 100 unique sensitivity studies were performed to assess consistency between the PIRT and S-RELAP5 large-break LOCA model response. The outcome of those studies served to motivate further code model upgrades and validate PIRT selections.

CSAU, Step 6, serves to establish a computer code’s applicability to the analysis problem. This is done by defining a cross reference of phenomena and plant components to the computer code’s models and correlations and nodalization capability. With regard to the dominant PIRT parameters, code applicability also must be supported by the documentation provided in Step 5.

3.2. Assessment and Ranging of Parameters

The second CSAU element establishes the methodology’s pedigree to perform a best-estimate analysis. This is done by code-to-data comparisons, sensitivity studies, and uncertainty analysis. It builds from Element 1 that defines a framework for the performance of sensitivity studies and identification of experimental test programs by relevance to the dominant large-break LOCA phenomena. Step 7 defines the code’s assessment matrix. Thermal-hydraulic computer codes like S-RELAP5 include a large number of closure-relationships to address the broad spectrum of possible thermal-hydraulic phenomenological processes. For this reason, it is neither practical nor necessary to assess every code model and correlation to support the subset of important phenomena anticipated during a LBLOCA. The PIRT and the subsequent sensitivity studies were used by AREVA NP to identify the most useful experimental programs for code assessment from the rather extensive knowledge base of experiments supporting PWR LOCA phenomena. Proprietary restrictions reduce this set considerably; however, sufficient data remains in the public domain to support qualification of a best-estimate LOCA code for PWR applications. The AREVA NP RLBLOCA assessment matrix is characterized in Table 3, which identifies the test program, the number of specific tests applied to the AREVA NP RLBLOCA assessment matrix, and the primary phenomenon of interest. The particular tests were selected to address the following:

(i)important LOCA phenomena defined in the PIRT;(ii)nodalization validation (defined in CSAU Step 8);(iii)code/model scaling (defined in CSAU Step 10);(iv)verification of no important compensating effects;(v)establishing a broad range of applicability. The CSAU methodology acknowledges that system nodalization is similar to any code model or correlation in that code results are sensitive to model permutations. This is addressed in Step 8, nuclear power plant nodalization definition. System nodalization presents an inherent code uncertainty. Unlike code models and correlations, quantification of nodalization-based code uncertainty is deemed to be of lesser importance relative to the practical requirements of model accuracy and calculation efficiency or economics. The objective is to define the minimum noding needed to capture the important phenomena. The selection process used to arrive at this objective becomes the standard nodalization procedure. The standard nodalization procedure is applied to every code assessment and LBLOCA analysis; thus, minimizing nodalization as a contributor to uncertainty.

Code assessment using the test matrix from Step 7 and the nuclear power plant nodalization of Step 8 is used to accomplish Step 9, code, and experiment accuracy. Code accuracy is quantified for bias and deviations through confirmatory code uncertainty analysis and benchmarks. This step also serves as a validation for Step 6, code applicability, and sets up the tasks of element 3, sensitivity, and uncertainty analysis. The demonstration of code accuracy—or for a conservative EM, code adequacy—has always been a required component of LOCA evaluation methodologies. With a CSAU-based evaluation methodology, the emphasis is focused on evaluating the important individual contributors (i.e., phenomena) to the overall code uncertainty.

For the dominant LBLOCA phenomenon (e.g., critical flow, film boiling, condensation, fuel stored energy, etc.), sets of separate effects tests were used to derive the S-RELAP5 code uncertainty as it relates to each individual phenomenon. From the code-to-data comparisons, such as that seen in Figure 3 comparing S-RELAP5 results (xc) to Marviken critical flow test data (xm), code bias (μx) and the statistical standard deviation (σ) were evaluated.

While uncertainty quantification obviously requires data, the process for quantification begins with a clear qualitative understanding of the assumptions associated with measured values. This is the nature of probability and statistics in general. For example, heat transfer is fundamentally dependent on geometry, power, temperatures, fluid properties, and mass flow. In a nuclear power reactor core, heat transfer is complicated by multidimensional effects resulting from core and fuel design and radial and axial power variations. In addition, potentially dramatic changes in fluid properties can occur as a consequence of both phenomenological (e.g., phase change) and plant process response (e.g., safety injection). However, what we know about core heat transfer has been gathered from data taken from prototypical systems of likely different scale skewed by limitations in measurement capabilities and data reduction techniques.

The quality of the data, characterized by both quantitative limitations such as the domain of system conditions during testing and qualitative limits associated with measurement factors and data reduction, must be addressed. The ideal nature of measured data would have the following characteristics.

(i)Phenomenon of interest is measurable independent of other phenomena.(ii)Phenomenological dependencies with a particular system condition are measurable independent of changes of other system conditions.(iii)Detailed dimensional variations are measurable.(iv)Scale distortion is eliminated. Since real data often does not have these characteristics, data reduction techniques have been devised and applied to compensate. Such methods often involve the elimination of “tainted” data and/or the averaging of data. The cost of such techniques is typically seen in the loss of some data and/or the broadening of uncertainty measures. Some examples from a hypothetical reflood heat transfer test are as follows:

(a)the elimination of temperature data for heater rods near a “cold” vessel wall (possible excessive radiation)—a consequence of scale distortion;(b)insufficient number of thermocouples to track radial and/or axial temperature variation in the simulated fuel assembly resulting in the need to track computed average temperature results or just the peak temperature results by eliminating data not considered “peak”;(c)tracking a total heat transfer measure rather than separate heat transfer mechanisms and other influencing phenomena (i.e., combinations of radiation and convection between walls and liquid and vapor fluids, interfacial drag); that is, tracking the convolution of multiple phenomena to produce an “aggregate phenomenon”;(d)binning temperature data over a segment of a test condition range (e.g., pressure, void fraction) to assure an adequate depth of data necessary to generate meaningful uncertainty measures. Such limitations in data are manageable; however, the implications of such limits should be addressed in the implementation of the uncertainty measures used in BEPU methodologies.

Completeness requires that the treatment of each important LBLOCA phenomena be addressed; however, a full quantification of uncertainty for each phenomenon is not necessary and, given the availability of data, may not be possible. “Phenomenological treatment” should describe a method in which the parameter range of each LBLOCA contributor is covered. The use of statistics provides various methods for describing ranges of uncertainty for a given problem; however, the CSAU process does allow for methodology conservatisms to satisfy the objective of defining uncertainty treatment for individual code models and correlations. The practical limitations of economics and data availability are considered when accepting a conservative phenomenological treatment. The trade off is the reduction in margin relative to the LBLOCA acceptance criteria. Again, engineering judgment can play a role in how to approach this step. Table 4 provides a summary of the parameters for which code uncertainty was quantified. While in most cases AREVA NP developed proprietary analyses to quantify parameter uncertainty, quantified uncertainty for a few parameters appears in open literature. In those cases (i.e., metal-water reaction and decay heat), the values used in the AREVA NP RLBLOCA methodology are provided.

Given quantified uncertainty measures, the integrity of the statistics requires the demonstration of sufficient density and breadth of data within the range-of-applicability. Validation of uncertainty ranges or standard deviation is provided by reserving “control sets” of data and reevaluating statistics. Data from integral effects tests (e.g., CCTF, LOFT, and semiscale) was used to demonstrate the acceptability of the code biases developed from the separate effects tests. Figure 4 shows a comparison of a CCTF Test 54 assessment before and after the evaluation of code biases.

Beyond the uncertainty quantification exercise, the primary challenge of Element 2 is to demonstrate sufficient range of applicability of the computer code models and correlations. Code models and correlations are best assessed using separate effects test data developed for the explicit purpose of investigating the phenomena described by the code model or correlation. Establishing a sufficient range of applicability is complicated by the fact that conditions present during a PWR LBLOCA span thermal-hydraulic ranges (pressures, temperatures, flows, etc.) that exceed the ranges of any individual separate effects test. Given this inherent limitation, the logical approach to establish the pedigree of a particular code model or correlation must incorporate a broader body of knowledge on the phenomena of interest. Applying an analogy from vector space analysis, the “applicability space” will not only include data from various separate effects test programs, but also analytical solutions and data from various integral effects tests. It is the collection of this full body of phenomenological knowledge: the analytical model, the statistical description of uncertainty from separate effects tests, and validation with integral effects tests—as incorporated within a calculational framework such as S-RELAP5 that provides the technical basis supporting the declared range of applicability of a code model or correlation.

An added complexity to the applicability question is test scalability. This is addressed in Step 10. In the long history of thermal-hydraulic code models and correlations development, computer code models and correlations have often been “tuned” to particular data sets. This approach to computer code development can create a results bias and uncertainty associated with the scaling of the problem of interest. Scaling uncertainty can be evaluated using data from a suite of test programs generated at various scales. For the specific application to the PWR LBLOCA, there is a motivation to acquire full-scale data for the dominant LBLOCA phenomena. Fortunately, many hydraulic phenomena can be assessed using tests performed at the full-scale upper plenum test facility (UPTF). In addition, heat transfer phenomena can be assessed applying data from the many reflood tests that have been performed with full-scale assemblies. The AREVA NP RLBLOCA methodology utilized the available full-scale data wherever possible. In addition, code-to-data comparisons from scaled test facilities did not show a significant scale bias. With this approach to the scaling issue, no additional accounting for scale is necessary.

3.3. Sensitivity and Uncertainty Analysis

Given the inherent uncertainty and complexity of the thermal-hydraulic processes appearing during a large-break LOCA, a best-estimate statement of assurance must be provided statistically. This CSAU element focuses on setting-up, executing, and evaluating a RLBLOCA analysis. As a statistics-based methodology, the problem setup involves implementing the bias and uncertainty for the LBLOCA contributors identified from CSAU Elements 1 and 2. Execution involves the convolution of these uncertainty contributors and the final result is evaluated from the number of calculations necessary to provide a statistically meaningful set.

While the CSAU methodology through Step 9 is focused on phenomenological contributors to uncertainty, it recognizes in Step11 that there is also uncertainty associated with the measurable states that define a plant’s operating condition, such as pressures, temperatures, levels. For utility customers interested in plant-specific application of an approved methodology, this step may be the most important step; however, the CSAU methodology [4] discussion provides the least amount of direction. In response to the limited amount of guidance provided by the TPG, the AREVA NP approach has been detailed and reported in [61].

The key challenge to addressing the uncertainty associated with plant state is reconciling the requirement for analyses to support a plant’s licensing basis through the plant’s design and control specifications while still being “best-estimate.” Traditional deterministic analyses explicitly utilize a plant’s technical specifications when it is clearly conservative to do so; otherwise, a best-estimate value is considered to bound the technical specification. Since no provision is made for BE methods to exempt the use of conservative technical specification in safety analysis, the concept of “realistic conservatism” is unavoidable. That is, this condition is a function of the regulatory process for plant licensing and not an artifact of the developed safety analysis methodology.

AREVA NP’s approach to identify which plant parameters to explicitly treat as an uncertainty parameter, either as a direct bias or sampled, considers the interests of several constituents. The primary regulatory interest requires that the plant be analyzed at technical specification limits. Precedence established by Appendix K methods provides the list of those parameters that are expected to be treated in this fashion. A second interest has been inferred by AREVA NP given the emphasis in the CSAU methodology on important phenomenological contributors to LOCA acceptance criteria. AREVA NP chose to recognize that plant response to an off-normal event is driven by phenomena. Specifically, plant parameters were correlated to phenomena and the importance of a plant parameter was made in relation to any associated phenomenological parameter. For example, accumulator pressure will affect ECCS bypass and initial flow rate will affect break flow. In effect, the inclusion of a plant parameter’s operational and measurement uncertainty implicitly broadens the range and distribution of PIRT parameters. The third interest in this regard is the customer. In this situation, the customer may be interested in an analysis of some process or condition for which an expanded operational variance is desired, for reasons beyond the normal support of a plant’s limits of operation. The uncertainty treatment for these parameters is handled just like other sampled parameters.

Table 5 presents the list of plant parameters treated in the AREVA NP RLBLOCA 3- and 4-loop sample problems and their relation to important PIRT parameters. Generally, the impact of plant parameters will be much less than PIRT parameters. Most plant parameters represent initial conditions; hence, their impact diminishes with time. Typically, limiting LBLOCA safety analyses show PCT during late reflood; hence, the impact of plant initial state is likely very small. The ECCS parameters will influence the simulation throughout the event; hence, greater importance should be given to these plant parameters.

The objective of CSAU Steps 12 and 13 is to combine the bias and uncertainty of the important individual contributors as identified in Step 9 and Step 11 through the running of a large set of plant simulations. RLBLOCA simulations using the AREVA NP methodology involves two computer codes: RODEX3A and S-RELAP5. As stated in the introduction, RODEX3A is a fuel performance code that provides fuel material property characteristics that determine a fuel pin’s initial stored energy versus burnup. S-RELAP5, a derivative of RELAP5/MOD2 and the CONTEMPT codes, uses the RODEX3A results to initialize the fuel heat structure models as a part of calculating the steady-state solution that initializes the LBLOCA transient simulation. S-RELAP5 is then executed for the transient simulation of the fuel and coolant system response to the break and containment back pressure condition.

The convolution of the many LBLOCA uncertainty contributors (Tables 4 and 5) to PCT is an inherently statistical approach. The two common approaches are generally classified as either parametric or nonparametric. The response surface method, a parametric method, was the approach demonstrated in the CSAU sample problem [4]. The objective of that method is the development of a response surface describing peak clad temperature sensitivity to the dominant LBLOCA uncertainty contributors. The number of calculations required for that approach is dependent on the number of LBLOCA uncertainty contributors considered. AREVA NP chose to apply a nonparametric approach originally recommended in the German Gesellschaft fur Anlagen und Reaktorsicherheit (GRS) methodology [62]. This statistical method is often referred to as Wilks’ method [63]. The nonparametric approach decouples the association between the number of uncertainty parameters and the number of required calculations. The desired quantification of PCT uncertainty is the identification of a specific result that represents coverage of the results domain at or above 95% with a 95% confidence. The 95/95 coverage/confidence has been recognized by the USNRC having sufficient conservatism for LBLOCA analyses.

The minimum number of sampled cases is given by Wilks’ formula for one-sided tolerance limits. Beginning with the probability statement where the is the “probability that the result from a given sample case (F(xk)) exceeds the β percentile” case. When , that is, the largest value of all of the samples, this relationship reduces to where is the coverage, is the confidence, and n is the minimum number of sampled calculations. For the 95/95 coverage/confidence condition, . This means in a random sample of 59 calculations, one case, the highest PCT case, will bound the 95/95 coverage/confidence condition for PCT. A disadvantage of this method is that there may be significant conservatism as a result of bounding the 95/95 condition. Applying Somerville’s generalization of Wilk’s formula on nonparametric tolerance limits [64] can improve the fidelity in the final result through the performance of additional calculations.

Each calculation is setup by first sampling every LBLOCA uncertainty contributor over its derived range. A minimum of 59 calculations are performed. The PCT results from each calculation are sorted to identify the highest PCT. The highest PCT result from 59 calculations bounds the 95/95 condition.

Included in the AREVA NP RLBLOCA methodology, topical reports are sample problems demonstrating application of this methodology on both a 3- and 4-loop Westinghouse pressurized water reactor. Some results from the 3-loop sample problem were presented in [65], which culminated in a PCT of 1853°F. For this problem, more than 30 uncertainty parameters were statistically treated using a Monte Carlo sampling procedure for the creation of 59 code input file sets. Each set included four input files describing models for the fuel performance evaluation, thermal-hydraulic steady-state initialization, thermal-hydraulic transient response, and simultaneous containment response.

The final step in the CSAU process is to identify the total uncertainty. If any PCT gains or penalties were identified during the CSAU process, they are to be applied in Step 14. In addition, the total uncertainty can be quantified relative to a “best-estimate” figure-of-merit. The total uncertainty does not have meaning in relationship to regulatory acceptance criteria. As such, the importance of this measure is somewhat diminished from what the TPG originally envisioned. AREVA NP chose to define total uncertainty using the 50/50 condition, also evaluated from nonparametric statistics. The 50/50 condition is provided by the calculation providing for an odd-numbered sized sample space. For the sample problem, the 50/50 condition was identified as 1500°F; hence, the 95/95 condition represents about 350°F uncertainty.

4. Regulatory Review

The unwritten “Element 4” in the CSAU process is the USNRC regulatory review process. This process spanned over 20 months and required 139 formal “requests for additional information.” Plant-specific elements of the generic review were addressed for the first application and an additional 12 months and approximately 30 RAIs were required. The bulk of the review focused on the explicit definition of the range of applicability for the key LBLOCA phenemological and plant parameters. This was provided following the methods previously discussed in the Element 2 section. In addition, the USNRC requested technical basis supporting the treatment of fuel relocation, downcomer boiling and rod-to-rod radiation–phenomena not appearing on the AREVA NP PIRT. AREVA NP responded to these concerns by supplying new sensitivity results and/or detailed characterization of how the existing model was adequate.

5. Conclusion

The AREVA NP RLBLOCA methodology is a CSAU-based methodology for performing best-estimate large-break LOCA analysis. The methodology addresses all of the expressed steps of the CSAU process. The key challenge to this process has been the defense of declared engineering judgment and the demonstration of the methodologies range of applicability. This was accomplished by careful characterization of dominant LOCA parameters and emphasis on validation through sensitivity studies and the statistical nature of the methodology.

The generic AREVA NP RLBLOCA methodology was approved by the USNRC in April 2003 and is now being applied to several nuclear power plants serviced by AREVA NP Inc. While the CSAU methodology represents a significant departure from traditional deterministic methods, the AREVA NP methodology applying nonparametric statistics retains an economical viability on par with existing methodologies. Throughout the 40+ staff-years of development effort at AREVA NP, the CSAU process has withstood the technical questions and challenges to its foundation. The key benefits realized by AREVA during this development are

(i)The move to a realistic LOCA methodology brings a new clarity of understanding of the LBLOCA problem to the industry by demonstrating contrast to the very conservative 10 CFR 50 Appendix K methodologies.(ii)Through use of statistically-based methods, there is improved characterization of the conditions in which individual LBLOCA uncertainty contributors influence LBLOCA response.(iii)The reliance on experimental data has revived the importance of the many test programs that have long since been decommissioned.

These rewards alone have validated the CSAU approach.

Acronyms
CSAU:Code scaling, applicability, and uncertainty
ECCS:Emergency core cooling system
GRS:Gesellschaft fur Anlagen und Reaktorsicherheit
HEM:Homogeneous equilibrium model
LOCA:Loss of coolant accident
PCT:Peak clad temperature
PIRT:Phenomena identification and ranking table
PWR:Pressurized water reactor
RLBLOCA:Realistic large-break LOCA
TPG: Technical Program Group
USNRC:United States Nuclear Regulatory Commission