Scaling, Uncertainty, and 3D Coupled Code Calculations in Nuclear Technology
View this Special IssueResearch Article  Open Access
Robert P. Martin, Larry D. O'Dell, "Development Considerations of AREVA NP Inc.'s Realistic LBLOCA Analysis Methodology", Science and Technology of Nuclear Installations, vol. 2008, Article ID 239718, 13 pages, 2008. https://doi.org/10.1155/2008/239718
Development Considerations of AREVA NP Inc.'s Realistic LBLOCA Analysis Methodology
Abstract
The AREVA NP Inc. realistic largebreak lossofcoolantaccident (LOCA) analysis methodology references the 1988 amended 10 CFR 50.46 allowing bestestimate calculations of emergency core cooling system performance. This methodology conforms to the code scaling, applicability, and uncertainty (CSAU) methodology developed by the Technical Program Group for the United States Nuclear Regulatory Commission in the late 1980s. In addition, several practical considerations were revealed with the move to a production application. This paper describes the methodology development within the CSAU framework and utility objectives, lessons learned, and insight about current LOCA issues.
1. Introduction
The objective of any methodology for measuring the performance of an emergency corecooling system (ECCS) during a lossofcoolant accident (LOCA) is to provide a statement of assurance that the ECCS will preserve fuel integrity. For largebreak LOCA analysis, the key measure (among several) is peak cladding temperature (PCT) relative to 2200^{°}F (1200^{°}C). Traditionally, LOCA analyses performed in the U.S. for Nuclear Power Plant designbasis safety analysis were required to comply with the U.S. Code of Federal Regulations Title 10, Part 50 (10 CFR 50), Appendix K, a conservative, deterministic approach. Following several research and development advances in twophase flow and heat transfer phenomena specifically related to the LOCA, regulations were updated in 1988 to allow bestestimate approaches. Several events leading up to the rule change included the close of the 2D/3D program [1] and the development of NUREG1230, Compendium of ECCS Research [2]. In addition, during the rulemaking process, a committee of experts was convened to develop a paradigm for performing bestestimate LOCA evaluations. These experts came from the USNRC, national laboratories, and academia. This Technical Program Group (TPG) produced the code scaling, applicability and uncertainty (CSAU) methodology, which is documented in NUREG5249 [3]. Today, the CSAU methodology is well known in the LOCA community and many papers have been inspired from both the content and the conclusion developed from that original work. Accompanying NUREG5249, the USNRC released Regulatory Guide 1.157, bestestimate calculations of emergency core cooling system performance, which provides specific detail describing acceptable bestestimate LOCA methodologies [4].
An AREVA NP predecessor company, Siemens Power Corporation, developed and submitted to the USNRC a bestestimate LBLOCA methodology during the early 1990s; however, the USNRC could not provide resources to support the review for several years. As a consequence, Siemens Power Corporation decided to reinvent this methodology and resubmitted a realistic largebreak LOCA (RLBLOCA) methodology in August 2001 [5]. In April 2003, AREVA NP received approval of an SRELAP5based realistic largebreak LOCA methodology from the USNRC [6].
2. Evolution of BE Methods Since 1988
The development of this methodology is a product of the lessons learned since the 1988 rule change both internal to AREVA NP and by the thermalhydraulic community at large. This is despite the fact that in May 1990 in a special issue of “Nuclear Engineering and Design” [7], the editor declared “the closure of the largebreak LOCA issue.” This bold statement did not go unchallenged. In January 1992, a special issue of “Nuclear Engineering and Design” [8] was published providing comment and criticism, in the form of “Letters to the Editor,” of the existing technical understanding of LOCA, in general, and the CSAU methodology, specifically. Several areas were identified as being incomplete. These can be generally associated in the following categories [9]:
(i)defining “bestestimate methods;”(ii)merits of engineering judgment;(iii)methods for the convolution of uncertainty;(iv)data to quantify uncertainties. In order to produce an acceptable, usable methodology, resolution of these and other issues was necessary. “Resolution” is, of course, a negotiated condition involving the methodology developers, an applicant, and the regulatory reviewers. Nonetheless, this paper presents insights from AREVA NP’s experience in the process from the 1988 rule change until USNRC approval in 2003.
2.1. Defining “BestEstimate” Methods
In the context of thermalhydraulic safety analysis performed to support nuclear power plant operation, no consensus appears to have been established for defining “best estimate.” The difficulty stems from the many types of uncertainty contributing to a plantscale accident scenario. Sources of uncertainty associated with a largebreak lossofcoolant accident (LOCA) analysis begin with that which can be observed—measurable quantities reflecting the design or condition of a system, structure, or component. In this context, “bestestimate” can be simply characterized as a preferred state for which any perturbation is followed by a return to its preferred or “bestestimate” state for the system, structure, or component.
The original problem tackled by the TPG in NUREG5249 was for a doubleended largebreak LOCA at a Westinghouse 4loop PWR operating at steadystate full power. Several uncertainties associated with this problem were recognized in that reference including those associated with code models, the impact of test facility scaling, epistemic uncertainty resulting in compensating errors, nodalization, and, to a lesser extent, the user effect. On the surface, the TPG appeared to establish a welldefined description; however, even this description, supported by the discussion on uncertainty presented in NUREG5249, disguises other uncertainties that are much more difficult to quantify and include into a definition for “bestestimate methods.” To identify these additional uncertainty contributors, this application statement can be dissected.
Beginning with “doubleended largebreak LOCA,” this identifies a scenario with a particular break configuration. This vision of the largebreak LOCA problem either ignores the spectrum of breaksizes associated with LOCAs or addresses this uncertainty with conservatism. Incorporating conservatism into the definition of “bestestimate methods” appears to undermine the original move to bestestimate methods. One of the primary criticisms of the Appendix K deterministic approach was that certain socalled “conservative” models could result in nonconservative behavior during a simulation. Bestestimate methods certainly should avoid this situation; however, the question of breaksize is just one element of the broader uncertainty category associated with the nature of the initiating event. The communicative nature of the break (i.e., guillotine or longitudinal split), break orientation (i.e., necking for guillotine break and directional nature of split breaks), break location (i.e., cold or hot leg; pressurizer or other loops, attached pipe), and the assumed single failure (a regulatory requirement) also contribute to the initiating event uncertainty.
The descriptor “Westinghouse 4loop PWR” identifies a plant design; however, the nature of nuclear power plant development is such that even among Westinghouse 4loop PWRs there can be significant differences. Component choices, such as reactor coolant pumps, steam generators, core/reactor vessel design (i.e., bypass flows, fuel assembly design, upper head design), and containment response features (i.e., sprays, ice, fan coolers, passive structure surface area), represent elements of the design uncertainty. In addition, operational and maintenance history can impact the performance of “equivalent” systems, structures, and components. As a consequence, there are no “identical” plants.
“Operating at steadystate full power” encompasses all uncertainties associated with plant operating state and event response. In analysis space, these are often initial or boundary conditions. A plant’s technical specifications and limiting condition of operation define the operational space enveloping acceptable plant states. Frequently, there is significant latitude for “acceptable” states for system variables, including core axial power and fuel burnup that can have a strong influence on the acceptance criteria metrics. The challenge for a “bestestimate” analysis is to balance the value of defining the likely plant state at the time of an accident with the need to support the plant’s operational envelope. Dozens of analysis parameters fall into this category.
In recognizing the complexity of the uncertainty problem associated with LOCA safety analysis, the term “bestestimate” as applied to this problem has evolved into “bestestimate plus uncertainty” (BEPU). The problem has always been the management of uncertainty. At the time the CSAU methodology was being developed, a relatively narrow view of uncertainty was necessary because of limitations in computational ability and limited appreciation of advanced statistical methods. This original CSAU view on uncertainty was criticized as being incomplete with relevant contributors to the LOCA safety analysis problem being treated implicitly and, as a consequence, wrong. As such, the conversation moved from BE to BEPU—with the emphasis on uncertainty management.
2.2. The Role of Engineering Judgment
Engineering judgment has always been a necessary part of any engineering task. Engineers, through the expression of their experience, have often applied engineering judgment to make big engineering challenges workable. Confirmation is, of course, necessary when safety is a concern. In developing the CSAU methodology, the TPG formalized this often unappreciated aspect of engineering. Doing so started a debate as to the extent that engineering judgment should play in the LOCA safety analysis problem.
The manifestation of engineering judgment in the CSAU process is the phenomenological identification and ranking table (PIRT). As the name implies, the PIRT reflects qualitative engineering judgment as to the importance of various phenomena relevant to the problem of interest. The intent of the PIRT is to provide a technical basis during the BEPU methodology development process for the many decisions, including the management of uncertainty, required to complete the task.
Resistance to this formalized use of engineering judgment inspired several criticisms, including the following.
(i)Who is qualified to be a part of a PIRT team?(ii)How do PIRT teams deal with differences of opinion?(iii)Should uncertainty with the ranking process be incorporated into the PIRT?(iv)Even after the PIRT is developed, engineering judgment is required to use the results.(v)How can the absence of knowledge (i.e., unmodeled parameters) be treated in this context? Despite the initial criticism, the PIRT exercise has found a degree of acceptance. Its foremost value has been in establishing an understanding of the processes and phenomena of interest among a group of peers. Once consensus is achieved, decisions impacting the solution of the task at hand may begin.
In the original CSAU largebreak LOCA sample problem, the TPG, applying a PIRT they developed for this problem, established a precedent that the largebreak LOCA problem can be well characterized by explicitly addressing a minimum set of very important processes and phenomena. Beyond that set of largebreak LOCA contributors, other phenomenological or process parameters were treated as “nominal.” This application of engineering judgment has not found universal acceptance for two reasons: (1) there is a lack of consensus of “important” parameters and (2) it ignores traditional licensing measures defined in plant technical specifications and limiting condition of operation.
To satisfy this criticism, the BEPU approach recognizes the value of “realistic conservatism,” that is, the explicit treatment of uncertainty by characterizing the uncertainty parameter such that the key output variables are penalized relative to the acceptance criteria. For parameters with low largebreak LOCA importance, this may be a trivial distinction; however, as importance increases, scrutiny over that which is proclaimed conservative also increases. Nonetheless, the acceptance of “realistic conservatism” represents a significant departure from the original concept of BE methods; yet, it is absolutely necessary for the complex LOCA analysis problem where engineering judgment is involved.
2.3. Convolution of Uncertainty
A constraint, recognized early by the TPG during the development of the CSAU method, stemmed from the application of statistics to convolve parameter uncertainty of several individual largebreak LOCA contributors into a single uncertainty statement for PCT. Specifically, the broader the set of uncertainty contributors considered, the more than number of required LOCA simulations grows exponentially. This is the nature of the response surface methods that the TPG considered stateoftheart for this application. There is no doubt that this practical constraint influenced their acceptance of the relatively small number of largebreak LOCA contributors considered in their uncertainty analysis sample problem. Later, Westinghouse would introduce a clever extension to the response surface approach to expand the number of largebreak LOCA contributors that could be considered [10].
When introduced in 1989, a few organizations in the international thermalhydraulic community—in particular, Germany’s GRS—recognized that this obvious limitation could be eliminated by considering nonparametric statistical approaches. This counterpoint was not universally appreciated either because there was a lack of understanding or nonacceptance of nonparametric statistics lack of a definitive uncertainty statement. The uncertainty statement from a nonparametric statistical approach is expressed as an inequality characterized with a confidence level.
Today, nonparametricordered statistics (e.g., Wilk’s method) have become the method of choice. However, consensus with regard to its implementation within regulatory guidelines is still evolving. Current regulation in the U.S. and other countries recognize a multivariant acceptance criterion for largebreak LOCA analysis. As a consequence, a debate over the required number of calculations necessary to provide an acceptable uncertainty statement has resulted in several journal articles on the subject [11–15]. Much of this debate is on the semantics used to present the uncertainty statement. Specifically, should the acceptance criterion be measured individually or is it sufficient to consider the outcome of an analysis as a single statement concerning whether the entire acceptance criterion has been satisfied. AREVA NP’s position is with the latter.
2.4. Completeness of the Experimental Database
Driven by the recognized gap in knowledge of LOCA phenomena apparent in the early 1970s that resulted in the early Appendix K rule making, governments around the world invested heavily in experimental programs to rectify this situation. By the late 1980s, a large body of research on many facets of the largebreak LOCA problem was completed. Coupled with the CSAU approach for performing BE analysis, was this body of work sufficient to declare the closure of the largebreak LOCA problem? Undermining the closure position was the view that so much of the thermalhydraulic phenomenological database was populated empirically and, as such, there remains much yet to be characterized.
While statistics appeared to be the answer to the analyst, the experimentalist was saying that in many areas data was insufficient for deriving statistical measures. In addition, the possibility of unknown phenomena or undesirable interplay between competing phenomena made any declaration of closure irresponsible. The TPG’s response was simply that a sufficient amount of experimentation focused on both separate and integral effects existed and that uncertainty associated with scale could be determined. In areas this may be large; however, if it turns out that uncertainty is too penalizing, this would be a motivation for new test programs.
2.5. AREVA NP’s BEPU Paradigm
Constraining factors that can limit a nuclear power plant’s efficiency include engineering design limits, equipment operability, and regulatory requirements. The acceptance of BE methods has revealed margin for improving plant operating performance. Figure 1 illustrates this view of the plant operating margin provided by BE methods relative to the traditional Appendix K deterministic methods. Margin is characterized by the separation between the design or the licensing limit and the nominal operating point. With regard to regulatory limits, this is measured by recognized metrics relative to the regulatory acceptance criteria, for example, PCT < 2200^{°}F.
Deterministic methods provide a single “analysis of record” that quantifies the acceptance criteria metrics (PCT, total oxidation, and local hydrogen generation). Over the operating history of current generation nuclear power plants, utilities have nearly exhausted the availability of margin provided by this original method and, as a result, the apparent margin is small.
In contrast, BE methods strive to identify the acceptance criteria metrics associated with the real state of the plant. Practical limitations associated with the state of knowledge required to perform analyses force analysts to apply conservatisms that make the calculated BE value bounding of the real state. In addition, the real margin is never realized because the design basis limits reserve margin to cover uncertainties associated with the actual limits.
For the purpose of reporting plant operating performance margin relative to licensing limits, the goal is not to define this margin relative to the actual state; rather, it is to convolve all key phenomenological and process uncertainties to identify the calculated BEPU value—a conservative estimate of margin incorporating realistic models of the physical processes and associated phenomena.
In preparing the AREVA NP largebreak LOCA methodology, the challenge of addressing the expectations of Regulatory Guide 1.157 and the CSAU process—balanced with the known criticisms of the CSAU process—moved the AREVA NP methodology development team towards nonparametric statistical methods and the “realistic conservatism” concept of uncertainty management. By taking this step, the focus of the methodology moves towards the resolution of individual uncertainty contributors.
The main advantage of nonparametric statistical methods is that the number of treatable uncertainty contributors is independent of the number of plant calculations. This characteristic provides flexibility during the development process to explicitly address as many or as few analysis contributors as necessary to resolve the outcome of the PIRT. As this is a product of engineering judgment, the uncertainty associated with this exercise can be reduced by explicitly addressing additional analysis contributors. In addition, this methodology characteristic provides the opportunity to incorporate customer requests for the explicit treatment of plant process uncertainty.
For the remainder of this paper, a description is provided of how AREVA NP’s RLBLOCA methodology conforms to the basic principles of the CSAU methodology while incorporating realistic conservatisms and nonparametric statistics.
3. Reconciling AREVA NP’s RLBLOCA Methodology with CSAU
The development of AREVA NP’s RLBLOCA methodology was primarily an exercise in complying with the main themes of the CSAU methodology. AREVA NP’s interpretation of the CSAU approach is that it represents a framework for deriving a quantifiable degree of assurance from a bestestimate analysis tool. This framework, graphically presented in Figure 2, consists of three elements and 14 steps that build on a qualitative understanding of (in this case) the largebreak LOCA problem to define the necessary tasks to derive a quantitative solution. Highlighted components in Figure 2 represent steps that overlap with deterministic Appendix K methodologies. The CSAU framework outlines a procedure that leads from the identification and characterization of the dominant phenomena influencing the key acceptance parameter, PCT, to quantify a bestestimate of the consequences of a LBLOCA and its associated uncertainty. As with AppendixKderived methodologies, the final result is a calculation that provides a PCT to be measured against the 10 CFR 50.46 acceptance criteria and a statement of total uncertainty associated with that result.
3.1. Requirements and Code Capabilities
The first CSAU element sets a foundation of understanding to guide methodology development. Its emphasis is on defining the problem and capturing a knowledge base that will be used to provide the fundamental technical basis for decisions downstream in the methodology development process. Steps 1, 2, 4, and 5 shown in Figure 2 identify the problem through specification of the event scenario, plant type, computer code and version, and computer code documentation, respectively. Historically, this information represented all that would normally be required for evaluation methodologies (EM) based on 10 CFR 50 Appendix K. Table 1 summarizes the AREVA choices. Of particular note is the primary analysis tool SRELAP5. SRELAP5 is a modified version of RELAP5/MOD2 [16] with several updates including:
(i)multidimensional modeling capability (twodimensional hydrodynamics);(ii)energy equations modified to better conserve transported energy;(iii)incorporation of a derivative of the CONTEMPT [17] containment analysis code;(iv)iterative evaluation for choked junctions;(v)bankoff CCFL model;(vi)modeling of noncondensable gases (e.g., nitrogen discharge form accumulators);(vii)revised twophase pump degradation based on EPRI data;(viii)improvements to interphase friction and mass transfer models;(ix)SleicherRouse used for singlephase vapor heat transfer. Step 3, identify and rank phenomena, marks a significant departure from traditional evaluation methodology approaches by formulizing engineering judgment to aid both methodology development and regulatory review. This is particularly important given the substantial effort required to develop a CSAUbased methodology. Step 3 acknowledges that plant behavior is not equally influenced by all processes and phenomena that occur during a transient. This provides the basis to reduce the analysis effort to a manageable set of phenomena ranked with respect to their influence or importance on the primary safety criteria (i.e., PCT).
The ranking process employed for the AREVA NP RLBLOCA methodology was accomplished primarily through structured discussions among AREVA NP engineers and recognized nuclear safety and thermalhydraulics experts from industry and academia. The experts assembled for this task had extensive experience in both the experimental and computational areas of nuclear thermalhydraulics. The PIRT team started with the original LBLOCA PIRT presented by the TPG [4]. This initial PIRT was reviewed by the three external experts, who offered recommendations for the addition or deletion of phenomena from the PIRT and revisions to the ranking of the phenomena based on the evolution of LBLOCA understanding since the publication of the CSAU methodology and lessons learned from early applications of BE methods. Following this review, a peer review was held with the three experts and four additional AREVA NP personnel to derive a final PIRT that incorporated the input from all seven participants. This final PIRT also merited from approximately 300 code sensitivity studies that served as a validation of the engineering judgment statements. The outcome of these meetings was an AREVA NPproprietary phenomena identification and ranking table (PIRT) for largebreak LOCAs that has many similarities with the original TPG largebreak LOCA PIRT [4]. AREVA NP identified the PIRT parameters shown in Table 2 as dominant in a largebreak LOCA and must be explicitly addressed in a CSAUbased methodology. Following PIRT development nearly 100 unique sensitivity studies were performed to assess consistency between the PIRT and SRELAP5 largebreak LOCA model response. The outcome of those studies served to motivate further code model upgrades and validate PIRT selections.

CSAU, Step 6, serves to establish a computer code’s applicability to the analysis problem. This is done by defining a cross reference of phenomena and plant components to the computer code’s models and correlations and nodalization capability. With regard to the dominant PIRT parameters, code applicability also must be supported by the documentation provided in Step 5.
3.2. Assessment and Ranging of Parameters
The second CSAU element establishes the methodology’s pedigree to perform a bestestimate analysis. This is done by codetodata comparisons, sensitivity studies, and uncertainty analysis. It builds from Element 1 that defines a framework for the performance of sensitivity studies and identification of experimental test programs by relevance to the dominant largebreak LOCA phenomena. Step 7 defines the code’s assessment matrix. Thermalhydraulic computer codes like SRELAP5 include a large number of closurerelationships to address the broad spectrum of possible thermalhydraulic phenomenological processes. For this reason, it is neither practical nor necessary to assess every code model and correlation to support the subset of important phenomena anticipated during a LBLOCA. The PIRT and the subsequent sensitivity studies were used by AREVA NP to identify the most useful experimental programs for code assessment from the rather extensive knowledge base of experiments supporting PWR LOCA phenomena. Proprietary restrictions reduce this set considerably; however, sufficient data remains in the public domain to support qualification of a bestestimate LOCA code for PWR applications. The AREVA NP RLBLOCA assessment matrix is characterized in Table 3, which identifies the test program, the number of specific tests applied to the AREVA NP RLBLOCA assessment matrix, and the primary phenomenon of interest. The particular tests were selected to address the following:

(i)important LOCA phenomena defined in the PIRT;(ii)nodalization validation (defined in CSAU Step 8);(iii)code/model scaling (defined in CSAU Step 10);(iv)verification of no important compensating effects;(v)establishing a broad range of applicability. The CSAU methodology acknowledges that system nodalization is similar to any code model or correlation in that code results are sensitive to model permutations. This is addressed in Step 8, nuclear power plant nodalization definition. System nodalization presents an inherent code uncertainty. Unlike code models and correlations, quantification of nodalizationbased code uncertainty is deemed to be of lesser importance relative to the practical requirements of model accuracy and calculation efficiency or economics. The objective is to define the minimum noding needed to capture the important phenomena. The selection process used to arrive at this objective becomes the standard nodalization procedure. The standard nodalization procedure is applied to every code assessment and LBLOCA analysis; thus, minimizing nodalization as a contributor to uncertainty.
Code assessment using the test matrix from Step 7 and the nuclear power plant nodalization of Step 8 is used to accomplish Step 9, code, and experiment accuracy. Code accuracy is quantified for bias and deviations through confirmatory code uncertainty analysis and benchmarks. This step also serves as a validation for Step 6, code applicability, and sets up the tasks of element 3, sensitivity, and uncertainty analysis. The demonstration of code accuracy—or for a conservative EM, code adequacy—has always been a required component of LOCA evaluation methodologies. With a CSAUbased evaluation methodology, the emphasis is focused on evaluating the important individual contributors (i.e., phenomena) to the overall code uncertainty.
For the dominant LBLOCA phenomenon (e.g., critical flow, film boiling, condensation, fuel stored energy, etc.), sets of separate effects tests were used to derive the SRELAP5 code uncertainty as it relates to each individual phenomenon. From the codetodata comparisons, such as that seen in Figure 3 comparing SRELAP5 results (x_{c}) to Marviken critical flow test data (x_{m}), code bias (μ_{x}) and the statistical standard deviation (σ) were evaluated.
While uncertainty quantification obviously requires data, the process for quantification begins with a clear qualitative understanding of the assumptions associated with measured values. This is the nature of probability and statistics in general. For example, heat transfer is fundamentally dependent on geometry, power, temperatures, fluid properties, and mass flow. In a nuclear power reactor core, heat transfer is complicated by multidimensional effects resulting from core and fuel design and radial and axial power variations. In addition, potentially dramatic changes in fluid properties can occur as a consequence of both phenomenological (e.g., phase change) and plant process response (e.g., safety injection). However, what we know about core heat transfer has been gathered from data taken from prototypical systems of likely different scale skewed by limitations in measurement capabilities and data reduction techniques.
The quality of the data, characterized by both quantitative limitations such as the domain of system conditions during testing and qualitative limits associated with measurement factors and data reduction, must be addressed. The ideal nature of measured data would have the following characteristics.
(i)Phenomenon of interest is measurable independent of other phenomena.(ii)Phenomenological dependencies with a particular system condition are measurable independent of changes of other system conditions.(iii)Detailed dimensional variations are measurable.(iv)Scale distortion is eliminated. Since real data often does not have these characteristics, data reduction techniques have been devised and applied to compensate. Such methods often involve the elimination of “tainted” data and/or the averaging of data. The cost of such techniques is typically seen in the loss of some data and/or the broadening of uncertainty measures. Some examples from a hypothetical reflood heat transfer test are as follows:
(a)the elimination of temperature data for heater rods near a “cold” vessel wall (possible excessive radiation)—a consequence of scale distortion;(b)insufficient number of thermocouples to track radial and/or axial temperature variation in the simulated fuel assembly resulting in the need to track computed average temperature results or just the peak temperature results by eliminating data not considered “peak”;(c)tracking a total heat transfer measure rather than separate heat transfer mechanisms and other influencing phenomena (i.e., combinations of radiation and convection between walls and liquid and vapor fluids, interfacial drag); that is, tracking the convolution of multiple phenomena to produce an “aggregate phenomenon”;(d)binning temperature data over a segment of a test condition range (e.g., pressure, void fraction) to assure an adequate depth of data necessary to generate meaningful uncertainty measures. Such limitations in data are manageable; however, the implications of such limits should be addressed in the implementation of the uncertainty measures used in BEPU methodologies.
Completeness requires that the treatment of each important LBLOCA phenomena be addressed; however, a full quantification of uncertainty for each phenomenon is not necessary and, given the availability of data, may not be possible. “Phenomenological treatment” should describe a method in which the parameter range of each LBLOCA contributor is covered. The use of statistics provides various methods for describing ranges of uncertainty for a given problem; however, the CSAU process does allow for methodology conservatisms to satisfy the objective of defining uncertainty treatment for individual code models and correlations. The practical limitations of economics and data availability are considered when accepting a conservative phenomenological treatment. The trade off is the reduction in margin relative to the LBLOCA acceptance criteria. Again, engineering judgment can play a role in how to approach this step. Table 4 provides a summary of the parameters for which code uncertainty was quantified. While in most cases AREVA NP developed proprietary analyses to quantify parameter uncertainty, quantified uncertainty for a few parameters appears in open literature. In those cases (i.e., metalwater reaction and decay heat), the values used in the AREVA NP RLBLOCA methodology are provided.

Given quantified uncertainty measures, the integrity of the statistics requires the demonstration of sufficient density and breadth of data within the rangeofapplicability. Validation of uncertainty ranges or standard deviation is provided by reserving “control sets” of data and reevaluating statistics. Data from integral effects tests (e.g., CCTF, LOFT, and semiscale) was used to demonstrate the acceptability of the code biases developed from the separate effects tests. Figure 4 shows a comparison of a CCTF Test 54 assessment before and after the evaluation of code biases.
Beyond the uncertainty quantification exercise, the primary challenge of Element 2 is to demonstrate sufficient range of applicability of the computer code models and correlations. Code models and correlations are best assessed using separate effects test data developed for the explicit purpose of investigating the phenomena described by the code model or correlation. Establishing a sufficient range of applicability is complicated by the fact that conditions present during a PWR LBLOCA span thermalhydraulic ranges (pressures, temperatures, flows, etc.) that exceed the ranges of any individual separate effects test. Given this inherent limitation, the logical approach to establish the pedigree of a particular code model or correlation must incorporate a broader body of knowledge on the phenomena of interest. Applying an analogy from vector space analysis, the “applicability space” will not only include data from various separate effects test programs, but also analytical solutions and data from various integral effects tests. It is the collection of this full body of phenomenological knowledge: the analytical model, the statistical description of uncertainty from separate effects tests, and validation with integral effects tests—as incorporated within a calculational framework such as SRELAP5 that provides the technical basis supporting the declared range of applicability of a code model or correlation.
An added complexity to the applicability question is test scalability. This is addressed in Step 10. In the long history of thermalhydraulic code models and correlations development, computer code models and correlations have often been “tuned” to particular data sets. This approach to computer code development can create a results bias and uncertainty associated with the scaling of the problem of interest. Scaling uncertainty can be evaluated using data from a suite of test programs generated at various scales. For the specific application to the PWR LBLOCA, there is a motivation to acquire fullscale data for the dominant LBLOCA phenomena. Fortunately, many hydraulic phenomena can be assessed using tests performed at the fullscale upper plenum test facility (UPTF). In addition, heat transfer phenomena can be assessed applying data from the many reflood tests that have been performed with fullscale assemblies. The AREVA NP RLBLOCA methodology utilized the available fullscale data wherever possible. In addition, codetodata comparisons from scaled test facilities did not show a significant scale bias. With this approach to the scaling issue, no additional accounting for scale is necessary.
3.3. Sensitivity and Uncertainty Analysis
Given the inherent uncertainty and complexity of the thermalhydraulic processes appearing during a largebreak LOCA, a bestestimate statement of assurance must be provided statistically. This CSAU element focuses on settingup, executing, and evaluating a RLBLOCA analysis. As a statisticsbased methodology, the problem setup involves implementing the bias and uncertainty for the LBLOCA contributors identified from CSAU Elements 1 and 2. Execution involves the convolution of these uncertainty contributors and the final result is evaluated from the number of calculations necessary to provide a statistically meaningful set.
While the CSAU methodology through Step 9 is focused on phenomenological contributors to uncertainty, it recognizes in Step11 that there is also uncertainty associated with the measurable states that define a plant’s operating condition, such as pressures, temperatures, levels. For utility customers interested in plantspecific application of an approved methodology, this step may be the most important step; however, the CSAU methodology [4] discussion provides the least amount of direction. In response to the limited amount of guidance provided by the TPG, the AREVA NP approach has been detailed and reported in [61].
The key challenge to addressing the uncertainty associated with plant state is reconciling the requirement for analyses to support a plant’s licensing basis through the plant’s design and control specifications while still being “bestestimate.” Traditional deterministic analyses explicitly utilize a plant’s technical specifications when it is clearly conservative to do so; otherwise, a bestestimate value is considered to bound the technical specification. Since no provision is made for BE methods to exempt the use of conservative technical specification in safety analysis, the concept of “realistic conservatism” is unavoidable. That is, this condition is a function of the regulatory process for plant licensing and not an artifact of the developed safety analysis methodology.
AREVA NP’s approach to identify which plant parameters to explicitly treat as an uncertainty parameter, either as a direct bias or sampled, considers the interests of several constituents. The primary regulatory interest requires that the plant be analyzed at technical specification limits. Precedence established by Appendix K methods provides the list of those parameters that are expected to be treated in this fashion. A second interest has been inferred by AREVA NP given the emphasis in the CSAU methodology on important phenomenological contributors to LOCA acceptance criteria. AREVA NP chose to recognize that plant response to an offnormal event is driven by phenomena. Specifically, plant parameters were correlated to phenomena and the importance of a plant parameter was made in relation to any associated phenomenological parameter. For example, accumulator pressure will affect ECCS bypass and initial flow rate will affect break flow. In effect, the inclusion of a plant parameter’s operational and measurement uncertainty implicitly broadens the range and distribution of PIRT parameters. The third interest in this regard is the customer. In this situation, the customer may be interested in an analysis of some process or condition for which an expanded operational variance is desired, for reasons beyond the normal support of a plant’s limits of operation. The uncertainty treatment for these parameters is handled just like other sampled parameters.
Table 5 presents the list of plant parameters treated in the AREVA NP RLBLOCA 3 and 4loop sample problems and their relation to important PIRT parameters. Generally, the impact of plant parameters will be much less than PIRT parameters. Most plant parameters represent initial conditions; hence, their impact diminishes with time. Typically, limiting LBLOCA safety analyses show PCT during late reflood; hence, the impact of plant initial state is likely very small. The ECCS parameters will influence the simulation throughout the event; hence, greater importance should be given to these plant parameters.

The objective of CSAU Steps 12 and 13 is to combine the bias and uncertainty of the important individual contributors as identified in Step 9 and Step 11 through the running of a large set of plant simulations. RLBLOCA simulations using the AREVA NP methodology involves two computer codes: RODEX3A and SRELAP5. As stated in the introduction, RODEX3A is a fuel performance code that provides fuel material property characteristics that determine a fuel pin’s initial stored energy versus burnup. SRELAP5, a derivative of RELAP5/MOD2 and the CONTEMPT codes, uses the RODEX3A results to initialize the fuel heat structure models as a part of calculating the steadystate solution that initializes the LBLOCA transient simulation. SRELAP5 is then executed for the transient simulation of the fuel and coolant system response to the break and containment back pressure condition.
The convolution of the many LBLOCA uncertainty contributors (Tables 4 and 5) to PCT is an inherently statistical approach. The two common approaches are generally classified as either parametric or nonparametric. The response surface method, a parametric method, was the approach demonstrated in the CSAU sample problem [4]. The objective of that method is the development of a response surface describing peak clad temperature sensitivity to the dominant LBLOCA uncertainty contributors. The number of calculations required for that approach is dependent on the number of LBLOCA uncertainty contributors considered. AREVA NP chose to apply a nonparametric approach originally recommended in the German Gesellschaft fur Anlagen und Reaktorsicherheit (GRS) methodology [62]. This statistical method is often referred to as Wilks’ method [63]. The nonparametric approach decouples the association between the number of uncertainty parameters and the number of required calculations. The desired quantification of PCT uncertainty is the identification of a specific result that represents coverage of the results domain at or above 95% with a 95% confidence. The 95/95 coverage/confidence has been recognized by the USNRC having sufficient conservatism for LBLOCA analyses.
The minimum number of sampled cases is given by Wilks’ formula for onesided tolerance limits. Beginning with the probability statement where the is the “probability that the result from a given sample case (F(x_{k})) exceeds the β percentile” case. When , that is, the largest value of all of the samples, this relationship reduces to where is the coverage, is the confidence, and n is the minimum number of sampled calculations. For the 95/95 coverage/confidence condition, . This means in a random sample of 59 calculations, one case, the highest PCT case, will bound the 95/95 coverage/confidence condition for PCT. A disadvantage of this method is that there may be significant conservatism as a result of bounding the 95/95 condition. Applying Somerville’s generalization of Wilk’s formula on nonparametric tolerance limits [64] can improve the fidelity in the final result through the performance of additional calculations.
Each calculation is setup by first sampling every LBLOCA uncertainty contributor over its derived range. A minimum of 59 calculations are performed. The PCT results from each calculation are sorted to identify the highest PCT. The highest PCT result from 59 calculations bounds the 95/95 condition.
Included in the AREVA NP RLBLOCA methodology, topical reports are sample problems demonstrating application of this methodology on both a 3 and 4loop Westinghouse pressurized water reactor. Some results from the 3loop sample problem were presented in [65], which culminated in a PCT of 1853^{°}F. For this problem, more than 30 uncertainty parameters were statistically treated using a Monte Carlo sampling procedure for the creation of 59 code input file sets. Each set included four input files describing models for the fuel performance evaluation, thermalhydraulic steadystate initialization, thermalhydraulic transient response, and simultaneous containment response.
The final step in the CSAU process is to identify the total uncertainty. If any PCT gains or penalties were identified during the CSAU process, they are to be applied in Step 14. In addition, the total uncertainty can be quantified relative to a “bestestimate” figureofmerit. The total uncertainty does not have meaning in relationship to regulatory acceptance criteria. As such, the importance of this measure is somewhat diminished from what the TPG originally envisioned. AREVA NP chose to define total uncertainty using the 50/50 condition, also evaluated from nonparametric statistics. The 50/50 condition is provided by the calculation providing for an oddnumbered sized sample space. For the sample problem, the 50/50 condition was identified as 1500^{°}F; hence, the 95/95 condition represents about 350^{°}F uncertainty.
4. Regulatory Review
The unwritten “Element 4” in the CSAU process is the USNRC regulatory review process. This process spanned over 20 months and required 139 formal “requests for additional information.” Plantspecific elements of the generic review were addressed for the first application and an additional 12 months and approximately 30 RAIs were required. The bulk of the review focused on the explicit definition of the range of applicability for the key LBLOCA phenemological and plant parameters. This was provided following the methods previously discussed in the Element 2 section. In addition, the USNRC requested technical basis supporting the treatment of fuel relocation, downcomer boiling and rodtorod radiation–phenomena not appearing on the AREVA NP PIRT. AREVA NP responded to these concerns by supplying new sensitivity results and/or detailed characterization of how the existing model was adequate.
5. Conclusion
The AREVA NP RLBLOCA methodology is a CSAUbased methodology for performing bestestimate largebreak LOCA analysis. The methodology addresses all of the expressed steps of the CSAU process. The key challenge to this process has been the defense of declared engineering judgment and the demonstration of the methodologies range of applicability. This was accomplished by careful characterization of dominant LOCA parameters and emphasis on validation through sensitivity studies and the statistical nature of the methodology.
The generic AREVA NP RLBLOCA methodology was approved by the USNRC in April 2003 and is now being applied to several nuclear power plants serviced by AREVA NP Inc. While the CSAU methodology represents a significant departure from traditional deterministic methods, the AREVA NP methodology applying nonparametric statistics retains an economical viability on par with existing methodologies. Throughout the 40+ staffyears of development effort at AREVA NP, the CSAU process has withstood the technical questions and challenges to its foundation. The key benefits realized by AREVA during this development are
(i)The move to a realistic LOCA methodology brings a new clarity of understanding of the LBLOCA problem to the industry by demonstrating contrast to the very conservative 10 CFR 50 Appendix K methodologies.(ii)Through use of statisticallybased methods, there is improved characterization of the conditions in which individual LBLOCA uncertainty contributors influence LBLOCA response.(iii)The reliance on experimental data has revived the importance of the many test programs that have long since been decommissioned.
These rewards alone have validated the CSAU approach.
AcronymsCSAU:  Code scaling, applicability, and uncertainty 
ECCS:  Emergency core cooling system 
GRS:  Gesellschaft fur Anlagen und Reaktorsicherheit 
HEM:  Homogeneous equilibrium model 
LOCA:  Loss of coolant accident 
PCT:  Peak clad temperature 
PIRT:  Phenomena identification and ranking table 
PWR:  Pressurized water reactor 
RLBLOCA:  Realistic largebreak LOCA 
TPG:  Technical Program Group 
USNRC:  United States Nuclear Regulatory Commission 
References
 P. S. Damerell and J. W. Simons, “2D/3D program work summary report,” Tech. Rep. NUREG/IA0126, Nuclear Regulatory Commission, Washington, DC, USA, 1993. View at: Google Scholar
 USNRC, “Compendium of ECCS research for realistic LOCA analysis,” Tech. Rep. NUREG/CR 1230, Nuclear Regulatory Commission, Washington, DC, USA, 1988. View at: Google Scholar
 Technical Program Group (TPG), “Quantifying reactor safety margins,” 1989, NUREG/CR5249, EGG255. View at: Google Scholar
 USNRC, “Best estimate calculations of emergency core cooling system performance,” 1989, Regulatory guide 1.15. View at: Google Scholar
 2001, Framatome ANP Richland Report, “Realistic large break LOCA methodology for pressurized water reactors,” EMF2103 (proprietary.
 H. N. Berkow, “Safety evaluation on framatome ANP topical report accident methodology for pressurized water reactors,” April 2003, EMF2103 (P) Rev. 0, realistic largebreak lossofcoolant, letter to J. F. Malla. View at: Google Scholar
 T. G. Theofanous, Ed., “Preface,” Nuclear Engineering and Design, vol. 119, no. 1, p. ix, 1990. View at: Publisher Site  Google Scholar
 T. G. Theofanous, Ed., “Preface to the discussion on quantifying reactor safety margins (pp. 405–447),” Nuclear Engineering and Design, vol. 132, no. 3, p. 403, 1992. View at: Publisher Site  Google Scholar
 G. E. Wilson, B. E. Boyackb, I. Cattonc et al., “TPG response to the foregoing letterstotheeditor,” Nuclear Engineering and Design, vol. 132, no. 3, pp. 431–436, 1992. View at: Publisher Site  Google Scholar
 M. Y. Young, S. M. Bajorek, M. E. Nissley, and L. E. Hochreiter, “Application of code scaling applicability and uncertainty methodology to the large break loss of coolant,” Nuclear Engineering and Design, vol. 186, no. 12, pp. 39–52, 1998. View at: Publisher Site  Google Scholar
 A. Guba, M. Makai, and L. Pál, “Statistical aspects of best estimate method—I,” Reliability Engineering and System Safety, vol. 80, no. 3, pp. 217–232, 2003. View at: Publisher Site  Google Scholar
 W. T. Nutt and G. B. Wallis, “Evaluation of nuclear safety from the outputs of computer codes in the presence of uncertainties,” Reliability Engineering and System Safety, vol. 83, no. 1, pp. 57–77, 2004. View at: Publisher Site  Google Scholar
 Y. Orechwa, “Comments on ‘evaluation of nuclear safety from the outputs of computer codes in the presence of uncertainties’ by W.T. Nutt and G.B. Wallis,” Reliability Engineering and System Safety, vol. 87, no. 1, pp. 133–135, 2005. View at: Publisher Site  Google Scholar
 G. B. Wallis and W. T. Nutt, “Reply to “comments on ‘evaluation of nuclear safety from the outputs of computer codes in the presence of uncertainties’ by W.T. Nutt and G. B. Wallis” by Y. Orechwa,” Reliability Engineering and System Safety, vol. 87, no. 1, pp. 137–145, 2005. View at: Google Scholar
 G. B. Wallis, “Uncertainties and probabilities in nuclear reactor regulation,” in Proceedings of the 11th International Topical Meeting on Nuclear Reactor ThermalHydraulics (NURETH11), Avignon, France, October 2005. View at: Google Scholar
 V. H. Ransom, R. J. Wagner, J. A. Trapp et al., “RELAP5/MOD2 code manual,” 1985, NUREG/CR4312, EGG239. View at: Google Scholar
 L. L. Wheat, “CONTEMPTLT a computer program for predicting containment pressuretemperature response to a lossofcoolantaccident,” Tech. Rep. TID4500, ANCR1219, Aerojet Nuclear, Idaho Falls, Idaho, USA, 1975. View at: Google Scholar
 Siemens Power Corporation Report, “RODEX3 fuel rod thermalmechanical response evaluation model, Vol. 1, theoretical manual; Volume 2, thermal and gas release assessments,” 1996, ANF90145, Vol. 1&2, Suppl. 1 (proprietary. View at: Google Scholar
 Siemens Power Corporation Report, “SRELAP5 programmers guide,” 2001, EMF2101, revision 2 (proprietary. View at: Google Scholar
 2001, Framatome ANP Richland Report, “SRELAP5: code verification and validation,” EMF2102 (proprietary.
 2001, Framatome ANP Richland Report, “RODEX3A: theory and users manual,” EMF1557, revision 4, (proprietary.
 2001, Framatome ANP Richland Report, “Code input development guidelines for realistic large break LOCA analysis of a pressurized water reactor,” EMF2054, revision 2, (proprietary.
 2001, Framatome ANP Richland Report, “SRELAP5 realistic large break LOCA analysis guidelines,” EMF2058(P), revision 1 (proprietary.
 2001, Framatome ANP Richland Report, “SRELAP5 models and correlations code manual,” EMF2100, revision 4 (proprietary.
 Oak Ridge National Laboratory (ORNL), “ORNL smallbreak LOCA heat transfer test series I: highpressure reflood analysis,” 1981, NUREG/CR2114, ORNL/NUREG/TM44. View at: Google Scholar
 Oak Ridge National Laboratory (ORNL), “Dispersed flow film boiling in rod bundle geometry: steady state heat transfer data and correlation comparisons,” 1982, NUREG/CR2435, ORNL582. View at: Google Scholar
 Oak Ridge National Laboratory (ORNL), “Analysis of transient film boiling of highpressure water in a rod bundle,” 1982, NUREG/CR2469, ORNL/NUREG8. View at: Google Scholar
 Oak Ridge National Laboratory (ORNL), “Experimental investigations of bundle boiloff and reflood under highpressure low heat flux conditions,” 1982, NUREG/CR2455, ORNL584. View at: Google Scholar
 T. M. Anklam, R. J. Miller, and M. D. White, “Experimental investigations of uncovered bundle heat transfer and twophase mixture level swell under high pressure low heat flux conditions,” Tech. Rep. NUREG/CR2456, ORNL5848, Oak Ridge National Laboratory, Oak Ridge, Tenn, USA, 1982. View at: Google Scholar
 J. A. Findlay, “BWR refillreflood program task 4.8—model qualification task plan,” 1981, NUREG/CR1899, EPRI NP1527, GEAP2489. View at: Google Scholar
 O. Nylund et al., “Hydrodynamic and heat transfer measurements on a fullscale simulated 36rod Marviken fuel element with uniform heat flux distribution,” 1968, ASEA and AB Atomenergi, Frigg2, R4447/RTL100. View at: Google Scholar
 A. W. Bennett, G. F. Hewitt, H. A. Kearsey, and R. K. F. Keeys, “Heat transfer to steamwater mixtures flowing in uniformly heated tubes in which the critical heat flux has been exceeded,” 1967, UKAEA Research Group Report, AERER 537. View at: Google Scholar
 FLECHT SEASET Program, “PWR FLECHT SEASET unblocked bundle, forced and gravity reflood task data report,” 1980, Volumes 1 and 2. NUREG/CR1532, EPRI NP1459, WCAP969. View at: Google Scholar
 F. F. Cadek, D. P. Dominicis, and R. H. Leyse, “PWR FLECHT (full length emergency cooling heat transfer). Final report,” Tech. Rep. WCAP7665, Chicago, Ill, USA, 1971. View at: Google Scholar
 Siemens Power, 1998, HTP reflood test characterization report, EMFP60,149 (proprietary.
 Studsvik Eco & Safety AB, “The Marviken fullscale critical flow tests,” 1982, Summary Report, NUREG/CR2671, MXC30. View at: Google Scholar
 G. P. Lilly and L. E. Hochreiter, “Mixing of ECC water with steam: 1/3 scale test and summary,” EPRI Report EPRI2942, 1975. View at: Google Scholar
 C. L. Tien, K. S. Chung, and C. P. Liu, “Flooding in twophase countercurrent flows,” Tech. Rep. EPRI NP1283, Electric Power Research Institute, Chicago, Ill, USA, 1979. View at: Google Scholar
 E. Weiss, R. A. Markley, and A. Battacharyya, “Open duct coolingconcept for the radial blanket region of a fast breeder reactor,” Nuclear Engineering and Design, vol. 16, pp. 175–386, 1971. View at: Google Scholar
 Siemens AG UB KWU, “Upper plenum test facility,” 1986, Test No. 12 tie plate countercurrent flow testle, R515/86/1. View at: Google Scholar
 Siemens AG UB KWU, “Upper plenum test facility,” 1988, Test No. 8 cold/hot leg flow pattern test experimental data report, U9 316/88/1. View at: Google Scholar
 Siemens AG UB KWU, 1988, Experimental data report—UPTF—test No. 10, tie plate countercurrent flow test, U9 316/88/.
 Siemens AG UB KWU, 1989, Experimental data report—UPTF—test No. 6, downcomer countercurrent flow test. U9 316/88/1.
 Siemens AG UB KWU, 1989, Experimental data report—UPTF—test No. 7, downcomer countercurrent flow test. U9 316/89/1.
 Siemens AG UB KWU, 1990, Experimental data report—UPTF—test No. 29, entrainment/deentrainment test, U9 314/90/0.
 Iguchi et al., 1983, Data report on large scale reflood test43—CCTF core shakedown test C2SH2 (Run 54). JAERIM5815.
 T. Okubo et al., 1985, Evaluation report on CCTF coreII reflood test C24 (Run 62)—investigation of reproducibility. JAERIM8502.
 H. Akimoto et al., 1987, Evaluation report on CCTF coreII reflood test C28 (Run 67)—effect of system pressure. JAERIM8700.
 H. Akimoto et al., 1987, Evaluation report on CCTF coreII reflood test C29 (Run 68)—effect of LPCI flow rate. JAERIM8700.
 MPR Associates, 1989, Research information report of the Slab core test facility (SCTF) core II test series. MPR111.
 A. Ohnuki et al., 1991, Study on ECC injection modes in reflood tests with SCTF core II Œ comparison between gravity and forced feeds. JAERIM910.
 B. J. Holmes, 1991, 125 comparison report. NEA/CSNI/R(91)1, AEATRS104.
 P. G. Pressinos et al., 1979, Experimental data report for LOFT power ascension experiment L23, NUGEG/CR079.
 J. P. Adams et al., 1982, Quicklook report on LOFT experiment L25, EGGLOFT592.
 J. P. Adams et al., 1983, Quicklook report on OECD LOFT experiment LP026, OECD LOFTT3404, EG&G Idah.
 J. P. Adams and J. C. Birchley, 1984, Quicklook report on OECD LOFT experiment LPLB1, OECD LOFTT3504, EG&G Idah.
 D. L. Reeder, “LOFT system and test description (5.5 ft Nuclear Core 1 LOCEs),” 1978, NUREG/CRTREE120. View at: Google Scholar
 USNRC, “Experiment data report for semiscale Mod1 Test S063 (LOFT counterpart test),” 1978, NUREG/CR0251;TREE112. View at: Google Scholar
 USNRC, “Semiscale Mod3 Test program and system description,” 1978, NUREG/CR0239;TREENUREG121. View at: Google Scholar
 USNRC, “Experiment data report for semiscale Mod3 blowdown heat transfer test S071 (baseline test series),” 1978, NUREG/CR0281;TREE122. View at: Google Scholar
 R. P. Martin and B. M. Dunn, “Application and licensing requirements of the framatome ANP RLBLOCA methodology,” in Proceedings of the International Meeting on Updates in Best Estimate Methods in Nuclear Installation Safety Analysis (BE '04), pp. 60–70, Washington, DC, USA. View at: Google Scholar
 E. Hofer, “The GRS programme package for uncertainty and sensitivity analysis,” in Proceedings of the Seminar on Methods and Codes for Assessing the OffSite Consequences of Nuclear Accidents, Athens, Greece, May 1990, EUR 13013, Commission of the European Communitie. View at: Google Scholar
 S. S. Wilks, “Determination of sample sizes for setting tolerance limits,” Annals of Mathematical Statistics, vol. 12, no. 1, pp. 91–96, 1941. View at: Publisher Site  Google Scholar
 P. N. Somerville, “Tables for obtaining nonparametric tolerance limits,” Annals of Mathematical Statistics, vol. 29, no. 2, pp. 599–601, 1958. View at: Publisher Site  Google Scholar
 R. P. Martin and L. D. O'Dell, “Framatome ANP's realistic large break LOCA analysis methodology,” Nuclear Engineering and Design, vol. 235, no. 16, pp. 1713–1725, 2005. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2008 Robert P. Martin and Larry D. O'Dell. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.