Table of Contents Author Guidelines Submit a Manuscript

Quantitative Components of Nuclear Safeguards

Call for Papers

Nuclear safeguards consist of both qualitative and quantitative components that together aim to confirm that nuclear materials and activities are used for peaceful purposes. This special issue includes topics related to quantitative components for inspected nuclear facilities and also topics related to quantitative components of countrywide monitoring. For inspected facilities, methods to verify operator measurement data and to detect nuclear material misuse or loss over short or longer time durations using sequential statistical testing are important. Both inspector and operator measurements are evaluated from a metrological viewpoint to estimate random and systematic error components, using both first principles (“bottom-up”) and empirical (“top-down”) uncertainty quantification.

There are two main reasons that this topic is important and timely. First, the number of nuclear facilities is increasing so the question of resource allocation is increasingly important. Quantitative measures such as nuclear material misuse or loss detection probabilities help to allocate resources such as measurement methods and inspection days. Second, advantage should be taken of new computational methods for modeling, simulation, and statistical inference that could improve nuclear material misuse or loss detection probabilities. For example, computational methods include the following: (1) modeling: Monte Carlo transport detector modeling methods continue to improve development and quantification of nondestructive assay methods used by inspectors; (2) simulation: modern simulation can provide facility models that are sufficiently realistic to provide quantitative assessments of candidate monitoring schemes such as monitoring spectral data for off-normal operating conditions that could indicate misuse of the facility to operate in violation of treaty agreements; and (3) inference methods such as approximate Bayesian computation (ABC) allow data analysts to drop conventional assumptions, such as assuming that random and/or systematic measurement errors are normally distributed.

High quality original papers and review articles are invited.

Potential topics include but are not limited to the following:

  • Facility and simulation modeling (including models of facility misuse that track key isotopes through all processing stages) for process monitoring (PM) and nuclear material accounting
  • Modeling for nondestructive assay (NDA) using transport models
  • Unattended quantitative monitoring
  • Near-real-time accounting with sequential testing and pattern recognition for PM with isotopic data
  • Statistical sampling at the facility and country level, including random unannounced inspections
  • Methods such as the IAEA’s difference statistic for quantitative verification
  • Frequentist versus Bayesian inference as applied to testing for nuclear material loss
  • Tolerance intervals for controlling false alarm rates
  • Data-driven choices among estimators, for example, of random and systematic error variances
  • Metrology topics such as bottom-up UQ (first principles using a model of the assay procedure) for NDA and destructive analysis by analytical chemistry, top-down UQ (empirical) using interlaboratory data, paired operator, inspector data, and ABC for both top and bottom UQ

Papers are published upon acceptance, regardless of the Special Issue publication date.

Authors can submit their manuscripts through the Manuscript Tracking System at

Submission DeadlineFriday, 6 July 2018
Publication DateNovember 2018

Papers are published upon acceptance, regardless of the Special Issue publication date.

Lead Guest Editor

  • Tom Burr, International Atomic Energy Agency, Vienna, Austria

Guest Editors