About this Journal Submit a Manuscript Table of Contents
Advances in Meteorology
Volume 2010 (2010), Article ID 432160, 10 pages
http://dx.doi.org/10.1155/2010/432160
Review Article

Beating the Uncertainties: Ensemble Forecasting and Ensemble-Based Data Assimilation in Modern Numerical Weather Prediction

Department of Atmospheric Sciences, University of Utah, 135 S 1460 E, Rm. 819, Salt Lake City, UT 84112, USA

Received 1 January 2010; Revised 31 March 2010; Accepted 3 June 2010

Academic Editor: Hann-Ming Henry Juang

Copyright © 2010 Hailing Zhang and Zhaoxia Pu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Accurate numerical weather forecasting is of great importance. Due to inadequate observations, our limited understanding of the physical processes of the atmosphere, and the chaotic nature of atmospheric flow, uncertainties always exist in modern numerical weather prediction (NWP). Recent developments in ensemble forecasting and ensemble-based data assimilation have proved that there are promising ways to beat the forecast uncertainties in NWP. This paper gives a brief overview of fundamental problems and recent progress associated with ensemble forecasting and ensemble-based data assimilation. The usefulness of these methods in improving high-impact weather forecasting is also discussed.

1. Introduction

Numerical weather prediction (NWP) is an initial value problem: it forecasts the atmospheric state by integrating a numerical model with given initial conditions. Commonly, two fundamental factors account for an accurate numerical weather forecast: 1) the present state of the atmosphere must be characterized as accurately as possible; 2) the intrinsic laws, according to which the subsequent states develop out of the preceding ones, must be known [1]. These so-called laws are composed of a set of partial differential equations, including the laws of momentum, mass, and energy conservations.

Since the first successful NWP in early 1950s by Charney et al. [2], much progress has been made in enhancing the skill of NWP. These include efforts in improving initial conditions through advances in observing systems and the development of atmospheric data assimilation techniques. Many studies also devoted to improve numerical modeling with advanced numerical methods, better representation of dynamics processes of the atmosphere, and improved physical parameterization schemes [35]. Today, NWP has become a major forecasting tool in many operational centers around the world.

However, due to inadequate observations, our limited understanding of the physical processes of the atmosphere, and the chaotic nature of the atmospheric flow, uncertainties always exist in both initial conditions and numerical models. Thus, reducing forecast errors caused by theses uncertainties remains a large area of research and operational implementation.

Recent developments have proved that ensemble forecasting and ensemble-based data assimilation are promising ways to beat the forecast uncertainties in NWP. The objective of this paper is to give a brief overview of the fundamental problems and recent progress associated with ensemble forecasting and ensemble-based data assimilation. The usefulness of these methods in improving high-impact weather forecasting is also discussed.

The paper is organized as follow. Section 2 addresses the fundamental concepts of atmospheric predictability; Section 3 introduces stochastic theory and ensemble weather forecasting; Section 4 describes the Bayes theorem and ensemble Kalman filtering data assimilation; Section 5 addresses the implementation and practical issues associated with the ensemble Kalman filter; Section 6 briefly summarizes current applications of ensemble forecasting and ensemble-based data assimilation methods on high-impact weather prediction; and a summary and concluding remarks are presented in Section 7.

2. Forecast Uncertainties and Predictability

2.1. Predictability

Predictability refers to the extent to which the future state of the atmosphere or a specific weather system may be predicted based on current ability of NWP. Corresponding to the aforementioned two fundamental factors that influence the numerical forecast, there are two kinds of predictabilities as addressed by Lorenz in [6]: 1) attainable predictability, which is limited by the inaccuracy of measurement and 2) practical predictability that is limited by our inability to express the precise equations of the atmosphere motion and physical processes in the numerical model. The errors in measurement include instrumental errors and errors due to interpolation over regions where there are no measurements at all. These errors can be decreased by enlarging our network of observation stations and improving the techniques of interpolation or data assimilation. Errors in model equations rely much on the computational methods used to solve the equations and our current ability to understand the physical processes, as well as the model resolution to resolve these physical processes in the numerical models.

2.2. The Unpredictable Nature of the Atmosphere

While attainable and practical predictabilities are associated with uncertainties in the initial conditions and imperfect models, what would the predictability be if the model (dynamical and physical processes) is perfect and the initial conditions are accurate? Lorenz [6, 7] asserted that the atmosphere, as a kind of unstable dynamical system, has a finite limit of predictability depending upon a particular flow. As is well known, Lorenz [6] found that a slight departure in initial conditions would evolve into totally different atmospheric states in the numerical forecasts regardless of how small the errors in the initial conditions were.

The chaotic nature of the atmosphere determines that the predictability of the model depends upon not only the realism of the model and the accuracy of initial conditions but also the system itself. Atmospheric motion, as a nonlinear dynamic system, is supposed to have finite limit predictability. The stochastic characteristics account for the extent to which the atmosphere could be predicted. The number of days we can forecast accurately in advance is dependent upon the evolution of the atmosphere.

Figure 1 shows the motion trajectories of stable and unstable dynamic systems. In Figure 1(a), the trajectories drift away from each other although the initial conditions are very close; for Figure 1(b), the trajectories in a stable system stay close to each other with time. This suggests that the intrinsic predictability becomes completely impossible for the unstable flow just as seen from Figure 1(a). Even two very close initial conditions may result in markedly different outcomes. Since we do not know the true atmospheric state, we therefore have no idea about how to ascertain the true value from those totally different forecast states. This brings us a great challenge in numerical weather forecasting.

fig1
Figure 1: The evolutions of slightly different initial states (a) unstable trajectories; (b) stable trajectories [courtesy of Lorenz (1963)].

Figure 2 illustrates a real scenario in the Lorenz 63 model [6]. Given a cloud of close initial states, the trajectories depart from each other after the model was integrated forward only 10 seconds. This suggests, in a nonlinear system, that it is almost impossible to predict even in which lobe the states would be.

432160.fig.002
Figure 2: The evolutions of slightly different initial states in the Lorenz-63 model. Red stars represent initial states; blue circles represent states after 10 seconds. (The figure follows that of Palmer in [8].)

The uncertain properties of the atmospheric system call for more suitable methods to represent the initial conditions and forecast the atmospheric states, instead of the traditional way that describes the initial values with the single analysis best state and integrate the single best guess forward. This will be illustrated in Sections 3 and 4.

3. The Stochastic Prediction and Ensemble Forecasting

In view of the uncertain properties of the atmospheric system, a theory of stochastic dynamic prediction is proposed by Epstein [9]. In a stochastic context, the initial and forecast states of the atmosphere are represented as probability distributions. That is, the probability density function (PDF) of the present model state should be estimated first according to all the prior information and available observations; then, a method for forecasting the evolution of this PDF forward in time is needed. Based on the stochastic dynamic prediction, it is possible to make the probabilistic forecasts in addition to a deterministic forecast using a single model with single initial conditions. Although early experiments by Epstein are very different from the ensemble forecasting done today, the theory of stochastic dynamic predictions offers a stepping stone with which to develop ensemble forecasting.

The advance in parallel processing computers in the early 1990s and improved operational forecasting systems—improvements in both model physics and data assimilation—has led to operational stochastic dynamic prediction at the European Centre for Medium-Range Weather Forecasts (ECMWF), U. S. National Centers for Environmental Prediction (NCEP), and the Meteorological Service of Canada (MSC) in the early 1990s. These operational stochastic prediction systems are referred to as ensemble forecasting systems. Instead of using only one model with a single set of initial conditions, a group of forecasts with slightly different initial conditions are made in an ensemble forecast. The approach to ensemble prediction used at operational centers exhibits subtle differences when compared with the standard Monte Carlo method that was used in the stochastic dynamic prediction. In Monte Carlo, it is assumed that the initial probability density function (PDF) is known and that it is sampled randomly. In most of the methods used in current ensemble forecasting, the PDF is generally not sampled in a random way. There are different ways to generate the initial perturbations in the different operational ensemble systems, including the following.(i)Breeding of Growing Modes (BGM): Developed by Toth and Kalnay [10, 11], the BGM scheme is a simple and inexpensive method to generate the initial perturbation. It eliminates the difference in the growth rate of errors (growing modes) due to convection or baroclinic instability. The BGM consists of the following steps: a) Add a small arbitrary perturbation to the atmospheric analysis; b) integrate the model for 6 hours from both the unperturbed (control) and the perturbed initial condition; c) subtract the 6-hour control forecast from the perturbed forecast; d) scale down the difference field so that it has the same size (in RMS) as the initial perturbation; and e) repeat the above process in time. Thus, the perturbation evolves along with the time-dependant analysis fields, ensuring that after a few days of cycling the perturbation field consist of a superposition of fast-growing modes corresponding to the contemporaneous atmosphere, akin to local Lyapunov vectors.(ii)A Singular vector (SV) method: ECWMF developed and implemented the Singular Vector scheme [12, 13], which is based on the observation that perturbations pointing along different axes of the phase space of the system are characterized by different amplification rates. Given an initial uncertainty, perturbations along the directions of maximum growth amplify more than those along other directions. For defining the SVs in the ensemble prediction system, growth is measured by a matrix based on the total energy norm. The SVs are computed by solving an eigenvalue problem, which is defined by an operator that is a combination of the tangent forward and adjoint model, integrated during a time period named the optimization time interval. The advantage of using singular vectors is that, if the forecast error evolves linearly and the proper initial norm is used, the resulting ensemble captures the largest amount of forecast-error variance at optimization time [14].(iii)A Perturbed-observation approach: The MSC perturbed-observation approach attempts to obtain a representative ensemble of perturbations by comprehensively simulating the behavior of errors in the forecasting system. The system is based on an ensemble of data assimilation systems using perturbed observations. Because the analysis and forecast process is repeated several times with different random input, the perturbed-observation method is a classic example of the Monte Carlo approach. Arguments for the use of nonselective, purely random ensemble perturbations are presented by Houtekamer et al. [15] and Anderson [16].(iv)Ensemble transform Kalman filter (ETKF): It was first introduced as an adaptive sampling method [17]. The formulation of ETKF is based on the application of a Kalman filter, with the forecast and analysis covariance matrices being represented by ensembles of forecast and analysis perturbations. Thus, it produces analysis perturbations (initial perturbation for ensemble) in ensemble representation based on the ensemble forecast from previous cycle and observations. One argued that the ETKF is able to make perturbations more independent and flow dependent [18].

All of the methods discussed above only include perturbations in the initial conditions, assuming that the error growth due to model deficiencies is small compared to that due to unstable growth of initial errors. However, in reality, uncertainties in model physical parameterizations cannot be ignored in many cases. Therefore, in addition to the aforementioned initial perturbation methods, ensemble forecast systems have also been designed to account for model errors and model uncertainty. Current methods and progress include the multimodel ensemble (see, e.g., [19, 20]), stochastic physical parameterizations (see, e.g., [2123]), nonlocal stochastic-dynamic parameterization schemes [24], kinetic energy backscatter [25], performing ensemble simulations with different time steps to study the impact of model truncation error [26], and using different parameterizations within the ensemble prediction system [27]. Krishnamurti et al. [19] commented that the performance of multimodel ensemble forecasts shows superior forecast skill compared to all individual models used. Reynolds et al. [23] illustrated that a stochastic convection scheme improves the ensemble performance in the tropics.

Since ensemble forecasting takes account of the uncertainties in NWP, it has major advantages over a single deterministic forecast [28], like, for example, the following.(i)It improves the forecasting skill by reducing the nonlinear error growth and averaging out unpredictable components. (ii)It predicts the skill, by relating it to the agreement among ensemble forecast members. If the ensemble forecasts are quite different from each other, it is clear that at least some of them are wrong, whereas if there is good agreement among the forecasts, there is more reason to be confident about the forecast. (iii)It provides an objective basis for forecasts in a probabilistic form. In a chaotic system such as the atmosphere, probabilistic information is recognized as the optimum format for weather forecasts both from a scientific and a user perspective.

In addition, ensemble forecasts also show the potential value in a new area of research, such as targeted weather observations (see, e.g., [29]) and data assimilation (see next section)..

4. Bayes Theorem and Ensemble-based Data Assimilation

As mentioned in the previous section, uncertainties of the initial conditions are the major source of error in NWP. Thus, improved data assimilation techniques will be useful to beat the uncertainties in the initial conditions. We continue this subject with the stochastic dynamic prediction.

4.1. Bayes Theorem of Data Assimilation

In a stochastic context, the initial and forecast states of the atmosphere are represented as probability distributions. Therefore, the probability density function of the present model state should be estimated first according to all the prior information and available observations and then a method for forecasting the evolution of this PDF forward in time is needed. Usually, getting the current PDF is referred to Bayes data assimilation theory [30, 31].

In the application of data assimilation, Bayes’ theorem can be expressed as where denotes the probability density of the model state at time ; and are the state variables; denotes the probability density of the observations at time ; and is viewed as a kind of prior and represents the probability density of the prior ensemble forecast at time . The denominator is a kind of normalization for guaranteeing that the total probability of all possible states is 1.

As shown above, (1) describes the way in which new observations are incorporated to modify the prior conditional probability density available from predictions based on earlier observations.

Taking an example, for Gaussian probability density, the prior is where and are the mean and standard deviations, respectively. The subscript “” denotes the “prior” state.

The observation PDF is given as where the Gaussian probability density function given the mean and standard derivation error is

Divide the product (named temporarily) of times by a normalization denominator gives the posterior PDF as shown in Figure 3.

432160.fig.003
Figure 3: The observations (red curve), prior (green curve), and posterior (blue curve) probability density function given Gaussian error [32].

After the above processes, we get the posterior estimate: Here

4.2. Monte Carlo Method

Although we can solve for the posterior PDF in the previous section, it is not easy to express the PDF of the observations and the prior information explicitly in real operational numerical implementation. Therefore, it is difficult to obtain the posterior PDF of initial conditions directly from Bayes theorem.

Fortunately, the implementation of the Monte Carlo method provided us an effective approach to simulate the desired PDF with a random sample and to some extent solve the uncertainties of the initial conditions. However, the Monte Carlo method was conditionally effective only under an assumption that the number of sample members is sufficiently large in order to represent the PDF suitably. Consequently, the difficulty comes with the “large sample.” For instance, for a common real model with 107 degree of freedom, a 107*107 dimension calculation for estimating the PDF will be involved. That is demanding considering even the most recent computational advances.

Figure 4 shows schematically the forecast results under conditions when there are too few sample members doing the estimating. The mean forecast drifts away from the truth with time.

432160.fig.004
Figure 4: Monte Carlo forecast with finite sample.
4.3. Ensemble Kalman Filter

Considering the limitations of traditional Bayes and Monte Carlo methods, a more practical technique is needed. With the most recent developments, ensemble Kalman filter data assimilation techniques, originated from the basic idea of the Monte Carlo theory and the well-known Kalman filter method, are successfully applied in many research and operational practices.

4.3.1. Basic Equations of Kalman Filter

As a sequential data assimilation method, the implementation of the Kalman filter [5, 3335] includes two steps, which are named a forecast step and an analysis step. The model is integrated forward with time and used to update the model state by assimilating new observations when observations are available.

Kalman filter assumes that the prior conditional probability distribution is Gaussian and expresses it with its mean and covariance.

The analysis equation is where is the analysis variables, is the background fields (prior estimate), and denotes observations. , called the observational operator, connects the true state with observations within particular measurement errors: is the so-called gain matrix:

In an extended Kalman filter, one must integrate the linear tangent model in each forecast step to evolve the flow dependent forecast error covariance : where the analysis error covariance is given as

Equations (9) and (10) perform as forecast step and (11) and (7) as the analysis step.

4.3.2. Ensemble Kalman Filter Theory

There are two main drawbacks of the extended Kalman filter method [36, 37]. One is that the simplified closure scheme used for estimating the error covariance results in an unbounded error growth while neglecting the third- and higher-order terms in the apparent closure scheme. Another is that Kalman filter poses an expensive cost due to the computational requirement of the error covariance matrix for the model forecast. From the previous Section 4.3.1, it is known that extended Kalman filter requires integrating the tangent linear model forward to get the error covariance estimation and hence expensive computational cost.

In theory, the error covariance of forecast estimation (background) is defined as:

However, we never know the true atmosphere state . This makes the estimation of the background error covariance very difficult.

From Section 3, we have already learned that the ensemble mean could be the best estimation of the true state. Using this, the ensemble Kalman filter employs a group of ensemble members to represent the covariance statistics of the analyzed state. The ensemble is integrated in the nonlinear model to get a sample of the prior distribution at the next time when the observation is available as follows:

Equation (13) indicates that a flow-dependent error covariance of forecast estimation can be obtained by using ensemble forecasting in practical implementation.

5. Implementation and Practical Issues on Ensemble Kalman Filter

5.1. Implementation of Ensemble Kalman Filters

Since the first attempt by Evensen [37], ensemble Kalman filter methods have been developed rapidly and used widely in data assimilation applications. There are two classes of basic approaches, referred to as the method with perturbed observations and the square root filter (without perturbed observations), to implement the ensemble Kalman filter as aforementioned. The perturbed observation algorithm updates each ensemble member with a different set of observations perturbed with random noise. Because randomness is introduced in every assimilation cycle, the update is considered stochastic. The square root filter methods do not add stochastic noise to the observations and are called deterministic algorithms. Evensen [37], Evensen and Van Leeuwen [38], as well as Houtekamer and Mitchell [39] originally implemented the ensemble Kalman filter with perturbed observations. Anderson [40], Bishop et al. [17], Baek et al. [41], Corazza et al. [42], Hunt et al. [43], Miyoshi and Yamane [44], Harlim and Hunt [45], as well as Yang et al. [46] contributed various square root filter algorithms including an ensemble adjustment Kalman filter [40], an ensemble transform Kalman filter [17], a local ensemble Kalman filter (for LEKF, see [41, 42]) and a local ensemble transform Kalman filter (for LETKF, see [4346]). Whitaker and Hamill [47] indicated that the perturbed observations approach might introduce another kind of sampling errors; thus the square root algorithms methods are more accurate for a given ensemble size.

5.2. Comparison of the Ensemble Kalman Filter with 4DVar

Since the ensemble Kalman filter is becoming part of the operational choice, progress has been made to compare it with advanced data assimilation methods that are currently available. Specifically, a four-dimensional variational data assimilation (4DVar) method has been widely adopted in operational centers around the world. Owing to its capability in assimilating asynchronous observations and high-resolution observations such as satellite radiance and radar reflectivity, 4DVar method is indeed helpful for improving current numerical forecasting [4850]. However, the requirement of the tangent linear and adjoint models made the 4DVar method complicated in its implementation. Compared to 4DVar, the major merit of the ensemble Kalman filter is its simplicity of implementation. It does not need to develop and maintain tangent linear and adjoint model. It is model independent. One can easily switch to other models using ensemble methods [51, 52]. In addition, the ensemble Kalman filter represents and forwards forecast covariance using the ensemble sample without much effort. The main disadvantage of the ensemble Kalman filter comes with the sampling problem. The low ensemble size brings up sampling errors in the estimation of the background error covariance. The inflation tuning is employed to adjust this sample error in the practice.

Fertig et al. [53] studied the performance of 4DVar and 4D-LETKF in assimilating the asynchronous observations using the Lorenz 96 model [54]. Both schemes have comparable error when 4D-LETKF is cycled frequently and when 4DVar is performed over a sufficiently long analysis time window. Yang et al. [55] explored the relative advantages and disadvantages of the 4DVar and LEKF using a quasigeostrophic model and asserted that LEKF did better on both computational cost and accuracy when assimilating the same rawinsonde observations. Buehner et al. [56] evaluated the operational performance of both methods in Environment Canada using the same model and observations and obtained equivalent forecast scores. Kalnay et al. [51] offered a comprehensive comparison between 4DVar and ensemble Kalman filter. Based on results obtained using operational models and both simulated and real observations, they concluded that currently the ensemble Kalman filter is becoming competitive with 4DVar, and that the experience acquired with each of these methods can be used to improve the other.

In brief, due to its simple implementation and equivalent ability compared to 4DVar, the ensemble Kalman filter is becoming an attractive operational choice in more centers. However, the current ability of the ensemble Kalman filter is not equal to that of 4DVar in terms of assimilating satellite and radar observations. In order to utilize advantages from both methods, a hybrid approach, originally proposed by Hamill and Snyder [57], has received significant attention. Lorenc [52] asserted that hybrid approaches of variational methods and ensemble methods would be better than either single approach. Buehner et al. [56] showed that a hybrid approach based on 4DVar but using forecast covariance error estimation from the ensemble Kalman filter gave an improvement in 5-day forecasts in the southern hemisphere.

From the results of current studies, the hybrid method of the ensemble Kalman filter and 4DVar has a promising future since it combines the advantages of both methods and eliminates the existing disadvantages.

5.3. Nonlinear Issues in Ensemble Kalman Filter

Previous studies have proven that ensemble Kalman filter is capable of dealing with data assimilation in nonlinear system (e.g. [58]). However, nonlinearity is still an important issue in the implementation of ensemble Kalman filter. The equations introduced in Section 4.3.2 are valid only when the error PDF are Gaussian. Unfortunately, in reality, even if the error PDF were Gaussian at the initial time, it would be non-Gaussian when the model is integrated forward due to the strongly non-linear model. In the case, the PDFs of errors cannot be represented by a Gaussian function. In addition, it is operationally impossible to assume a non-Gaussian error PDFs, although it looks feasible based on the current ensemble.

There have been many studies devoted to dealing with the nonlinear and non-Gaussian problem, mainly focusing on the development and implementation of the ensemble Kalman filter. For instance, Van Leeuwen [59] presented a true variance minimizing filter method. Its performance was tested by the Korteweg-DeVries equation and a quasi-geostrophic model. He addressed that the method works satisfactorily with a strongly nonlinear system. Hoteit et al. [60] evaluated a new particle-type filter based on a Gaussian mixture representation of the state PDFs using the Lorenz 96 model and discussed its application in real meteorological and oceanographic models. Yang and kalnay [61] applied the outer-loop in LETKF to handle the nonlinear problem with the Lorenz 63 model. Results indicated that the LETKF with outer-loop could use a longer assimilation window and improve the analysis accuracy during highly nonlinear time periods.

6. Applications of Ensemble Forecasting and Ensemble Kalman Filters to High-Impact Weather Prediction

Owing to their advantages in beating the uncertainties and dealing with the nonlinearity, ensemble forecasting and ensemble-based data assimilation have received a lot of attention in the research and operational communities during the last decade. Specifically, they have been applied to high-impact weather forecasting. Many studies have documented results from these applications. The ensemble forecasting was used in short-range ensemble forecasting (SREF) [6267], tropical cyclone forecasts [68, 69], as well as the flooding warning [70], and so forth. The ensemble-based Kalman filtering techniques were also applied for the studying and numerical simulation of hurricanes (see, e.g,. [71]) and storm scale forecasts at high resolution (see, e.g., [72, 73]).

Du et al. [62] applied ensemble forecasting in quantitative precipitation forecasting (QPF). They found a remarkable reduction of root-mean-square error for QPF due to the ensemble application and asserted that the improvements from SREF techniques exceed the effect due to resolution doubling. After a short-range ensemble forecasting system was implemented in real-time operational at NCEP in 2001 [65], Du et al. [66] added another 6 members, which were generated from a weather research and forecasting (WRF) model, into the ensemble forecasting and obtained forecast improvements with increased ensemble spreads. Yuan et al. [67] studied the QPFs and probabilistic QPFs (PQPFs) over the southwest United States, the area that is marked by highly heterogeneous topography and diverse vegetation.

The hurricane track forecasting by Zhang and Krishnamurti [68] showed that the ensemble forecasts are superior to the results from single-model control experiments and the track position errors are largely reduced by the ensemble prediction. Mackey and Krishnamurti [70] combined ensemble forecasts with a high-resolution regional spectral model to postpredict the track, intensity, and flooding precipitation arising from Typhoon Winnie in August, 1997. They evaluated the effectiveness of the ensemble forecasting and found that the ensemble mean track would be superior only if the forecast uncertainty is properly sampled.

Zhang et al. [71] studied Hurricane Humberto (2007) using the ensemble Kalman filter method for assimilating Doppler radar radical velocity. Results indicated that the ensemble Kalman filtering analysis improved the representation of the track and intensity of Humberto. Tong and Xue [72] and Xue et al. [73] used the ensemble Kalman filter method and radar reflectivity to correct errors in fundamental microphysical parameters that are of great importance to microphysics schemes. The results show that the ensemble Kalman filter successfully corrected model errors in microphysical parameters.

7. Concluding Remarks

NWP is an initial value problem: it forecasts the atmospheric state by integrating a numerical model with given initial conditions. Due to inadequate observations, our limited understanding in physical processes of atmosphere, and the chaotic nature of the atmospheric flow, uncertainties always exist in modern NWP. Enhancing the predictability becomes a key issue in improving the skill of NWP.

In this paper, the ensemble forecasting and ensemble-based Kalman filter methods, both derived from concepts of the stochastic prediction, are overviewed. It can be concluded as follows.(i)Atmospheric motion, as an unstable system, has a finite predictability. NWP is strongly sensitive to the initial conditions. Uncertainties in the model physical parameterization also introduce errors into NWP. Due to strong nonlinearity and chaotic nature of the atmospheric flow, unpredictable components exist in reality. (ii)Ensemble forecasting takes uncertainties into account in initial conditions and/or model physical parameterizations to help produce improved forecasts over a single deterministic forecast in NWP and also provide probabilistic forecasts.(iii)The Ensemble Kalman filter refines the Monte Carlo method and traditional Kalman filter. It uses ensemble forecasts to express the flow-dependent error covariance of the forecast estimation. Ensemble Kalman filters present an effective way for data assimilation to improve model initial conditions, while at the same time also take uncertainties into account.

Owing to their advantages in beating the uncertainties and dealing with the nonlinearity in NWP, ensemble forecasting and ensemble-based data assimilation received a lot of attention in the research and operational communities during the last decade. Specifically, they have been applied to improve high-impact weather forecasting.

However, there are issues outstanding. As the ensemble forecasting requires large computational resources, many operational ensemble systems were implemented in coarser resolutions, compared with the high-resolution deterministic weather prediction models. Meanwhile, the small size of the ensemble could cause the underrepresentation problem when generating the background covariance for ensemble-based data assimilation. In addition, with perturbed initial conditions and various physical parameterizations, ensemble forecasts take into account both initial and model errors; however, there has not yet been a consensus regarding which one of these two methods is more efficient for accurate NWP in general. Moreover, the ensemble Kalman filter has many advantages over the current variational data assimilation systems. However, so far, the use of the ensemble Kalman filter in operational forecasts has been in a test phase. More studies are needed to make it a more powerful tool for assimilating real observations. In the meantime, a hybrid variational and ensemble Kalman filter method could be a promising technique in the near future.

Acknowledgments

The authors are grateful to three anonymous reviewers for their review comments that were helpful in improving this manuscript. This study is supported by U. S. National Science Foundation through Award no. ATM-0833985.

References

  1. V. Bjerknes, Dynamic Meteorology and Hydrography. Part II. Kinematics, Carnegie Institute, Gibson Bros, New York, USA, 1911.
  2. J. G. Charney, R. Fjörtoft, and J. V. Neumann, “Numerical integration of the barotropic vorticity equation,” Tellus, vol. 2, pp. 237–254, 1950.
  3. P. Thompson, Numerical Weather Analysis and Prediction, The Macmillan Company, New York, NY, USA, 1961.
  4. G. J. Haltine, Numerical Weather Prediction, John Wiley & Sons, New York, NY, USA, 1971.
  5. E. Kalnay, Atmospheric Modeling, Data Assimilation and Predictability, Cambridge University Press, Cambridge, UK, 2003.
  6. E. N. Lorenz, “The predictability of hydrodynamics flow,” Transactions of the New York Academy of Sciences Series II, vol. 25, no. 4, pp. 409–432, 1963.
  7. E. N. Lorenz, “A study of the predictability of a 28-variable atmospheric model,” Tellus, vol. 12, pp. 321–333, 1965.
  8. T. N. Palmer, “Predictability of weather and climate: from theory to practice,” in Predictability of Weather and Climate, chapter 1, pp. 1–29, Cambridge University Press, Cambridge, UK, 2006.
  9. E. S. Epstein, “Stochastic dynamic prediction,” Tellus, vol. 6, pp. 739–759, 1969.
  10. Z. Toth and E. Kalnay, “Ensemble forecasting at NCEP: the generation of perturbations,” Bulletin of the American Meteorological Society, vol. 74, pp. 2371–2330, 1993.
  11. Z. Toth and E. Kalnay, “Ensemble forecasting at NCEP and the breeding method,” Monthly Weather Review, vol. 125, no. 12, pp. 3297–3319, 1997. View at Scopus
  12. R. Buizza and T. N. Palmer, “The singular-vector structure of the atmospheric global circulation,” Journal of the Atmospheric Sciences, vol. 52, no. 9, pp. 1434–1456, 1995. View at Scopus
  13. F. Molteni, R. Buizza, T. N. Palmer, and T. Petroliagis, “The ECMWF Ensemble prediction system: methodology and validation,” Quarterly Journal of the Royal Meteorological Society, vol. 122, no. 529, pp. 73–119, 1996. View at Scopus
  14. M. Ehrendorfer and J. J. Tribbia, “Optimal prediction of forecast error covariances through singular vectors,” Journal of the Atmospheric Sciences, vol. 54, no. 2, pp. 286–313, 1997. View at Scopus
  15. P. L. Houtekamer, L. Lefaivre, J. Derome, H. Ritchie, and H. L. Mitchell, “A system simulation approach to Ensemble prediction,” Monthly Weather Review, vol. 124, no. 6, pp. 1225–1242, 1996. View at Scopus
  16. J. L. Anderson, “The impact of dynamical constraints on the selection of initial conditions for Ensemble predictions: low-order perfect model results,” Monthly Weather Review, vol. 125, no. 11, pp. 2969–2983, 1997. View at Scopus
  17. C. H. Bishop, B. J. Etherton, and S. J. Majumdar, “Adaptive sampling with the Ensemble transform Kalman filter. Part I: theoretical aspects,” Monthly Weather Review, vol. 129, no. 3, pp. 420–436, 2001. View at Scopus
  18. M. Wei, Z. Toth, R. Wobus, and Y. Zhu, “Initial perturbations based on the Ensemble transform (ET) technique in the NCEP global operational forecast system,” Tellus, vol. 60, no. 1, pp. 62–79, 2008. View at Publisher · View at Google Scholar · View at Scopus
  19. T. N. Krishnamurti, C. M. Kishtawal, and C. M. Kishtawal, “Multimodel Ensemble forecasts for weather and seasonal climate,” Journal of Climate, vol. 13, no. 23, pp. 4196–4216, 2000. View at Scopus
  20. V. V. Kharin and F. W. Zwiers, “Climate predictions with multimodel Ensembles,” Journal of Climate, vol. 15, no. 7, pp. 793–799, 2002. View at Scopus
  21. R. Buizza, M. Miller, and T. N. Palmer, “Stochastic representation of model uncertainties in the ECMWF Ensemble prediction system,” Quarterly Journal of the Royal Meteorological Society, vol. 125, no. 560, pp. 2887–2908, 1999. View at Scopus
  22. G. Shutts and T. N. Palmer, “The use of high-resolution numerical simulations of tropical circulation to calibrate stochastic physics schemes,” in Proceedings of the Simulation and Prediction of Intra-seasonal Variability with Emphasis on the MJO (ECMWF/CLIVAR '04), pp. 83–102, European Centre for Medium-Range Weather Forecasts, Reading, UK, 2004.
  23. C. A. Reynolds, J. Teixeira, and J. G. McLay, “Impact of stochastic convection on the Ensemble transform,” Monthly Weather Review, vol. 136, no. 11, pp. 4517–4526, 2008. View at Publisher · View at Google Scholar · View at Scopus
  24. T. N. Palmer, “A nonlinear dynamical perspective model error: a proposal for non-local stochastic-dynamic parametrization in weather and climate prediction models,” Quarterly Journal of the Royal Meteorological Society, vol. 127, no. 572, pp. 279–304, 2001. View at Publisher · View at Google Scholar · View at Scopus
  25. G. Shutts, “A kinetic energy backscatter algorithm for use in Ensemble prediction systems,” Quarterly Journal of the Royal Meteorological Society, vol. 131, no. 612, pp. 3079–3102, 2005. View at Publisher · View at Google Scholar · View at Scopus
  26. J. Teixeira, C. A. Reynolds, and K. Judd, “Time step sensitivity of nonlinear atmospheric models: numerical convergence, truncation error growth, and Ensemble design,” Journal of the Atmospheric Sciences, vol. 64, no. 1, pp. 175–189, 2007. View at Publisher · View at Google Scholar · View at Scopus
  27. P. L. Houtekamer, L. Lefaivre, J. Derome, H. Ritchie, and H. L. Mitchell, “A system simulation approach to Ensemble prediction,” Monthly Weather Review, vol. 124, no. 6, pp. 1225–1242, 1996. View at Scopus
  28. M. S. Tracton and E. Kalnay, “Operational Ensemble prediction at the National Meteorological Center: practical aspects,” Weather & Forecasting, vol. 8, no. 3, pp. 379–398, 1993. View at Scopus
  29. Z.-X. Pu and E. Kalnay, “Targeting observations with the quasi-inverse linear and adjoint NCEP global models: performance during FASTEX,” Quarterly Journal of the Royal Meteorological Society, vol. 125, no. 561, pp. 3329–3337, 1999. View at Scopus
  30. A. C. Lorenc, “Analysis methods for numerical weather prediction,” Quarterly Journal of the Royal Meteorological Society, vol. 112, pp. 1177–1194, 1986.
  31. T. M. Hamill, “Ensemble-based atmospheric data assimilation,” in Predictability of Weather and Climate, pp. 124–156, Cambridge University Press, Cambridge, UK, 2006.
  32. J. L. Anderson, “Data assimilation research testbed tutorial note,” 2006, http://www.image.ucar.edu/DAReS/DART/DART_Documentation.php#tutorial_simple.
  33. R. E. Kalman, “A new approach to linear filtering and prediction problems,” Transactions of the ASME: Journal of Basic Engineering, Series D, vol. 82, pp. 35–45, 1960.
  34. R. E. Kalman and R. Bucy, “New results in linear filtering and prediction theory,” Transactions of the ASME: Journal of Basic Engineering, Series D, vol. 83, pp. 95–108, 1961.
  35. G. Evensen, “The Ensemble Kalman filter: theoretical formulation and practical implementation,” Ocean Dynamics, vol. 53, pp. 343–367, 2003.
  36. G. Evensen, “Using the extended Kalman filter with a multilayer quasi-geostrophic ocean model,” Journal of Geophysical Research, vol. 97, no. 11, pp. 17905–17924, 1992. View at Scopus
  37. G. Evensen, “Sequential data assimilation with a non-linear quasi-geostrophic model using Monte Carlo methods to forecast error statistics,” Journal of Geophysical Research, vol. 99, no. 5, pp. 143–162, 1994. View at Scopus
  38. G. Evensen and P. J. van Leeuwen, “Assimilation of geosat altimeter data for the agulhas current using the Ensemble kalman filter with a quasigeostrophic model,” Monthly Weather Review, vol. 124, no. 1, pp. 85–96, 1996. View at Scopus
  39. P. L. Houtekamer and H. L. Mitchell, “Data assimilation using an Ensemble Kalman filter technique,” Monthly Weather Review, vol. 126, no. 3, pp. 796–811, 1998. View at Scopus
  40. J. L. Anderson, “An Ensemble adjustment Kalman filter for data assimilation,” Monthly Weather Review, vol. 129, no. 12, pp. 2884–2903, 2001. View at Scopus
  41. S.-J. Baek, B. R. Hunt, E. Kalnay, E. Oott, and I. Szunyogh, “Local Ensemble Kalman filtering in the presence of model bias,” Tellus, vol. 58, no. 3, pp. 293–306, 2006. View at Publisher · View at Google Scholar · View at Scopus
  42. M. Corazza, E. Kalnay, and S. C. Yang, “An implementation of the Local Ensemble Kalman Filter in a quasigeostrophic model and comparison with 3D-Var,” Nonlinear Processes in Geophysics, vol. 14, no. 1, pp. 89–101, 2007. View at Scopus
  43. B. R. Hunt, E. J. Kostelich, and I. Szunyogh, “Efficient data assimilation for spatiotemporal chaos: a Local Ensemble transform Kalman filter,” Physica D, vol. 230, no. 1-2, pp. 112–126, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  44. T. Miyoshi and S. Yamane, “Local Ensemble transform Kalman filtering with an AGCM at a T159/L48 resolution,” Monthly Weather Review, vol. 135, no. 11, pp. 3841–3861, 2007. View at Publisher · View at Google Scholar · View at Scopus
  45. J. Harlim and B. R. Hunt, “Four-dimensional local Ensemble transform Kalman filter: numerical experiments with a global circulation model,” Tellus, vol. 59, no. 5, pp. 731–748, 2007. View at Publisher · View at Google Scholar · View at Scopus
  46. S.-C. Yang, E. Kalnay, B. Hunt, and N. E. Bowler, “Weight interpolation for efficient data assimilation with the Local Ensemble Transfom Kalman filter,” Quarterly Journal of the Royal Meteorological Society, vol. 135, no. 638, pp. 251–262, 2009. View at Publisher · View at Google Scholar · View at Scopus
  47. J. S. Whitaker and T. M. Hamill, “Ensemble data assimilation without perturbed observations,” Monthly Weather Review, vol. 130, no. 7, pp. 1913–1924, 2002. View at Scopus
  48. C. Köpken, G. Kelly, and J.-N. Thépaut, “Assimilation of Meteosat radiance data within the 4D-Var system at ECMWF: assimilation experiments and forecasts impact,” Quarterly Journal of the Royal Meteorological Society, vol. 130, no. 601, pp. 2277–2292, 2004. View at Publisher · View at Google Scholar · View at Scopus
  49. J.-F. Mahfouf, P. Bauer, and V. Marécal, “The assimilation of SSM/I and TMI rainfall rates in the ECMWF 4D-Var system,” Quarterly Journal of the Royal Meteorological Society, vol. 131, no. 606, pp. 437–458, 2005. View at Publisher · View at Google Scholar · View at Scopus
  50. P. Bauer, P. Lopez, A. Benedetti, D. Salmond, and E. Moreau, “Implementation of 1D+4D-Var assimilation of precipitation-affected microwave radiances at ECMWF. I: 1D-Var,” Quarterly Journal of the Royal Meteorological Society, vol. 132, no. 620, pp. 2277–2306, 2006. View at Publisher · View at Google Scholar · View at Scopus
  51. E. Kalnay, H. Li, T. Miyoshi, S.-C. Yang, and J. Ballabrera-Poy, “4-D-Var or Ensemble Kalman filter?” Tellus, vol. 59, no. 5, pp. 758–773, 2007. View at Publisher · View at Google Scholar · View at Scopus
  52. A. C. Lorenc, “The potential of the Ensemble Kalman filter for NWP—a comparison with 4D-Var,” Quarterly Journal of the Royal Meteorological Society, vol. 129, no. 595, pp. 3183–3203, 2003. View at Publisher · View at Google Scholar · View at Scopus
  53. E. J. Fertig, J. Harlim, and B. R. Hunt, “A comparative study of 4D-VAR and a 4D Ensemble Kalman filter: perfect model simulations with Lorenz-96,” Tellus, vol. 59, no. 1, pp. 96–100, 2007. View at Publisher · View at Google Scholar · View at Scopus
  54. E. N. Lorenz, “Predictability—a problem partly solved,” in Proceedings of the Seminar on Predictability, ECMWF, September 1996.
  55. S.-C. Yang, M. Corazza, A. Carrassi, E. Kalnay, and T. Miyoshi, “Comparison of Ensemble-based and variational-based data assimilation schemes in a quasi-geostrophic model,” in Proceedings of the 10th Symposium on Integrated Observing and Assimilation Systems for the Atmosphere, Oceans, and Land Surface, 2007, http://ams.confex.com/ams/pdfpapers/101581.pdf.
  56. M. Buehner, C. Charente, B. He, et al., “Intercomparison of 4-DVar and EnKF systems for operational deterministic NWP,” 2008, http://4dvarenkf.cima.fcen.uba.ar/Download/Session_7/Intercomparison_4D-Var_EnKF_Buehner.pdf.
  57. T. M. Hamill and C. Snyder, “A hybrid Ensemble Kalman filter-3D variational analysis scheme,” Monthly Weather Review, vol. 128, no. 8, pp. 2905–2919, 2000. View at Scopus
  58. Z. Pu and J. Hacker, “Ensemble-based Kalman filters in strongly nonlinear dynamics,” Advances in Atmospheric Sciences, vol. 26, pp. 373–380, 2009.
  59. P. J. Van Leeuwen, “A variance-minimizing filter for large-scale applications,” Monthly Weather Review, vol. 131, no. 9, pp. 2071–2084, 2003. View at Publisher · View at Google Scholar · View at Scopus
  60. I. Hoteit, D.-T. Pham, G. Korres, and G. Triantafyllou, “Particle Kalman filtering for data assimilation in meteorology and oceanography,” in Proceedings of the 3rd International Conference on Reanalysis (WCRP '08), p. 6, Tokyo, Japan, 2008.
  61. S.-C. Yang and E. Kalnay, “Handling nonlinearity and non-Gaussianity in Ensemble Kalman filter: Experiments with the three-variable Lorenz model,” 2010, http://www.atmos.umd.edu/~ekalnay/YangKalnay2010Rev.pdf.
  62. J. Du, S. L. Mullen, and F. Sanders, “Short-range Ensemble forecasting of quantitative precipitation,” Monthly Weather Review, vol. 125, no. 10, pp. 2427–2459, 1997. View at Scopus
  63. S. L. Mullen, J. Du, and F. Sanders, “The dependence of Ensemble dispersion on analysis-forecast systems: implications to short-range Ensemble forecasting of precipitation,” Monthly Weather Review, vol. 127, no. 7, pp. 1674–1686, 1999. View at Scopus
  64. H. Yuan, S. L. Mullen, X. Gao, S. Sorooshian, J. Du, and H.-M. H. Juang, “Verification of probabilistic quantitative precipitation forecasts over the southwest United States during winter 2002/03 by the RSM Ensemble system,” Monthly Weather Review, vol. 133, no. 1, pp. 279–294, 2005. View at Publisher · View at Google Scholar · View at Scopus
  65. J. Du and M.S. Tracton, “Implementation of a real time short-range Ensemble forecasting system at NCEP: an update,” in Proceedings of the 9th conference on Mesoscale processed, pp. 355–356, American Meteorological Society, Fort Lauderdale, Fla, USA, 2001.
  66. J. Du, J. McQueen, G. DiMego, et al., “New dimension of NCEP short-range Ensemble forecasting (SREF) system: inclusion of WRF members,” WMO Expert Team Meeting on Ensemble Prediction System, Exeter, UK, February 2006, 5 pages, http://www.emc.ncep.noaa.gov/mmb/SREF/reference.html, Preprint.
  67. H. Yuan, S. L. Mullen, X. Gao, S. Sorooshian, J. Du, and H.-M. H. Juang, “Verification of probabilistic quantitative precipitation forecasts over the southwest United States during winter 2002/03 by the RSM Ensemble system,” Monthly Weather Review, vol. 133, no. 1, pp. 279–294, 2005. View at Publisher · View at Google Scholar · View at Scopus
  68. Z. Zhang and T. N. Krishnamurti, “Ensemble forecasting of hurricane tracks,” Bulletin of the American Meteorological Society, vol. 78, no. 12, pp. 2785–2795, 1997. View at Scopus
  69. S. D. Aberson, M. A. Bender, and R. E. Tuleya, “Ensemble forecasting of tropical cyclone tracks,” in Proceedings of the 12th Conference on Numerical Weather Prediction, pp. 290–292, American Meteorological Society, Phoenix, Ariz, USA, 1998.
  70. B. P. Mackey and T. N. Krishnamurti, “Ensemble forecast of a typhoon flood event,” Weather & Forecasting, vol. 16, no. 4, pp. 399–415, 2001. View at Scopus
  71. F. Zhang, Y. Weng, J. A. Sippel, Z. Meng, and C. H. Bishop, “Cloud-resolving hurricane initialization and prediction through assimilation of doppler radar observations with an Ensemble Kalman filter,” Monthly Weather Review, vol. 137, no. 7, pp. 2105–2125, 2009. View at Publisher · View at Google Scholar · View at Scopus
  72. M. Tong and M. Xue, “Simultaneous estimation of microphysical parameters and atmospheric state with simulated radar data and Ensemble square root Kalman filter. Part II: parameter estimation experiments,” Monthly Weather Review, vol. 136, no. 5, pp. 1649–1668, 2008. View at Publisher · View at Google Scholar · View at Scopus
  73. M. Xue, M. Tong, and G. Zhang, “Simultaneous state estimation and attenuation correction for thunderstorms with radar data using an Ensemble Kalman filter: tests with simulated data,” Quarterly Journal of the Royal Meteorological Society, vol. 135, no. 643, pp. 1409–1423, 2009. View at Publisher · View at Google Scholar · View at Scopus