Table of Contents Author Guidelines Submit a Manuscript
Advances in Mathematical Physics
Volume 2010, Article ID 509326, 42 pages
http://dx.doi.org/10.1155/2010/509326
Research Article

Introducing Randomness into First-Order and Second-Order Deterministic Differential Equations

1Department for Protection, Norwegian Defence Research Establishment, P.O. Box 25, 2007 Kjeller, Norway
2Faculty of Social Sciences, University of Stavanger, 4036 Stavanger, Norway

Received 23 June 2009; Revised 1 January 2010; Accepted 1 March 2010

Academic Editor: Luigi Berselli

Copyright © 2010 John F. Moxnes and Kjell Hausken. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We incorporate randomness into deterministic theories and compare analytically and numerically some well-known stochastic theories: the Liouville process, the Ornstein-Uhlenbeck process, and a process that is Gaussian and exponentially time correlated (Ornstein-Uhlenbeck noise). Different methods of achieving the marginal densities for correlated and uncorrelated noise are discussed. Analytical results are presented for a deterministic linear friction force and a stochastic force that is uncorrelated or exponentially correlated.

1. Introduction

Stochastic theories model systems which develop in time and space in accordance with probabilistic laws.1 Essential in stochastic theories is how randomness is accounted for. For Markov [1] processes2, which are an important class of stochastic processes, the state value of at time is given by the state value at time plus a state value of a “random variable” at time 3 A random “disturbance” in a Markov process may possibly influence all subsequent values of the realization. The influence may decrease rapidly as the time point moves into the future. Five methods to account for randomness are to assume deterministic equations to determine the stochastic process for say a particle and apply Monte Carlo simulations or analytical methods [2, 3] to find probability distributions, to use the Liouville equation with or without added terms for the probability density per se, for example, for a particle, to use ordinary differential equations for the statistical moments of the probability distribution, to use the “hydrodynamic approach,” which is to specify constitutive relations in an equation set akin to what is used in hydrodynamic formulations of gas flow [4], and in quantum physics one can use traditional quantization rules, Nelson’s [5] stochastic version of quantum mechanics, or path integration methods. This article mainly confines attention to method 1, addressed in all sections of the article. Method 2 is briefly addressed in the last part of Section 4. Methods 3, 4, and 5 are briefly outlined in Appendix F. We first outline methods 1 and 2 and thereafter specify the contribution of this article towards the end of the introduction.

For method 1 allowing randomness in the initial values for a particle and for a deterministic noise commonly implies more realistic models of physical situations than, for example, the Liouville process where a realization of the stochastic process is constructed deterministically allowing randomness only in the initial conditions of the particle. In the Stratonovich [3] method noise is incorporated in a deterministic equation by adding a deterministic noise term that can be integrated in the normal Riemann sense. The Stratonovich integral occurs as the limit of time-correlated noise (so-called colored noise) when the correlation time of the noise term approaches zero. Monte Carlo simulations or analytical methods can be used to “count up” the probability density. The increment in the Einstein-Smoluchowski theory of Brownian motion during a time interval is proportional with the so-called drift, plus a nondeterministic noise as a time-uncorrelated Gaussian random term (called Gaussian white noise), with mean zero and variance proportional to . This mathematical model was first treated rigorously by Ito [2] (See also [6, 7]). The work by Bachelier [8], Einstein [9, 10], Smoluchowski [11, 12], Wigner [13], Ornstein-Uhlenbeck (1930), and Ito [2] constitute foundations. A system with time-uncorrelated noise is usually just a coarse-grained version of a more fundamental microscopic model with correlation. Thus depending on the problem in consideration, the Stratonovich (deterministic noise) or the Ito model (stochastic noise) could be appropriate approximations. For additive noise the Ito model gives the same answer as the Stratonovich model. But for multiplicative noise the results are different. (For a recent treatment of different interpretations of stochastic differential equations see, e.g., Lau and Lubensky [14].) Roughly, the reason for this difference is that the Stratonovich integral, which is a Riemann integral, applies on functions with bounded variations. The Ito integral applies on functions without bounded variations (i.e., white noise).

In physics or engineering applications second-order ordinary differential equations are often used as models. Unfortunately, second-order processes are more difficult to address than first-order processes. The second-order differential equation is first written as the mathematical equivalent set of two first-order equations, and then randomness is incorporated into the first-order equations either by Ito or Stratonovich interpretations by defining two stochastic differential equations for the two random variables and [6, 7]. In the well-known Langevin models used in physics, or in any second-order process driven by noise force, a noise term is added to the velocity (say ), but not to the position (say ). The position follows as the integral of the velocity. The Ornstein-Uhlenbeck (1930) theory of Brownian motion is expressed in terms of a Markov process in the bi-dimensional phase space. Masoliver [15] studied second-order processes driven by a dichotomous exponentially time-correlated noise force, while Heinrichs [16] studied second-order processes driven by Gaussian exponential time-correlated noise force. See also Bag [17] and Bag et al. [18] for studies on stochastic processes and entropy production. Langevin models are very fruitful as a starting point for quantum noise phenomena [19].

In method 2, randomness is implemented without using differential equations and Monte Carlo simulations, by studying the Liouville equation as such with our without added terms, for example, for a particle. In the Liouville process the probability density is a solution of the so-called Liouville equation4. But a traditional picture of applying realistic randomness is through collision integrals that fulfill local conservation laws of mass, momentum, and energy [20, 21]. It is usually considered in direct reference to the Boltzmann kinetic theory. Microscopic collision rules are established based on time symmetry, but then a rapid decay of correlations amounts to assuming friction [22, 23]. For a second-order process driven by dichotomous noise with exponential correlation Masoliver [15] found a third-order partial equation for the joint density distribution. For a second-order process driven by Gaussian noise with exponential correlation Hienrichs [16] found a Fokker-Planck equation with time variable coefficients for the joint distribution. In the phase-space formulation of quantum physics the Wigner quasijoint distribution is commonly used [24]. The equation for the joint distribution includes additional terms compared to the bidimensional Liouville equation. (See the review articles by Hillery et al. [25] and Lee [26] for a review of quantum phase-space distributions.) No corresponding second-order stochastic differential equation is constructed.

A system with time-uncorrelated noise is usually just a coarse-grained version of a more fundamental microscopic model with time correlation. It is therefore of interest to study models with correlation. Bag et al. [18] introduced correlation by increasing the number of differential equations and applying uncorrelated noise throughout. This approach obviously increases the system complexity. This article shows that time-correlated noise can be mimicked by time-uncorrelated noise and time-dependent noise without increasing the number of equations in the equation set. To provide a benchmark we start in Section 2 by considering a one-dimensional system based on recurrence relations without correlation. We first show how the recurrence relation, which is usually applied for Gaussian noise only ( that means corresponding to the Ito integral) can be used to develop equations for a more general noise that is multifractal. We develop the equation for variances and covariances. Gaussian processes give that higher-order moments follow from second-order moments. Our first order system has not been analyzed in the manner proposed in this paper, which is needed to develop alternative accounts of introducing correlation. We compare different methods of achieving the main equations and study when the time-dependent uncorrelated noise can mimic exponential correlated noise. Section 3 studies first order systems with correlation, which are compared with the systems without correlation in Section 2. Such comparison has not been made in the earlier literature. Section 4 considers second-order stochastic processes driven by time-uncorrelated or correlated noise. Proceeding to the second-order allows capturing a larger fraction of real life processes which makes the approach more realistic. We show that a second-order system driven by an exponential time-correlated noise force can be mimicked by adding time-uncorrelated noise both to the position and to the velocity5. Instead of expanding the equation set, noise is added separately to each of the two dimensions (exemplified with position and velocity) of the two-dimensional system. Section 5 concludes.

2. A First-Order Time-Uncorrelated Process with Additive Noise

This section provides a first order process which accounts for randomness. Assume that the differential equation has been used to describe a physical phenomenon. Assume that this equation is found to no longer hold due to results from a more developed experimental set-up. The question is then as follows: how should this equation be (a) modified or (b) reinterpreted, to be more realistic? Assume that we use (b), assuming that the original equation is reinterpreted to mean , where means expectation of a stochastic process. (Another possibility is . These two interpretations are in general different since this one demands a derivative path.) The next question is then how to construct a stochastic theory such that expectation, variance, and higher-order moments can be calculated.

Stochastic or nonstochastic integrals can rarely be solved in analytic form, making numerical algorithms an important tool. Assume that is the change in this quantity during a small time interval from to (we will let the time spacing approach zero). Assume that this change is proportional with the time interval and with which gives the recurrence relation: where “mod” means that this is a model assumption and “def” means definition. When constructing a more developed theory accounting for noise, we can, for instance, make the initial values stochastic as in the Liouville approach or we can change the recurrence relation in (2.1). In the study of nonlinear recurrence relations Glass and Mackey [27] showed that it is possible to construct an infinite number of deterministic relations which are chaotic, but which describe a given density distribution. Thus a given density distribution has no unique recurrence relation. It appears that the broad class of Markovian theories incorporating the Gaussian white noise input provides a satisfactory approximation for a large variety of phenomena. We consider the more general stochastic equation, which we consider as a stochastic differential equation with additive noise of the Ito form: where is not a differential in the Riemann sense and is therefore denoted by the “”. is a function and is the differential change of with respect to during the time step We model as a stochastic variable where Distribution is a distribution of expectation zero and variance = . means expectation. is any arbitrary function. The expectation of in (2.2b) is zero. We assume no correlation. Equation (2.2c) is assumed in order to develop (2.5). Equation (2.2c) is valid when using Ito calculus (no bounded variation) but is not valid when using Stratonovich calculus. When we define, as determined by (2.2a)–(2.2c), what we call a Liouville recurrence relation, which defines the Liouville process, where only the initial values are stochastic.

Conceptually, we can easily generate realizations by applying (2.2a)–(2.2c) on a computer. Assume that we perform (which approaches infinity) different runs up to time with a constant time spacing (which approaches zero). Let be the arbitrary function that we apply to each number. Thus we achieve the set of numbers, to read Applying Taylor expansion, the expectation is achieved when goes to infinity, that is, where the “dot” above a variable means the time derivative that is, , and “D” means the space derivative , that is, . We can conceptually collect all tracks that pass through a given . We next assume that (a) the variance is for small time steps, where is some well-behaved function to be specified exogenously, and (b) that higher-order moments also have the same powers of akin to multifractal phenomena [28]. The terms and equal zero due to (2.2c). Thus for . This gives is the forward infinitesimal operator of the process. By setting as a special case, we easily find that We introduce the probability density of We choose time independent and can then write (2.5) by definition as (Only natural boundary conditions are used in this article and the space integration limits are suppressed.) Integrating (2.6) by parts and assuming natural boundary conditions gives Equation (2.7) is valid for all and thus In fact, if our uncorrelated random term is Gaussian, all odd moments are zero, and even moments of higher order than two are of higher order than (see Appendix A). This gives , implying the partial differential equation called the forward Fokker-Planck equation or forward Kolmogorov equation [29]. We can from (2.8) easily calculate the derivative of the variance of which for a time-dependent (and uncorrelated) random term with becomes The general Liouville solution of (2.8) is easily found, to read is an arbitrary function such that the integral is equal to 1. Say that the force is a linear friction force and that where and are parameters. The time derivative of the variance becomes according to (2.9) Say that we will formulate a continuous equation in time that corresponds to (2.2a)–(2.2c), to read where is an arbitrary noise function. We let be deterministic (the randomness is then only in the initial values of ; see Appendix E as an example). The deterministic approach generally gives that can be integrated in the traditional Riemann sense. The Stratonovich integral occurs as the limit of colored noise if the correlation time of the noise term approaches zero. A quite common and different integral is the Ito integral for the uncorrelated situation and Gaussian noise. This integral of cannot be integrated in the Riemann sense due to lack of bounded variation. However, for additive noise the Stratonovich and Ito models give the same answer. To match the noise in (2.2a)–(2.2c) we set that expectation is zero and that the noise is uncorrelated, to read in the Stratonovich sense (indeed, also in the Ito sense since we will achieve the same result for additive noise): where () is the Dirac delta function that accounts for the lack of correlation. This gives that Equation (2.13) is equal to (2.9) if which we will find is correct if is a linear force. As another example, set that the continuous process in space of is in fact an approximation to a discrete process in space. Assume as an example that is the number of cells that die randomly. We find that the drift term (first order term) of the Fokker-Planck equation (2.8) is related to the diffusion term (second-order term) by (see Appendix B). Apply the example of linear friction, that is, . We then have for the time continuous case that Equation (2.14) can be solved as This gives the covariance as We have used that the covariance of the initial value (0) and the noise is zero, to read The integral in (2.16), which is quite general, can be calculated explicitly for a time-dependent uncorrelated noise where to read We use our example (we will compare this process with a correlated process in the next section). This gives from (2.17) when The variance follows for (actually demands a more careful analysis, but it turns out that setting in (2.18) is correct), to read The time derivative of the variance becomes Thus (2.19) gives the same solution as in (2.20) since for our linear friction force. The expectation is given by For Gaussian processes (correlated or uncorrelated) the variance is important since higher-order moments follow from second-order moments (variance). By comparing with (2.13) and (2.20) we see that = 0 for a linear deterministic force However, this can be found more easily for a linear force by using (2.14) since We further find from (2.18) for the special case that that Thus in this special case is for large times exponentially correlated.

We can write (2.8) as where is a current velocity, which can be arbitrary in a general theory. Generally, we can let the increment depend on the probability density at time to read Realizations are easily generated by a computer. All realizations have to be calculated in parallel such that the density can be “counted up” at each time before a new time step is calculated. This process is not Markovian.

3. First-Order Stochastic Processes with Exponential Correlation

In Section 2 we analyzed first order uncorrelated processes with additive noise. The Fokker-Plank equation was applicable for the uncorrelated processes. It turns out that in some cases uncorrelated processes with additive noise can in fact mimic correlated processes. Assume as an example that the random term is exponentially correlated, with correlation , where is a correlation time parameter. In the limit when approaches zero we achieve uncorrelated noise, to read . Notice that an equation for that in fact will generate this exponentially correlated noise for large times (see (2.16)) follows simply by a scaling of the parameters in (2.14), to read where is white noise with

The last line in (2.16) is general. For exponentially correlated noise with linear friction we achieve Equation (3.2) then implies The variance follows when to readEquations (3.4d) and (3.4e) show that when we achieve the uncorrelated noise with Hurst exponent one half, while when we achieve a fractional noise with Hurst exponent one. (For the concept of fractional Brownian motion see Peitgen et al. [30, section 9.5, page 491]. See also M. Rypdal and K. Rypdal [31, 32] for fractional Brownian motion models descriptive for avalanching systems.) We can compare the two different stochastic processes, the one with linear friction and uncorrelated time-dependent noise where , and the one with linear friction and exponentially correlated noise of the type . The covariances and variances are given by the solution (2.18)-(2.19) and (3.2)–(3.4a), (3.4b) (3.4c), (3.4d), (3.4e), and (3.4f), respectively. In the limit when approaches zero, the variances become equal, but the covariances remain unequal, to read Indeed, the variances can be calculated easily when to read without correlation With exponential correlation we achieve that We can calculate higher-order moments for the two stochastic processes. For the special case that the time-dependent uncorrelated process is Gaussian, a Fokker-Plank equation follows as shown in Section 2 with . Heinrichs [16] has proved that a Fokker-Planck equation also follows for the probability density when assuming Gaussian exponentially correlated noise (called Ornstein-Uhlenbeck noise, see Appendix A). Thus for the probability density of the time-dependent Gaussian uncorrelated noise is equal to probability density of the exponentially correlated Gaussian noise. Gaussian processes give that higher-order moments follow from the second-order moment. Thus we should expect equality of two Gaussian processes, even a correlated and uncorrelated one, if the variances are equal.

With friction we can also compare the variances, to readWe observe that the variances are different. Thus the two stochastic processes couple differently to the linear friction term. Assume that The variance then becomes In this limit the variances are not different. Thus pure exponentially time-correlated noise can be mimicked by uncorrelated noise. However, the noise couples differently to a linear deterministic friction force.

Figure 1 shows the Liouville variance (), the variance assuming the variance assuming , and the variance assuming correlation .

509326.fig.001
Figure 1: The variances as a function of time assuming linear friction.

We have so far only analyzed first order systems with additive noise. Consider now the continuous time approach with nonlinear noise, to read Using the Stratonovich integral ordinary calculus applies, the solution becomes Assuming Gaussian noise, the expectation becomes Here we have used the algebra in (A.5)–(A.7). However, it has in the literature been considered nice to have a process that fulfills the relation even for multiplicative noise. It turns out that this can be archived if the time derivative path is abandoned (see Appendix C for the Ito calculus).

We can mimic Gaussian-correlated noise by Gaussian-uncorrelated noise when using the Stratonovich integral for multiplicative noise also. To achieve this we must according to (3.11) have that However, we have proven this already in (3.6) and (3.7). More generally assume that the correlated noise is chosen to be . A solution is , which is fulfilled for the two processes we examine in this article. When using the Ito integral in (3.10), the solution is different. Appendix C shows that the solution is . This solution for the uncorrelated case cannot mimic the correlated case.

4. Second-Order Stochastic Processes

Unfortunately bidimensional first order processes or second-order stochastic processes are more difficult to address than one dimensional first order processes. This is so because the position at a given time depends strongly on the velocity. Removing this dependence is tricky. We construct a stochastic interpretation of the bidimensional equation set with additive noise, to read as the Ito stochastic equation where the expectations are zero and and are the variances. We achieve by Taylor expansion of an arbitrary function that For time-uncorrelated Gaussian random terms with no cross correlation only the terms with contribute to order Thus after some simple algebra analogous to the algebra in Section 2 we achieve the well-known Fokker-Planck equation: In physics or engineering applications second-order differential equations are often used as models. For physical systems where we use we set the mass of the object equal to The second-order differential equation can be written as bidimensional first order equations, to read The Langevin model for the Ornstein-Uhlenbeck (1930) process with uncorrelated Gaussian random force is a special case of (4.1)–(4.3) when assuming a random term in the velocity () equation only (due to a stochastic force), that is, As a well-known example, assume that in the Ornstein-Uhlenbeck (1930) process. Thus we have a random force, a conservative nonrandom force, and a linear nonrandom friction force. is the Boltzmann constant and is the temperature. The Fokker-Planck equation, corresponding to an uncorrelated Gaussian random force, becomes according to (4.4) It is easily verified, and well known, that a steady-state solution is given by the Boltzmann distribution, to read where is a constant and is the temperature. Thus the Boltzmann distribution is achieved as a steady-state solution when assuming linear friction. Notice that when every solution of the type is a steady-state solution. is now the Hamiltonian, to read . This shows the importance of linear friction to achieve the correct steady-state solution for the uncorrelated Gaussian process.

Consider now the second-order process in (4.1) with only a random force. We can find the analogous continuous time solution equation for the position, to read according to Stratonovich The question is now whether uncorrelated noise can mimic correlated noise for the position, and not only for the velocity. Assume that we first calculate the variance by only applying an uncorrelated force of our now familiar type . This gives that However, for the correlated process with we achieve that Thus we find that the variances of the position are unequal since . We have that Thus for we find a Brownian motion with Hurst exponent 1.5, while for we find the Brownian motion with Hurst exponent 2. This is in agreement with the asymptotic solution found by Heinrichs [16].

The equation for the joint distribution is more complicated to derive. We achieve by Taylor expansion with only a random force that