Abstract

We incorporate randomness into deterministic theories and compare analytically and numerically some well-known stochastic theories: the Liouville process, the Ornstein-Uhlenbeck process, and a process that is Gaussian and exponentially time correlated (Ornstein-Uhlenbeck noise). Different methods of achieving the marginal densities for correlated and uncorrelated noise are discussed. Analytical results are presented for a deterministic linear friction force and a stochastic force that is uncorrelated or exponentially correlated.

1. Introduction

Stochastic theories model systems which develop in time and space in accordance with probabilistic laws.1 Essential in stochastic theories is how randomness is accounted for. For Markov [1] processes2, which are an important class of stochastic processes, the state value of at time is given by the state value at time plus a state value of a “random variable” at time 3 A random “disturbance” in a Markov process may possibly influence all subsequent values of the realization. The influence may decrease rapidly as the time point moves into the future. Five methods to account for randomness are to assume deterministic equations to determine the stochastic process for say a particle and apply Monte Carlo simulations or analytical methods [2, 3] to find probability distributions, to use the Liouville equation with or without added terms for the probability density per se, for example, for a particle, to use ordinary differential equations for the statistical moments of the probability distribution, to use the “hydrodynamic approach,” which is to specify constitutive relations in an equation set akin to what is used in hydrodynamic formulations of gas flow [4], and in quantum physics one can use traditional quantization rules, Nelson’s [5] stochastic version of quantum mechanics, or path integration methods. This article mainly confines attention to method 1, addressed in all sections of the article. Method 2 is briefly addressed in the last part of Section 4. Methods 3, 4, and 5 are briefly outlined in Appendix F. We first outline methods 1 and 2 and thereafter specify the contribution of this article towards the end of the introduction.

For method 1 allowing randomness in the initial values for a particle and for a deterministic noise commonly implies more realistic models of physical situations than, for example, the Liouville process where a realization of the stochastic process is constructed deterministically allowing randomness only in the initial conditions of the particle. In the Stratonovich [3] method noise is incorporated in a deterministic equation by adding a deterministic noise term that can be integrated in the normal Riemann sense. The Stratonovich integral occurs as the limit of time-correlated noise (so-called colored noise) when the correlation time of the noise term approaches zero. Monte Carlo simulations or analytical methods can be used to “count up” the probability density. The increment in the Einstein-Smoluchowski theory of Brownian motion during a time interval is proportional with the so-called drift, plus a nondeterministic noise as a time-uncorrelated Gaussian random term (called Gaussian white noise), with mean zero and variance proportional to . This mathematical model was first treated rigorously by Ito [2] (See also [6, 7]). The work by Bachelier [8], Einstein [9, 10], Smoluchowski [11, 12], Wigner [13], Ornstein-Uhlenbeck (1930), and Ito [2] constitute foundations. A system with time-uncorrelated noise is usually just a coarse-grained version of a more fundamental microscopic model with correlation. Thus depending on the problem in consideration, the Stratonovich (deterministic noise) or the Ito model (stochastic noise) could be appropriate approximations. For additive noise the Ito model gives the same answer as the Stratonovich model. But for multiplicative noise the results are different. (For a recent treatment of different interpretations of stochastic differential equations see, e.g., Lau and Lubensky [14].) Roughly, the reason for this difference is that the Stratonovich integral, which is a Riemann integral, applies on functions with bounded variations. The Ito integral applies on functions without bounded variations (i.e., white noise).

In physics or engineering applications second-order ordinary differential equations are often used as models. Unfortunately, second-order processes are more difficult to address than first-order processes. The second-order differential equation is first written as the mathematical equivalent set of two first-order equations, and then randomness is incorporated into the first-order equations either by Ito or Stratonovich interpretations by defining two stochastic differential equations for the two random variables and [6, 7]. In the well-known Langevin models used in physics, or in any second-order process driven by noise force, a noise term is added to the velocity (say ), but not to the position (say ). The position follows as the integral of the velocity. The Ornstein-Uhlenbeck (1930) theory of Brownian motion is expressed in terms of a Markov process in the bi-dimensional phase space. Masoliver [15] studied second-order processes driven by a dichotomous exponentially time-correlated noise force, while Heinrichs [16] studied second-order processes driven by Gaussian exponential time-correlated noise force. See also Bag [17] and Bag et al. [18] for studies on stochastic processes and entropy production. Langevin models are very fruitful as a starting point for quantum noise phenomena [19].

In method 2, randomness is implemented without using differential equations and Monte Carlo simulations, by studying the Liouville equation as such with our without added terms, for example, for a particle. In the Liouville process the probability density is a solution of the so-called Liouville equation4. But a traditional picture of applying realistic randomness is through collision integrals that fulfill local conservation laws of mass, momentum, and energy [20, 21]. It is usually considered in direct reference to the Boltzmann kinetic theory. Microscopic collision rules are established based on time symmetry, but then a rapid decay of correlations amounts to assuming friction [22, 23]. For a second-order process driven by dichotomous noise with exponential correlation Masoliver [15] found a third-order partial equation for the joint density distribution. For a second-order process driven by Gaussian noise with exponential correlation Hienrichs [16] found a Fokker-Planck equation with time variable coefficients for the joint distribution. In the phase-space formulation of quantum physics the Wigner quasijoint distribution is commonly used [24]. The equation for the joint distribution includes additional terms compared to the bidimensional Liouville equation. (See the review articles by Hillery et al. [25] and Lee [26] for a review of quantum phase-space distributions.) No corresponding second-order stochastic differential equation is constructed.

A system with time-uncorrelated noise is usually just a coarse-grained version of a more fundamental microscopic model with time correlation. It is therefore of interest to study models with correlation. Bag et al. [18] introduced correlation by increasing the number of differential equations and applying uncorrelated noise throughout. This approach obviously increases the system complexity. This article shows that time-correlated noise can be mimicked by time-uncorrelated noise and time-dependent noise without increasing the number of equations in the equation set. To provide a benchmark we start in Section 2 by considering a one-dimensional system based on recurrence relations without correlation. We first show how the recurrence relation, which is usually applied for Gaussian noise only ( that means corresponding to the Ito integral) can be used to develop equations for a more general noise that is multifractal. We develop the equation for variances and covariances. Gaussian processes give that higher-order moments follow from second-order moments. Our first order system has not been analyzed in the manner proposed in this paper, which is needed to develop alternative accounts of introducing correlation. We compare different methods of achieving the main equations and study when the time-dependent uncorrelated noise can mimic exponential correlated noise. Section 3 studies first order systems with correlation, which are compared with the systems without correlation in Section 2. Such comparison has not been made in the earlier literature. Section 4 considers second-order stochastic processes driven by time-uncorrelated or correlated noise. Proceeding to the second-order allows capturing a larger fraction of real life processes which makes the approach more realistic. We show that a second-order system driven by an exponential time-correlated noise force can be mimicked by adding time-uncorrelated noise both to the position and to the velocity5. Instead of expanding the equation set, noise is added separately to each of the two dimensions (exemplified with position and velocity) of the two-dimensional system. Section 5 concludes.

2. A First-Order Time-Uncorrelated Process with Additive Noise

This section provides a first order process which accounts for randomness. Assume that the differential equation has been used to describe a physical phenomenon. Assume that this equation is found to no longer hold due to results from a more developed experimental set-up. The question is then as follows: how should this equation be (a) modified or (b) reinterpreted, to be more realistic? Assume that we use (b), assuming that the original equation is reinterpreted to mean , where means expectation of a stochastic process. (Another possibility is . These two interpretations are in general different since this one demands a derivative path.) The next question is then how to construct a stochastic theory such that expectation, variance, and higher-order moments can be calculated.

Stochastic or nonstochastic integrals can rarely be solved in analytic form, making numerical algorithms an important tool. Assume that is the change in this quantity during a small time interval from to (we will let the time spacing approach zero). Assume that this change is proportional with the time interval and with which gives the recurrence relation: where “mod” means that this is a model assumption and “def” means definition. When constructing a more developed theory accounting for noise, we can, for instance, make the initial values stochastic as in the Liouville approach or we can change the recurrence relation in (2.1). In the study of nonlinear recurrence relations Glass and Mackey [27] showed that it is possible to construct an infinite number of deterministic relations which are chaotic, but which describe a given density distribution. Thus a given density distribution has no unique recurrence relation. It appears that the broad class of Markovian theories incorporating the Gaussian white noise input provides a satisfactory approximation for a large variety of phenomena. We consider the more general stochastic equation, which we consider as a stochastic differential equation with additive noise of the Ito form: where is not a differential in the Riemann sense and is therefore denoted by the “”. is a function and is the differential change of with respect to during the time step We model as a stochastic variable where Distribution is a distribution of expectation zero and variance = . means expectation. is any arbitrary function. The expectation of in (2.2b) is zero. We assume no correlation. Equation (2.2c) is assumed in order to develop (2.5). Equation (2.2c) is valid when using Ito calculus (no bounded variation) but is not valid when using Stratonovich calculus. When we define, as determined by (2.2a)–(2.2c), what we call a Liouville recurrence relation, which defines the Liouville process, where only the initial values are stochastic.

Conceptually, we can easily generate realizations by applying (2.2a)–(2.2c) on a computer. Assume that we perform (which approaches infinity) different runs up to time with a constant time spacing (which approaches zero). Let be the arbitrary function that we apply to each number. Thus we achieve the set of numbers, to read Applying Taylor expansion, the expectation is achieved when goes to infinity, that is, where the “dot” above a variable means the time derivative that is, , and “D” means the space derivative , that is, . We can conceptually collect all tracks that pass through a given . We next assume that (a) the variance is for small time steps, where is some well-behaved function to be specified exogenously, and (b) that higher-order moments also have the same powers of akin to multifractal phenomena [28]. The terms and equal zero due to (2.2c). Thus for . This gives is the forward infinitesimal operator of the process. By setting as a special case, we easily find that We introduce the probability density of We choose time independent and can then write (2.5) by definition as (Only natural boundary conditions are used in this article and the space integration limits are suppressed.) Integrating (2.6) by parts and assuming natural boundary conditions gives Equation (2.7) is valid for all and thus In fact, if our uncorrelated random term is Gaussian, all odd moments are zero, and even moments of higher order than two are of higher order than (see Appendix A). This gives , implying the partial differential equation called the forward Fokker-Planck equation or forward Kolmogorov equation [29]. We can from (2.8) easily calculate the derivative of the variance of which for a time-dependent (and uncorrelated) random term with becomes The general Liouville solution of (2.8) is easily found, to read is an arbitrary function such that the integral is equal to 1. Say that the force is a linear friction force and that where and are parameters. The time derivative of the variance becomes according to (2.9) Say that we will formulate a continuous equation in time that corresponds to (2.2a)–(2.2c), to read where is an arbitrary noise function. We let be deterministic (the randomness is then only in the initial values of ; see Appendix E as an example). The deterministic approach generally gives that can be integrated in the traditional Riemann sense. The Stratonovich integral occurs as the limit of colored noise if the correlation time of the noise term approaches zero. A quite common and different integral is the Ito integral for the uncorrelated situation and Gaussian noise. This integral of cannot be integrated in the Riemann sense due to lack of bounded variation. However, for additive noise the Stratonovich and Ito models give the same answer. To match the noise in (2.2a)–(2.2c) we set that expectation is zero and that the noise is uncorrelated, to read in the Stratonovich sense (indeed, also in the Ito sense since we will achieve the same result for additive noise): where () is the Dirac delta function that accounts for the lack of correlation. This gives that Equation (2.13) is equal to (2.9) if which we will find is correct if is a linear force. As another example, set that the continuous process in space of is in fact an approximation to a discrete process in space. Assume as an example that is the number of cells that die randomly. We find that the drift term (first order term) of the Fokker-Planck equation (2.8) is related to the diffusion term (second-order term) by (see Appendix B). Apply the example of linear friction, that is, . We then have for the time continuous case that Equation (2.14) can be solved as This gives the covariance as We have used that the covariance of the initial value (0) and the noise is zero, to read The integral in (2.16), which is quite general, can be calculated explicitly for a time-dependent uncorrelated noise where to read We use our example (we will compare this process with a correlated process in the next section). This gives from (2.17) when The variance follows for (actually demands a more careful analysis, but it turns out that setting in (2.18) is correct), to read The time derivative of the variance becomes Thus (2.19) gives the same solution as in (2.20) since for our linear friction force. The expectation is given by For Gaussian processes (correlated or uncorrelated) the variance is important since higher-order moments follow from second-order moments (variance). By comparing with (2.13) and (2.20) we see that = 0 for a linear deterministic force However, this can be found more easily for a linear force by using (2.14) since We further find from (2.18) for the special case that that Thus in this special case is for large times exponentially correlated.

We can write (2.8) as where is a current velocity, which can be arbitrary in a general theory. Generally, we can let the increment depend on the probability density at time to read Realizations are easily generated by a computer. All realizations have to be calculated in parallel such that the density can be “counted up” at each time before a new time step is calculated. This process is not Markovian.

3. First-Order Stochastic Processes with Exponential Correlation

In Section 2 we analyzed first order uncorrelated processes with additive noise. The Fokker-Plank equation was applicable for the uncorrelated processes. It turns out that in some cases uncorrelated processes with additive noise can in fact mimic correlated processes. Assume as an example that the random term is exponentially correlated, with correlation , where is a correlation time parameter. In the limit when approaches zero we achieve uncorrelated noise, to read . Notice that an equation for that in fact will generate this exponentially correlated noise for large times (see (2.16)) follows simply by a scaling of the parameters in (2.14), to read where is white noise with

The last line in (2.16) is general. For exponentially correlated noise with linear friction we achieve Equation (3.2) then implies The variance follows when to readEquations (3.4d) and (3.4e) show that when we achieve the uncorrelated noise with Hurst exponent one half, while when we achieve a fractional noise with Hurst exponent one. (For the concept of fractional Brownian motion see Peitgen et al. [30, section 9.5, page 491]. See also M. Rypdal and K. Rypdal [31, 32] for fractional Brownian motion models descriptive for avalanching systems.) We can compare the two different stochastic processes, the one with linear friction and uncorrelated time-dependent noise where , and the one with linear friction and exponentially correlated noise of the type . The covariances and variances are given by the solution (2.18)-(2.19) and (3.2)–(3.4a), (3.4b) (3.4c), (3.4d), (3.4e), and (3.4f), respectively. In the limit when approaches zero, the variances become equal, but the covariances remain unequal, to read Indeed, the variances can be calculated easily when to read without correlation With exponential correlation we achieve that We can calculate higher-order moments for the two stochastic processes. For the special case that the time-dependent uncorrelated process is Gaussian, a Fokker-Plank equation follows as shown in Section 2 with . Heinrichs [16] has proved that a Fokker-Planck equation also follows for the probability density when assuming Gaussian exponentially correlated noise (called Ornstein-Uhlenbeck noise, see Appendix A). Thus for the probability density of the time-dependent Gaussian uncorrelated noise is equal to probability density of the exponentially correlated Gaussian noise. Gaussian processes give that higher-order moments follow from the second-order moment. Thus we should expect equality of two Gaussian processes, even a correlated and uncorrelated one, if the variances are equal.

With friction we can also compare the variances, to readWe observe that the variances are different. Thus the two stochastic processes couple differently to the linear friction term. Assume that The variance then becomes In this limit the variances are not different. Thus pure exponentially time-correlated noise can be mimicked by uncorrelated noise. However, the noise couples differently to a linear deterministic friction force.

Figure 1 shows the Liouville variance (), the variance assuming the variance assuming , and the variance assuming correlation .

We have so far only analyzed first order systems with additive noise. Consider now the continuous time approach with nonlinear noise, to read Using the Stratonovich integral ordinary calculus applies, the solution becomes Assuming Gaussian noise, the expectation becomes Here we have used the algebra in (A.5)–(A.7). However, it has in the literature been considered nice to have a process that fulfills the relation even for multiplicative noise. It turns out that this can be archived if the time derivative path is abandoned (see Appendix C for the Ito calculus).

We can mimic Gaussian-correlated noise by Gaussian-uncorrelated noise when using the Stratonovich integral for multiplicative noise also. To achieve this we must according to (3.11) have that However, we have proven this already in (3.6) and (3.7). More generally assume that the correlated noise is chosen to be . A solution is , which is fulfilled for the two processes we examine in this article. When using the Ito integral in (3.10), the solution is different. Appendix C shows that the solution is . This solution for the uncorrelated case cannot mimic the correlated case.

4. Second-Order Stochastic Processes

Unfortunately bidimensional first order processes or second-order stochastic processes are more difficult to address than one dimensional first order processes. This is so because the position at a given time depends strongly on the velocity. Removing this dependence is tricky. We construct a stochastic interpretation of the bidimensional equation set with additive noise, to read as the Ito stochastic equation where the expectations are zero and and are the variances. We achieve by Taylor expansion of an arbitrary function that For time-uncorrelated Gaussian random terms with no cross correlation only the terms with contribute to order Thus after some simple algebra analogous to the algebra in Section 2 we achieve the well-known Fokker-Planck equation: In physics or engineering applications second-order differential equations are often used as models. For physical systems where we use we set the mass of the object equal to The second-order differential equation can be written as bidimensional first order equations, to read The Langevin model for the Ornstein-Uhlenbeck (1930) process with uncorrelated Gaussian random force is a special case of (4.1)–(4.3) when assuming a random term in the velocity () equation only (due to a stochastic force), that is, As a well-known example, assume that in the Ornstein-Uhlenbeck (1930) process. Thus we have a random force, a conservative nonrandom force, and a linear nonrandom friction force. is the Boltzmann constant and is the temperature. The Fokker-Planck equation, corresponding to an uncorrelated Gaussian random force, becomes according to (4.4) It is easily verified, and well known, that a steady-state solution is given by the Boltzmann distribution, to read where is a constant and is the temperature. Thus the Boltzmann distribution is achieved as a steady-state solution when assuming linear friction. Notice that when every solution of the type is a steady-state solution. is now the Hamiltonian, to read . This shows the importance of linear friction to achieve the correct steady-state solution for the uncorrelated Gaussian process.

Consider now the second-order process in (4.1) with only a random force. We can find the analogous continuous time solution equation for the position, to read according to Stratonovich The question is now whether uncorrelated noise can mimic correlated noise for the position, and not only for the velocity. Assume that we first calculate the variance by only applying an uncorrelated force of our now familiar type . This gives that However, for the correlated process with we achieve that Thus we find that the variances of the position are unequal since . We have that Thus for we find a Brownian motion with Hurst exponent 1.5, while for we find the Brownian motion with Hurst exponent 2. This is in agreement with the asymptotic solution found by Heinrichs [16].

The equation for the joint distribution is more complicated to derive. We achieve by Taylor expansion with only a random force that We further achieve from (4.11) is of order This gives, as it should, the Fokker-Planck equation without mixed terms, to read However, Heinrichs [16] found for a deterministic noise force only, that is exponentially correlated and Gaussian, that and when the correlation time approaches zero, we achieve the traditional Fokker-Planck equation corresponding to the Gaussian uncorrelated random force.  = 0 gives the Liouville equation. The term is zero for as it should be. Say that we calculate the expectation . This gives when using (4.14) that We further have for our time-dependent uncorrelated noise that = according to (4.13), or alternatively if we use (4.7) with , to read This is in disagreement with (4.15). This shows that our time-dependent uncorrelated noise cannot mimic the correlated noise for position. To check this further we have for correlated noise that This is in agreement with (4.15). However, it is easily observed from (4.1) that (4.14) follows if We thus find that noise must be added both to the position and to the velocity in the Ito stochastic differential equation to mimic the correlated noise force. In addition noise in velocity and position must be cross correlated. The time continuous uncorrelated Stratonovich version (but with cross correlation in velocity and position) that will mimic the solution in (4.14) is accordingly Agreement can also be shown by studying the variance directly, to read This agrees with the solution in (4.9). More generally, according to method 2 discussed in the introduction, we can introduce noise simply by adding terms (more or less as hoc) to the Liouville equation. Expanding on the results of (4.18) with a random and nonrandom force, consider the equation where the force is both a noise term plus a deterministic force, to follow from (4.18) when Equation (4.21) gives that A question is now whether the equation set (4.19) (or (4.21)) could mimic where . It will not. As our example we use the linear friction force, . The Stratonovich solution of (4.19) is The solution for the variance of is equivalent to (3.8a) substituting which assumes time-dependent uncorrelated noise. The solution is not equal to the correlated solution in (3.8c). Thus (4.19) (or (4.21)) and in general model different physical realities, even though the models are the same when is set to zero. Also both models are equal in the limit where the correlation time approaches zero.

By integrating with respect to the position and velocity, respectively, we achieve the equation for the marginal density of velocity and position, to read The same equations apply for the Liouville process. We can find an explicit relation for

5. Conclusion

This article studies the construction of stochastic theories from deterministic theories based on ordinary differential equations. We incorporate randomness into deterministic theories and compare analytically and numerically some well-known stochastic theories: the Liouville process, the Ornstein-Uhlenbeck process, and a process that is Gaussian and exponentially correlated (Ornstein-Uhlenbeck noise). Different methods of achieving the marginal densities are discussed and we find an equation for the marginal density of velocity and position for a second-order process driven by an exponentially correlated Gaussian random force. We show that in some situations a noise process with exponential correlation can be mimicked by time-dependent uncorrelated noise. We show that a second-order system driven by an exponential time-correlated noise force can be mimicked by adding time-uncorrelated noise both to the position and to the velocity. For such a situation the traditional concept of force loses its meaning.

Appendices

A. Gaussian Exponentially Correlated Noise and the Fokker Planck Equation

Following Heinrichs [16], we write without drift where is the expectation. The correlation time is We observe that In the limit of zero correlation time we achieve the uncorrelated noise since the Fourier transform (called spectrum) of the delta function is a constant function (called flat). Integrating (A.1) gives The expectation and variance become This gives that Or alternatively Now, by assuming a Gaussian distribution all odd moments become zero. The even moments are given by The characteristic function for the density of becomes which implies Thus we find that Hence with correlation time we achieve a Fokker-Planck or forward Kolmogorov equation [29] with an explicit time variation in the diffusion coefficient. A solution is

B. A Discrete Markov Process

We apply a specific Markov process in continuous time and discrete space. Fixing attention to a cell, we define (stochastically varying) as a number indicating whether this cell does not die or dies during the time interval from to It is reasonable to assume that the conditional point probabilities that this cell does not die, or dies, between and are where is given at time is an arbitrary function, and means the absolute value. Equation (B.2) applies to each cell (i.e., all cells are “alike”). This gives a recurrence relation: The conditional point probability is and is binomial. implies . Equation (B.3) implies where applying the law of total probability gives Rearranging and taking the limit as h tends to zero yields Proceeding with this Markov process, the right-hand side in (B.6) can be approximated as a continuous partial differential equation. Taylor expansion up to order 2 gives For large is a good approximation to the space step length, where is an arbitrarily small real increment without denomination. Inserting into (B.6) gives Assuming =0 for where is any arbitrary positive integer, and inserting (B.6) into the definition of gives Applying as an example the special initial condition the solution is

C. Ito Calculus for Multiplicative Noise

We do not in this section use the “” to separate the Stratonovich from the Ito interpretation. Say that we have two solutions of the differential equation asOne must decide whether the Langevin equation (C.1a) should be solved by Ito or by Stratonovich integrals. Different numerical schemes for numerical solutions of stochastic differential equations (i.e., Langevin models) driven by Gaussian white noise do not in general give the same solution. It can be shown that the Euler scheme is generally consistent with the Ito formulation since during trapezoid integration the leftmost points are used for the intervals that are summed during integration (thus not looking into the future!).

We have the following numerical scheme in the Ito formulation (C.1c): where Distribution is a distribution of expectation zero and variance . means expectation. is any arbitrary function. The Fokker-Planck equation and the expectation according to (C.2) become (when assuming Gaussian noise or even multifractals) When , we find that .

Say instead that (C.1a) is interpreted according to Stratonovich (C.1b). It can be shown that an Ito formulation exists which gives the same answer as the Stratonovich formulation, to read Thus the stochastic Ito differential equation that would give the Stratonovich solution is This gives the modified Fokker-Planck equation and the expectation as Setting we find that This is in agreement with the solution in (3.12). The Ito () and Stratonovich () “calculus” can be written for an arbitrary According to the development in (2.4) and (2.5) we have Thus In general the relation between the Ito integrals and the Stratonovich integral is

D. A Simple Deterministic Noise as an Example

Say that the noise is given by where is a time parameter. Let the density be given by This gives when we for simplicity set that Further we have for where is a time parameter, This gives that Thus the covariance or correlation is cyclic in the time difference.

E. Methods 3, 4, 5 for Introducing Randomness

A third method to introduce randomness is by ordinary differential equations for the statistical moments of the probability distribution. These moments can be constructed ad hoc or found by mathematical manipulation of the partial differential equation for the probability density, or simply from a recurrence relation in time.

A fourth method of introducing randomness is the hydrodynamic method. Consider the variables as position and velocity for illustration, but the method applies generally. By integrating the equation for the joint distribution for two stochastic variables with respect to the second variable (velocity), the well-known equation for the conservation of probability in space is found. This equation, which is only the conservation of probability, can be used without referring to any stochastic theory. The equation includes the so-called current velocity. It is well known that in Boltzmann kinetic theory or in most Langevin models, the total derivative of the current velocity is equal to the classical force minus a term that is proportional to times the space derivative of , where is the variance of at time given by the position of and is the density of [20]. Now, the equation for the conservation of probability in space is a first partial differential equation. As a second equation, set the total derivative of the current velocity equal to the classical force minus a term that is proportional to times the space derivative of as in Boltzmann’s kinetic theory or in Langevin models. Thus randomness can be accounted for by constitutive relations for without postulating a relation for a joint or quasijoint distribution [4]. For the Liouville process realizations in the position- velocity space (phase space) cannot cross. In addition, for a conservative classical force, all realizations that start at the same position will have a unique velocity at a given position when applying the Liouville process, which implies =0. The equation for the total derivative of the current velocity, which now equals the classical force, can be integrated in space to give the familiar Hamilton-Jacobi equation in classical mechanics as a special case. More generally, the total derivative of the current velocity of the Liouville and the Ornstein-Uhlenbeck (1930) processes (assuming uncorrelated Gaussian noise) has also been analyzed when assuming initial conditions in position and velocity that are independent and Gaussian distributed. It has been shown that is independent of x, but time-dependent for the free particle or for the harmonic oscillator [3335]. We believe that this hydrodynamic method can be useful when experimental data pertain to the variables in the equation set, and there is no direct experimental access to microscopic dynamics.

The fifth method of introducing randomness is quantization rules or Nelson’s [5] approach. Quantization rules “transform” second-order ordinary differential equations into stochastic equations, assuming that the system of ordinary differential equations follows from Lagrange’s formalism. Hojman [36] and Gomeroff and Hojman [37] provide several examples of the construction of Hamilton structures without using any Lagrangian. In the double slit experiment interference appears and the received theory states that the density solution is impossible to construct from Markovian recurrence relations [38]. It turns out that the density solution is equal to the Liouville density solution, and realizations that follow deterministic trajectories can then be used to “count up” the density. Realizations can always be constructed by using the density solution to construct different realizations by drawing positions from that solution at each time step. Such realizations can also be constructed from a Liouville density, but they are certainly different from the classical tracks in that case. A mixture of classical stochastic theory and quantum mechanics has been developed by the use of a quasiclassical Langevin equation. The variables are nonoperator quantities while the spectrum of the random force relates to the zero point fluctuation (that is proportional to frequency) and the Planck spectrum [39, 40]. Nelson’s [5, 41, 42] stochastic version of quantum mechanics is believed to be equivalent to the quantum mechanics for predictions of the outcomes of experimental experiments. Blanchard et al. [43] show that Nelson’s approach is able to describe in a unified language the deterministic and random aspects of quantum evolution. The approach has features analogous to the Ornstein-Uhlenbeck (1930) theory. But the increment in position is not written as the classical velocity times the time step, as in the Ornstein-Uhlenbeck (1930) theory, but a general drift field (that depends on time and position) times the time step plus a random uncorrelated Gaussian term. In addition the drift is found by setting the average sum of the so-called backward and forward infinitesimal operator applied on the drift, equal to the classical force. By manipulating the equation set the Schrödinger equation follows easily. The work by Nelson has stimulated more recent and refined studies on the stochastic approach. See Albeverio and Høegh-Krohn [44], Ezawa et al. [45], Albeverio et al. [46], Guerra [47], Carlen [48], Zheng [49], Zambrini [50], Blaquière [51], Garbaczewski [4], Blanchard and Garbaczewski [52], Garbaczewski and Olkiewicz [53], Garbaczewski and Klauder [33], Czopnik and Garbaczewski [34], and Garbaczewski [35]. See also Bratteli and Robinson [54] for the statistical mechanics of continuous quantum systems. For a study of statistical interpretations of quantum mechanics see Ballentine [55].

F. The Duffing Equation

The Duffing equation is a nonlinear second-order differential equation exemplifying a dynamical system exhibiting chaotic behavior, that is, in the simplest form without forcing Equations (F.1) describe the motion of a damped oscillator with a more complicated potential than in simple harmonic motion. In physical terms, it models, for example, a spring pendulum whose spring's stiffness does not exactly obey Hooke's law. The Duffing equation cannot be solved exactly in terms of symbols, though for the special case of the undamped () Duffing equation, an exact solution can be obtained using Jacobi's elliptic functions. Numerically, Euler's method, Runge-Kutta’s method, and various numeric methods can be used.

To demonstrate applying Duffing’s method that the methods in this paper work, we propose three methods. First, replacing with and substituting into (4.6) gives which is an analytical expression for the probability density as a function of and As expected, and

Second, replacing with and replacing with and substituting into (4.21) give Third, substituting into (4.22) gives

Notation

:Time
:Correlation time parameter
:Initial time
:Temperature
:Stochastic space variable when Stratonovich is applied
:Stochastic space variable when Stratonovich is applied
:Stochastic space variable when Ito is applied
:Stochastic space variable when Ito is applied
:Function of a space variable
:Deterministic space variable
:Deterministic space variable
:Time increment
:Stochastic variable,
:Arbitrary function
:Well-behaved function specified exogenously,
:Probability density of
:Forward infinitesimal operator of the process
:Arbitrary function such that
:
:
:Positive parameter
:Dirac delta function
:Arbitrary noise function
:Parameter, initial value of
:
:Stochastic variable
:
:White noise
: means Ito, means Stratonovich
:Integration variable
:Stochastic variable
:Deterministic variable
:Conditional point probability
:Arbitrary function
:Absolute value of
:Small increment without denomination
:Stochastic variable
:Time parameter
:Probability density.

Acknowledgments

The authors thank an anonymous referee of this journal and Dr. Joseph McCauley for useful comments on the manuscript.

Endnotes

  1. The space is not necessarily the familiar Euclidean space for everyday life. We distinguish between cases which are discrete and continuous in time or space. See Taylor and Karlin [56] for a mathematical definition of stochastic processes, which is not replicated here. Briefly, the usual situation is to have a set of random variables defined for all values of the real number (assume time), which could be discrete or continuous. The outcome of a random variable is a state value (often a real number). The set of random variables are called a stochastic process, which is completely determined if the joint distribution of the set of random variables is known. A realization of the stochastic process is an assignment to each in the set a value of
  2. In the narrowest sense, a stochastic process has the Markov property if the probability of having state at time conditioned on having the particular state at time is equal to the conditional probability of having that same state but conditioned on its value for all previous times before See Feller [57] for a broader definition. However, a Markov process may be deterministic; that is, all values of the process at time are determined when the value is given at time Or a process may be nondeterministic; that is, a knowledge of the process at time is only probabilistically useful in specifying the process at time
  3. By “counting up” the different realizations (tracks) in the state space the joint distribution can be constructed. Although counting up all different realizations in general constructs the joint probability, the inverse does not hold. Hence the joint probability of the set of random variables does not lead to a unique recurrence relation.
  4. The Liouville equation is present in most standard text books of statistical physics. See, for instance, Lifshitz and Pitaevskĭ [58]. By inserting an initial Dirac delta distribution into the Liouville equation, the distribution remains a Dirac delta distribution for all times. Marquis de Laplace [59] wrote on Determinism: “We ought to regard the present state of the universe as the effect of its anterior state and as the cause of the one which is to follow. Given for one instant an intelligence which could comprehend all the forces by which nature is animated and the respective situation of the beings who compose it—an intelligence vast to submit this data to analysis—it would embrace in the same formulae the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes. The human mind offers, in the perfection which it has been able to give astronomy, a feeble idea of this intelligence. Its discoveries in mechanics and in geometry, added to that of universal gravity, have enabled it to comprehend in the analytical expressions the past and the future state of the system of the world”. This shows that the Liouville equation supports Laplace’s world view, that the future can be foreseen, and the past can be recovered to any desired accuracy by finding sufficiently precise initial data and finding sufficiently powerful laws of nature. In the early 1900s Poincaré supplemented this view by pointing out the possibility that very small differences in the initial conditions may produce large differences in the final phenomena. Poincaré further argued that the initial conditions always are uncertain. Poincaré [60] wrote on chaos: “A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we assume that the effect is due to chance. If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at the following moment. But, even if it were the case that the natural laws had no longer any secrets for us, we could still only know the initial conditions approximately. If that enabled us to predict the succeeding situation with the same approximation, that is all we require, and we should say that the phenomenon has been predicted, that is, governed by laws. But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce very great ones in the final phenomena. Predictions becomes impossible, and we have the fortuitous phenomenon”. The marginal probability density can be found by integrating out other variables of the joint probability [61, 62].
  5. It quantum physics, the approach by Nelson’s [5] approach is able to describe in a unified language the deterministic and random aspect of quantum evolution [43]. Some students believe that Newton’s second law is a law of nature (a law that can be falsified by experiments), but actually it is only a definition of force. But the mathematical structure, suggests that the “force” is easy to find a new relation for. Thus the equation set becomes closed. It is not obvious that a mathematical theory accounting for randomness or noise is most easily formulated by seeking to model a random force. In this way Nelson’s approach suggests that at least in quantum mechanics, the force is not so easy to find, and that the mathematical structure most easily can be built from mathematical principles different from Newton’s second law. But in principle, the mathematical principle chosen to model randomness or noise is not given a priory. This applies both for the phenomena which we call quantum phenomena, and for the phenomena that we call classical stochastic phenomena (where we usually seek a random or noise force).