Abstract

Due to the fact that noise gains the increasing interests in the field of biomedical signal processing and living systems, we present this introductive survey that may suffice to exhibit the elementary and the particularities of noise in comparison with conventional random functions. Three theorems are given for highlighting the particularities of noise. The first says that a random function with long-range dependence (LRD) is a noise. The secondindicates that a heavy-tailed random function is in the class of noise. The third provides a type of stochastic differential equations that produce noise.

1. Introduction

The pioneering work of noise may refer to the paper by Schottky [1], where he introduced the concept of two classes of noise. One class is thermal noise, such as the random motion of molecules in the conductors. The other is the shot noise, which may be caused by randomness of the emission from the cathode and the randomness of the velocity of the emitted electrons [1, 2]. Johnson described the latter using the term of the Schottky effect [3], which may be the first paper in the sense of expressing such a type of processes by the term of noise.

Let be a random function. Let be its power spectrum density (PSD) function, where is frequency and is radian frequency. Then, by noise, one means that for . Note that the PSD of a conventional random function, such as , is convergent at . In the field, the term “ noise” is a collective noun, which implies in fact noise. In the general case, noise has the meaning of noise for , where is the set of real numbers; see, for example, [4]. However, since for , one may usually not be interested in the case of . In what follows, we discuss the noise of type for unless otherwise stated.

Since the notion of noise appeared [3], it has gained increasing interests of scientists in various fields, ranging from bioengineering to computer networks; see, for example, [454], simply to mention a few. That fact gives rise to a question of what noise is. The question may be roughly answered in a way that a noise is such that its PSD is divergent at as previously mentioned. Nonetheless, that answer may be never enough to describe the full picture of noise. By full picture, we mean that we should describe a set of main properties of noise, in addition to the property in frequency domain. When regarding as its first property, denoted by P1, we would like to list other three as follows.P2: What is the qualitative structure of an autocorrelation function (ACF) of noise? For this property, we will discuss the statistical dependence based on the hyperbolically decayed ACF structure.P3: What is the main property of its probability density function (PDF)? With P3, we will explain the heavy-tailed property of noise, which may produce random functions without mean or variance.P4: What is the possible structure of the differential equation to synthesize noise?

For facilitating the description of the full picture of noise, we will brief the preliminaries in Section 2. Then, P2–P4 will be discussed in Sections 35, respectively. After that, we will conclude the paper in Section 6.

2. Preliminaries

2.1. Dependence of Random Variables

A time series may also be called a random function [55]. The term random function apparently exhibits that    is a random variable, implying We would like to discuss the dependence of and ) for , as well as .

2.1.1. Dependence Description of Random Variables with Probability

Let be the probability of the event . Denote by the probability of the event . Then [56], and are said to be independent events if If and are dependent, on the other side, where is the conditional probability, implying that the probability of the event provided that has occurred.

Note 1. The dependence of and is reflected in the conditional probability . If and are independent, .

2.1.2. Dependence Description of Gaussian Random Variables with Correlation

Let be a Gaussian random function. Denote by the correlation coefficient between and . Then [57, 58], and are independent if On the other hand, and are dependent provided that

The condition (2.4) or (2.5) expressed by the correlation coefficient regarding the independence or dependence may not be enough to identify the independence or dependence of a Gaussian random function completely. For example, when , that (2.4) holds for large may not imply that it is valid for small . In other words, one may encounter the situations expressed by By using the concept of probability, (2.6) corresponds to Similarly, (2.7) corresponds to (2.9)

Note 2. The notion of the dependence or independence of a set of random variables plays a role in the axiomatic approach of probability theory and stochastic processes; see Kolmogorov [59].

The above example exhibits an interesting fact that the dependence or independence relies on the observation scale or observation range. In conventional time series, we do not usually consider the observation scale. That is, (2.2) and (2.3) or (2.4) and (2.5) hold for all observation ranges no matter whether is small or large; see, for example, [60, 61]. Statistical properties that depend on observation ranges are briefed by Papoulis and Pillai [56] and Fuller [62] but detailed in Beran [63, 64].

Note 3. The Kolmogorov’s work on axiomatic approach of probability theory and stochastic processes needs the assumption that and are independent in most cases likely for the completeness of his theory. Nevertheless, he contributed a lot in random functions that are range dependent; see, for example, [65, 66].

2.1.3. Dependence Description of Gaussian Random Variables with ACF

Denote by the ACF of . Denote by E the operator of mean. Then, it is given by It represents the correlation between the one point and the other apart, that is, . For facilitating the discussions, we assume that is stationary in the wide sense (stationary for short). With this assumption, only replies on the lag . Therefore, one has In the normalized case, ACF is a convenient tool of describing the dependence of a Gaussian random function . For instance, on the one hand, we say that any two different points of are uncorrelated, accordingly independent as is Gaussian, if . That is the case of Gaussian white noise. On the other hand, any two different points of are strongest dependent if . This is the case of the strongest long-range dependence. In the case of , the value of varies with the lag .

A useful measure called correlation time, which is denoted by [67, page  74], is defined in the form By correlation time, we say that the correlation can be neglected if , where is the time scale of interest [67].

Note 4. For a conventional Gaussian random function , its correlation can be neglected if . This implies that the statistical dependence or independence of relies on its correlation time . However, we will show in Note 8 that correlation time fails if is a noise.

2.2. ACF and PSD

The PSD of is the Fourier transform of its ACF. That is, Equivalently, Letting in (2.14) yields Similarly, letting in (2.15) produces

Note 5. Equation (2.16) implies that an ACF is integrable if . On the other side, a PSD is integrable when as indicated in (2.17). Both are usual cases in conventional random functions.

Note 6. The noise of type has the property , which makes noise substantially different from conventional random functions.

2.3. Mean and Variance

Denote by the PDF of . Then, the mean denoted by is given by The variance denoted by is in the form Assume that is stationary. Then, and do not rely on time . In this case, they are expressed by Without generality losing, we always assume that is stationary in what follows unless otherwise stated.

Denote by the autocovariance of . Then, Taking into account (2.19) and (2.21), one has Denote by the mean square value of . Then, Considering given by one has Therefore,

It is worth noting that the above number characteristics are crucial to the analysis of in practice; see, for example, [6062, 6587], just to cite a few. However, things turn to be complicated if is in the class of noise. In Section 4 below, we will show that and/ or may not exist for specific types of noise.

3. Hyperbolically Decayed ACFs and Noise

The qualitative structure of noise is in the form , where is a constant. Since , as we previously mentioned several times, its ACF is nonintegrable over . That is,

Note that the above may be taken as a definition of LRD property of a random function; see, for example, [4, 63, 64, 8890]. Thus, the above exhibits that noise is LRD. Consequently, of a noise may have the asymptotic property given by Following [9193], we have the Fourier transform of in the form From the above, we have Thus, we have the following note.

Note 7. Qualitatively, the ACF of noise is in the structure of power function. It follows power law. This is the answer to P2 explained in the Introduction.

Example 3.1. A well-known example of noise is the fractional Gaussian noise (fGn) [93, 94]. Its ACF is in the form where is the Hurst parameter; is the strength of fGn. Its PSD is in the form [93]

Example 3.2. Another example of noise is fractional Brownian motion (fBm). The PSD of the fBm of the Weyl type is in the form [95, 96] The PSD of the fBm of the Riemann-Liouville type is given by [9799] where is the Bessel function of order and is the Struve function of order , respectively.

Example 3.3. The Cauchy-class process with LRD discussed in [100, 101] is a case of noise. Its ACF is in the form Its PSD is given by where is the modified Bessel function of the second kind. It has the asymptotic expression for in the form, being noise,

Example 3.4. The generalized Cauchy process with LRD reported in [102105] is an instance of noise. Its ACF is given by where and . Its PSD in the complete form refers to [92]. The following may suffice to exhibit its noise behavior [102]: It may be worthwhile for us to write a theorem and a note when this section will soon finish.

Theorem 3.5. If a random function is LRD, it belongs to the class of noise and vice versa.

Proof. LRD implies that the right side of (2.16) is divergent, which implies that is a noise. On the other side, when is a noise, , which means that its ACF is nonintegrable, hence, LRD.

Since any random function with LRD is in the class of noise, one may observe other types of random functions that belong to noise from a view of LRD processes, for example, those described in [106112].

Note 8. As noise is of LRD, its ACF is nonintegrable over . Thus, in (2.13), its correlation time . That implies that correlations at any time scale cannot be neglected. Consequently, the measure of correlation time fails to characterize the statistical dependence of noise.

4. Heavy-Tailed PDFs and Noise

Heavy-tailed PDFs are widely observed in various fields of sciences and technologies, including life science and bioengineering; see, for example, [113146]. Typical heavy-tailed PDFs are the Pareto distribution, the log-Weibull distribution, the stretched exponential distribution, the Zipfian distribution, Lévy distribution, and the Cauchy distribution; see, for example [51, 96, 125, 143169], merely to cite a few.

By heavy tails, we mean that the tail of a PDF decays slower than the tails of PDFs in the form of exponential functions. More precisely, the term of heavy tail implies that of a random function decays slowly such that in (2.21), or in (2.24), decays hyperbolically such that it is of LRD. Thus, we may have the theorem below.

Theorem 4.1. Let be a heavy-tailed random function. Then, it is in the class of noise and vice versa.

Proof. Considering that is heavy tailed, we may assume its ACF hyperbolically decays, that is, for and . According to (3.3), therefore, we see that is noise. On the other side, if is a noise, it is LRD and accordingly heavy tailed [54, 146]. This completes the proof.

Note 9. Theorem 4.1 may be taken as an answer to P3 stated in the Introduction. Since in Theorem 4.1 is restricted to (0, 1), Theorem 4.1 is consistent with the result of the Taqqiu’s law, referring [53, 54, 89] for the details of the Taqqiu’s law.

Note 10. The tail of may be so heavy that the mean or variance of does not exist.

A commonly used instance to clarify Note 10 is the Pareto distribution; see, for example, [96, 134, 147, 148, 161]. To clarify it further, we would like to write more. Denote by the Cauchy distribution. Then, one has where is the half width at half maximum and is the statistical median [170]. Its th moment denoted by is computed by Since the above integral is divergent for , the mean and variance of obeying do not exist.

Note 11. Application of the Cauchy distribution to network traffic modeling refers to [169]. The generalized Cauchy distribution is reported in [171173].

Another type of heavy-tailed random functions without mean and variance is the Lévy distribution; see, for example, [162166, 174]. Suppose that follows the Lévy distribution that is denoted by . Then, for is given by where is the location parameter and the scale parameter. The th moment of a Lévy distributed random function, for the simplicity by letting , is given by The integral in (4.4) is divergent for . Thus, its mean and variance do not exist because they approach .

Note 12. Application of the Lévy distribution to network traffic modeling is discussed in [175]. Tools in Matlab for simulating Lévy distributed random data are described in [174].

The previously discussed heavy-tailed distributions, such as the Cauchy distribution and the Lévy distribution, are special cases of stable distributions, which are detailed in [89, 120, 176179]. Denote by the PDF of a stable-distributed random function . Then, it is indirectly defined by its characteristic function denoted by . It is in the form where is the stability parameter, is the skewness parameter, is the scale parameter, the location parameter, and It may be easy to see that has mean when . However, its mean is undefined otherwise. In addition, its variance equals to if . Otherwise, its variance is infinite.

Note 13. A stable distribution is characterized by 4 parameters. In general, the analytical expression of is unavailable except for some specific values of parameters, due to the difficulties in performing the inverse Fourier transform of the right side of (4.5).

Note 14. A stable distribution is generally non-Gaussian except for some specific values of parameters. When , it is Gaussian with the mean and the variance of . Generally speaking, is heavy tailed.

Note 15. reduces to the Landau distribution when . Denote by Landau the Landau distribution [180]. Then, its characteristic function is given by where is the location parameter and the scaled parameter. Its PDF is given by Applications of the Landau distribution can be found in nuclear physics [181, 182].

Note 16. reduces to the Holtsmark distribution if and . Accordingly, Its PDF is given by This type of random functions has mean but its variance is infinite [183185].

The literature regarding applications of stable distributions is rich. Their applications to network traffic modeling can also be found in [186188].

It is worth noting that the observation of random functions without mean and variance may be traced back to the work of the famous statistician Daniel Bernoulli’s cousin, Nicolas Bernoulli in 1713 [189, 190]. Nicolas Bernoulli studied a casino game as follows. A player bets on how many tosses of a coin will be needed before it first turns up heads. If it falls heads on the first toss the player wins $2; if it falls tails, heads, the player wins $4; if it falls tails, tails, heads, the player wins $8, and so on. According to this game rule, if the probability of an outcome is , the player wins $. Thus, the mean for is given by The above may be used to express the game that is now termed “Petersburg Paradox.” That paradox is now named after Daniel Bernoulli due to his presentation of the problem and his solution in the Commentaries of the Imperial Academy of Science of Saint Petersburg [191].

5. Fractionally Generalized Langevin Equation and Noise

The standard Langevin equation is in the form where [56, 192] and is a standard white noise. By standard white noise, we mean that its PSD is in the form The solution to (5.1) in frequency domain is given by

The standard Langevin equation may not attract people much in the field of noise. People are usually interested in fractionally generalized Langevin equations. There are two types of fractionally generalized Langevin equations. One is in the form [193208] The other is expressed by [209215] Two are consistent when . We now adopt the one expressed by (5.5).

Theorem 5.1. A solution to the stochastically fractional differential equation below belongs to noise:

Proof . Denote by the impulse response function of (5.5). That is, where is the Dirac- function defined by, for being continuous at , The function is called the impulse function in linear systems [61, 216]. According to the theory of linear systems, is the solution to (5.7) under the zero initial condition, which is usually called the impulse response function in linear systems [61, 68, 69, 76, 77, 216222]. Denote by the Fourier transform of . Then, with the techniques in fractional calculus [213, 215, 223], doing the Fourier transforms on (5.7) yields Therefore, we have, by taking into account (5.2), If in (5.10), we have noise expressed by This finishes the proof.

From the above, we see that belongs to noise. As a matter of fact, by using fractional integral in (5.6), we have

The above expression implies that a noise may be taken as a solution to a stochastically fractional differential equation, being an answer to P4 described in the Introduction. The following example will soon refine this point of view.

Example 5.2. Let be the Wiener Brownian motion for ; see [224] for the details of the Brownian motion. Then, it is nondifferentiable in the domain of ordinary functions. It is differentiable, however, in the domain of generalized functions over the Schwartz space of test functions [91, 225]. Therefore, in the domain of generalized functions, we write the stationary Gaussian white noise by Based on the definitions of the fractional integrals of the Riemann-Liouville’s and the Weyl’s [223], on the one hand, when using the Riemann-Liouville integral operator, we express the fBm of the Riemann-Liouville type, which is denoted by , in the form where is the Riemann-Liouville integral operator of order for . On the other hand, the fBm of the Weyl type by using the Weyl fractional integral, which we denote by , is given by where is the Weyl integral operator of order . Expressions (5.14) and (5.15) are the fBms introduced by Mandelbrot and van Ness in [94] but we provide a new outlook of describing them from the point of view of the fractional generalized Langevin equation with the topic of noise.

6. Conclusions

We have explained the main properties of noise as follows. First, it is LRD and its ACF is hyperbolically decayed. Second, its PDF obeys power laws and it is heavy tailed. Finally, it may be taken as a solution to a stochastic differential equation. Fractal time series, such as fGn, fBm, the generalized Cauchy process, and the Lévy flights, -stable processes, are generally in the class of noise.

Acknowledgments

This work was supported in part by the 973 Plan under the Project Grant no. 2011CB302800 and by the National Natural Science Foundation of China under the Project Grant nos. 61272402, 61070214, and 60873264.