Research Article  Open Access
Guo Jiang, "The Structure of Autocovariance Matrix of Discrete Time Subfractional Brownian Motion", Mathematical Problems in Engineering, vol. 2018, Article ID 3132048, 14 pages, 2018. https://doi.org/10.1155/2018/3132048
The Structure of Autocovariance Matrix of Discrete Time Subfractional Brownian Motion
Abstract
This article explores the structure of autocovariance matrix of discrete time subfractional Brownian motion and obtains an approximation theorem and a structure theorem to the autocovariance matrix of this stochastic process. Moreover, we give an expression to the unique time varying eigenvalue of the autocovariance matrix in asymptotic means and prove that the increments of subfractional Brownian motion are asymptotic stationary processes. At last, we illustrate these results with numerical experiments and give some probable applications in finite impulse response filter.
1. Introduction
Fractional Brownian motion (FBm) is a continuous centered Gaussian process starting from zero and with covariance where , . When , fBm is a standard Brownian motion (see [1, 2] and their references).
Subfractional Brownian motion (SfBm) had been introduced by Bojdecki et al. [3]; it is also a continuous Gaussian process starting from zero and with covariance When , it arises from occupation time fluctuation of branching particle systems (see [3–5]). Moreover, sfBm has some properties analogous to those of fBm, such as selfsimilarity, longrang dependence, and Hölder continuity (for details we refer to [6–9]). But sfBm has nonstationary increments, its increments over nonoverlapping intervals are more weakly correlated, and the covariance decays polynomially at a higher rate in comparison with fBm; thereby it is called sfBm in [3].
It is well known that fBm has been successfully applied in the engineering fields like characterization of texture in bone radiographs, image processing and segmentation, terrain surface modeling, medical image analysis, and network traffic analysis. However, in contrast to the extensive applications on fBm, there have been lesser systematic investigations on sfBm. The main reason for this is the complexity of dependence structures for the process which has not stationary increments. Therefore, it seems meaningful to study the structure of autocovariance matrix of discrete time sfBm.
In recent years, Tudor [7] studied some new properties of sfBm such as strong variation, renormalized power variation, the Dirichlet property, and short and long memory properties. Moreover, in [8, 9], Tudor characterized the domain of the Wiener integral with respect to sfBm. Yan et al. [10] also defined stochastic integral with respect to sfBm with and obtained Itô and Tanaka formulas. Shen and Chen [6] defined the extended divergence integral with respect to sfBm with and established the corresponding versions of Itô and Tanaka formulas. Shen and Yan [11] and Rao [12] give the efficient estimation for the drift term parameter driven by sfBm. Shen and Yan [13] obtain an approximation theorem for sfBm using martingale differences. Liu et al. [14, 15] study the estimators for the selfsimilarity of sfBm.
On the other hand, covariance matrix is a very important feature of stochastic processes. These studies to covariance matrix are always a significant and interesting topic in probability and statistics. There exist large amounts of literatures about theories and applications of covariance matrix (see [16–23], etc.). In fact, one usually works with discrete time stochastic processes for practical application purposes (e.g., [17, 24–26]). In particular, Ayache et al. [24] presented the explicit covariance formula of multifractional Brownian motion and briefly reported some applications in the synthesis problem and the longterm structure. In [17], Gupta and Joshi studied the structure of covariance matrix of discrete time fBm and gave the application in band wavelet system. Yet the corresponding researches are still blank space for the sfBm which has nonstationary increments.
Inspired by these, we explore the structure of autocovariance matrix of discrete time sfBm (dtsfBm), and then we obtain an approximation theorem and a structure theorem to the autocovariance matrix of dtsfBm. At the same time, we give an expression to the unique time varying eigenvalue of the autocovariance matrix in asymptotically means and prove that the increments of sfBm are asymptotically stationary. At last, we illustrate these results with numerical experiments and give some probable application in finite impulse response filter. These researches fill the gap between the theories and applications of sfBm.
This paper is organized as follows: In Section 2, we present some preliminaries about sfBm and matrix theory. In Section 3, we give the approximation to the autocovariance matrix of dtsfBm, testify the unique time varying eigenvalue in asymptotic sense, and prove that these increments of dtsfBm are asymptotically stationary. Then, we establish the structure theorem (Theorem 17) to the autocovariance matrix of dtsfBm. In Section 4, we illustrate those results obtained in Section 3 by numerical experiments and give some probable applications. In Section 5, we come to the conclusion of the paper.
2. Preliminaries
2.1. Discrete Time Subfractional Brownian Motion
In fact, we often observe the discrete time signals. SfBm could be used to model some random conditions. This article exactly concerns the statistical properties of dtsfBm and presents some results on the structure of its autocovariance matrix. DtsfBm is defined as where and is the sampling period. For convenience, one usually lets . Since sfBm is selfsimilar, we have . Moreover, the mean value, variance, and autocovariance function of dtsfBm are given as follows:
Let be an dimensions random vector of dtsfBm; that is, In the following, we mainly study the autocovariance of the vector as follows: where and .
Obviously, sfBm is nonstationary and the autocovariance matrix is a function in .
2.2. Some Results of the Matrix Theory
In this section, we recall some useful results of the matrix theory (for details see also [27]).
Let be a matrix, are the distinct eigenvalues of , and is the minimal polynomial of with degree .
Theorem 1 (see [27, P 314]). If a matrix , a given function is defined on the spectrum of , and denote the value of the jth derivative of at the eigenvalue , then there exist component matrices which are independent of and Moreover, the matrices are linearly independent members of and commutative with and with each other.
In particular, where , the resolvent of can be expressed as
Theorem 2 (see [27, P 315]). The components of the matrix satisfy the following conditions: where, for and ,
Theorem 3 (see [27, P 319]). For and , We now present some relevant results on analytic perturbation of linear operator.
Lemma 4 (see [27, P 392]). Let be an unperturbed matrix with eigenvalues . is an matrix whose elements are analytic functions of complex number in a neighborhood of such that the eigenvalues of depend continuously on , and as for , where and the superscripts do not denote derivatives in this paper; the following results are established:
(i) if is an unrepeated eigenvalue of , then is analytic in a neighborhood of ;
(ii) if has algebraic multiplicity and for , then is an analytic function of in a neighborhood of , where and is one of the branches of the function . Thus, has a power series expansion in .
Remark 5. The perturbed eigenvalues remain real for a Hermitian matrix; the component matrices of associated with perturbed eigenvalues will be Hermitian and analytic in , whatever the multiplicity of the unperturbed eigenvalues may be. The eigenvectors of are analytic and orthonormal throughout a neighborhood of .
Lemma 6 (see [27, P 396]). Let be a matrix that is analytic in on a neighborhood of , and . Let be an unrepeated eigenvalue of with index one; then, for sufficiently small , there is an eigenvalue of such that Moreover, there are right and left eigenvectors and , respectively, associated with for which
Remark 7 (see [27, P 399]). (I) The firstorder perturbation coefficients are as follows: where the matrix is defined as (II) In particular, if the perturbation of is linear in , that is, , then the perturbation coefficients for all orders are given by
Lemma 8 (see [27, P 402]). Let be a matrix that is analytic in on a neighborhood of , and . is an eigenvalue of with index one and multiplicity ; then the eigenvalue splits into for sufficiently small . Let be the eigenvalues of and . Then there is a number and a positive integer such that where is an eigenvalue of , and with being a simple closed contour that encircles and no other eigenvalues of or .
Moreover, there is at least one eigenvector which corresponds to each such that with and .
Lemma 9 (see [27, P 403]). If is an eigenvalue of with index one, then there exists such that where is an eigenvalue of with right eigenvector .
3. Main Results
In this section, we study the autocovariance matrix of dtsfbm and give the approximation theorem of the matrix. At the same time, we give an expression to the unique time varying eigenvalue of the autocovariance matrix in asymptotic means (i.e., for large enough ). Though the increments of sfBm are nonstationary, we show that the increments of dtsfBm are asymptotically stationary and obtain a structure theorem about the autocovariance matrix.
Theorem 10 (approximation theorem). Let be a random vector of length of dtsfBm (see (7)) and let be autocovariance matrix of the vector ; then can be approximated as for large enough , and where , ,
Moreover, for a positive , if the normalized approximation error is where is the Frobenius norm, then
Proof. By virtue of Taylor theorem, we have For , , the terms such as on the righthand side of (32) are negligible for large enough .
Notice that and are the largest and smallest term in the autocovariance , respectively; is strictly increasing in indexes and . The normalized approximation error is defined as (30); that is, where . Therefore, the normalized approximation error decays in .
Given is the upper bound of error , we can obtain the minimum value of as follows: and tends to zero as .
Thus, for large enough , the approximate matrix of autocovariance matrix can be shown as follows: where and .
That is, which completes the proof.
Remark 11. The matrix has two different eigenvalues, with index one and multiplicity , and with index one and multiplicity one and the corresponding eigenvector . And, the minimal polynomial of is .
Proposition 12. Let , , , , be the same as Theorem 10 and is the unique nonzero eigenvalue of associated with the eigenvector . is a small perturbation to matrix ; then denote the perturbed eigenvalue of as follows: (i)when ,(ii)when ,
Proof. In terms of Theorem 10, and then the perturbation of is linear in with . By virtue of (19), the corresponding firstorder perturbation in is that is, Notice that the component matrices and of corresponding to and , respectively, can be written as By (20) and (21), the firstorder perturbation in the eigenvector is given as follows: where ; then By (22), the secondorder perturbation coefficient can be expressed as that is, If , and , as ; if , and , as . Therefore,(1)if ,(2)if , The third and other higher order perturbation in are sufficiently small for large enough and the quantity of (49) is also sufficiently small; then denote the perturbed as follows: (i)when ,(ii)when ,Submitting , into (51) and (52), the proof is completed.
Proposition 13. Assume the autocovariance matrix where is an orthogonal matrix and is a diagonal matrix; then can be approximated as for , where is characterized as (28) and is a diagonal matrix corresponding to .
Proof. Since approaches to for large enough , we have The remaining is obvious, so we omit it.
Proposition 14. Suppose that is given the same as Proposition 12; then has a unique time varying eigenvalue in asymptotic means (i.e., for large enough ) as follows: ()when ,()when ,
Proof. By virtue of Theorem 10 and Proposition 12, it is obvious that is the eigenvalue of . Now we prove that it is unique time varying eigenvalue in asymptotic means (i.e., for large enough ). Firstly, we study the effect of perturbation on the eigenvalue of which is index one and multiplicity . According to Lemma 8, can be split into for sufficiently small , , and is an eigenvalue of , , as , .
Thus, we express the eigenvalues of as where .
By (40) and (43), Notice that the matrix is a constant matrix independent of ; then is an eigenvalue of constant matrix and is also independent of . On the other hand, the rank of is while is full rank; then which means that there are eigenvalues of index one for Hermitian matrix . According to Lemma 9, there exists a number corresponding to such that and is an eigenvalue of with right eigenvector , where and .
By (21), (40), (43), and Theorem 10, we have and Here, it is indifferent to designate the unique nonzero eigenvalue of as . It is obvious that (for ).
Therefore, we obtain For large enough , we have Since the eigenvalues are independent of time in the asymptotic sense, decreases with the increasing of and it can be considered as independent of time; then we simply denote in above formula by , and the diagonal matrix associated with is expressed as follows: The proof is completed.
Remark 15. In view of Theorem 10 and Propositions 13 and 14, we can conclude that the unique eigenvalue of is time varying in asymptotic sense (i.e., for large enough ).
Proposition 16. Let be the increment of dtsfBm and let be the autocovariance matrix of . Then is an asymptotic stationary process (i.e., for large enough ) and the approximate autocovariance matrix of vector is simultaneously diagonalizable with in asymptotic sense (i.e., for large enough ).
Proof. It is easy to know that Considering (6), we have where . By (32), for large enough , According to Theorem 10, we obtain where is the same as in Theorem 10 and Hence, we show that the autocovariance matrix is independent of large enough time index ; that is, is an asymptotic stationary process.
Denote for simplicity. Assume that the diagonalization of is as follows:where is a constant orthogonal matrix independent of . For , it is easy to verify that which means that the matrices and can be simultaneously diagonalized in asymptotic sense.
Now, on the basis of the above discussion, we receive the second main result.
Theorem 17 (structure theorem). The autocovariance matrix can be diagonalizable as and be approximated as , where
4. Simulation and Illustration
In this section, we simulate the eigenvalues and eigenvectors of the actual autocovariance matrices with different and by numerical experiment. The relative errors of time varying eigenvalues are given. Moreover, we analyze certain interesting behaviors of the autocovariance matrices (the data of Tables 1–7 are obtained by Maple; for details see Tables 1–7).
