Abstract

In this paper, we consider a quantitative fourth moment theorem for functions (random variables) defined on the Markov triple , where is a probability measure and is the carré du champ operator. A new technique is developed to derive the fourth moment bound in a normal approximation on the random variable of a general Markov diffusion generator, not necessarily belonging to a fixed eigenspace, while previous works deal with only random variables to belong to a fixed eigenspace. As this technique will be applied to the works studied by Bourguin et al. (2019), we obtain the new result in the case where the chaos grade of an eigenfunction of Markov diffusion generator is greater than two. Also, we introduce the chaos grade of a new notion, called the lower chaos grade, to find a better estimate than the previous one.

1. Introduction

The aim of this paper is to find the fourth moment bound in the normal approximation of a random variable related to a general Markov diffusion generator. A central limit theorem, known as the fourth moment theorem, was first discovered in [1] by Nualart and Peccati, where the authors found a necessary and sufficient condition such that a sequence of random variables, belonging to a fixed Winer chaos, converges in distribution to a Gaussian random variable.

Throughout this paper, we use the mathematical expectation for the integral on a probability space , so that, for example, the integral is denoted by , and a real-valued measurable function defined on a probability space will be called a random variable. Also, we define the variance of a random variable in as

Theorem 1 (fourth moment theorem). Fix an integer , and consider a sequence of random variables belonging to the th a Wiener chaos with for all . Then, if and only if , where is a standard Gaussian random variable and the notation denotes the convergence in distribution.
Such a result constitutes a dramatic simplification of the method of moments and cumulants. In the paper [2], the fourth moment theorem is expressed in terms of Malliavin derivative. However, the results given in [1, 2] do not provide any estimates, whereas the authors in [3] prove that Theorem 1 can be recovered from the estimate of the Kolmogorov (or total variation, Wasserstein) distance obtained by using the techniques based on the combination between Malliavin calculus (see, e.g., [46]) and Stein’s method for the normal approximation (see, e.g., [79]). Also, we refer to the papers [3, 4, 1013] for an explanation of these techniques.
For estimates for a normal approximation, we consider the Kolmogorov distances of the type where is a standard Gaussian random variable. The following statement is the remarkable achievement of Nourdin-Peccati [3] approach (see Theorem 3.1 in [3]). Let be such that and . Then, the following bound holds: where . The notations , , and , related to Malliavin calculus, are explained in [5] or [6]. In the particular case where is an element in the th Wiener chaos of with , the upper bound in (3) is given by Here, is just the fourth cumulant of .

Recently, the author in [14] proves, from a purely spectral point of view, that the fourth moment theorem also holds in the general framework of Markov diffusion generators. Precisely, under a certain spectral condition on a Markov diffusion generator, a sequence of eigenfunctions of such a generator satisfies the bound given in (4) with a different constant. In particular, this new technique avoids the use of complicated product formula. The authors in [15] introduce a Markov chaos of eigenfunctions being less restrictive than the notion of Markov chaos defined in [14] and also obtain the quantitative four moment theorem for convergence of the eigenfunctions towards Gaussian, gamma, and beta distributions. Furthermore, Bourguin et al. in [16] prove that convergence of the elements of a Markov chaos to a Pearson distribution can be still bounded with just the first four moments of the form where is a suitable distance, is a random variable with the law belonging to the Pearson family, and is a chaotic random variable defined on . Here, and in (5) are polynomials of degree four, and the constants and are determined in terms of a chaos grade defined in Definition 3.5 of [16].

In this paper, we find a bound of the form for the Kolmogorov distance between a stand Gaussian random variable and a random variable defined on related to a Markov diffusion generator with an invariant measure . Since usually the central limit theorem (normal approximation) is a main topic of convergence in distribution, we confine our interest in a normal approximation. In the line of this research, the motivations and contributions of our work in comparison with other works will be summarized below: (i)Compared to previous works [1416], our studies are not limited to an element belonging to a fixed eigenspace of Markov chaos. Our result is a remarkable extension in comparison with other works dealing with only random variables in a fixed eigenspace. To achieve our goal, the starting point is the following boundwhere is the pseudo-inverse of the underlying Markov generator and is the carré du champ operator, which is the result of the study in [16]. However, in order to find the fourth moment bound (6), we introduce a new technique relying on two types of the operators given in [17, 18]. First, we prove that the right-hand side of (7) can be represented as the sum of two integrals with the operators mentioned above, and use this representation to prove that the fourth moment bound (6) holds. This is the innovation point of this work (ii)If the upper chaos grade of is strictly greater than two, then, the constant and in the bound (5) are given as follows: and . This fact means that the fourth moment theorem in Theorem 1 is not working. However, applying the technique developed in this paper can eliminate the second term in (5), which means that, in such a random variable , the fourth moment theorem holds unlike the previous result in [16], where the upper bound (5) for a sequence of chaotic random variables is given in (111) of Remark 12(iii)In this paper, another notion of chaos grade, called a lower chaos grade, is introduced and used to provide a better estimate than the previous one obtained from (5) in [16]. Throughout this paper, the existing chaos grade in Definition 3.5 of [16] will be called an upper chaos grade

The rest of the paper is organized as follows: Section 2 introduces some basic notations and reviews the results of Markov diffusion generator. In Section 3, a new notion of chaos grade in a finite sum of Markov chaos is defined, and the orthogonal polynomials will be considered in order to illustrate the concept on chaos grades. In Section 4, our main result is covered in Theorem 8. Finally, as an application of our main result, in Section 5, we derive upper bounds in the Kolmogorov distance for an eigenfunction belonging to a fixed Markov chaos.

2. Preliminaries

In this section, we recall some basic facts about Markov diffusion generator. The reader is referred to [19] for a more detailed explanation. We begin by the definition of Markov triple in the sense of [19]. For the infinitesimal generator of a Markov semigroup with -domain , we associate a bilinear form . Assume that we are given a vector space of such that for every of random variables defined on a probability space , the product is in ( is an algebra). On this algebra , the bilinear map (carré du champ operator) is defined by for every . As the carré du champ operator and the measure completely determine the symmetric Markov generator , we will work throughout this paper with Markov triple equipped with a probability measure on a state space and a symmetric bilinear map such that .

Next, we construct the domain of the Dirichlet form by completion of and then obtain, from this Dirchlet domain, the domain of . Recall the Dirchlet form as

If is endowed with the norm the completion of with respect to this norm turns it into a Hilbert space embedded in . Once the Dirchlet domian is constructed, the domaion is defined as all elements such that, for all , where is a finite constant only depending on . On these domains, a relation of and holds, namely, the integration by parts formula

By the integration by parts formula (12) and , the operator is nonnegative and symmetric, and therefore, the spectrum of is contained .

A full Markov triple is a standard Markov triple for which there is an extended algebra , with no requirements of integrability for elements of , satisfying the requirements given in Section 3.4.3 of [19]. In particular, the diffusion property holds: for any function and ,

We also define the operator , called the pseudoinverse of , satisfying for any ,

3. Chaos Grade and Orthogonal Polynomials

3.1. Chaos Grade

Fix a probability space . We assume that has a discrete spectrum . Obviously, the zero is always an eigenfunction such that . By the assumption on the spectrum of , one has that

Now, we define chaotic random variables as follows.

Definition 2. Suppose that , where and for . The random variable is called chaotic if there exist and such that and satisfy

In this case, the largest number satisfying (16) is called the lower chaos grade of . On the other hand, the smallest number satisfying (16) is called the upper chaos grade of .

Remark 3. (1)The authors in [16] define the chaos grade of , corresponding to the upper chaos grade in Definition 2, in the case where is an eigenfunction with respect to an eigenvalue of the generator . In this paper, we introduce the lower chaos grade of , which will be used to obtain a better estimate for the four moments theorem than the estimate given in Theorem 3.9 of [16] in the particular case where the target distribution is a standard Gaussian distribution(2)Let for . If is a chaotic random variable, then , , can be expanded as a sum of eigenfunctions with the eigenvalue of the largest magnitude and the eigenvalue of the smallest magnitude , i.e.,From (17), it follows that where and . (3)In the paper [16], the authors describe how the a chaos grade, corresponding to the upper chaos grade in our works, behaves under tensorization. Let be a generator with invariant measure . Define a generator byIf , then is an eigenfunction of with eigenvalue . Suppose that , , has the lower chaos grades . Then, the lower chaos grade of is given by

See Corollary 4.1 in [16] for the upper chaos grade of .

Next, we consider a finite dimensional eigenfunction and a finite sum of eigenfunctions to illustrate the concept on chaos grades.

3.2. Ornstein-Uhlenbeck Operator

We consider the -dimensional Ornstein-Uhlenbeck generator , defined for any test function by action on , where

For a multi-index , we define a -dimensional Hermite polynomial by

We write , , and if for all . Let us set where , , denotes the Hermite polynomial of order . Then, we have that . By the well-known product formula of Hermite polynomials and a change of variables, we have that where

Since , can be expanded as a sum of eigenfunctions with the smallest eigenvalue, among positive eigenvalues, being given by

Obviously, , so that the lower chaos grade is . On the other hand, the largest eigenvalue in the expansion of , as a sum of eigenfunctions, is given by . Therefore, the upper chaos grade is given by .

If , where , then the lower and upper chaos grades are given, respectively, by

3.3. Jacobi Operator

For , we consider the -dimensional Jacobi generator , defined for any test function by action on , where

Its spectrum is of the form where . Let us set where , , denotes the th Jacobi polynomial being given by

Recall that denotes the generalized hypergeometric function with numerator and denominator, given by where

Then the Jacobi polynomials can be expressed as

The well-known product formula of Jacobi polynomials yields that where

First, observe that

It follows, from (39), that for any indices and such that and , where the notation denotes the degree of . Successive applications of the arguments for (39) yield that

For any another index such that and , we have, from (41), that so that

Let . Now, we find a point yielding the maximum value of under the restriction given by (43). Obviously, Lagrange’s method shows that where is a Lagrange multiplier. Plugging into (43) yields that , so that for all . Therefore, it follows, from (40), that

Hence, the upper chaos grade is given by

Next, we find the lower chaos grade of . From (37), the square of can be expanded as a sum of eigenfunctions with the smallest eigenvalue, among positive eigenvalues, being given by

When for and for , the sum has the minimum value. Hence, , so that the lower chaos grade is given by .

4. Fourth Moment Theorem

In this section, we derive an upper bound on Kolmogorov distance , where , not necessarily belonging to a fixed eigenspace, is a random variable related to Markov diffusion generator , and is a standard Gaussian random variable.

4.1. Lemmas

We begin with stating a useful lemma, which is going to be frequently used in this section. Lemmas, which appeared in this section, are well-known in the particular case where is a functionals of Gaussian random fields.

Lemma 4. Let and . Then, we have

Proof. Since , we have that and . Hence, all expectations in equation (48) are well defined. By the integration by parts formula (12) and (14), we have that This gives the desired result.

Now, we extend the techniques developed in [20] in the case of a functional of Gaussian fields to a random variable belonging to . Let . Define and . If , , is a well-defined element in , we set

Similarly, let ; we define and . If for fixed ,

Lemma 5. Suppose that , , with . Then, we have

Proof. Observe that . Using the definition of and Lemma 4, we have that The diffusion property and Lemma 4 yield, from (53), that The carré champ operator and the integration by parts formula prove that the right-hand side of (54) can be computed as Again, using diffusion property, the first and third expectations in (55) can be represented as and similarly, Plugging (56) and (57) into (55) yields that the equality (52) holds.

Next, we investigate the relation between and .

Lemma 6. Suppose that , , with . We have

Proof. By the definition of the operator and the carré du champ operator , we have that Using the diffusion property, the first expectation in (59) can be written as Lemma 4 shows that the second expectation in (59) can be computed as The diffusion property shows that the third expectation in (59) can be represented as follows: Plugging (60), (61), and (62) into (59) yields that From Lemma 4 and (63), it follows that The above result (64) deduces that

Hence, the desired result follows.

4.2. Fourth Moment Theorem

Let us define a set

Lemma 7. For any random variable related to a Markov diffusion generator such that , one has that .

Proof. Define . Assumption and Lemma 6 yield that . If , then and by Lemma 6. Hence, there exists such that . If and , then , so that there exists a constant such that . On the other hand, if and , then , which implies that we can find a constant such that . Obviously, combining the above results proves that .

Theorem 8 (fourth moment bound). Suppose that , , with and . If the law of is absolutely continuous with respect to the Lebesgue measure, there exists a constant such that

Proof. By Stein’s equation, we have that, for , where is a solution of Stein’s equation. Since , we have that . Therefore, by the integration by parts formula (12) and the derivation of , the right-hand side of (68) can be written as Since , we have, from Lemma 5 and Lemma 7 together with (69), that

Remark 9. (1)If , then we see, from the proof of Lemma 7, that , and hence, the value in the square root in (67) is of positive. On the other hand, if , then . This means that the value in the squre root of upper bound also is of positive(2)If , then, by (70), is a random variable having a standard Gaussian distribution. Conversely, suppose that is a random variable having a standard Gaussian distribution. Then, we have that and . Hence,

As far as we know, the following is the first result of the quantitative fourth moment theorem for a random variable belonging to a sum of Wiener chaoses.

Corollary 10. Let for , where and are Hermite polynomials of order and , respectively. Then, one has that where the lower chaos grade of is given by

Proof. We compute the expectation . First, note that Let us set Obviously, the well-known product formula and the definition of carré du champ operator prove that for , Direct computations yield, from (73) and (75), together with , that From the right-hand side of (76), it follows that On the other hand, since , we get so that From (77) and (79), we have that Obviously, when , we have Similarly, if , then Therefore, if , then for , so that By a similar computation as for (77), one has that On the other hand, Therefore, one has, from (84) and (85), that Using the same arguments as for the case of , , yields that , , for . This implies that Similarly, Combining the above results (83), (87), (88), and (89), we can show that Hence, the proof of this corollary is completed.

5. Markov Chaos

In this section, as an application of Theorem 8, chaotic random variables such that will be considered.

Theorem 11. Let be a chaotic eigenfunction of with respect to eigenvalue with and . Suppose that has an upper chaos grade and a lower chaos grade . (a)If , one has that(b)If and , then there exist and such that(c)If , and , then there exist and such that(d)If , and , then there exist and such that

Proof. We compute and . By the definition of , we have On the other hand, by using Lemma 4, we have that We denote by the projection of on . From (16) in Definition 2, we know that has a chaos decomposition of the form where and . By orthogonality, it follows, from (95) and (96), that The proof of : since , it is obvious, from (98) and (99), that If , then we have, from (100), that . Obviously, . This fact implies that there exists such that . Application of Theorem 8 gives the desired result (91).
The proof of : first note that since , If , there exist and , from (102), such that Now, we take so that the right-hand side of (103) is equal to . Since , we have that This inequality (105) shows that , where is given by (104). Hence, applying Theorem 8 yields the bound (92).
The proof of : since and , there exist and , by the similar estimate as for the case of (b), such that Taking yields that the right-hand side of (106) is equal to . Since and , the constant given in (107) belongs to . Application of Theorem 8 proves the bound (93).
The proof of : since and , there exist and such that The right-hand side of (108) is equal to when one takes Note that This estimate (110) shows that given in (109) belongs to from which we obtain the bound (93).

Remark 12. Suppose that the target distribution in Theorem 4.9 of [16] is a standard Gaussian measure. If a chaotic random variable , with and , satisfies , then the bound (3.7) in [16] becomes

Even when the fourth moment of in the first term of (111) converges to , the sequence does not converge, in distribution sense, to a standard Gaussian random variable , due to the second term in (111). It means that the fourth moment theorem of Theorem 1 is not working. The bounds in (b), (c), and (d) of Theorem 11 show that the fourth moment theorem holds by removing the second term in (111) even if an upper chaos grade is strictly greater than 2.

Remark 13. The Ornstein-Uhlenbeck operator on is then given by The carré du champ operator is the usual gradient operator . In infinite-dimensional setting, the infinite-dimensional Ornstein-Uhlenbeck generator on Wiener space can be obtained with Wiener measure as an invariant distribution. If is an element belonging to a fixed Wiener chaos with order , i.e, , , the well-known product formula of multiple stochastic integrals gives that where is the contraction of the kernel . This expansion of the square of shows, from the definition of the chaos grade, that and . In this case , it follows, from the above bound (91), that In this specific case, the above bound (114) has been applied to obtain the Berry–Esséen bound for parameter estimation of fractional Ornstein-Uhlenbeck processes (see [21]). Furthermore, the authors of [20] derive a bound for the form to obtain an optimal Berry-Esséen bound for parameter estimation of stochastic partial differential equation.

6. Conclusions

In this paper, we derive the fourth moment bound in a normal approximation on the random variable of a general Markov diffusion generator. Significant features of our work are that (a) it provides a better estimate than the previous one; (b) it contains square integrable random variables unlike all previous works in this research line, where the authors deal with only random variables to belong to a fixed eigenspace (or Wiener chaos); and (c) unlike the previous result, the fourth moment theorem holds even if the upper chaos grade is strictly greater than 2. Future research plans will study whether our methods are applicable in other (non-Gaussian) target measures (for example, a bound for gamma or beta approximation).

Data Availability

There is no data used for this research.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This research was supported by Hallym University (HRF-202008-014).