Abstract

Let be a sequence of independent and nonidentically distributed random variables. We obtain a new kind of complete moment convergence for their sums under the Lyapunov condition. Moreover, our result extends and improves the corresponding result of the independent and identically distributed (i.i.d.) cases.

1. Introduction and Main Result

Let be a sequence of random variables, and . If for every , , then is said to converge to 0 completely.

Hsu and Robbins [1] proved that if is a sequence of independent and identically distributed (i.i.d.) random variables with , and , then completely.

Erdos [2, 3] proved that if is a sequence of i.i.d. random variables, then for every , holds if and only if and .

Obviously the sum tends to infinity as , and it is necessary to study the convergence rate in which this occurs; Heyde [4] proved that when and . This research direction is known as the precise asymptotics. For analogous results in more general case, we refer the reader to [514] and the references therein.

Recently, Liu and Lin [15] have introduced a new kind of complete moment convergence and obtained the following result.

Theorem A (see [15]). Suppose that is a sequence of independent and identically distributed (i.i.d.) random variables. Then holds if and only if , , and , where and .

However, the condition of identical distribution is very strong and rather difficult to verify in some real cases. The following theorem gives a sufficient condition of complete moment convergence for independent nonidentically distributed random variables.

Theorem 1. Let be independent random variables such that and , . Assume that there exists a constant such that , a.s., where . Moreover, one also assumes that the following Lyapunov condition [16, page 298] is satisfied: where . Then, one has

Remark 2. Suppose that is a sequence of independent and identically distributed (i.i.d.) random variables with , where is a constant. It is easy to verify the Lyapunov condition (3) in real applications, and so the Lyapunov condition (3) is much weaker than the identically distributed condition. Moreover, the Lyapunov condition (3) constrains the growth rate of moment.
Many sequences of independent random variables satisfy Lyapunov’s condition; here we give some examples.

Example 1. Let be a sequence of independent random variables satisfying , , , and . Suppose that , are uniformly bounded; that is, there exists a constant such that for all , and then we have which verifies that satisfies the Lyapunov condition (3).

Example 2. Let be a sequence of independent random variables, which satisfies , , and By Example 1, we know that satisfies the Lyapunov condition (3).

Remark 3. Suppose that is a sequence of independent and identically distributed (i.i.d.) random variables such that , for all , , where is a constant. Then, from Remark 2, we know that it satisfies Lyapunov’s condition. Therefore, by Theorem 1, we have Obviously, this case is the result of Liu and Lin [15]. Therefore, our condition of Theorem 1 is different from the conditions of Theorem A, and our result partly extends and improves those given in Liu and Lin [15].

2. Proof of Theorem 1

In this section, we will prove Theorem 1. We first present the following two lemmas, which play a key role in the proof of Theorem 1.

Lemma 4 (see [17]). Suppose that are independent random variables with and , where . Let , , and , where is the standard normal distribution function. If , , for some , then for every , holds.

Lemma 5 (see page 73 of [18]). Under the conditions of Lemma 4, if a.s., , where , then for every ,

Proof of Theorem 1. Similar to [15], we have
To prove Theorem 1, we only need to study and . We will divide the proof into two steps.
Step 1. We first prove the equality as follows:
In fact, it follows from Proposition 2.1.1 of [19] that By (12), we obtain To establish the equality (11), from (13) we only need to prove that
Obviously, it follows from Lemma 4 that Combining (3) and (15), we get Since it follows from Toeplitz’s lemma (page 120 of [20]) that By (18), we have
On the other hand, it follows from Lemma 5 that Noting that the inequality (20) yields By (22), we obtain Combining (19) and (23), we see that the equality (11) is satisfied.
Step 2. Next, we need to prove the following equality:
Obviously, it follows from Proposition 3.1 of [15] that To establish (24), from (25) we only need to prove
Letting and , we apply Lemma 4 to obtain
If , then it follows from (3) that Hence, by (27) and (28), we have that for , the following holds:
Noting the fact that the weighted average of a sequence that converge to 0 also converges to 0, we have and so
If , it follows from (8) and (3) that Obviously, by (32), we get Combining (31) and (33), we have which implies that (24) is satisfied.
Therefore, from (11) and (34), we see that (4) is true. This completes the proof of Theorem 1.

Conflict of Interests

The authors declare that they have no competing interests.

Acknowledgments

The authors are thankful to the referees for their many valuable suggestions. This work was jointly supported by the National Natural Science Foundation of China (61374080), The Natural Science Foundation for Colleges and Universities of Jiangsu Province (14KJB110025), the Natural Science Foundation of Zhejiang Province (LY12F03010), the Natural Science Foundation of Ningbo (2012A610032), the Postgraduate Innovation Projection of Jiangsu University (CXLX12-0652), Youth Foundation of Xuzhou Institute of Technology (XKY2012301), and a Project Funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions.