Abstract

We study strong limit theorems for hidden Markov chains fields indexed by an infinite tree with uniformly bounded degrees. We mainly establish the strong law of large numbers for hidden Markov chains fields indexed by an infinite tree with uniformly bounded degrees and give the strong limit law of the conditional sample entropy rate.

1. Introduction

The impact of the theory of hidden Markov models (HMMs) has become widespread for modeling sequences of dependent random variables during the last two decades. Hidden Markov models have many applications in a wide range of areas, such as speech recognition (Rabiner (1998)) [1], image processing [2], DNA sequence analysis (see, e.g., [3, 4]), DNA microarray time course analysis [5], and econometrics [6, 7]. For a good review of statistical and information-theoretic aspects of hidden Markov processes (HMPs), please see [8]. In recent years, the work of Baum and Petrie [9] on finite-state finite-alphabet HMMs has been extended to HMM with finite as well as continuous state spaces and a general alphabet. In particular, statistical properties and ergodic theorems for relative entropy densities of HMMs were developed, and consistency and asymptotic normality of the maximum-likelihood (ML) parameter estimator were proved under some mild conditions [912].

In this paper we want to expand tree-indexed homogeneous Markov chain fields to hidden Markov chain fields indexed by an infinite tree.

A tree is a graph which is connected and contains no loops. Given any two vertices , let be the unique path connecting and . Define the graph distance to be the number of edges contained in the path .

Let be an infinite tree with root 0. The set of all vertices with distance from the root is called the th generation of , which is denoted by . We denote by the union of the first generations of . For each vertex , there is a unique path from to and for the number of edges on this path. We denote the first predecessor of by , the second predecessor of by , and the th predecessor of by . The degree of a vertex is defined to be the number of neighbors of it. In this paper, we mainly consider an infinite tree which has uniformly bounded degree; that is, the numbers of neighbors of any vertices in this tree are uniformly bounded. For any two vertices and of tree , write if is on the unique path from the root to . We denote by the vertex farthest from satisfying and . and denote by the number of vertices of .

When the context permits, this type of trees is all denoted simply by .

Definition 1 (homogeneous Markov chains indexed by tree (see [13, 14])). Let be an infinite tree with uniformly bounded degrees, let be a finite state space, and let be a stochastic process defined on probability space , taking values in the finite set . Let be a distribution on , and let be a transition probability matrix on . If, for any vertex , Thus will be called -valued homogeneous Markov chains indexed by infinite tree with the initial distribution (1) and transition probability matrix whose elements are determined by (3).

Definition 2. Let be an infinite tree with uniformly bounded degrees and let and be two stochastic processes on a probability space with finite state spaces and , respectively. Let and be two stochastic matrices on and , respectively. Suppose If for any vertex , Moreover, we suppose where Then will be called -valued hidden Markov chain indexed by an infinite tree , or called tree-indexed hidden Markov chain taking values in the finite set .

Remark 3. If we sum over in (6) and take conditional expectations with respect to on both sides of the result equation, we can easily arrive at (3). In Definition 2, We can also call the processes and , respectively, to be state process and the observed process indexed by an infinite tree.

2. Two Useful Lemmas

Let be an infinite tree with uniformly bounded degrees and let be -valued hidden Markov chains indexed by . Let be functions defined on . Let be a real number, , , and now we define a stochastic sequence as follows: At first we come to prove the following fact.

Lemma 4. is a nonnegative martingale.

Proof of Lemma 4. Obviously, by (6), the process is a tree-indexed Markov chain with state space , so that we have Then we have here the second equation holds because of (10). Furthermore, we have
On the other hand, we also have Combining (13) and (14), we get Thus we complete the proof of Lemma 4.

Lemma 5. Let be -valued hidden Markov chains indexed by an infinite tree with uniformly bounded degrees. are functions defined as above; denote Let , be a sequence of nonnegative random variables. Denote Then

Proof. By Lemma 4, we have known that is a nonnegative martingale. According to Doob martingale convergence theorem, we have which implies that Combining (9), (18), and (21), we arrive at Let . Dividing two sides of the above equation by , we get For case , by using inequalities , , and (23) Letting in (24), combining with (16), we have Let . Similarly to the analysis of the case , it follows from (22) that Letting , we can arrive at Combining (25) and (27), we obtain (19) directly.

Corollary 6. Let be -valued hidden Markov chains indexed by an infinite tree with uniformly bounded degrees. If are the uniformly bounded functions defined on , let be nonnegative integer. Then

Proof. Letting in Lemma 5, noticing that are uniformly bounded, then for all . This theorem follows from Lemma 5 directly.

3. Strong Law of Large Numbers

In the following, we always let , , , and denote here and thereafter denotes the Kronecker function. The following lemma is very useful for proving our main result.

Lemma 7 (see [14]). Let be an infinite tree with uniformly bounded degrees and let be a tree-indexed Markov chain with finite state space , which is determined by initial distribution (1) and finite transition probability matrix . Suppose that the stochastic matrix is ergodic, whose unique stationary distribution is ; that is, and . Let be defined as (30). Thus one has

Theorem 8. Let be an infinite tree with uniformly bounded degrees and let be -valued hidden Markov chains indexed by . For all nonnegative integer , define the following weighted empirical measure of triples : If the transition probability matrix of is ergodic, one has where is the stationary distribution of the ergodic matrix .

Proof. For any , let Then we have Combining (35) and Corollary 6, we obtain By using Lemma 7, our conclusion (33) holds.

Corollary 9. Under the conditions of Theorem 8, denote ; then one has where is the stationary distribution of the ergodic matrix .

For every finite , let be -valued hidden Markov chains indexed by an infinite tree with uniformly bounded degrees. We define the offspring empirical measure as follows:

In the following, we consider the limit law of the random sequence of which are defined as above.

Theorem 10. Let be an infinite tree with uniformly bounded degrees, and let be -valued hidden Markov chains indexed by . If the transition probability matrix of is ergodic, then where is the stationary distribution of the ergodic matrix .

Proof. Letting , we have Comparing (38) with (40), it is easy to see Taking limit on both sides of above equation as tends to infinity, it follows from Corollary 9 that where the last equation holds because is unique stationary distribution of the ergodic stochastic matrix ; that is, . Thus we complete the proof of Theorem 10.

From the expression of (38) we can easily obtain the empirical measure of the observed chain which is denoted by , Thus we can obtain Corollary 11.

Corollary 11. Under the same conditions of Theorem 10, one has

Let be any function defined on . Denote By simpe computation, we arrive at Corollary 12.

Corollary 12. Under the same conditions of Theorem 10, one also has

Now we define conditional entropy rate of given by From (6), we obtain that

The convergence of to a constant in a sense ( convergence, convergence in probability, a.e. convergence) is called the conditional version of Shannon-McMillan theorem or the entropy theorem or the AEP (asymptotic equipartition property) in information theory. Here from Corollary 12, if we let we can easily obtain the Shannon-McMillan theorem with a.e. convergence for conditional entropy theorem of hidden Markov chain fields on tree .

Corollary 13. Under the same conditions of Theorem 10, one has Here one also specifies as zero by convention.

4. Conclusion

This paper gives some strong limit theorems for hidden Markov chains fields indexed by an infinite tree with uniformly bounded degrees. We study the strong law of large numbers for hidden Markov chains fields indexed by an infinite tree with uniformly bounded degrees and give the strong law of the conditional sample entropy rate.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

Acknowledgment

This work was supported by National Natural Science Foundation of China (Grant no. 11201344).