#### Abstract

We study the large deviations and moderate deviations of hypothesis testing for squared radial Ornstein-Uhleneck model. Large deviation principles for the log-likelihood ratio are obtained, by which we give negative regions in testing squared radial Ornstein-Uhleneck model and get the decay rates of the error probabilities.

#### 1. Introduction

Let us consider the hypothesis testing for the following squared radial Ornstein-Uhleneck model: where is the unknown parameter to be tested on the basis of continuous observation of the process on the time interval [0,T], is a standard Brownian motion and, is known. We denote the distribution of the solution (1.1) by .

We decide the two hypothesis: where . The hypothesis testing is based on a partition of of the outcome process on into two (decision) regions and its compliment , and we decide that is true or false according to the outcome or .

The probability of accepting when is actually true is called the error probability of the first kind. The probability of accepting when is actually true is called the error probability of the second kind. That is, By the Neyman-Pearson lemma (cf. [1]), the optional decision region has the following form: where is the -algebra generated by the outcome process on .

The research of hypothesis testing problem has started in the 1930s (cf. [1]). Since the optional decision region has the above form, we are interested in the calculation or approximation of the constant , and the hypothesis testing problem can be studied by large deviations (cf. [2â€“6]). In those papers, some large deviation estimates of the error probabilities for some i.i.d. sequences, Markov chains, stationary Gaussian processes, stationary diffusion processes, Ornstein-Uhlenbeck processes are obtained. In this paper, we study the large deviations and moderate deviations for the hypothesis testing problem of squared radial Ornstein-Uhleneck model; by large deviation principle, we obtain that the decay of the error probability of the second kind approaches to 0 or 1 exponentially fast depending on the fixed exponent of the decay of the error probability of the first kind; we also give negative regions and get the decay rates of the error probabilities by moderate deviation principle. The large and moderate deviations for parameter estimators of squared radial Ornstein-Uhleneck model were studied in [7, 8].

#### 2. Main Results

In this section, we state our main results.

Theorem 2.1. *Let be a positive function satisfying
**
For any , set
**
Then
*

Theorem 2.2. *If , then for each , there exists a , such that
**
and when ,
**
when ,
**
where
*

Theorem 2.3. *If , then for each , there exists a , such that
**
and when ,
**
when ,
**
where
*

#### 3. Moderate Deviations in Testing Squared Radial Ornstein-Uhleneck Model

In this section, we will prove Theorem 2.1. Let us introduce the log-likelihood ratio process of squared radial Ornstein-Uhleneck model and study the moderate deviations of the log-likelihood ratio process.

By [7], the log-likelihood ratio process has the representation

The following Lemma (cf. [9]) plays an important role in this paper.

Lemma 3.1. *The law of under is , where denotes the Gamma distribution:
**
Moreover, for any ,
*

Lemma 3.2. *For any closed subset ,
**
and for any open subset ,
*

*Proof. *Let
By (3.1), for any , we have
where
For large enough, , we can choose , then and

By Lemma 3.1, we have
Therefore,
Finally, the GĂ¤rtner-Ellis theorem (cf. [10]) implies the conclusion of Lemma 3.2.

Noting that we also have the following result.

Lemma 3.3. *For any closed subset ,
**
and for any open subset ,
*

*Proof of Theorem 2.1. *The first claim is a direct conclusion of Lemma 3.2. Since
by Lemma 3.3, we see that the second one also holds.

#### 4. Large Deviations in Testing Fractional Ornstein-Uhleneck Model

In this section, we will prove Theorems 2.2 and 2.3. We first study the large deviations of the log-likelihood ratio process.

Lemma 4.1. *Assume . Then for any closed subset ,
**
and for any open subset ,
**
where
*

*Proof. *Let
Then for , we have
where .

Since , for , we can choose , for each ; then and
By Lemma 3.1, we get
Therefore,

Since is a strictly convex differentiable function on with
where is the effective domain of , we see that is steep. Finally, by
and GĂ¤rtner-Ellis theorem, we complete the proof of this lemma.

Similarly, when , we have Since is a strictly convex differentiable function on with and , we can see that is steep. By GĂ¤rtner-Ellis theorem, we also have the following result.

Lemma 4.2. *Assume . Then for any closed subset ,
**
and for any open subset ,
**
where
**
Note that
**
Then we have the following Lemma.*

Lemma 4.3. *Assume . Then for any closed subset ,
**
and for any open subset ,
**
where
*

Lemma 4.4. *Assume . Then for any closed subset ,
**
and for any open subset ,
**
where
*

By the expression of , , , and , the following lemma is.

Lemma 4.5. *
(i)**
for all ,
**
(ii)**
for all ,
*

*Proof of Theorems 2.2 and 2.3. *Since the proofs of the two theorems are similar, we only prove Theorem 2.2. Since is increasing on and , Therefore, for , by Lemma 4.1, we can choose a such that

It is clear that is increasing for , and by Lemma 4.5, we get , which implies . Hence for , we have , and since is nonincreasing for , therefore we get
Similarly, for , we have , and since is nondecreasing for , therefore we get
which complete the proof of Theorem 2.2.

#### Acknowledgments

The authors would like to express their gratitude to Professor F. Q. Gao and the reviewer for their valuable comments.