Abstract

This paper proposes the least squares method to estimate the drift parameter for the stochastic differential equations driven by small noises, which is more general than pure jump -stable noises. The asymptotic property of this least squares estimator is studied under some regularity conditions. The asymptotic distribution of the estimator is shown to be the convolution of a stable distribution and a normal distribution, which is completely different from the classical cases.

1. Introduction

Stochastic differential equations (SDEs) are being extensively used as a model to describe some phenomena which are subject to random influences; it has found many applications in biology [1], medicine [2], econometrics [3, 4], finance [5], geophysics [6], and oceanography [7]. Then, statistical inference for these differential equations was of great interest and became a challenging theoretical problem. For a more recent comprehensive discussion, we refer to [8, 9].

The asymptotic theory of parametric estimation for diffusion processes with small white noise based on continuous time observations is well developed and it has been studied by many authors (see, e.g., [1014]). There have been many applications of small noise in mathematical finance; see, for example, [1518].

In parametric inference, due to the impossibility of observing diffusions continuously throughout a time interval, it is more practical and interesting to consider asymptotic estimation for diffusion processes with small noise based on discrete observations. There are many approaches to drift estimation for discretely observed diffusions (see, e.g., [1923]). Long [24] has started the study on parameter estimation for a class of stochastic differential equations driven by small stable noise . However, there has been no study on parametric inference for stochastic processes with small Lévy noises yet.

In this paper, we are interested in the study of parameter estimation for the following stochastic differential equations driven by more general Lévy noise based on discrete observations. We will employ the least squares method to obtain an asymptotically consistent estimator.

Let be a basic complete filtered probability space satisfying the usual conditions; that is, the filtration is continuous on the right and contains all -null sets. In this paper, we consider a class of stochastic differential equations as follows: where and are known functions and ,  are known constants. Let be a standard Brownian motion and let be a standard -stable Lévy motion independent of , with for , .

Let be a real-valued, stationary process satisfying the stochastic differential equation (1) and we assume that this process is observed at regularly spaced time points . Assume is the solution of the underlying ordinary differential equation (ODE) with the true value of the drift parameter : Then, we get

2. Preliminaries

In this paper, we denote as a generic constant whose value may vary from place to place.

The following regularity conditions are assumed to hold:The functions and satisfy the Lipschitz conditions; that is, there exists a constant such that There exist constants and satisfying the growth condition There exists a positive constant such that . For , The LSE of is defined as where the contrast function Then the can be represented explicitly as follows: Based on (3) and (9), there is a special decomposition for Now we give an explicit expression for . By using (10), we have

One of the important tools we will employ is the underlying lemma (see in the Lemma 3.2 of [24]).

Lemma 1. Under conditions , one has

3. Asymptotic Property of the Least Squares Estimator

Theorem 2. Under the conditions , as ,  , , and , one has where and are independent random variables with -stable distribution and   is an independent random variable with standard normal distribution.

The theorem will be proved by establishing several propositions. We will consider the asymptotic behaviors of , , , respectively.

Proposition 3. Under conditions , and , , one has

Proof. Under conditions , Proposition 3 can be proved by using condition (see the proof of Proposition 3.3 in [24]).

Proposition 4. Under conditions , as , and , one has

Proof. For , , It follows that Using Gronwall inequality, we get which yields thus, under conditions and , Then, Using (13) in Lemma 1, conditions and , we get as , and (see in [24]). By using the same techniques, under condition, we can prove that ,  , as , , respectively.

Proposition 5. Under conditions , as , and , one has

Proof. Under conditions , Proposition 5 can be proved by using condition (see the proof of Proposition 4.4 in [24]).

Proposition 6. Under conditions , as , , one has

Proof. Note that For , let , . Then it is easy to see that and are independent normal random variables.
It follows that as , .
For , using Markov inequality and Ito’s isometry property, for any given , By using (13), , as .
Applying similar techniques to , , we get , , as , .

Now we can prove Theorem 2.

Proof. By using Propositions 3, 4, 5, 6 and Slutsky’s theorem, we can get the conclusion.

4. Example

We consider the following nonlinear SDE driven by general Lévy noises: where , ,   and are known constants, and is an unknown parameter.

For simplicity, let , ; we get the ODE: and the solution Then, the asymptotic distribution is

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.