Table of Contents Author Guidelines Submit a Manuscript
Journal of Probability and Statistics
Volume 2017, Article ID 6219149, 14 pages
https://doi.org/10.1155/2017/6219149
Research Article

Estimation of the Parameters of a Chirp Type Model with Stationary Residuals

Department of Engineering Mathematics, Faculty of Engineering, University of Peradeniya, Peradeniya, Sri Lanka

Correspondence should be addressed to K. Perera; kl.ca.ndp@pihtnak

Received 19 July 2016; Accepted 1 December 2016; Published 9 February 2017

Academic Editor: Aera Thavaneswaran

Copyright © 2017 K. Perera. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Let be the observations from a chirp type statistical model , , where is a stationary noise. We consider a method of estimation of parameters, , , , , and , (where is the variance of ’s) which is basically an approximate least-squares method. The main advantage of the proposed approach is that no assumptions are required. We make use of the three theorems which were established associated with the kernel and then use them to prove, under certain conditions, the consistency of the estimators.

1. Introduction

In [1] 1973 Walker considered the problem of estimating the parameters of a sine wave,where ’s are the observations and ’s are independent, identically distributed random variables with mean zero and finite but unknown variance . The parameters , , , and are assumed unknown and are to be estimated. He showed that, as , the estimators , , , and converge in probability to the actual values , , , and , respectively. That is, he showed that the estimators are consistent. He then showed that the differences between the estimators and the actual values they estimate have a joint normal distribution.

Suppose now that the frequency of the above sine wave changes with time. If is the initial frequency, the frequency at time may be written asThe simplest case is one in which is linear; that is, . Then . This leads to the model The parameters for this model are , , , , and . This is sometimes called the “chirp” model (see [2, 3]). Several authors have considered the parameter estimation of the chirp model and are found in [47]. Different approaches to the estimation of chirp parameters in similar kinds of models are found in [812].

But our approach is entirely different from them. We do not make any assumptions. The method used to estimate the parameters and to prove the consistency of the estimators is similar to that by Walker in [13]. We consider not only estimates of the parameters of , , , and but also an estimate of . Although this leads to an interesting problem in estimation, the model is somewhat unrealistic. Over the course of observations, the frequency changes from to . As goes to , the sine wave oscillates faster and faster (unless ). Its frequency becomes infinite, and its period approaches . We therefore change the model by assuming that the change in frequency over the course of the observations is a number independent of . We also assume, as in (3), that the change in frequency is linear. This leads to the model or, more precisely, sequence of models We assume, as Walker does in his 1971 paper, that ’s are independent and identically distributed with mean zero and variance . Thus the parameters are , , , , and and the estimators of the parameters are , , , , and . Our objective is to show that these estimators are consistent.

We make use of the following three theorems which we have established.

Theorem 1 (see [14]). Let , where is a nonnegative real number. Then , for not a rational multiple of , uniformly in .

Theorem 2 (see [15]). Let be a sequence of independent random variables such that , , and . Then

Theorem 3 (see [16]). For any sufficiently small and

2. Estimation of the Parameters

If ’s are normally distributed, the likelihood function of the observations isThen the log likelihood function of the observations is where

Now consider :

By using the identities , , and we obtainFrom Theorem 1, the last two terms in (11) are of order . Therefore if we let

then is an approximation of . More precisely, . Thusis an approximation of the log likelihood function . Now fix in (13). Then we maximize to obtain the estimates for , , , and . That is, we minimize over the region . Now fix and . If or then . Since is continuous in , the minimum is achieved at a point . The partial derivatives and must vanish at . Thus the estimators of and are solutions of the equationsThe solution to these equations is given byTo obtain estimates for and we substitute and for and , respectively, in (12) and minimize as a function of and . Since this last expression is equal to with respect to and ; this is same as maximizingSince varies over a compact set, there is a point which maximizes this expression. We then estimate and by The minimum value of is assumed at .

Now we obtain an estimator of . If is zero then, since and , almost surely. Thus there is no randomness in the model. We assume, then, that . To obtain we substitute , , , and in (13) and maximize . From (13), We will show that , as a function of , achieves its positive maximum provided , which we show is positive with high probability for large values of . From (4) and (9) we have Using (11) and (12) we have Using the mean value theorem, where is a point on the line segment connecting and .

ThusIn Section 3 we prove that and . Therefore we have Also in Section 3 we show that Thus (25) implies that SimilarlyIn Section 3 we also show that and . Thus (22) implies that Using (21) in the above equation we obtain By the weak law of large numbers Using this in (30) we obtain that with probability arbitrarily close to 1 for sufficiently large values of . Suppose, then, that . Since and , as or . Thus since is a continuous function in the maximum is achieved when . The partial derivative vanishes at . Therefore is the solution to equation So

Note that exits only with high probability for large . Using (12) and substituting for and we obtainNow, using (18) in the above equation, we obtain Therefore, substituting in (34) we have

3. The Consistency of the Estimators

Now we will establish under certain conditions the consistency of the estimators , , , , and

Theorem 4. Let be a sequence of independent random variables with , , and . Let be any real numbers and . Let and let . For each , let The estimators , , , , and of , , , , and given in (12), (13), (15), and (17), respectively, are consistent estimators of , , , , and , respectively. Furthermore and .

Proof. From (15)ButThusIf we use complex exponentials instead of sines and cosines we have where and , and then LetThenExpanding the third term on the right, we get

Lemma 5.

Proof. From (46) we haveBy virtue of Theorem 2HenceThus we obtain the following estimates for the first and second terms in (48):Consider the term in (48). Using (44) we have Now we can write where .
AlsoIt follows by virtue of Lemma [16] that there is for which Thus we have Let . Then we obtain In the proof of Theorem 3 [16, p. 65, eq 1.9], we showed that if at least one of or tends to then Therefore, since , we have This together with (57) implies that Thus using this equation (51), and (60) in (48) we obtain ThusThis completes the proof of Lemma 5.

Lemma 6. Let be defined by where and . Then

Proof. Throughout the proof it is assumed that and . From (46) we have From Theorem 2This implies thatThus we obtain the following estimates for the first and second terms in (65): Consider the term . Using (44) we have We can write where .
ThenIt follows by virtue of Lemma [9, p. 1] that there is for which Similarly we can show that Thus we have Let in (74). Then we have Let . We will show that SupposeThen there is a sequence and for which for each . Then we can find for which for each . Let , . Since , , ; hence . It follows from equation 1.9 [16, p. 65] in the proof of Theorem 3 thatThus we have a contradiction. Therefore Hence by (76) This implies thatSubstituting (68) and (84) in (65) we obtain ThusThis completes the proof of Lemma 6.

Now combining Lemmas 5 and 6 we obtain Thus