Abstract
We propose a cross-validation method suitable for smoothing of kernel quantile estimators. In particular, our proposed method selects the bandwidth parameter, which is known to play a crucial role in kernel smoothing, based on unbiased estimation of a mean integrated squared error curve of which the minimising value determines an optimal bandwidth. This method is shown to lead to asymptotically optimal bandwidth choice and we also provide some general theory on the performance of optimal, data-based methods of bandwidth choice. The numerical performances of the proposed methods are compared in simulations, and the new bandwidth selection is demonstrated to work very well.
1. Introduction
The estimation of population quantiles is of great interest when one is not prepared to assume a parametric form for the underlying distribution. In addition, due to their robust nature, quantiles often arise as natural quantities to estimate when the underlying distribution is skewed [1]. Similarly, quantiles often arise in statistical inference as the limits of confidence interval of an unknown quantity.
Let be independent and identically distributed random sample drawn from an absolutely continuous distribution function with density . Further, let denote the corresponding order statistics. For a quantile function is defined as follows: If denotes th sample quantile, then where denotes the integral part of. Because of the variability of individual order statistics, the sample quantiles suffer from lack of efficiency. In order to reduce this variability, different approaches of estimating sample quantiles through weighted order statistics have been proposed. A popular class of these estimators is called kernel quantile estimators. Parzen [2] proposed a version of the kernel quantile estimator as below: From (1.2) one can readily observe that puts most weight on the order statistics , for which is close to. In practice, the following approximation to is often used: Yang [3] proved that and are asymptotically equivalent in terms of mean square errors. Similarly, Falk [4] demonstrates that, from a relative deficiency perspective, the asymptotic performance of is better than that of the empirical sample quantile.
In this paper, we propose a cross-validation method suitable for smoothing of kernel quantile estimators. In particular, our proposed method selects the bandwidth parameter, which is known to play a crucial role in kernel smoothing, based on unbiased estimation of a mean integrated squared error curve of which the minimising value determines an optimal bandwidth. This method is shown to lead to asymptotically optimal bandwidth choice and we also provide some general theory on the performance of optimal, data-based methods of bandwidth choice. The numerical performances of the proposed methods are compared in simulations, and the new bandwidth selection is demonstrated to work very well.
2. Data-Based Selection of the Bandwidth
Bandwidth plays a critical role in the implementation of practical estimation. Specifically, the choice of the smoothing parameter determines the tradeoff between the amount of smoothness obtained and closeness of the estimation to the true distribution [5].
Several data-based methods can be made to find the asymptotically optimal bandwidth in kernel quantile estimators for given by (1.3). One of these methods use derivatives of the quantile density for .
Building on Falk [4], Sheather and Marron [1] gave the MSE of as follows. If is not symmetric or is symmetric but , where , and is the antiderivative of .
If then where , .
There is no single optimal bandwidth minimizing the when is symmetric and. Also, If , we need higher terms and the can be shown to bewhere see Cheng and Sun [6].
In order to obtain we need to estimate and . It follows from (1.3) that the estimator of can be constructed as follows: Jones [7] derived that the as By minimizing (2.5), we obtain the asymptotically optimal bandwidth for : To estimate in (2.2), we employ the known result and it readily follows that which represents the asymptotically optimal bandwidth for . By substituting in (2.4) and in (2.7) we can compute .
3. Cross-Valdation Bandwidth Selection
When measuring the closeness of an estimated and true function the mean integrated squared (MISE) defined as is commonly used as a global measure of performance.
The value which minimises is the optimal smoothing parameter, and it is unknown in practice. The following is the discrete form of error criterion approximating:
The unknown is replaced by and a function of cross-validatory procedure is created as:where denotes the kernel estimator evaluated at observation , but constructed from the data with observation omitted.
The general approach of crossvalidation is to compare each observation with a value predicted by the model based on the remainder of the data. A method for density estimation was proposed by Rudemo [8] and Bowman [9]. This method can be viewed as representing each observation by a Dirac delta function , whose expectation is , and contrasting this with a density estimate based on the remainder of the data. In the context of distribution functions, a natural characterisation of each observation is by the indicator function whose expectation is . This implies that the kernel method for density estimation can be expressed as
when .
The kernel method for distribution function
where is a distribution function, is the bandwidth controls the degree of smoothing. When
where is the indicator function
Now, from (1.3) when and thus a cross-validation function can be written as The smoothing parameter is then chosen to minimise this function. By subtracting a term that characterise the performance of the true we have which does not involve . By expanding the braces and taking expectation, we obtain When the th order statistic is asymptotically normally distributed where the notation with positive subscript denotes a kernel estimator based on a sample size of . The proceeding arguments demonstrate that provides an asymptotic unbiased estimator of the true curve for a sample size . The identity at (3.12) strongly suggests that crossvalidation should perform well.
4. Theoretical Properties
From (3.1), we can write .
Sheather and Marron [1] have shown that while Falk [4, page 263] proved that
On combining the expressions for bias and variance we can express the mean integrated square error as
and for and the MISE can be expressed as Therefore, the asymptotically optimal bandwidth is , where.
We can see from (3.12) that may be a good approximation to or at least to that function evaluated for a sample of size rather than . Additionally, this is true if we adjusted by adding the quantity This quantity is demean and does not depend on which makes it attractive for obtaining a particularly good approximation to .
Theorem 4.1. Suppose that is bounded on and right continuous at the point 0, and that is a compactly supported density and symmetric about 0. Then, for each , with probability, uniformly in , as .
(An outline proof of the above theorem is in the appendix).
From the above theorem, we can conclude that minimisation of produces a bandwidth that is asymptotically equivalent to the bandwidth that minimises .
Corollary 4.2. Suppose that the conditions of previous theorem hold. If denotes the bandwidth that minimises in the range , for any and any , then with probability 1 as .
5. A Simulation Study
A numerical study was conducted to compare the performances of the two bandwidth selection methods. Namely, the method presented by Sheather and Marron [1] and our proposed method.
In order to account for different shapes for our simulation study we consider a standard normal, Exp, Log-normal(0,1) and double exponential distributions and we calculate 18 quantiles ranging from to . Through the numerical study the Gaussian kernel was used as the kernel function. Sample sizes of 100, 200 and 500 were used, with 100 simulations in each case. The performance of the methods was assessed through the mean squared errors criterion (MSE). . And the relative efficiency (R.E)
Further, for comparison purposes we refer to our proposed method and that of Sheather and Marron [1] as method 1 and method 2 respectively.
(a) Standard normal distribution (see Table 1 and Figure 1).
(a)
(b)
(c)
(d)
(e)
(f)
(b) Exponential distribution (see Table 2 and Figure 2).
(a)
(b)
(c)
(d)
(e)
(f)
(c) Log-normal distribution (see Table 3 and Figure 3).
(a)
(b)
(c)
(d)
(e)
(f)
(d) Double exponential distribution (see Table 4 and Figure 4).
(a)
(b)
(c)
(d)
(e)
(f)
We can compute and summarize the relative efficiency of for the all previous distributions in Table 5.
From Tables 1, 2, 3, and 4, for the all distributions, it can be observed that in 52.3% of cases our method produces lower mean squared errors, slightly wins Sheather-Marron method.
Also, from Table 5 which describes the relative efficiency for we can see more efficient from for all the cases except the standard normal distribution cases with and double exponential distribution cases with .
So, we may conclude that in terms of MISE our bandwidth selection method is more efficient than Sheather-Marron for skewed distributions but not for symmetric distributions.
6. Conclusion
In this paper we have a proposed a cross-validation-based-rule for the selection of bandwidth for quantile functions estimated by kernel procedure. The bandwidth selected by our proposed method is shown to be asymptotically unbiased and in order to assess the numerical performance, we conduct a simulation study and compare it with the bandwidth proposed by Sheather and Marron [1]. Based on the four distributions considered the proposed bandwidth selection appears to provide accurate estimates of quantiles and thus we believe that the new bandwidth selection method is a practically useful method to get bandwidth for the quantile estimator in the form (1.3).
Appendix
Step 1. Let , where
Step 2. With and
Step 3. This step combines Steps 1 and 2 to prove that
Step 4. This step establishes that
Step 5. This step combines Steps 3 and 4, concluding that
where
Let , for a random variable and a positive sequence
Step 6. This step notes that , where and that, where
Step 7. Shows that .
Step 8. This step combines the results of Steps 5, 6, 7, obtaining
Step 9. This step notes that and
Step 10. This step combines Steps 8 and 9, establishing that This means that