- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents

Journal of Probability and Statistics

Volume 2012 (2012), Article ID 807045, 5 pages

http://dx.doi.org/10.1155/2012/807045

## Improved Estimators of the Mean of a Normal Distribution with a Known Coefficient of Variation

Department of Statistics, Faculty of Science, Khon Kaen University, Khon Kaen 40002, Thailand

Received 1 August 2012; Revised 10 November 2012; Accepted 13 November 2012

Academic Editor: Shein-chung Chow

Copyright © 2012 Wuttichai Srisodaphol and Noppakun Tongmol. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

This paper is to find the estimators of the mean for a normal distribution with mean and variance , , . These estimators are proposed when the coefficient of variation is known. A mean square error (MSE) is a criterion to evaluate the estimators. The results show that the proposed estimators have preference for asymptotic comparisons. Moreover, the estimator based on jackknife technique has preference over others proposed estimators with some simulations studies.

#### 1. Introduction

For the population that is distributed as normal with mean and variance , the sample mean is the unbiased and minimum variance estimator. In the situation that coefficient of variation is known where for and , Khan [1] proposed the unbiased estimator and the asymptotic variance of is . This estimator is the linear combination between and sample variance and the asymptotic variance of estimator is the Cramer-Rao bound.

Arnholt and Hebert [2] improved the estimator where is an unbiased estimator of , , and constant are known. They found that has smaller mean square error (MSE) than the estimator . They also gave the example for and obtained the estimator and . Then, has MSE smaller than the estimator .

This paper focuses on improving the estimators of when the coefficient of variation is known. MSE is a criterion for evaluating the estimators. The estimators are proposed by using the method of Khan [1] and Arnholt and Hebert [2]. Also, the jackknife technique [3] is used to reduce the bias of estimator. Moreover, The Bayesian estimator [4] is proposed based on noninformative prior distribution by using Jeffreys’ prior distribution.

The paper is organized as follows. The improved estimators are proposed in Section 2. Asymptotic comparison and simulation study results are presented in Section 3. Finally, Section 4 contains conclusions.

#### 2. Improved Estimators

Let be independent and distributed as normal with mean () and variance , and coefficient of variation is known where for and . There are three estimators proposed as follows.

(1) Let be the proposed estimator of based on Khan [1], and Arnholt and Hebert [2], where is a constant. is a biased estimator of with and where the asymptotic variance of is . The minimum MSE of is obtained by since .

(2) Let be the proposed estimator of based on the jackknife technique. The estimator is used to construct the jackknife estimator as follows. Let be an estimator based on the sample size by deleting the th sample. Denote The estimator is given by The MSE of is shown in the simulation study in Section 3.

(3) Bayes estimator is obtained as follows.

The likelihood function of given data is The log likelihood function is The Jeffreys prior distribution is where is Fisher’s information.

Then, the prior distribution is The posterior distribution, the distribution of given data is Therefore, The Bayes estimator of , is given as The MSE of is shown in the simulation study in Section 3.

#### 3. Asymptotic Comparison and Simulation Study Results

(1) For asymptotic comparison, the estimators are compared based on the relative efficiency (RE) of MSEs. The RE of with respect to is obtained by It shows that is smaller than .

The of with respect to is obtained by It shows that is smaller than .

Therefore, from (3.1) and (3.2), the proposed estimator has smaller MSE than and .

(2) The simulation results are shown for the comparison MSEs among the three proposed estimators,, , and . Let parameters , 10, and 15, and , 0.09, and 0.25 with small sample size , 20, and 30. The results are shown in Tables 1, 2, and 3.

From Tables 1–3, the results show that, for small sample size , the estimator has smaller MSEs than the estimator . We also see that, the estimator has smaller MSEs than the estimator , since is constructed by using the jackknife technique to reduce bias of the biased estimator . Therefore, the estimator is better than the estimators and within the intervals of and .

#### 4. Conclusions

These estimators , , and are proposed. The estimator is improved based on the methods of Khan [1] and Arnholt and Hebert [2]. The estimator is obtained by reducing bias of . The estimator is a Bayesian estimator for the noninformative prior distribution by using the Jeffreys prior distribution. The estimator is better than the estimators and in the asymptotic comparison. Moreover, the estimator is better than the estimators and with some simulation studies.

#### Acknowledgment

The authors would like to thank Computational Science Research Group, Faculty of Science, Khon Kaen University for the financial support.

#### References

- R. A. Khan, “A note on estimating the mean of a normal distribution with known coefficient of variation,”
*Journal of the American Statistical Association*, vol. 63, pp. 1039–1104, 1968. - A. T. Arnholt and J. E. Hebert, “Estimating the mean with known coefficient of variation,”
*The Amrican Statistician*, vol. 49, pp. 367–369, 1995. - R. G. Miller, “The jackknife: a review,”
*Biometrika*, vol. 61, no. 1, pp. 1–15, 1974. View at Scopus - G. Casella and R. L. Berger,
*Statistical Inference*, Duxbury, 2nd edition, 2002.