Mathematical Problems in Engineering

Volume 2016, Article ID 7946828, 13 pages

http://dx.doi.org/10.1155/2016/7946828

## Parameter Estimation and Joint Confidence Regions for the Parameters of the Generalized Lindley Distribution

Department of Mathematics, Beijing Jiaotong University, Beijing 100044, China

Received 23 October 2015; Accepted 28 December 2015

Academic Editor: Vladimir Turetsky

Copyright © 2016 Wenhao Gui and Man Chen. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

We deal with the problem of estimating the parameters of the generalized Lindley distribution. Besides the classical estimator, inverse moment and modified inverse estimators are proposed and their properties are investigated. A condition for the existence and uniqueness of the inverse moment and modified inverse estimators of the parameters is established. Monte Carlo simulations are conducted to compare the estimators’ performances. Two methods for constructing joint confidence regions for the two parameters are also proposed and their performances are discussed. A real example is presented to illustrate the proposed methods.

#### 1. Introduction

Lindley [1] originally introduced the Lindley distribution to illustrate a difference between fiducial distribution and posterior distribution. This distribution is becoming increasingly popular for modeling lifetime data and has a wide applicability in survival and reliability as closed forms for the survival and hazard functions and good flexibility of fit. Its density function is given byWe denoted this by writing . The Lindley distribution is a mixture of an exponential distribution with scale and a gamma distribution with shape 2 and scale , where the mixing proportion is .

Ghitany et al. [2] provided a comprehensive treatment of the statistical properties of the Lindley distribution. Mazucheli and Achcar [3] used the Lindley distribution as a good alternative to analyze lifetime data within the competing risks approach as compared with the use of standard exponential or even the Weibull distribution commonly used in this area. Krishna and Kumar [4] considered the reliability estimation in Lindley distribution with progressively type II right censored sample. Al-Mutairi et al. [5] dealt with the estimation of the stress-strength parameter when the variables are independent Lindley random variables with different shape parameters.

Some researchers have proposed and studied new classes of distributions based on the Lindley distribution. See, for example, Sankaran [6], Ghitany et al. [7], Bakouch et al. [8], Shanker et al. [9], and Ghitany et al. [10]. In this paper, we focus on the generalized Lindley distribution (GLD) introduced by Nadarajah et al. [11]. It has the attractive feature of allowing for monotonically decreasing, monotonically increasing, and bath tub shaped hazard rate functions while not allowing for constant hazard rate functions. It has better hazard rate properties than the gamma, lognormal, and the Weibull distributions.

The cumulative distribution function and the probability density function are, respectively, given bywhere and are two parameters. We denote this distribution as . When , the generalized Lindley distribution reduces to the one parameter Lindley distribution.

Singh et al. [12] developed the Bayesian estimation for the generalized Lindley distribution under squared error and general entropy loss functions in case of complete sample of observations. Singh et al. [13] considered the generalized Lindley distribution and proposed the progressive type II censoring scheme which allows the removal of the live units from a life-test with beta-binomial probability law during the execution of the experiment.

Nadarajah et al. [11] considered the classical maximum-likelihood estimation of the parameters of a generalized Lindley distribution. The results showed that the bias is not satisfied especially for a small or even moderate sample size. As for the moment estimates, two nonlinear equations need to be solved simultaneously and the existence and uniqueness of the roots are not clear and guaranteed.

In this paper, we consider the problem of estimating the two parameters of the generalized Lindley distribution. We propose inverse moment and modified inverse moment estimators and study their properties. The conditions of the existence and uniqueness of the estimators are established. Monte Carlo simulations are used to compare the performances of the estimators. We also investigate the methods for constructing joint confidence regions for the two parameters and study their performances.

The rest of this paper is organized as follows. In Section 2, we briefly review the classical maximum-likelihood estimation of the parameters of the generalized Lindley distribution. In Section 3, the moment estimator is discussed. In Section 4, we propose two new methods of estimating the parameters and study their properties. Joint confidence regions for the two parameters are proposed in Section 5. Section 6 conducts simulations to assess the methods. Finally, in Section 7, a real example is presented to illustrate the proposed methods.

#### 2. Maximum-Likelihood Estimation

In this section, we briefly review the MLEs of the parameters of GLD distribution. Let be a random sample from with pdf and cdf as (3) and (2), respectively. The log-likelihood function is given by The score equations are thus as follows:

From (6) we obtain the MLE of as a function of : The MLE of is the root of the following equation:

Such nonlinear equation does not have closed form solution. We can apply numerical method such as Newton-Raphson method to compute .

#### 3. Moment Estimation of Parameters

In this section, we discuss the moment estimation (MOM) of the parameters of GLD distribution. Let be a random sample from with pdf and cdf as (3) and (2), respectively. are the observed values. Let , . are sample moments. For the population moments, we need the following lemma (Nadarajah et al. [11]).

Lemma 1. *Let . One has *

Let denote a GLD random variable. It follows thatBy equating the population moments with the sample moments, we obtainThe method of moments estimators is the roots of the two equations. Similar to the MLEs, such nonlinear equations do not have closed form solutions. We can apply numerical method such as Newton-Raphson method to determine the roots.

#### 4. Inverse Moment Estimation of Parameters

Unlike the regular method of moments, the idea of the inverse moment estimation (IME) is as follows: for a given random sample from a distribution with unknown parameters, first transform the original sample to a quasisample , where contains the unknown parameters but its distribution does not depend on the unknown parameters; that is, is a pivot variable, . The population moments of the new sample do not depend on the unknown parameters while the sample moments do. Let the population moments of the quasisample equal the sample moments and solve for the unknown parameters.

Let form a sample from with pdf given in (3); it is known that , , follow the uniform distribution , and thus , , follow standard exponential distribution . By the method of inverse moment estimation, we let that is,Thus, the IME of is obtained as a function of ,which is identical to the MLE of . In the following, we determine the IME of .

Lemma 2. *Let be the order statistics from the standard exponential distribution. Then, the random variables , wherewith , are independent and follow standard exponential distributions.*

*Proof. *The proof can be found by Arnold et al. [14].

Lemma 3. *Let be iid standard exponential variables, , , , ; then*(1)* are independent;*(2)* follow the uniform distribution ;*(3)* follows .*

*Proof. *The proof can be found by Wang [15].

*For the sample from , considering the order statistics , we have that are -order statistics from standard exponential distribution.*

*Let , , where . Thus, are the first -order statistics from the standard exponential distribution. By Lemma 2, , , form a sample from standard exponential distribution.*

*Let , , , and ; by Lemma 3, we havewhere Note that the mean of is . Thus, we obtain an inverse moment equation for as follows:Solve the equation and we obtain the inverse estimate of . Plugging into (15), we obtain the inverse estimate . In addition, considering that the mode of is , we can obtain a modified equation for : Solve the equation and we obtain the modified inverse estimate of . Plugging into (15), we obtain the modified inverse estimate . Unlike the moment estimation, here we only need to solve one nonlinear equation instead of two equations.*

*In the following, we prove the existence and uniqueness of the root in (20) and (21).*

*Lemma 4. The following limits hold:(1)One has(2)One has(3)One has*

*Lemma 5. For , is a decreasing function of .*

*Theorem 6. Let , , form a sample from standard exponential distribution, ; then for , equation has a unique positive solution.*

*Proof. *By Lemma 4, we obtain Thus, . On the other hand, Thus, . Therefore, for , equation has one positive solution. For the uniqueness of the solution, we consider the derivative of with respect to . Note that, for , where By Cauchy’s mean-value theorem, for , , there exist and such that Note that , by Lemma 5, , , thus is a strictly increasing function of , and equation has a unique positive solution.

*5. Joint Confidence Regions for and *

*5. Joint Confidence Regions for and*

*Let form a sample from , and are the order statistics. Let , Thus, are the first -order statistics from the standard exponential distribution. By Lemma 2, , , form a sample from standard exponential distribution. Let , , , and . HenceIt is obvious that and are independent. DefineWe obtain that and are independent using the known bank-post office story in statistics.*

*Let denote the percentile of distribution with left-tail probability and and degrees of freedom. Let denote the percentile of distribution with left-tail probability and degrees of freedom.*

*By using the pivotal variables and , a joint confidence region for the two parameters and can be constructed as follows.*

*Theorem 7 (method 1). Let form a sample from ; then, based on the pivotal variables and , a joint confidence region for the two parameters is determined by the following inequalities:where is the root of for the equation and is the root of for the equation .*

*Proof. * + − / is a function of and does not depend on . From Theorem 6, we have , , and . Therefore, for any , equation has a unique positive root of :

*On the other hand, by Lemma 3, we have and are also independent. By using the pivotal variables and , a joint confidence region for the two parameters and can be constructed as follows.*

*Theorem 8 (method 2). Let form a sample from ; then, based on the pivotal variables and , a joint confidence region for the two parameters is determined by the following inequalities: where is the root of for the equation and is the root of for the equation .*

*Proof. * is a function of and does not depend on . From Theorem 6, for any , equation has a unique positive root of :

*6. Simulation Study*

*6. Simulation Study*

*6.1. Comparison of the Four Estimation Methods*

*6.1. Comparison of the Four Estimation Methods*

*In this section, we conduct simulations to compare the performances of the MIMEs, IMEs, MLEs, and MOMs mainly with respect to their biases and mean squared errors (MSEs), for various sample sizes and for various true parametric values.*

*The random data from the distribution can be generated as follows:where follows uniform distribution over and giving the principal solution for in is pronounced as Lambert function; see Jodrá [16].*

*We obtain by solving (8) and by (7). and can be obtained by solving (11) and (12) simultaneously. and can be obtained by solving (20) and (21), respectively. and can be obtained from (15).*

*We consider sample sizes and . We take in all our computations. For each combination of sample size and parameter , we generate a sample of size from and estimate the parameters and by the MLE, MOM, IME, and MIME methods. The average values of and as well as the corresponding MSEs over 1000 replications are computed and reported.*

*Table 1 reports the average values of and the corresponding MSE is reported within parenthesis. Figures 1(a), 1(b), 1(c), and 1(d) show the relative biases and the MSEs of the four estimators of for sample sizes and . Figures 1(e) and 1(f) show the relative biases and the MSEs of the four estimators of for . The other cases are similar.*