Advances in Statistics

Volume 2014, Article ID 198696, 13 pages

http://dx.doi.org/10.1155/2014/198696

## Designing Bayesian Sampling Plans with Adaptive Progressive Hybrid Censored Samples

Wayne State University, Detroit, MI 48202, USA

Received 2 May 2014; Accepted 29 September 2014; Published 16 November 2014

Academic Editor: Chin-Shang Li

Copyright © 2014 TaChen Liang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

This paper studies the acceptance sampling for exponential distributions with type-I and type-II adaptive progressive hybrid censored samples. Algorithms are proposed for deriving Bayesian sampling plans. We compare the performance of the proposed sampling plans with the sampling plans of Lin and Huang (2012). The numerical results indicate that the proposed sampling plans outperform the sampling plans of Lin and Huang (2012).

#### 1. Introduction

Suppose we are given a batch of lifetime components for acceptance sampling. We let denote the lifetimes of these units. It is assumed that are mutually independent and follow an exponential distribution, having expected lifetime , and the parameter follows a gamma prior distribution. In the context of life test experiment, identical units are sampled from the batch and placed on life test without replacement with a suitable sampling scheme. At the end of the experiment, let denote the duration of the experiment, and let be the number of failures among the items put on life test. Let be an action regarding the acceptance sampling. When , it means accepting the batch, while when , it means rejecting the batch. Let denote the cost per unit inspected. Also, let be the loss of rejecting the batch and let be the loss of accepting the batch where , , , and . In many situations, the cost of time used for life test is an essential issue and should be taken as a part of the loss function. We let be the cost per unit time used for the life test experiment. When the life test experiment terminates, the unfailed components can be reused and thus have salvage value , where . Thus, many researchers including Chen et al. [1], Liang and Yang [2], and Lin and Huang [3] have considered using the loss function for the acceptance sampling, where

Lin and Huang [3] have studied acceptance sampling for exponential distributions with the loss based on adaptive type-I and type-II progressive hybrid censored samples. For the adaptive progressive hybrid censoring (APHC), a positive time and progressive censoring scheme should be determined before the life test experiment, where , , for , and . Let denote the observable variables obtained either by using type-I APHC or by using type-II APHC and an observed value of . A decision function is a function defined on the sample space of the observable variables such that is the probability of accepting the batch when is observed. The determination of the design parameters and the decision function is called a sampling plan, which is denoted by .

With the loss of (1) and the prior distribution, the Bayes risk associated with the sampling plan can be expressed as

A sampling plan, say , is said to be a Bayesian sampling plan if minimizes the Bayes risks among all sampling plans . Lin and Huang [3] claimed having derived the Bayesian sampling plan for each type of APHC.

However, the reported sampling plans are not the Bayesian sampling plan since the associated Bayes risks are much larger than the minimum Bayes risks.

The goal of this paper is to find the Bayesian sampling plan for the acceptance sampling with the loss . The paper is organized as follows. Section 2 deals with the derivation of the Bayesian sampling plan with type-I APHC samples. For the given parameters , a Bayes decision function is derived. Then, Algorithm A is proposed for deriving the Bayesian sampling plan . Section 3 deals with the derivation of the Bayesian sampling plan with type-II APHC samples. For the given parameters , a Bayes decision function is derived. Then, Algorithm B is proposed for deriving the Bayesian sampling plan . In Section 4, we compare the performance of the proposed sampling plans with the sampling plans of Lin and Huang [3]. The numerical results indicate that the proposed sampling plans perform better than that of Lin and Huang [3].

#### 2. Derivation of the Bayesian Sampling Plan **BSP**_{1}

**BSP**

_{1}

For type-I APHC, during the experiment, for each , when the th failure is observed, immediately after the failure, functioning items are randomly removed from the life test. Let denote the lifetime of the th failed item. Also, let denote the lifetime of the th failed item. When , the experiment terminates at time . When , at the th failure, do not follow the prespecified censoring scheme to remove the remaining functioning items; instead, continue to observe failures without any further withdrawals up to time . In either case, let denote the number of failures before time . Thus, . We denote the lifetimes of the failures after the th failure by , . Let , . Let be observed values of . Let , , and . The duration of the life test experiment is . The observable variables have a joint probability density function given as follows: where is an observed value of and where and , , and − for .

##### 2.1. A Bayes Decision Function

Let denote the probability density of a prior distribution. Let Note that is the marginal joint probability density of , is the posterior probability density of , and is the posterior expectation of given being observed. For each observed , define Note that

Theorem 1. * is a Bayes decision function in the sense that, for each fixed sampling parameters , for all decision functions .*

*Proof. *Let and .

Then,
On , , . Thus,
On , , . Thus,
Combining (9)–(11) leads to the results of the theorem.

###### 2.1.1. An Alternative Form of

A straightforward computation shows that If , then for all . Hence, for all . In such a situation, we should take and consider the sampling plan with no sample data.

As , let Then, Note that is increasing in for . Also, .

If , then for all . Thus, for all . So, we request .

###### 2.1.2. Bayes Decision Function for No Sample Data Case

For no sample situation, the associated parameters are .

Denote . The Bayes decision function is Thus, for no sample data case, the Bayes risk of the sampling plan is .

##### 2.2. Derivation of Bayesian Sampling Plans

From (2), we have Note that , where is the first ordered statistic of the lifetime variables of the items put on life test.

For each , the sampling plan can collect as much information from the sample as the sampling plan does. Thus, we have Therefore, for any sampling plan , the following inequality holds: where . So, if then is not a Bayesian sampling plan. With this property, we propose Algorithm A for deriving a Bayesian sampling plan as follows.

*Algorithm A. *Define , where denotes the largest integer less than .

*Step 1. *For each satisfying , construct all type-I adaptive progressive hybrid censoring schemes with .

*Step 2. *For each fixed sampling scheme , derive the Bayes decision function . Note that if
then is not a Bayesian sampling plan.

If for all , take and adopt the sampling plan . Go to Step 8.

Otherwise, go to Step 3.

*Step 3. *For each fixed , find the time such that

*Step 4. *For each fixed , find the censoring scheme such that

*Step 5. *For each , find the such that

*Step 6. *Find , , such that

*Step 7. *If , then propose the sampling plan ; otherwise, is the proposed sampling plan.

*Step 8. *When , we let , , , and is the proposed sampling plan.

Theorem 2. * is an optimal sampling plan in the sense that for all sampling plans .*

*Proof. *It suffices to show that, for any sampling plan with , the following inequality holds:
By Step 2, ; by Step 3, ; by Step 4, ; by Step 5, ; by Step 6, .Combining the preceding inequalities and (26), we conclude this theorem.

###### 2.2.1. An Upper Bound of

Since thus .

##### 2.3. An Illustrating Numerical Example

One numerical example is used to illustrate the application of Algorithm A for deriving the Bayesian sampling plan.

*Example 1. *A batch of items is presented for acceptance sampling. It is assumed that , , , , , and . The goal is to find the Bayesian sampling plan for the acceptance sampling. Thus, , , and . So, we only need to consider cases where . Since , and , for each ,
A straightforward computation shows
Note that, . Hence, .

As , , . Thus,

As , , . So,

As , , and
Similarly, for and 5, straightforward computations yield that
Hence, for finding the Bayesian sampling plan, by Step 2 of Algorithm A, it suffices to consider the type of sampling plans .

For the sampling plan , , where is an exponential random variable having mean . For , we have . Since both and are increasing in , we have for all . So, for searching the best Bayesian sampling plan, it suffices to consider those sampling plans where . Now, for the sampling plan ,
Therefore,
Numerical computation is used for finding the to minimize of (35) among all such that . It is found that and is the Bayesian sampling plan. With this value, a computation shows , , , , , and .

#### 3. Derivation of the Bayesian Sampling Plan **BSP**_{2}

**BSP**

_{2}

For the type-II APHC, during the experiment, for each , when the th failure is observed, if , immediately after the failure, functioning items are randomly removed from the life test; if and , do not follow the prespecified censoring scheme to remove functioning items from the life test; instead, continue to observe failures without any further withdrawals; and if , immediately remove all the remaining functioning items and terminate the experiment. Let denote the number of failures before time . Thus, . The duration of the life test experiment is . The observable variables have a joint probability density function given as follows: where is an observed value of , where and .

##### 3.1. A Bayes Decision Function

Let For each observed variable , define Note that Similar to Theorem 1, we can obtain the following theorem.

Theorem 3. * is a Bayes decision function in the sense that, for each fixed sampling parameters , for all decision functions .*

##### 3.2. Derivation of Bayesian Sampling Plans

From (2), we have Note that , where is the th ordered statistic of the lifetime variables of the items put on life test. Similar to the inequality of (19), the following inequality holds: So if then is not a Bayesian sampling plan. With this property, we propose Algorithm B for deriving a Bayesian sampling plan as follows.

*Algorithm B. *Define , where denotes the largest integer less than .

*Step 1. *For each and satisfying and
construct all type-II progressive hybrid censoring schemes .

Go to Step 2.

If there is no pairs of satisfying the restrictions of (45), take . Then, adopt the sampling plan with no sample data. Go to Step 8.

*Step 2. *For each fixed , derive the Bayes decision function . Note that if , then is not a Bayesian sampling plan.

If for all , take and adopt the sampling plan . Go to Step 8.

Otherwise, go to Step 3.

*Step 3. *For each fixed , find the time such that

*Step 4. *For each fixed , find the censoring scheme such that

*Step 5. *For each , find the such that

*Step 6. *Find , , such that

*Step 7. *If , then propose the sampling plan ; otherwise, is the proposed sampling plan.

*Step 8. *When , is the proposed sampling plan.

Analogous to Theorem 2, the following theorem holds.

Theorem 4. * is an optimal sampling plan in the sense that for all sampling plans .*

##### 3.3. An Illustrating Example

Before presenting an example, we will provide some useful results.(a)For any sampling plan , the following inequality holds: (b)for any sampling plan , ;(c)for any sampling plan , (d)one has ;(e)one has .

*Example 2. *The model studied in Example 1 is applied here to illustrate the application of Algorithm B for searching a Bayesian sampling plan. From Example 1, we have that , , , , 2, , and . The goal is to find the Bayesian sampling plan for the acceptance sampling. Thus, , , and . So, we only need to consider cases where . With the numerical values provided in Example 1, we see that, for each () with , the following inequality holds:
Thus, for each (), , is not a Bayesian sampling plan. Therefore, for searching a Bayesian sampling plan, we only need to consider the sampling plan . Since
we have
Therefore, is the Bayesian sampling plan for the concerned acceptance sampling.

#### 4. Comparison with Lin and Huang’s [3] Sampling Plans

##### 4.1. Comparison with Type-I APHC

Lin and Huang [3] have studied the acceptance sampling with the loss and with type-I APHC. Lin and Huang [3] considered a type of decision function , which is defined below. When , the MLE of the expected lifetime is . The decision function is defined as A sampling plan with the type of decision functions is denoted by (or by ). The Bayes risk of can be presented as where

Lin and Huang [3] claimed having derived the Bayesian sampling plans . The values of parameters of the sampling plans and the associated Bayes risks have been provided in Table 4 of Lin and Huang [3] (also, see Table 1 of the present paper). We note that there are some minor errors regarding the values of MBR. Thus, to make an easy comparison, with the provided values of parameters of the sampling plans , the corresponding values, , and the Bayes risks are calculated and reported in Table 2. We will present certain propositions to verify the correctness of the sampling plans .

Proposition 5. *Suppose that a sampling plan is such that . Then, this sampling plan *