Research Article | Open Access

Volume 2016 |Article ID 2071582 | 13 pages | https://doi.org/10.1155/2016/2071582

# Applications of Fuss-Catalan Numbers to Success Runs of Bernoulli Trials

Accepted15 Dec 2015
Published12 Jan 2016

#### Abstract

In a recent paper, the authors derived the exact solution for the probability mass function of the geometric distribution of order , expressing the roots of the associated auxiliary equation in terms of generating functions for Fuss-Catalan numbers. This paper applies the above formalism for the Fuss-Catalan numbers to treat additional problems pertaining to occurrences of success runs. New exact analytical expressions for the probability mass function and probability generating function and so forth are derived. First, we treat sequences of Bernoulli trials with occurrences of success runs of length with -overlapping. The case , where there must be a gap of at least trials between success runs, is also studied. Next we treat the distribution of the waiting time for the nonoverlapping appearance of a pair of successes separated by at most failures ().

#### 1. Introduction

In a recent paper , the authors derived the exact analytical solution for the probability mass function of the geometric distribution of order . The roots of the auxiliary equation of the associated recurrence relation were derived in terms of generating functions for Fuss-Catalan numbers. (See the text by Graham et al.  for details about Fuss-Catalan numbers.) In this paper, we employ our formalism for the Fuss-Catalan numbers to treat additional problems pertaining to occurrences of success runs in sequences of Bernoulli trials. Throughout our paper, we treat only sequences of independent identically distributed (i.i.d.) Bernoulli trials with constant success probability (and failure probability ). The theory of success runs is discussed extensively in the texts by Balakrishnan and Koutras  and Johnson et al. . Our formalism provides a new perspective to treat problems of success runs in sequences of Bernoulli trials and complements and extends results derived by previous authors (especially Feller [5, pp. 322–326]). Citations and comparisons to the works of others will be presented in Sections 3 and 4, after we have derived our results.

We treat two main problems in this paper. First, we consider sequences with multiple occurrences of success runs of length . The success runs are permitted to overlap, with a maximum of overlaps between success runs. This is known as “-overlapping.” The case is perhaps surprising at first sight but is also of interest. In this case there must be a gap or “buffer” of at least trials (of arbitrary outcomes) between success runs. We call this “-buffering.” We also consider the scenario in which the length of the sequence is held fixed and the number of success runs is allowed to vary. This is the binomial distribution of order with -overlapping success runs. An encyclopedia article on binomial distributions of order has been published by Philippou and Antzoulakos . Using Fuss-Catalan numbers, we present new concise expressions for the probability mass functions of these distributions.

In Section 4 we study a different problem. We analyze the distribution of the waiting time for the nonoverlapping appearance of a pair of successes separated by at most failures (). Our main reference for this problem is the elegant analysis by Koutras , who also gives an excellent bibliography on the subject. For and , the problem is a special case of the detection waiting game when a -out-of- moving (or sliding) window detector is employed. See Koutras  for additional details and references. Note that our material in Section 4 is self-contained and is a different problem from that mentioned above.

#### 2. Notation and Definitions

We summarize the basic notation and definitions presented in our earlier paper . For a sequence of independent identically distributed Bernoulli trials with success probability (and failure probability ), let be the waiting time for the first run of consecutive successes. Then is said to have the geometric distribution of order . This distribution was studied by Feller in his classic text [5, pp. 322–326]. It is also known as the negative binomial distribution of order with parameter ; see Philippou . The probability mass function of satisfies the recurrence relation, for ,The initial conditions are for and . We define the auxiliary polynomialThe auxiliary equation is . We will drop the subscripts and unless necessary. Feller  proved that the roots of the auxiliary equation are distinct and also that there is a unique positive real root, and it lies in , and the real positive root has a strictly larger magnitude than all the other roots. Additional properties of the roots were derived in . We denote the roots by , , where is the positive real root. We call the “principal root” and the other roots “secondary roots.” Unless required, we will omit the arguments and . It is useful to multiply by to obtain the polynomial

Remark 1 (Fuss-Catalan numbers and roots of auxiliary polynomial). Relevant definitions, formulas, and identities for the Fuss-Catalan numbers can be found in the text by Graham et al. . The Fuss-Catalan numbers are given byThe first form (finite product) is valid in general, while the second form (Gamma functions) is well defined provided . The generating function of the Fuss-Catalan numbers is and [2, p. 363] We will also require the following formula:It was proved in  that, for all ,For , the above expression also applies for , while, for ,

For ease of reference, we list several relevant properties of the roots in the following. The proofs of all the results were given in , or references cited therein, and are omitted in the following.

Remark 2. All the roots of the auxiliary equation are distinct.

Remark 3. For , the auxiliary equation has a unique positive real root, which lies in . We denote the positive real root by , or , as stated above. For any , exactly one of the three following statements is true: (i),(ii),(iii).

Remark 4. For , the principal root has a strictly greater magnitude than all the other roots of the auxiliary equation; that is, , where is a root of . We employ the term “secondary roots” for the set .

Remark 5. For any , the secondary roots , satisfy the inequalityThe inequalities involving are strict if .

Remark 6. For , let denote the set of roots of the equation . Let . Then if or ; otherwise .

In addition to the above properties of the roots, we will also need the following two results, which were not proved in , as well as a lemma about sums of series.

Proposition 7 (distinctness of roots for different ). Consider fixed . Suppose is a root of the auxiliary equation for . Then is not a root for any other value of .

Proof. We are given thatSuppose that is also a root for . Then by hypothesisFrom (9), and for ; hence we can divide the two equations to deduceHence . Also from (9), for all the secondary roots. Thus the only possibility is that is the positive real root. Hence , but this is a root if and only if (see Remark 3). However, for arbitrary , the constraint has either no solution for or at most one solution for .

Proposition 8 (comparison of principal roots for different , for fixed ). For fixed , if the principal roots and , for respectively, satisfy the inequality

Proof. To exhibit the dependence on , we denote the auxiliary polynomial by . Then from (2)Then set to obtainNow has exactly one real root in , which is . Also and . It follows thatBy extension, this establishes (13) for all .

Lemma 9. For and ,Next, for and ,The expressions on the right hand sides of both equations are clearly well defined for all .

Proof. To derive (17) we define the sumThen differentiate the sum in (19) times to obtainEvaluation at yields (17). The derivation of (18) is an application of Leibniz’s rule:Evaluation at yields (18).

#### 3. Multiple Success Runs in Sequences of Bernoulli Trials

##### 3.1. Probability Generating Function, Mean, and Variance

We now turn to the first problem of interest in this paper, namely, the waiting time to obtain success runs of length . We begin by displaying the following expressions for the case . They were derived by Feller  and will be required in the following.

Remark 10 (probability generating function for ). Let . The probability generating function (p.g.f.) for the geometric distribution of order is [5, eq. ]The p.g.f. exists for . The mean and variance are given by [5, eq. ]The dependences on and will be omitted in the following unless necessary.

We now calculate the probability generating function, mean, and variance for multiple overlapping runs. The success runs have length and there can be at most overlaps between consecutive success runs. We denote the waiting time by . The case of nonoverlapping runs was extensively analyzed by Philippou , who named the distribution as the negative binomial distribution of order with vector parameter . Ling (1989)  and Hirano et al.  derived results for the special case . The text by Balakrishnan and Koutras  lists the special cases and as, respectively, Type I and Type III negative binomial distributions of order .

Proposition 11 (probability generating function for ). Let the probability generating function for success runs of length with at most overlaps be . We will omit the subscripts and unless necessary. ThenNotice that, for , we obtain (see (22)), as required.

Proof. Define as the waiting time to complete success runs. So suppose we have completed success runs. Hence by definition the last trials are all successes. Then exactly one of the following mutually exclusive events will occur:(i)The next trials are all successes. This yields the success run.(ii)The next trials are successes, followed by a failure, where . Then we restart the waiting time for the next success run from scratch (conditioned on an initial failure). We denote this additional waiting time by . Clearly, has the same distribution as .Since the events are mutually exclusive, we add the probabilities to obtainNow set and note that . Hence we obtain the following recurrence relation and solution for : Define as the term in the brackets. After some tedious algebra we obtainIn the last line it is necessary to exhibit the dependences on and explicitly. Then (24) follows immediately.

Proposition 12 (domain of convergence). The probability generating function for success runs converges forHence the domain of convergence of the probability generating function is the same for all .

Proof. Clearly, the function is well defined if and only if the sums of the series for and both converge. It was proved by Feller  that exists for . Hence exists ifHowever, because , it follows from Proposition 8 that . Hence . This proves (28).

Proposition 13 (mean and variance). The mean and variance for the waiting time for success runs are given by

Proof. We put and differentiate with respect to and evaluate at . We differentiate to obtainEvaluating at and noting that yieldThis proves (30a). We differentiate again to obtain We again evaluate at to obtainThis proves (30b).

We now show that various results derived by other authors are special cases of our results above. As stated above, the case of nonoverlapping runs was solved by Philippou , while Ling (1989)  and Hirano et al.  treated the case of overlapping runs.

Remark 14 (Philippou (1984)). Philippou [8, Lemma ] stated “Let be a distributed as . Then its probability generating function, to be denoted by , is given by”The mean and variance are given by [8, Proposition ]

Proof. By definition, Philippou’s notation is the same as our with . From (22), it is easy to show that for , whence and (35) follows. Note that Philippou  stated the domain of convergence to be , but we have shown that it is , which is a larger domain. Next, it is also easy to show that for , whence and which yield (36a) and (36b), respectively.

Remark 15 (Ling (1989)). Ling (1989) [9, Theorem ] stated that, for ,

Proof. Ling wrote and where we have written and , but the connection between the notations is clear. Setting in (26a) yieldsThis yields (37a). Next (37b) follows immediately by solving the recurrence relationSimilar to Philippou , Ling (1989)  also stated the domain of convergence to be , but we have shown that it is larger, given by .

Remark 16 (Hirano et al. (1991)). Hirano et al. [10, Theorem ] stated that “Let be the p.g.f. of .” They wroteThey also wroteThen they derived the solutionThey also gave expressions for the mean and variance. The mean is

Proof. The connection between the notations is that they write and where we write and , respectively. They employ as the independent variable, where we use . It is simple to derive that their expression for in (42) equals that for in (22). Next, (40) is simply (39) with the changes of notation listed above. Next, setting and changing the independent variable from to ,This is exactly (42). From (30a), the mean for isThis is exactly (43). Hirano et al.  also displayed an expression for the variance. The proof of equivalence with our expression involves merely tedious algebra and is omitted.

##### 3.2. Probability Mass Function

We derive an expression for , the probability mass function (p.m.f.) that the success run of length with -overlapping occurs at the Bernoulli trial, where and . Clearly for and for . An expression for the p.m.f. for the case was derived in . By definition, the probability generating function is related to the probability mass function viaWe derived an expression for above and we will use it to derive an expression for in the following. From the second form for in (22), with ,Hence for success runs,The right hand side is a rational function of two polynomials. From Proposition 7, the auxiliary polynomials and have no roots in common. Furthermore, because , the numerator polynomial is of a lower degree than the denominator polynomial. We also know that all the roots of the auxiliary polynomials are distinct. Hence we can expand as a sum of partial fractions with repeated roots (of the denominator polynomial)Here the coefficients are parameters which depend on , , and but not on . For brevity, we drop the subscripts and on and also write the roots as in the following. The coefficients can be evaluated explicitly in terms of the roots via the standard residues formulaReturning to the use of , we see thatWe expand the right hand side using the negative binomial theorem and equate to the coefficient of .

Proposition 17. The probability mass function for the success run of length with -overlapping is given byHence is given by a sum of exactly terms, independently of . Recall from above that for so the above formula is only required for ; hence the binomial coefficients are well defined.

The derivation of the above expression has already been given above, where all notation has been defined and explained.

##### 3.3. Binomial Distribution of Order with -Overlapping

Consider a sequence of Bernoulli trials of fixed length and let denote the number of success runs of length with a maximum of overlaps between success runs. This is the binomial distribution of order with -overlapping success runs and has been reviewed in the encyclopedia article by Philippou and Antzoulakos . Good overviews have also been given by Makri and Philippou  and Makri et al. ; see the bibliographies in both references. Ling (1988)  introduced the case of and called it the “Type II binomial distribution of order .”

The case is the probability that the longest success run in the first trials has length less than . It is also known as the probability that the waiting time to attain the first success run of length exceeds trials. This scenario has been solved by many authors. For example, Feller  presented an asymptotic solution in terms of the principal root. In our paper , we extended Feller’s solution to include all the roots. Solutions have also been derived by Burr and Cane , Godbole , Philippou and Makri , and Muselli , all of whom expressed their results using (possibly nested) binomial or multinomial sums.

Let , where , be the probability mass function for . We derive an expression for in the following. Note that, to obtain a nontrivial distribution, we must have so for fixed we must have .

Proposition 18. The probability mass function for the binomial distribution of order with -overlapping is given byHere . Note that the last term vanishes for ; hence the above expression agrees with our result in .

Proof. We solve the problem as follows. Suppose the success run is completed on the trial. Then by definition the outcomes of the last trials are all successes. There are now two mutually exclusive and exhaustive possibilities, according as the success run is contiguous with (and possibly overlaps) the success run, or there is at least one failure between the runs. (i)In the former case, the success run terminates at the trial . (This event is null if .) The outcome of the trial is a success.(ii)In the latter case, the outcome of the trial is a failure. The first trials contain exactly success runs.Since the events are mutually exclusive and exhaustive, we add the probabilities to obtainRearranging terms and replacing by and by yield (53). Our expression for is given by a sum of exactly terms, independently of . Note, however, that and must be calculated for each . In practice, this means we must calculate in (52) for . This requires a total of sums, to obtain the full probability mass distribution.

We now summarize results for the p.m.f. and p.g.f. derived by other authors. Aki and Hirano (2000) [18, Proposition ] derived an expression for the p.g.f. as a nested sum of multinomial terms. Makri and Philippou [11, Theorem ] derived the p.m.f. , as a sum of multinomial terms. They also derived an alternative expression for the p.m.f. [11, Theorem ] in terms of , which is the number of possible ways of distributing identical balls into urns such that the maximum allowed number of balls in any one urn is [11, Lemma ]. They also calculated the mean [11, Proposition ]The special case is Proposition of Aki and Hirano (1988) ; see also Antzoulakos and Chadjiconstantinidis . The special case is equivalent to Theorem (i) of Ling (1988) Ling (1988) [13, Theorem ] gave a recursive relation for the p.m.f. and also an explicit expression for the p.m.f. [13, Theorem ], in terms of nested multinomial sums. Ling derived the mean [13, Theorem (i)] and the variance [13, Theorem (ii)] and a recurrence relation for the m.g.f. [13, Theorem (iii)]. Inoue and Aki [21, Proposition ] derived an explicit expression for the p.g.f. in terms of restricted multiple sums and multinomials. They stated that their expression for the special case was derived by Inoue and Aki [21, Proposition ]. Hirano et al.  studied the case in some detail. They give an explicit expression for the p.g.f. in terms of restricted multiple sums and multinomials [10, Theorem ]. Hirano et al. [10, Theorem ] give an explicit expression for the p.m.f. in terms of nested multinomial sums, but different from Ling (1988) [13, Theorem ]. Han and Aki [22, Theorem ] presented a recurrence formula to calculate the p.m.f.

##### 3.4. Success Runs with

The case is not without interest. In this scenario, there must be a gap or buffer of at least trials (of arbitrary outcomes) between success runs. We call this scenario “-buffering.” First, we derive the probability mass function, probability generating function, mean, and variance of the negative binomial distribution of order for success runs of length with -overlapping. Next, we treat sequences with a fixed total length and study the binomial distribution of order with buffer . The value of of the number of success runs spans the interval . This is the same formula as for . We derive an expression for the probability mass function for the above distribution.

Most of the published literature for the case has treated sequences of fixed length . Inoue and Aki [21, Section ] published results for sequences of Markov trials. They derived an expression for the p.g.f. as a nested sum of multinomial terms [21, Proposition ]. Han and Aki  treated sequences of i.i.d. Bernoulli trials. They derived a recurrence relation for the p.g.f. [22, Theorem ].

The results for the negative binomial case (fixed , variable ) are straightforward to derive for . The following results are stated without proof.

Proposition 19. For , the probability mass function for success runs of length with -buffering satisfies the obvious identityThe probability generating function is then given byThe domain of convergence of the p.g.f. is clearly the same as in the case (see (28)) and is . It follows easily from (58) that the mean and variance are given by

For sequences of fixed length , the analysis of the binomial distribution of order is nontrivial for . We first state the following obvious result for all .

Remark 20. For any , let be the probability of attaining, after trials, or fewer success runs of length with -overlapping for or -buffering for . Then clearlyHence, for fixed , the probability mass function for is given by

The above expression is valid for all but requires the summation of an infinite series. For , (53) offers a more concise expression for . For , we can also derive a more concise expression for as follows.

Proposition 21. For fixed and fixed , the probability mass function for the binomial distribution of order with -buffering is given byHere . Note by definition that for .

Proof. We omit the indices and in the following. Consider , where success runs have taken place, ending at trial . Hence the last outcomes are all successes. We then have the following mutually exclusive and exhaustive possibilities. (i)The outcome of trial is a success. Then the success run must end at trial . The trials in the sequence from through constitute the buffer between the two success runs. The probability of this event is . (This event is null if . Note that .)(ii)The outcome of trial is a failure. Then we must attain success runs by trial . However, we must subtract the possibility that the success run ends at one of the trials through . (Note that if , this set is empty.) The probability of this event is .The events are mutually exclusive and exhaustive (noting that there can be at most one success run completed from trials through because of the buffering requirement); hence we add the probabilities to obtainRearranging terms and replacing by and by yield (62). For , the last sum is absent and the above expression is the same as (53).

#### 4. Pairs of Successes Separated by At Most Failures

In this section we study a different problem. We treat the distribution of waiting time for the nonoverlapping appearance of a pair of successes separated by at most failures (). Our main reference is the elegant analysis by Koutras , who also gives an excellent bibliography on the subject. To avoid cluttering the notation in this paper with too many symbols, we will reuse some of the symbols such as for the probability mass function, and so forth. It should be understood that we are treating a new problem, and the following notation is self-contained. We begin with . Koutras  gave a recurrence relation for the probability mass function . We will suppress the indices and unless required. We derive the exact solutions for the roots of the auxiliary polynomial associated with the recurrence relation, in terms of Fuss-Catalan numbers. We also derive various pertinent properties of the roots. We then solve a Vandermonde matrix system of equations to derive an expression for the p.m.f. as a sum over powers of the roots. We also derive an expression for the probability of the waiting time to exceed trials.

Let us denote the waiting time by . Note that Koutras  writes , but we write to maintain consistency with the notation in the earlier parts of our paper. We begin with the case and drop the subscripts.

Remark 22 (Koutras , Theorem ). The probability mass function satisfies the recurrence relation [7, eq. 3.1]The initial conditions are [7, eq. 3.2]

The auxiliary polynomial associated with the above recurrence relation isThe auxiliary equation is .

Proposition 23 (properties of roots). For fixed , the roots of the auxiliary polynomial have the following properties: (a) There are no repeated roots. (b) There is a unique positive real root. (c) The positive real root lies in . (d) If is odd, there are no other real roots. If is even, there is exactly one negative real root. (e) The magnitude of the positive real root exceeds that of all the other roots.

Proof. Both and must vanish simultaneously at a repeated root. NextHence vanishes at (not a root of ) or . Note that . Now for ,Hence for ,Hence for . Hence has no repeated roots. Next note that , , and . Hence has an odd number of positive real roots for . Now from (67), for , so for . It follows that has exactly one positive real root, and it lies in the interval . Also if is odd then for and there are no negative real roots. If is even then increases as decreases through negative values; hence for even , has exactly one negative real root. Next, if is a root, by the triangle inequality,HenceThe inequality is strict unless is real and positive (so that both and are real and positive) and we have shown that there is only one real positive root. Hence the real positive root has a larger magnitude than all the other roots.

We will call the positive real root the “principal root” and refer to all the other roots as “secondary roots.” We will denote the roots by , , where the principal root is . Although our the following analysis is for , it is helpful to note the following limiting cases for and .

Proposition 24 (limiting cases for roots). If , the principal root is . If , the principal root is . All the secondary roots vanish for both and . None of the roots vanish if .

Proof. We have already seen that ; hence obviously is not a root if . If , the auxiliary equation is ; hence all the roots vanish. If , the auxiliary equation is , so one root is and the others are all . Hence for and for , and all the secondary roots vanish for both and .

Proposition 25 (principal root decreases monotonically with increasing ). For fixed , let and denote the respective principal roots by and . Then .

Proof. Note that the auxiliary polynomial can be expressed in the following alternative form:For brevity write . Then by definitionThen because , it follows that and so (because ) it also follows that . Hence