The Scientific World Journal

Volume 2014 (2014), Article ID 940358, 13 pages

http://dx.doi.org/10.1155/2014/940358

## New Proofs of Some -Summation and -Transformation Formulas

Department of Mathematics, Chongqing Higher Education Mega Center, Chongqing Normal University, Huxi Campus, Chongqing 401331, China

Received 13 February 2014; Accepted 11 April 2014; Published 7 May 2014

Academic Editor: Kishin Sadarangani

Copyright © 2014 Xian-Fang Liu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

We obtain an expectation formula and give the probabilistic proofs of some summation and transformation formulas of -series based on our expectation formula. Although these formulas in themselves are not the probability results, the proofs given are based on probabilistic concepts.

#### 1. Introduction

The probabilistic method is an important tool to derive results in combinatorics, theory of numbers, and other fields (see [1–15]). There have been many applications in the basic hypergeometric series (or -series). For example, Fulman [3] presented a probabilistic proof of Rogers-Ramanujan identity using Markov chain. Chapman [2] proved the Andrews-Gordon identity by using extended Fulman's methods. Kadell [4] gave a probabilistic proof of Ramanujan's summation based on the order statistics.

Recently, Wang [13, 14] constructed a random variable and introduced a new probability distribution : where By applying the above probability distribution, Wang proved the -binomial theorem and -Gauss summation formula and also obtained some new summation formulas and transformation formulas.

One of the most important concepts in probability theory is that of the expectation of a random variable. If is a discrete random variable having a probability mass function , then the* expectation*, or the* expected value*, or the* expectation operator* of , denoted by , is defined by (e.g., [9, page 125])

In the following section we introduce some notations, definitions, and formulas of -series. Throughout this paper we suppose .

The -shifted factorials are defined by Clearly, The following are compact notations for the multiple -shifted factorials: The basic hypergeometric series or -series are defined by (see [16, 17]) Heine introduced the basic hypergeometric series which is defined by Jackson defined the -integral (see [17, 18]): The following is the Andrews-Askey integral (see [19]) which can be derived from Ramanujan’s : provided that there are no zero factors in the denominator of the integrals. Recently, Liu and Luo [20] further generalized the above Andrews-Askey integral in the following more general form.

Lemma 1 (see [21, page 5. (2.5)] [20, Theorem 1]). *One has
**
provided and , provided that there are no zero factors in the denominator of the integrals.*

Lemma 2 (see [21, page 5. (2.7)]). *One has
**
provided and , provided that there are no zero factors in the denominator of the integrals.*

The aim of the present paper is to give an expectation formula and introduce some probabilistic proofs of the corresponding summation and transformation formulas of -series based on an expectation formula. In Section 2 we give an expectation formula of the random variables . In Section 3 we show the probabilistic proofs of transformation formulas of . In Section 4 we give probabilistic proof of Heine's transformations and Jackson's transformations. In Section 5 we give probabilistic proof of some formulas of -series, for example, -binomial theorem, -Chu-Vandermonde sum formulas, -Gauss sum formula, -Kummer sum formula, Bailey sum formula, and so forth.

#### 2. Main Theorem

In this section we obtain the expectation formulas of some random variables which are very useful to prove the summation and transformation formulas of -series.

Theorem 3. *Let denote a random variable with probability distribution , . Then one has
**
provided that , , and .*

*Proof. *A random variable has the distribution . From definitions (9) we have
From definitions (10) and combining (15) we have
By using the probability distribution and noting (16) and (12) of Lemma 1, we calculate the expectation of the random variable as follows:
Hence, we obtain
The proof is complete.

Theorem 4. *Let denote a random variable with probability distribution , . Then one has
**
provided that , , and .*

*Proof. *By (7) and (8) we have
By (4) we have
Substituting (20) and (21) into the right-hand side of (14), we obtain
Next, let us replace by , respectively, and let in (22); we get
The proof is complete.

Theorem 5. *Let denote a random variable with probability distribution , . Then one has
**
provided that , , and .*

*Proof. *Letting in (14) of Theorem 3, we obtain (24).

Corollary 6 (see [13, page 463, Theorem 1]). *Let denote a random variable with probability distribution , . Then one has
**
provided that .*

*Proof. *Using (24) of Theorem 5, we deduce
Using (31) of Theorem 8, we have
Substituting (27) into the right-hand sides of (26), we have
The proof is complete.

Corollary 7 (see [14, page 245, Lemma 2.4]). *Let denote a random variable with probability distribution , . Then one has
**
provided that .*

*Proof. *Letting or in (14) of Theorem 3, then we have
The proof is complete.

#### 3. Probabilistic Proofs of Transformation Formulas of

Sears’ transformation formula is widely applied to the special functions. In this section we will introduce probabilistic proofs of transformation of .

Theorem 8 (see [17, page 359. III. 9, III. 10]). *One has
*

*Proof. *Interchanging and in (14), then we have
Interchanging and in (14), then we have
By (14) and (33), we obtain
and, replacing by in (35), we obtain a transformation formula
By (14) and (34) and then replacing by , we obtain (31).

By (33) and (34), we have
and, replacing by in (37), we obtain (32). The proof is complete.

*4. Probabilistic Proof of Heine and Jackson’s Transformations*

*Heine [22] derived transformation formulas for and also proved Euler’s transformation formula. A basic hypergeometric representation for a given function is by no means unique. There are groups of transformation between various hypergeometric representations of the same function. We will first prove the classical Heine’s transformation formula which will be useful in proving many other formulas. In this section we give the probabilistic proofs of Heine and Jackson’s transformations.*

*Theorem 9 (see [17, page 359, III. 1, III. 2, III. 3]). Heine’s transformation formulas for are
*

*Proof. *Comparing (24) of Theorem 5 and (25) of Corollary 6, we obtain
or, equivalently, that
Setting in (42), we have
Replacing by in (43), we get
which is just (38).

Setting and and replacing by in (14), we have
Setting and in (14) of Theorem 3, we have
Comparing (45) and (46), we obtain
Replacing by , we get
Letting in (48) gives
We get (39). From (39) we can deduce (40).

*Jackson’s transformations formula is an important formula in basic hypergeometric series, and now we give a probabilistic proof of Jackson’s transformation formulas for and .*

*Theorem 10 (see [17, page 359, III.4]). Jackson’s transformations of , series are
*

*Proof. *This includes employing two different forms of .

Letting in (14) of Theorem 3 and then replacing by , we get
Comparing (51) and (19) of Theorem 4 gives
Then we obtain
Replacing by gives
This completes the proof.

*5. Probabilistic Proofs of Some Formulas of -Series*

*5. Probabilistic Proofs of Some Formulas of -Series**The -binomial theorem is an important mathematical result which has been widely applied in the special functions, physics, quantum algebra, and quantum statistics. The -binomial theorem was derived by Cauchy [23], Heine [22], and Jacobi [24] concerning the nonterminating form. There are many proofs of the -binomial theorem to show the corresponding references; for example, a better and simpler proof, by using the method of the finite difference, was obtained by Askey (see [25]); a nice proof of the -binomial theorem based on combinatorial considerations was given by Joichi and Stanton (see [26]). In 1847, Heine [22] derived a -analogue of Gauss’s summation formula which is important in -series. Joichi and Stanton [26] gave a bijective proof of the -Gauss summation formula based on combinatorial considerations. Rahman and Suslov [27] used the method of the first order linear difference equations to prove the -Gauss summation formula. By analytic continuation, the terminating case, when , reduces to -analogues of Vandermonde’s formula. Bailey and Daum independently discovered the -Kummer summation formula.*

*In this section we will introduce probabilistic proof of some formulas of -series, for example, -binomial theorem, -Chu-Vandermonde, -Gauss summation formula and -Kummer summation formula, and so forth.*

*Theorem 11 (see [16, page 488, Theorem ] [17, page 354. II. 3]). The -binomial theorem is
*

*Proof. *Below we give two proofs of (55).

Setting and replacing and by and in (14), we obtain
Comparing (56) and (29) of Corollary 7, we have
Then we obtain
Replacing by , we can get
that is,

Another proof of the -binomial theorem is as follows.

Setting and and replacing by in (14), we obtain
Letting or and in (14) of Theorem 3, we obtain
Comparing (61) and (62) gives
Then we obtain
Replacing by gives
that is,
This proof is complete.

*Theorem 12 (see [17, page 354, II. 7]). The -Chu-Vandermonde sums are
*

*Proof. *The below are two proofs of the -Chu-Vandermonde.(i)First proof: setting and replacing by in (14), we have
Replacing by in (68), then we have
where
Hence,

By using the probability distribution and employing Andrews-Askey -integral (11), now we calculate the expectation of the random variables as follows:
Comparing (71) and (72) gives
Then we obtain
which is just -Vandermonde sums (67).(ii)Second proof: replacing by in (29), we have
Comparing (71) and (75), we obtain
Hence,
which is just -Vandermonde sums (67).

*Theorem 13 (see [16, page 522, Corollary ] or [17, page 354, II. 8]). The -Gauss sum is
*

*Proof. *Letting and replacing by in (14), we obtain
Comparing (29) and (79) gives
hence we get
Replacing by in the above formula, we obtain
which is just the -Gauss sum (78).

*Theorem 14 (see [17, page 354, II. 9]). The -Kummer sum formula is
*

*Proof. *Letting in (14) and then replacing by , we have
Replacing by in (84), we write
where
Hence, we obtain
By using the probability distribution and Lemma 2, we calculate the expectation of the random variables as follows:
Comparing (87) and (88), we have
Using Heine’s transformation and -binomial theorem, we have
Hence, we obtain (83).

*Theorem 15 (see [17, page 354, II. 10]). Bailey’s sum formula is
*

*Proof. *By (19), we have
Replacing by in (92) gives
where
Hence, we have

By using the probability distribution and Lemma 2, we calculate the expectation of the random variables as follows:
Comparing (95) and (96), we have
Hence, we get (91).

*Theorem 16 (see [17, page 354, II. 11]). The Gauss sum formula is
*

*Proof. *By (14), we have