#### Abstract

We investigate polynomials, called *m*-polynomials, whose generator polynomial has coefficients that can be arranged in a square matrix; in particular, the case where this matrix is a Hadamard matrix is considered. Orthogonality relations and recurrence relations are established, and coefficients
for the expansion of any polynomial in terms of *m*-polynomials are obtained. We conclude this paper by an implementation of *m*-polynomials and some of the results obtained for them in Mathematica.

#### 1. Introduction

Matrices have been the subject of much study, and large bodies of results have been obtained about them. We study the interplay between the theory of matrices and the theory of orthogonal polynomials. For Krawtchouk polynomials, introduced in [1], interesting results have been obtained in [2–4]; also see the review article [5] and compare [6] for generalized Krawtchouk polynomials. More recently, conditions for the existence of integral zeros of binary Krawtchouk polynomials have been obtained in [7], while properties for generalized Krawtchouk polynomials can be found in [8]. Other generalizations of binary Krawtchouk polynomials have also been considered; for example, some properties of binary Krawtchouk polynomials have been generalised to -Krawtchouk polynomials in [9]. Orthogonality relations for quantum and -Krawtchouk polynomials have been derived in [10], and it has been shown that affine -Krawtchouk polynomials are dual to quantum -Krawtchouk polynomials. In this paper, we define and study generalizations of Krawtchouk polynomials, namely, -polynomials.

The Krawtchouk polynomial is given by where is a natural number and . The generator polynomial is The generalized Krawtchouk polynomial is obtained by generalizing the above generator polynomial as follows: where , is a prime power, the are indeterminate, the field with elements is , and is a character.

The above information about Krawtchouk polynomials and generalized Krawtchouk polynomials was taken from [6].

If we replace the by arbitrary scalars in the last equation, we obtain the generator polynomial of -polynomials ; see Definition 2 below. These -polynomials are the subject of study in this paper.

In Section 2, we present relevant notations and definitions. In Section 3, we introduce the generator polynomial. The associated matrix of coefficients can be any square matrix, and so the question that immediately arises is how the properties of the -polynomials are related to the properties of . We will establish that, if is a generalized Hadamard matrix, then the associated -polynomials satisfy orthogonality conditions. In Section 4, we establish recurrence relations for -polynomials. Afterwards, we obtain coefficients for the expansion of a polynomial in terms of -polynomials in Section 5. Finally, in Section 6, we implement the results obtained here in Mathematica, so the reader may easily derive and explore -polynomials for any matrix .

#### 2. Definitions and Notations

In this paper, denotes . We use the convention that if , then has components . So, if then denotes . We will also use the *elementary unit vectors* , .

We use the -norm (the “taxicab-metric”) to measure the *length* of , that is, . We define the *set of weak compositions of ** into ** numbers* by ; in other words, is the subset of -dimensional nonnegative vectors of length . We note that the set has cardinality . For , we use the multi-index notation . Similarly, for a variable and , we write (where the convention is used). We note that the multinomial theorem reads

In the following, denotes an arbitrary matrix. We use the following convention to refer to the entries of a matrix [11]. The entry in the th row and th column is called the *entry*, where . Thus, the entry of is denoted by , where . Given a matrix , the matrix that remains when the first row and the first column of are removed is called the *core* of .

The next definition is well known.

*Definition 1. *For an integer greater than one, a matrix is called a *generalized Hadamard matrix* if , where is the complex conjugate transpose of and is the identity matrix.

We now define the -polynomial with respect to a matrix .

*Definition 2. *Let and be elements of . *The* *-polynomial in ** with respect to ** having parameter * is denoted by or and given by
where the summation is taken over all sets of nonnegative integers (with ) satisfying

#### 3. The Generator Polynomial and Orthogonality Relations

The values of the -polynomial with respect to a matrix can be derived from a generator polynomial.

Theorem 3 (generator polynomial). *Let and let be a matrix. Then,
**
where .*

*Proof. *This is an application of the multinomial theorem (recall that for , we have ):
where .

As an immediate consequence, we can recover the entries of the matrix as the -polynomials of minimum order.

Corollary 4. *We have .*

*Proof. *For , the left-hand side of (7) becomes
and since and thus runs through in this case, the rest follows by comparing coefficients of .

*Remark 5. *Theorem 3 can also be used for summation results of -polynomials over : using , we obtain
that is, the product of the th power of the th column sum of .

As an immediate result of this remark, we can establish the following corollary.

Corollary 6. *If for one (i.e., one of the row-sums of the matrix is zero), then .*

For a generalized Hadamard matrix , the multinomial theorem yields the following orthogonality relation for the corresponding -polynomials.

Theorem 7 (orthogonality relation). *If is a generalized Hadamard matrix, then the -polynomials , with , satisfy the orthogonality relations
**
where
**
denotes Kronecker’s delta.*

*Proof. *Let , , and define
Then, on the one hand, we have
since is a generalized Hadamard matrix. Consequently,
On the other hand, the multinomial theorem yields

Equating coefficients of in the two above expressions for , we obtain the desired result.

If is a generalized Hadamard matrix and also satisfies certain additional conditions, then it is possible to establish that the corresponding -polynomials satisfy additional orthogonality conditions. We use the following three results in proving this.

Lemma 8. *If is symmetric, that is, , then .*

*Proof. *By Definition 2, we have (where and )
where we define (and thus and ), and since is symmetric (i.e., ).

The following results can be obtained in a similar manner to Lemma 8.

Lemma 9. *If has a symmetric core, for and for , then .*

Such symmetry relations yield additional orthogonality relations.

Theorem 10 (additional orthogonality relation). *Let be a generalized Hadamard matrix. *(i)* If in addition is symmetric, then *(ii)

*If in addition**has a symmetric core,**for**and**for**, then**Proof. *This follows immediately from Theorem 7 and Lemmas 8 and 9.

*Remark 11. *It turns out that a slight modification of the above -polynomials has already been considered in [12, 13]. For a matrix and , [13, Equation (2.3)] defines polynomials by in our notation. For these polynomials, the following multiplication property is proved (see [13, Theorem 3.2]):
for matrices , . With , and thus , Theorem 7 becomes a special case of this equation by noting that (see [13, Equation (4.5)]) and (using the notation of Definition 2, the polynomial on the left can only be nonzero if whenever and otherwise).

#### 4. Recurrence Relations and Summation of Certain Polynomials

We use Theorem 3 to derive a recurrence relation for -polynomials in having parameter where both by a sum of -polynomials where both .

Theorem 12 (recurrence relations I). *Let with for one . Then,
**
where the sum on the right is taken over all such that the th component of is positive.*

*Proof. *Using (7) twice, we obtain (if )
Multiplying the right-hand side out and equating coefficients of establish the claim.

Theorem 13 (recurrence relations II). *Let with for one . Then,
**
where the sum on the right is taken over all such that the th component of is positive.*

*Proof. *We note that we have
Thus, on the one hand, the derivative of the left-hand side of (7) with respect to is
Using (7), this is equal to
On the other hand, the derivative of the right-hand side of (7) with respect to is
Equating the last two expressions and interchanging the order of summation yield
by comparing coefficients, we must have , and therefore (noting that )

We note that (21) and (23) can be rewritten as respectively.

In particular, Theorem 12 can be easily iterated to obtain the following statements.

Proposition 14. *Let with for one . Then,
**
where the sum on the right is taken over all such that is in .*

*Proof. *This follows by repeatedly applying (21) in Theorem 12. The multinomial coefficient arises as the number of ways can be changed to step-by-step.

Corollary 15. *Let with for all . Then,
**
where the sum on the right is taken over all the vectors such that .*

*Proof. *This follows from repeatedly applying Proposition 14, by subtracting from , from , and so on.

We note that taking in the previous corollary yields the statement of Definition 2 (since, by definition, ).

Combining the recurrence relation in Theorem 12 with Theorem 7 yields the following summation formula for -polynomials.

Proposition 16 (summation formula). *Let be a generalized Hadamard matrix and . Then, for any pair of fixed, arbitrary elements , one has
*

#### 5. Expansion of a Polynomial

The coefficients for the expansion of a polynomial in terms of Krawtchouk polynomials have been obtained (cf. [6]). We obtain a similar result for -polynomials defined in terms of a generalized Hadamard matrix, using orthogonality properties.

Theorem 17 (polynomial expansion). *Let be a generalized Hadamard matrix, and let be a variable element of . *(i)*If is symmetric and the expansion of a polynomial in terms of -polynomials defined with respect to is (with coefficients )
then, for all ,
Similarly, if the expansion of a polynomial in terms of -polynomials is (with coefficients )
then, for all ,
*(ii)*If has a symmetric core, for and for , then the corresponding coefficients can be calculated for all by
respectively,
*

*Proof. *Let . Multiplying both sides of (34) by and summing each side over all elements of yield
where step uses Theorem 10; the first result in (i) follows. The other results are proved similarly.

#### 6. Mathematica Code

Here, we present Mathematica code to obtain -polynomials. First, we specify the matrix and the length : > g = {{1, 1}, {1, −1}} > n = 6From this we can calculate , the set and initialize the variable : > q = Union[Dimensions[g]][[1]] > pall = Sort[Flatten[Map[Permutations, IntegerPartitions[n, {q}, Range[0, n]]], 1]] > z = Table[f[i], {i, q}]Using Theorem 3, we obtain the -polynomial via its generator: > generator[s_]:= Expand[Product[ (Sum[g[[i, j]] z[[j]], {j, 1, q}])(s[[i]]), {i, 1, q}]] > mg[p_, s_]:= Coefficient[generator[s], Inner[Power, z, p, Times]]The values of the -polynomial can be shown using the following command: > TableForm[Table[mg[pall[[i]], pall[[j]]], {i, Length[pall]}, {j, Length[pall]}], TableHeadings -> {pall, pall}, TableDepth -> 2]

For the above chosen matrix and length , the output for looks as follows ( denotes the row and the column; e.g., ): Thus, we are able to evaluate the -polynomial for any (and thus check orthogonality and recurrence relations). With the help of the Mathematica function Fit, we can express the m-polynomials as univariant polynomials in as follows (we also recall that here): Similarly, we may express the -polynomials as univariant polynomials in as follows: By Theorem 17, we can express any polynomial in terms of -polynomials. Using Mathematica to find the corresponding coefficients and in the expansion in terms of -polynomials, for example, for the polynomial , is achieved as follows: > p[{x0_, x1_}]:= x0 x1 > Table[1/qn Total[Table[p[pall[[i]]] Conjugate[mg[pall[[j]], pall[[i]]]], {i, Length[pall]}]], {j, Length[pall]}] > Table[1/qn Total[Table[p[pall[[i]]] Conjugate[mg[pall[[i]], pall[[j]]]], {i, Length[pall]}]], {j, Length[pall]}]

The output of the later two lines is giving the list of coefficients , respectively, for . That is, the expansion in terms of -polynomials here reads as follows: As a check, we have (recall that )

For a generalized Hadamard matrix with symmetric core, for and for ; for example, The above mathematica code using Theorem 17 has to be modified as follows: > Table[1/qn Total [Table [(-1)(pall[[i, 1]] + pall[[j, 1]]) p[pall[[i]]] Conjugate[mg[pall[[j]], pall[[i]]]], {i, Length[pall]}]], {j, Length[pall]}] > Table[1/qn Total[ Table[(-1)(pall[[i, 1]] + pall[[j, 1]]) p[pall[[i]]] Conjugate[mg[pall[[i]], pall[[j]]]], {i, Length[pall]}]], {j, Length[pall]}]

#### 7. Outlook

Krawtchouk polynomials and their generalisation appear in many areas of Mathematics, see [5]: harmonic analysis [1, 2, 4], statistics [3], combinatorics and coding theory [6, 7, 9, 11, 14, 15], probability theory [5], representation theory (e.g., of quantum groups) [10, 12, 13], difference equations [16], and pattern recognition [17] (a website called the “Krawtchouk Polynomials Home Page” by V. Zelenkov at http://orthpol.narod.ru/eng/index.html collecting material about M. Krawtchouk and the polynomials that bear his name has, unfortunately, not been updated in a while). However, our motivation in this paper was primarily driven by generalising the results obtained in [8]—and thus shedding more light on the qualitative structure of (generalisations of) Krawtchouk polynomials—and not yet with a specific application in mind. By providing the Mathematica code to obtain -polynomials, we also hope that other researchers are encouraged to explore them and see if they can be used in their research.

#### Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.