Abstract

We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called generalized Shannon inequality and is well-known generalization in information theory and then give its application in coding theory. The objective of the paper is to establish a result on noiseless coding theorem for the proposed mean code length in terms of generalized information measure of order .

1. Introduction

Throughout the paper denotes the set of the natural numbers and for we setwhere denote the set of all -components complete and incomplete discrete probability distributions.

For , , we define a nonadditive measure of inaccuracy, denoted by asIf , then reduces to nonadditive entropy.Entropy (3) was first of all characterized by Havrda and Charvát [1]. Later on, Daróczy [2] and Behara and Nath [3] studied this entropy. Vajda [4] also characterized this entropy for finite discrete generalized probability distributions. Sharma and Mittal [5] generalized this measure which is known as the entropy of order and type . Pereira and Gur Dial [6] and Gur Dial [7] also studied Sharma and Mittal entropy for a generalization of Shannon inequality and gave its applications in coding theory. Kumar and Choudhary [8] also gave its application in coding theory. Recently, Wondie and Kumar [9] gave a Joint Representation of Renyi’s and Tsallis Entropy. Tsallis [10] gave its applications in physics for , and , reduces to Shannon [11] entropyInequality (6) has been generalized in the case of Renyi’s entropy.

2. Formulation of the Problem

For and , , then an important property of Kerridge’s inaccuracy [12] is that Equality holds if and only if In other words, Shannon’s entropy is the minimum value of Kerridge’s inaccuracy. If , , then (5) is no longer necessarily true. Also, the corresponding inequalityis not necessarily true even for generalized probability distributions. Hence, it is natural to ask the following question: “For generalized probability distributions, what are the quantity the minimum values of which are ?” We give below an answer to the above question separately for by dividing the discussion into two parts (i) and (ii) Also we shall assume that , because the problem is trivial for

Case 1. Let If , , then as remarked earlier (5) is true. For , , it can be easily seen by using Jenson’s inequality that (5) is true if , equality in (5) holding if and only if

Case 2. Let Since (6) is not necessarily true, we need an inequalitysuch that and equality holds if and only if .
Since , by reverse Hölder inequality, that is, if and , are positive real numbers, thenLet , , and
Putting these values into (9), we getwhere we used (8), too. This implies however thatOrusing (12) and the fact that ,, we get (6).

Particular’s Case. If , then (6) becomeswhich is Kerridge’s inaccuracy [12].

3. Mean Codeword Length and Its Bounds

We will now give an application of inequality (6) in coding theory forLet a finite set of input symbolsbe encoded using alphabet of symbols, then it has been shown by Feinstein [13] that there is a uniquely decipherable code with lengths if and only if the Kraft inequality holds; that is,where is the size of code alphabet.

Furthermore, ifis the average codeword length, then for a code satisfying (16), the inequalityis also fulfilled and equality holds if and only ifand by suitable encoded into words of long sequences, the average length can be made arbitrarily close to , (see Feinstein [13]). This is Shannon’s noiseless coding theorem.

By considering Renyi’s entropy (see, e.g., [14]), a coding theorem, analogous to the above noiseless coding theorem, has been established by Campbell [15] and the authors obtained bounds for it in terms ofKieffer [16] defined a class rules and showed that is the best decision rule for deciding which of the two sources can be coded with expected cost of sequences of length when , where the cost of encoding a sequence is assumed to be a function of length only. Further, in Jelinek [17] it is shown that coding with respect to Campbell’s mean length is useful in minimizing the problem of buffer overflow which occurs when the source symbol is produced at a fixed rate and the code words are stored temporarily in a finite buffer. Concerning Campbell’s mean length the reader can consult [15].

It may be seen that the mean codeword length (17) had been generalized parametrically by Campbell [15] and their bounds had been studied in terms of generalized measures of entropies. Here we give another generalization of (17) and study its bounds in terms of generalized entropy of order .

Generalized coding theorems by considering different information measure under the condition of unique decipherability were investigated by several authors; see, for instance, the papers [6, 13, 18].

An investigation is carried out concerning discrete memoryless sources possessing an additional parameter which seems to be significant in problem of storage and transmission (see [9, 1618]).

In this section we study a coding theorem by considering a new information measure depending on a parameter. Our motivation is, among others, that this quantity generalizes some information measures already existing in the literature such as Arndt [19] entropy, which is used in physics.

Definition 1. Let , be arbitrarily fixed, then the mean length corresponding to the generalized information measure is given by the formulawhere and are positive integers so thatSince (22) reduces to Kraft inequality when , therefore it is called generalized Kraft inequality and codes obtained under this generalized inequality are called personal codes.

Theorem 2. Let , be arbitrarily fixed. Then there exist code length so thatholds under condition (22) and equality holds if and only ifwhere and are given by (3) and (21), respectively.

Proof. First of all we shall prove the lower bound of
By reverse Hölder inequality, that is, if and , are positive real numbers thenLet , , and
Putting these values into (25), we getwhere we used (22), too. This implies however thatFor , (27) becomesusing (28) and the fact that , we getFrom (24) and after simplification, we getThis implieswhich gives Then equality sign holds in (29).
Now we will prove inequality (23) for upper bound of
We choose the codeword lengths , in such a way thatis fulfilled for all .
From the left inequality of (32), we havemultiplying both sides by and then taking sum over , we get the generalized inequality (22). So there exists a generalized code with code lengths , .
Since , then (32) can be written asMultiplying (34) throughout by and then summing up from to , we obtain inequalitySince for , we get from (35) inequality (23).

Particular’s Cases. For , then (23) becomeswhich is the Shannon [11] classical noiseless coding theoremwhere and are given by (4) and (17), respectively.

4. Conclusion

In this paper we prove a generalization of Shannon’s inequality for the case of entropy of order with the help of Hölder inequality. Noiseless coding theorem is proved. Considering Theorem 2 we remark that the optimal code length depends on in contrast with the optimal code lengths of Shannon which do not depend of a parameter. However, it is possible to prove coding theorem with respect to (3) such that the optimal code lengths are identical to those of Shannon.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.