Table of Contents Author Guidelines Submit a Manuscript
International Journal of Mathematics and Mathematical Sciences
Volume 2018 (2018), Article ID 2861612, 4 pages
https://doi.org/10.1155/2018/2861612
Research Article

Some Inequalities in Information Theory Using Tsallis Entropy

Department of Mathematics, College of Natural and Computational Science, University of Gondar, Gondar, Ethiopia

Correspondence should be addressed to Satish Kumar; moc.liamffider@47hsitasrd

Received 27 December 2017; Accepted 27 February 2018; Published 3 April 2018

Academic Editor: A. Zayed

Copyright © 2018 Litegebe Wondie and Satish Kumar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called generalized Shannon inequality and is well-known generalization in information theory and then give its application in coding theory. The objective of the paper is to establish a result on noiseless coding theorem for the proposed mean code length in terms of generalized information measure of order .

1. Introduction

Throughout the paper denotes the set of the natural numbers and for we setwhere denote the set of all -components complete and incomplete discrete probability distributions.

For , , we define a nonadditive measure of inaccuracy, denoted by asIf , then reduces to nonadditive entropy.Entropy (3) was first of all characterized by Havrda and Charvát [1]. Later on, Daróczy [2] and Behara and Nath [3] studied this entropy. Vajda [4] also characterized this entropy for finite discrete generalized probability distributions. Sharma and Mittal [5] generalized this measure which is known as the entropy of order and type . Pereira and Gur Dial [6] and Gur Dial [7] also studied Sharma and Mittal entropy for a generalization of Shannon inequality and gave its applications in coding theory. Kumar and Choudhary [8] also gave its application in coding theory. Recently, Wondie and Kumar [9] gave a Joint Representation of Renyi’s and Tsallis Entropy. Tsallis [10] gave its applications in physics for , and , reduces to Shannon [11] entropyInequality (6) has been generalized in the case of Renyi’s entropy.

2. Formulation of the Problem

For and , , then an important property of Kerridge’s inaccuracy [12] is that Equality holds if and only if In other words, Shannon’s entropy is the minimum value of Kerridge’s inaccuracy. If , , then (5) is no longer necessarily true. Also, the corresponding inequalityis not necessarily true even for generalized probability distributions. Hence, it is natural to ask the following question: “For generalized probability distributions, what are the quantity the minimum values of which are ?” We give below an answer to the above question separately for by dividing the discussion into two parts (i) and (ii) Also we shall assume that , because the problem is trivial for

Case 1. Let If , , then as remarked earlier (5) is true. For , , it can be easily seen by using Jenson’s inequality that (5) is true if , equality in (5) holding if and only if

Case 2. Let Since (6) is not necessarily true, we need an inequalitysuch that and equality holds if and only if .
Since , by reverse Hölder inequality, that is, if and , are positive real numbers, thenLet , , and
Putting these values into (9), we getwhere we used (8), too. This implies however thatOrusing (12) and the fact that ,, we get (6).

Particular’s Case. If , then (6) becomeswhich is Kerridge’s inaccuracy [12].

3. Mean Codeword Length and Its Bounds

We will now give an application of inequality (6) in coding theory forLet a finite set of input symbolsbe encoded using alphabet of symbols, then it has been shown by Feinstein [13] that there is a uniquely decipherable code with lengths if and only if the Kraft inequality holds; that is,where is the size of code alphabet.

Furthermore, ifis the average codeword length, then for a code satisfying (16), the inequalityis also fulfilled and equality holds if and only ifand by suitable encoded into words of long sequences, the average length can be made arbitrarily close to , (see Feinstein [13]). This is Shannon’s noiseless coding theorem.

By considering Renyi’s entropy (see, e.g., [14]), a coding theorem, analogous to the above noiseless coding theorem, has been established by Campbell [15] and the authors obtained bounds for it in terms ofKieffer [16] defined a class rules and showed that is the best decision rule for deciding which of the two sources can be coded with expected cost of sequences of length when , where the cost of encoding a sequence is assumed to be a function of length only. Further, in Jelinek [17] it is shown that coding with respect to Campbell’s mean length is useful in minimizing the problem of buffer overflow which occurs when the source symbol is produced at a fixed rate and the code words are stored temporarily in a finite buffer. Concerning Campbell’s mean length the reader can consult [15].

It may be seen that the mean codeword length (17) had been generalized parametrically by Campbell [15] and their bounds had been studied in terms of generalized measures of entropies. Here we give another generalization of (17) and study its bounds in terms of generalized entropy of order .

Generalized coding theorems by considering different information measure under the condition of unique decipherability were investigated by several authors; see, for instance, the papers [6, 13, 18].

An investigation is carried out concerning discrete memoryless sources possessing an additional parameter which seems to be significant in problem of storage and transmission (see [9, 1618]).

In this section we study a coding theorem by considering a new information measure depending on a parameter. Our motivation is, among others, that this quantity generalizes some information measures already existing in the literature such as Arndt [19] entropy, which is used in physics.

Definition 1. Let , be arbitrarily fixed, then the mean length corresponding to the generalized information measure is given by the formulawhere and are positive integers so thatSince (22) reduces to Kraft inequality when , therefore it is called generalized Kraft inequality and codes obtained under this generalized inequality are called personal codes.

Theorem 2. Let , be arbitrarily fixed. Then there exist code length so thatholds under condition (22) and equality holds if and only ifwhere and are given by (3) and (21), respectively.

Proof. First of all we shall prove the lower bound of
By reverse Hölder inequality, that is, if and , are positive real numbers thenLet , , and
Putting these values into (25), we getwhere we used (22), too. This implies however thatFor , (27) becomesusing (28) and the fact that , we getFrom (24) and after simplification, we getThis implieswhich gives Then equality sign holds in (29).
Now we will prove inequality (23) for upper bound of
We choose the codeword lengths , in such a way thatis fulfilled for all .
From the left inequality of (32), we havemultiplying both sides by and then taking sum over , we get the generalized inequality (22). So there exists a generalized code with code lengths , .
Since , then (32) can be written asMultiplying (34) throughout by and then summing up from to , we obtain inequalitySince for , we get from (35) inequality (23).

Particular’s Cases. For , then (23) becomeswhich is the Shannon [11] classical noiseless coding theoremwhere and are given by (4) and (17), respectively.

4. Conclusion

In this paper we prove a generalization of Shannon’s inequality for the case of entropy of order with the help of Hölder inequality. Noiseless coding theorem is proved. Considering Theorem 2 we remark that the optimal code length depends on in contrast with the optimal code lengths of Shannon which do not depend of a parameter. However, it is possible to prove coding theorem with respect to (3) such that the optimal code lengths are identical to those of Shannon.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

References

  1. J. Havrda and F. S. Charvát, “Quantification method of classification processes. Concept of structural α-entropy,” Kybernetika, vol. 3, pp. 30–35, 1967. View at Google Scholar · View at MathSciNet
  2. Z. Daróczy, “Generalized information functions,” Information and Control, vol. 16, no. 1, pp. 36–51, 1970. View at Publisher · View at Google Scholar · View at Scopus
  3. M. Behara and P. Nath, “Additive and non-additive entropies of finite measurable partitions,” in Probability and Information Theory II, pp. 102–138, Springer-Verlag, 1970. View at Google Scholar · View at MathSciNet
  4. I. Vajda, “Axioms for α-entropy of a generalized probability scheme,” Kybernetika, vol. 4, pp. 105–112, 1968. View at Google Scholar · View at MathSciNet
  5. B. D. Sharma and D. P. Mittal, “New nonadditive measures of entropy for discrete probability distributions,” Journal of Mathematical Sciences, vol. 10, pp. 28–40, 1975. View at Google Scholar · View at MathSciNet
  6. R. Pereira and Gur Dial, “Pseudogeneralization of Shannon inequality for Mittal's entropy and its application in coding theory,” Kybernetika, vol. 20, no. 1, pp. 73–77, 1984. View at Google Scholar · View at MathSciNet
  7. Gur Dial, “On a coding theorems connected with entropy of order and type ,” Information Sciences, vol. 30, no. 1, pp. 55–65, 1983. View at Publisher · View at Google Scholar · View at MathSciNet
  8. S. Kumar and A. Choudhary, “Some coding theorems on generalized Havrda-Charvat and Tsallis's entropy,” Tamkang Journal of Mathematics, vol. 43, no. 3, pp. 437–444, 2012. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. L. Wondie and S. Kumar, “A joint representation of Renyi’s and Tsalli’s entropy with application in coding theory,” International Journal of Mathematics and Mathematical Sciences, vol. 2017, Article ID 2683293, 5 pages, 2017. View at Publisher · View at Google Scholar
  10. C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  11. C. E. Shannon, “A mathematical theory of communication,” Bell System Technical Journal, vol. 27, no. 4, pp. 623–656, 1948. View at Publisher · View at Google Scholar · View at Scopus
  12. D. F. Kerridge, “Inaccuracy and inference,” Journal of the Royal Statistical Society. Series B (Methodological), vol. 23, pp. 184–194, 1961. View at Google Scholar · View at MathSciNet
  13. A. Feinstein, Foundations of Information Theory, McGraw-Hill, New York, NY, USA, 1956. View at MathSciNet
  14. A. Rényi, “On measures of entropy and information,” in Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, pp. 547–561, University of California Press, 1961. View at Google Scholar · View at MathSciNet
  15. L. L. Campbell, “A coding theorem and Rényi's entropy,” Information and Control, vol. 8, no. 4, pp. 423–429, 1965. View at Publisher · View at Google Scholar · View at Scopus
  16. J. C. Kieffer, “Variable-length source coding with a cost depending only on the code word length,” Information and Control, vol. 41, no. 2, pp. 136–146, 1979. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  17. F. Jelinek, “Buffer overflow in variable length coding of fixed rate sources,” IEEE Transactions on Information Theory, vol. 14, no. 3, pp. 490–501, 1968. View at Publisher · View at Google Scholar · View at Scopus
  18. G. Longo, “A noiseless coding theorem for sources having utilities,” SIAM Journal on Applied Mathematics, vol. 30, no. 4, pp. 739–748, 1976. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  19. C. Arndt, Information Measure-Information and Its Description in Science and Engineering, Springer, Berlin, Germany, 2001. View at MathSciNet