Table of Contents Author Guidelines Submit a Manuscript
International Journal of Mathematics and Mathematical Sciences
Volume 2018 (2018), Article ID 2861612, 4 pages
https://doi.org/10.1155/2018/2861612
Research Article

Some Inequalities in Information Theory Using Tsallis Entropy

Department of Mathematics, College of Natural and Computational Science, University of Gondar, Gondar, Ethiopia

Correspondence should be addressed to Satish Kumar; moc.liamffider@47hsitasrd

Received 27 December 2017; Accepted 27 February 2018; Published 3 April 2018

Academic Editor: A. Zayed

Copyright © 2018 Litegebe Wondie and Satish Kumar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. J. Havrda and F. S. Charvát, “Quantification method of classification processes. Concept of structural α-entropy,” Kybernetika, vol. 3, pp. 30–35, 1967. View at Google Scholar · View at MathSciNet
  2. Z. Daróczy, “Generalized information functions,” Information and Control, vol. 16, no. 1, pp. 36–51, 1970. View at Publisher · View at Google Scholar · View at Scopus
  3. M. Behara and P. Nath, “Additive and non-additive entropies of finite measurable partitions,” in Probability and Information Theory II, pp. 102–138, Springer-Verlag, 1970. View at Google Scholar · View at MathSciNet
  4. I. Vajda, “Axioms for α-entropy of a generalized probability scheme,” Kybernetika, vol. 4, pp. 105–112, 1968. View at Google Scholar · View at MathSciNet
  5. B. D. Sharma and D. P. Mittal, “New nonadditive measures of entropy for discrete probability distributions,” Journal of Mathematical Sciences, vol. 10, pp. 28–40, 1975. View at Google Scholar · View at MathSciNet
  6. R. Pereira and Gur Dial, “Pseudogeneralization of Shannon inequality for Mittal's entropy and its application in coding theory,” Kybernetika, vol. 20, no. 1, pp. 73–77, 1984. View at Google Scholar · View at MathSciNet
  7. Gur Dial, “On a coding theorems connected with entropy of order and type ,” Information Sciences, vol. 30, no. 1, pp. 55–65, 1983. View at Publisher · View at Google Scholar · View at MathSciNet
  8. S. Kumar and A. Choudhary, “Some coding theorems on generalized Havrda-Charvat and Tsallis's entropy,” Tamkang Journal of Mathematics, vol. 43, no. 3, pp. 437–444, 2012. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. L. Wondie and S. Kumar, “A joint representation of Renyi’s and Tsalli’s entropy with application in coding theory,” International Journal of Mathematics and Mathematical Sciences, vol. 2017, Article ID 2683293, 5 pages, 2017. View at Publisher · View at Google Scholar
  10. C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  11. C. E. Shannon, “A mathematical theory of communication,” Bell System Technical Journal, vol. 27, no. 4, pp. 623–656, 1948. View at Publisher · View at Google Scholar · View at Scopus
  12. D. F. Kerridge, “Inaccuracy and inference,” Journal of the Royal Statistical Society. Series B (Methodological), vol. 23, pp. 184–194, 1961. View at Google Scholar · View at MathSciNet
  13. A. Feinstein, Foundations of Information Theory, McGraw-Hill, New York, NY, USA, 1956. View at MathSciNet
  14. A. Rényi, “On measures of entropy and information,” in Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, pp. 547–561, University of California Press, 1961. View at Google Scholar · View at MathSciNet
  15. L. L. Campbell, “A coding theorem and Rényi's entropy,” Information and Control, vol. 8, no. 4, pp. 423–429, 1965. View at Publisher · View at Google Scholar · View at Scopus
  16. J. C. Kieffer, “Variable-length source coding with a cost depending only on the code word length,” Information and Control, vol. 41, no. 2, pp. 136–146, 1979. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  17. F. Jelinek, “Buffer overflow in variable length coding of fixed rate sources,” IEEE Transactions on Information Theory, vol. 14, no. 3, pp. 490–501, 1968. View at Publisher · View at Google Scholar · View at Scopus
  18. G. Longo, “A noiseless coding theorem for sources having utilities,” SIAM Journal on Applied Mathematics, vol. 30, no. 4, pp. 739–748, 1976. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  19. C. Arndt, Information Measure-Information and Its Description in Science and Engineering, Springer, Berlin, Germany, 2001. View at MathSciNet