About this Journal Submit a Manuscript Table of Contents
Abstract and Applied Analysis
Volume 2014 (2014), Article ID 505184, 8 pages
http://dx.doi.org/10.1155/2014/505184
Research Article

A Generalization of the Havrda-Charvat and Tsallis Entropy and Its Axiomatic Characterization

1Department of Mathematics, College of Natural Sciences, Arba Minch University, Arab Minch, Ethiopia
2Department of Applied Sciences, Maharishi Markandeshwar University, Solan, Himachal Pradesh 173229, India

Received 3 September 2013; Revised 20 December 2013; Accepted 20 December 2013; Published 19 February 2014

Academic Editor: Chengjian Zhang

Copyright © 2014 Satish Kumar and Gurdas Ram. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In this communication, we characterize a measure of information of types , , and by taking certain axioms parallel to those considered earlier by Havrda and Charvat along with the recursive relation (; , , ), ; , , ) (()), ; ,()), ; , ), , , , . Some properties of this measure are also studied. This measure includes Shannon’s information measure as a special case.

1. Introduction

Shannon’s measure of entropy for a discrete probability distribution given by has been characterized in several ways (see Aczél and Daróczy [1]). Out of the many ways of characterization the two elegant approaches are to be found in the work of (i) Faddeev [2], who uses branching property namely, for the above distribution , as the basic postulate, and (ii) Chaundy and McLeod [3], who studied the functional equation Both of the above-mentioned approaches have been extensively exploited and generalized. The most general form of (4) has been studied by Sharma and Taneja [4], who considered the functional equation We define the information measure as for a complete probability distribution .

Measure (6) reduces to entropy of type (or ) when (or ) given by When , measure (7) reduces to Shannon’s entropy [5], namely, The measure (7) was characterized by many authors by different approaches. Havrda and Charvát [6] characterized (7) by an axiomatic approach. Daróczy [7] studied (7) by a functional equation. A joint characterization of the measures (7) and (8) has been done by author in two different ways. Firstly by a generalized functional equation having four different functions and secondly by an axiomatic approach. Later on Tsallis [8] gave the applications of (7) in Physics.

To characterize strongly interacting statistical systems within a thermodynamical framework—complex system in particular—it might be necessary to introduce generalized entropies. A series of such entropies have been proposed in the past, mainly to accommodate important empirical distribution functions to a maximum ignorance principles. The understanding of the fundamental origin of these entropies and its deeper relations to complex systems is limited. Authors [9] explore this question from first principle. Authors [9] observed that the 4th Khinchin axiom is violated by strongly interacting system in general and by assuming the first three Khinchin axioms derived a unique entropy and also classified the known entropies with in equivalence classes.

For statistical system that violates the four Shannon-Khinchin axioms, entropy takes a more general form than the Boltzmann-Gibbs entropy. The framework of superstatistics allows one to formulate a maximum entropy principle with these generalized entropies, making them useful for understanding distribution functions of non-Markovian or nonergodic complex systems. For such systems where the composability axiom is violated there exist only two ways to implement the maximum entropy principle; one is using the escort probabilities and the other is not. The two ways are connected through a duality. Authors [10] showed that this duality fixes a unique escort probability and derived a complete theory of the generalized logarithms and also gave the relationship between the functional forms of generalized logarithms and the asymptotic scaling behavior of the entropy.

Suyari [11] has proposed a generalization of Shannon-Khinchin axioms, which determines a class of entropies containing the well-known Tsallis and Havrda-Charvat entropies. Authors [12] showed that the class of entropy functions determined by Suyari’s axioms is wider than the one proposed by Suyari and generalized Suyari’s axioms characterizing recently introduced class of entropies obtained by averaging pseudoadditive information content.

In this communication, we characterized the measure (6) by taking certain axioms parallel to those considered earlier by Havrda and Charvát [6] along with the recursive relation (9). Some properties of this measure are also studied.

The measure (6) satisfies a recursive relation as follows: where , and .

Consider

Proof. which proves (9).

2. Set of Axioms

For characterizing a measure of information of types , and associated with a probability distribution ,    ,   , we introduce the following axioms:(1)is continuous in the region (2); (3); (4) for every ;(5)for every , where and , .

Theorem 1. If , then the axioms (1)–(5) determine a measure given by where and .
Before proving the theorem we prove some intermediate results based on the above axioms.

Lemma 2. If ,   and , then

Proof. To prove the lemma, we proceed by induction. For , the desired statement holds (cf. axiom (4)). Let us suppose that the result is true for numbers less than or equal to , we will prove it for . We have
One more application of induction premise yields For , (18) reduces to Similarly for , (18) reduces to Expression (17) together with (19) and (20) gives the desired result.

Lemma 3. If , , , , and , then

Proof. Proof of this lemma directly follows from Lemma 2.

Lemma 4. If , then where , , and

Proof. Replacing in Lemma 3   by and putting , , , where and are positive integer, we have Putting in (24) and using (by axiom (2)), we get which is (22).
Comparing the right hand sides of (24) and (25), we get Equation (27) together with (22) gives Putting in (28) and using , we get That is, , where is an arbitrary constant.
For , we get .
Thus, we have Similarly, which is (23).
Now (22) together with (23) gives

Proof of the Theorem. We prove the theorem for rationals and then the continuity axiom extends the result for reals. For this let and ’s be positive integers such that and if we put , then an application of Lemma 3 gives That is, Equation (34) together with (23) and (32) gives which is (15).
This completes the proof of the theorem.

3. Properties of Entropy of Types and

The measure , where , , is a probability distribution, as characterized in the preceding section and satisfies certain properties, which are given in the following theorems:

Theorem 5. The measure is nonnegative for ,    .

Proof.
Case 1. ; , ; Since, and , we get
Case 2. Similarly for and , we get Therefore from Case 1, Case 2, and axiom (2), we get This completes the proof of theorem.

Definition 6. We will use the following definition of a convex function.
A function over the points in a convex set is convex if for all and The function is convex if (40) holds with in place of .

Theorem 7. The measure is convex function of the probability distribution , , , when either and   or and  .

Proof. Let there be distributions associated with the random variable .
Consider numbers such that and and define where Obviously, and thus is a bonafide distribution of .
Let and , then we have , that is, , for ,   .

By symmetry in , and the above result is true for and  .

Theorem 8. The measure satisfies the following relations:(i)Generalized-Additive: where (ii)Subadditive: for , the measure is subadditive; that is, where , and are complete probability distributions.

Proof of (i). We have Also Adding (49) and (50), we get Using (46) which is (45). This completes the proof of part (i).

Proof of (ii). From part (i), we have As , for , This proves the subadditivity.

4. Conclusion

In addition to well-known information measure of Shannon, Renyi’s, Havrda-Charvat, Vajda [13], Darcózy, we have characterized a measure which we call , and information measure. We have given some basic axioms and properties with recursive relation. The Shannon’s [5] measure included in the , and information measure for the limiting case and ; and . This measure is generalization of Havrda-Charvat entropy.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

References

  1. J. Aczél and Z. Daróczy, On Measures of Information and Their Characterization, Academic Press, New York, NY, USA, 1975. View at MathSciNet
  2. D. K. Faddeev, “On the concept of entropy of a finite probabilistic scheme,” Uspekhi Matematicheskikh Nauk, vol. 11, no. 1(67), pp. 227–231, 1956. View at Zentralblatt MATH · View at MathSciNet
  3. T. W. Chaundy and J. B. McLeod, “On a functional equation,” Proceedings of the Edinburgh Mathematical Society. Series II, vol. 12, no. 43, pp. 6–7, 1960. View at MathSciNet
  4. B. D. Sharma and I. J. Taneja, “Functional measures in information theory,” Funkcialaj Ekvacioj, vol. 17, pp. 181–191, 1974. View at Zentralblatt MATH · View at MathSciNet
  5. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, pp. 379–423, 623–636, 1948. View at Zentralblatt MATH · View at MathSciNet
  6. J. Havrda and F. Charvát, “Quantification method of classification processes. Concept of structural α-entropy,” Kybernetika, vol. 3, pp. 30–35, 1967. View at MathSciNet
  7. Z. Daróczy, “Generalized information functions,” Information and Computation, vol. 16, pp. 36–51, 1970. View at Zentralblatt MATH · View at MathSciNet
  8. C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  9. R. Hanel and S. Thurner, “A comprehensive classification of complex statistical systems and an ab-initio derivation of their entropy and distribution functions,” Europhysics Letters, vol. 93, no. 2, Article ID 20006, 2011. View at Publisher · View at Google Scholar
  10. R. Hanel, S. Thurner, and M. Gell-Mann, “Generalized entropies and logarithms and their duality relations,” Proceedings of the National Academy of Sciences of the United States of America, vol. 109, no. 47, pp. 19151–19154, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  11. H. Suyari, “Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy,” IEEE Transactions on Information Theory, vol. 50, no. 8, pp. 1783–1787, 2004. View at Publisher · View at Google Scholar · View at MathSciNet
  12. V. M. IIic, M. S. Stankovic, and E. H. Mulalic, “Comments on Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for nonextensive entropy,” IEEE Transactions on Information Theory, vol. 59, no. 10, pp. 6950–6952, 2013. View at Publisher · View at Google Scholar
  13. I. Vajda, “Axioms for α-entropy of a generalized probability scheme,” Kybernetika, vol. 2, pp. 105–112, 1968. View at MathSciNet