Advances in Mathematical Physics

Advances in Mathematical Physics / 2011 / Article

Research Article | Open Access

Volume 2011 |Article ID 126108 | https://doi.org/10.1155/2011/126108

Shigeru Furuichi, "Characterizations of Generalized Entropy Functions by Functional Equations", Advances in Mathematical Physics, vol. 2011, Article ID 126108, 12 pages, 2011. https://doi.org/10.1155/2011/126108

Characterizations of Generalized Entropy Functions by Functional Equations

Academic Editor: Giorgio Kaniadakis
Received03 Mar 2011
Revised22 May 2011
Accepted23 May 2011
Published24 Jul 2011

Abstract

We will show that a two-parameter extended entropy function is characterized by a functional equation. As a corollary of this result, we obtain that Tsallis entropy function is characterized by a functional equation, which is a different form that used by Suyari and Tsukada, 2009, that is, in a proposition 2.1 in the present paper. We give an interpretation of the functional equation in our main theorem.

1. Introduction

Recently, generalized entropies have been studied from the mathematical point of view. The typical generalizations of Shannon entropy [1] are Rényi entropy [2] and Tsallis entropy [3]. The recent comprehensive book [4] and the review [5] support to understand the Tsallis statistics for the readers. Rényi entropy and Tsallis entropy are defined by ğ‘…ğ‘ž1(𝑋)=1âˆ’ğ‘žlog𝑛𝑗=1ğ‘ğ‘žğ‘—ğ‘†,(ğ‘žâ‰ 1,ğ‘ž>0),ğ‘ž(𝑋)=𝑛𝑗=1ğ‘ğ‘žğ‘—âˆ’ğ‘ğ‘—1âˆ’ğ‘ž,(ğ‘žâ‰ 1,ğ‘ž>0),(1.1)

for a given information source 𝑋={𝑥1,…,𝑥𝑛} with the probability 𝑝𝑗≡Pr(𝑋=𝑥𝑗). Both entropies recover Shannon entropy 𝑆1(𝑋)≡−𝑛𝑗=1𝑝𝑗log𝑝𝑗,(1.2) in the limit ğ‘žâ†’1. The uniqueness theorem for Tsallis entropy was firstly given in [6] and improved in [7].

Throughout this paper, we call a parametric extended entropy, such as Rényi entropy and Tsallis entropy, a generalized entropy. If we take 𝑛=2 in (1.2), we have the so-called binary entropy 𝑠𝑏(𝑥)=−𝑥log𝑥−(1−𝑥)log(1−𝑥). Also we take 𝑛=1 in (1.2), and we have the Shannon's entropy function 𝑓(𝑥)=−𝑥log𝑥. In this paper, we treat the entropy function with two parameters. We note that we can produce the relative entropic function −𝑦𝑓(𝑥/𝑦)=𝑥(log𝑥−log𝑦)by the use of the Shannon's entropy function 𝑓(𝑥).

We note that Rényi entropy has the additivity ğ‘…ğ‘ž(𝑋×𝑌)=ğ‘…ğ‘ž(𝑋)+ğ‘…ğ‘ž(𝑌),(1.3) but Tsallis entropy has the nonadditivity ğ‘†ğ‘ž(𝑋×𝑌)=ğ‘†ğ‘ž(𝑋)+ğ‘†ğ‘ž(𝑌)+(1âˆ’ğ‘ž)ğ‘†ğ‘ž(𝑋)ğ‘†ğ‘ž(𝑌),(1.4) where 𝑋×𝑌 means that 𝑋 and 𝑌 are independent random variables. Therefore, we have a definitive difference for these entropies although we have the simple relation between them 𝑅expğ‘žî€¸(𝑋)=expğ‘žî€·ğ‘†ğ‘žî€¸,(𝑋)(ğ‘žâ‰ 1),(1.5) where ğ‘ž-exponential function expğ‘ž(𝑥)≡{1+(1âˆ’ğ‘ž)𝑥}1/(1âˆ’ğ‘ž) is defined if 1+(1âˆ’ğ‘ž)𝑥≥0. Note that we have expğ‘ž(ğ‘†ğ‘žâˆ‘(𝑋))=(𝑛𝑗=1ğ‘ğ‘žğ‘—)1/(1âˆ’ğ‘ž)>0.

Tsallis entropy is rewritten by ğ‘†ğ‘ž(𝑋)=−𝑛𝑗=1ğ‘ğ‘žğ‘—lnğ‘žğ‘ğ‘—,(1.6) where ğ‘ž-logarithmic function (which is an inverse function of expğ‘ž(⋅)) is defined by lnğ‘žğ‘¥ğ‘¥â‰¡1âˆ’ğ‘žâˆ’11âˆ’ğ‘ž,(ğ‘žâ‰ 1),(1.7) which converges to log𝑥 in the limit ğ‘žâ†’1.

Since Shannon entropy can be regarded as the expectation value for each value −log𝑝𝑗, we may consider that Tsallis entropy can be regarded as the ğ‘ž-expectation value for each value −lnğ‘žğ‘ğ‘—, as an analogy to the Shannon entropy, where ğ‘ž-expectation value ğ¸ğ‘ž is defined by ğ¸ğ‘ž(𝑋)≡𝑛𝑗=1ğ‘ğ‘žğ‘—ğ‘¥ğ‘—.(1.8)

However, the ğ‘ž-expectation value ğ¸ğ‘ž lacks the fundamental property such as 𝐸(1)=1, so that it was considered to be inadequate to adopt as a generalized definition of the usual expectation value. Then the normalized ğ‘ž-expectation value was introduced 𝐸(nor)ğ‘žâˆ‘(𝑋)≡𝑛𝑗=1ğ‘ğ‘žğ‘—ğ‘¥ğ‘—âˆ‘ğ‘›ğ‘–=1ğ‘ğ‘žğ‘–,(1.9) and by using this, the normalized Tsallis entropy was defined by 𝑆(nor)ğ‘žğ‘†(𝑋)â‰¡ğ‘ž(𝑋)∑𝑛𝑗=1ğ‘ğ‘žğ‘—âˆ‘=−𝑛𝑗=1ğ‘ğ‘žğ‘—lnğ‘žğ‘ğ‘—âˆ‘ğ‘›ğ‘–=1ğ‘ğ‘žğ‘–,(ğ‘žâ‰ 1).(1.10)

We easily find that we have the following nonadditivity relation for the normalized Tsallis entropy: 𝑆(nor)ğ‘ž(𝑋×𝑌)=𝑆(nor)ğ‘ž(𝑋)+𝑆(nor)ğ‘ž(𝑌)+(ğ‘žâˆ’1)𝑆(nor)ğ‘ž(𝑋)𝑆(nor)ğ‘ž(𝑌).(1.11)

As for the details on the mathematical properties of the normalized Tsallis entropy, see [8], for example. See also [9] for the role of Tsallis entropy and the normalized Tsallis entropy in statistical physics. The difference between two non-additivity relations (1.4) and (1.11) is the signature of the coefficient 1âˆ’ğ‘ž in the third term of the right-hand sides.

We note that Tsallis entropy is also rewritten by ğ‘†ğ‘ž(𝑋)=𝑛𝑗=1𝑝𝑗lnğ‘ž1𝑝𝑗,(1.12) so that we may regard it as the expectation value such as ğ‘†ğ‘ž(𝑋)=𝐸1[lnğ‘ž1/𝑝𝑗],where 𝐸1 means the usual expectation value 𝐸1∑[𝑋]=𝑛𝑗=1𝑝𝑗𝑥𝑗. However, if we adopt this formulation in the definition of Tsallis conditional entropy, we do not have an important property such as a chain rule (see [10] for details). Therefore, we often adopt the formulation using the ğ‘ž-expectation value.

As a further generalization, a two-parameter extended entropy 𝑆𝜅,𝑟(𝑋)≡−𝑛𝑗=1𝑝𝑗ln(𝜅,𝑟)𝑝𝑗(1.13) was recently introduced in [11, 12] and systematically studied with the generalized exponential function and the generalized logarithmic function ln𝜅,𝑟(𝑥)≡𝑥𝑟((𝑥𝜅−𝑥−𝜅)/2𝜅). In the present paper, we treat a two-parameter extended entropy defined in the following form: 𝑆𝛼,𝛽(𝑋)≡𝑛𝑗=1𝑝𝛼𝑗−𝑝𝛽𝑗𝛽−𝛼,(𝛼,𝛽∈ℝ,𝛼≠𝛽),(1.14)

for two positive numbers 𝛼 and 𝛽. This form can be obtained by putting 𝛼=1+𝑟−𝜅 and 𝛽=1+𝑟+𝜅 in (1.13), and it coincides with the two-parameter extended entropy studied in [13]. In addition, the two-parameter extended entropy (1.14) was axiomatically characterized in [14]. Furthermore, a two-parameter extended relative entropy was also axiomatically characterized in [15].

In the paper [16], a characterization of Tsallis entropy function was proven by using the functional equation. In the present paper, we will show that the two-parameter extended entropy function 𝑓𝛼,𝛽𝑥(𝑥)=𝛼−𝑥𝛽𝛽−𝛼(𝛼,𝛽∈ℝ,𝛼≠𝛽)(1.15) can be characterized by the simple functional equation.

2. A Review of the Characterization of Tsallis Entropy Function by the Functional Equation

The following proposition was originally given in [16] by the simple and elegant proof. Here, we give the alternative proof along to the proof given in [17].

Proposition 2.1 (see [16]). If the differentiable nonnegative function ğ‘“ğ‘ž with positive parameter q∈ℝ satisfies the following functional equation: ğ‘“ğ‘ž(𝑥𝑦)+ğ‘“ğ‘ž((1−𝑥)𝑦)âˆ’ğ‘“ğ‘žî€·ğ‘“(𝑦)=ğ‘ž(𝑥)+ğ‘“ğ‘žî€¸ğ‘¦(1−𝑥)ğ‘ž,(0<𝑥<1,0<𝑦≤1),(2.1) then the function ğ‘“ğ‘ž is uniquely given by ğ‘“ğ‘ž(𝑥)=âˆ’ğ‘ğ‘žğ‘¥ğ‘žlnğ‘žğ‘¥,(2.2) where ğ‘ğ‘ž is a nonnegative constant depending only on the parameter ğ‘ž.

Proof. If we put 𝑦=1 in (2.1), then we have ğ‘“ğ‘ž(1)=0. From here, we assume that 𝑦≠1. We also put ğ‘”ğ‘ž(𝑡)â‰¡ğ‘“ğ‘ž(𝑡)/𝑡 then we have ğ‘¥ğ‘”ğ‘ž(𝑥𝑦)+(1−𝑥)ğ‘”ğ‘ž((1−𝑥)𝑦)âˆ’ğ‘”ğ‘žî€·(𝑦)=ğ‘¥ğ‘”ğ‘ž(𝑥)+(1−𝑥)ğ‘”ğ‘žî€¸ğ‘¦(1−𝑥)ğ‘žâˆ’1.(2.3) Putting 𝑥=1/2 in (2.3), we have ğ‘”ğ‘žî‚€ğ‘¦2=ğ‘”ğ‘žî‚€12î‚ğ‘¦ğ‘žâˆ’1+ğ‘”ğ‘ž(𝑦).(2.4) Substituting 𝑦/2 into 𝑦, we have ğ‘”ğ‘žî‚µğ‘¦22=ğ‘”ğ‘žî‚€12î‚î‚µğ‘¦ğ‘žâˆ’1+𝑦2î‚ğ‘žâˆ’1+ğ‘”ğ‘ž(𝑦).(2.5) By repeating similar substitutions, we have ğ‘”ğ‘žî‚µğ‘¦2𝑁=ğ‘”ğ‘žî‚€12î‚ğ‘¦ğ‘žâˆ’111+2î‚ğ‘žâˆ’1+122(ğ‘žâˆ’1)1+⋯+2(𝑁−1)(ğ‘žâˆ’1)+ğ‘”ğ‘ž(𝑦)=ğ‘”ğ‘žî‚€12î‚ğ‘¦ğ‘žâˆ’12𝑁(1âˆ’ğ‘ž)−121âˆ’ğ‘žî‚¶âˆ’1+ğ‘”ğ‘ž(𝑦).(2.6) Then, we have limğ‘â†’âˆžğ‘”ğ‘žî€·ğ‘¦/2𝑁2𝑁=0,(2.7) due to ğ‘ž>0. Differentiating (2.3) by 𝑦, we have 𝑥2ğ‘”ğ‘ž(𝑥𝑦)+(1−𝑥)2ğ‘”ğ‘ž((1−𝑥)𝑦)âˆ’ğ‘”î…žğ‘žî€·(𝑦)=(ğ‘žâˆ’1)ğ‘¥ğ‘”ğ‘ž(𝑥)+(1−𝑥)ğ‘”ğ‘žî€¸ğ‘¦(1−𝑥)ğ‘žâˆ’2.(2.8) Putting 𝑦=1 in the above equation, we have 𝑥2ğ‘”î…žğ‘ž(𝑥)+(1−𝑥)2ğ‘”î…žğ‘žî€·(1−𝑥)+(1âˆ’ğ‘ž)ğ‘¥ğ‘”ğ‘ž(𝑥)+(1−𝑥)ğ‘”ğ‘žî€¸(1−𝑥)=âˆ’ğ‘ğ‘ž,(2.9) where ğ‘ğ‘ž=âˆ’ğ‘”î…žğ‘ž(1).
By integrating (2.3) from 2−𝑁 to 1 with respect to 𝑦 and performing the conversion of the variables, we have 𝑥2âˆ’ğ‘ğ‘¥ğ‘”ğ‘žî€œ(𝑡)𝑑𝑡+21−𝑥−𝑁(1−𝑥)ğ‘”ğ‘žî€œ(𝑡)𝑑𝑡−12âˆ’ğ‘ğ‘”ğ‘žî€·(𝑡)𝑑𝑡=ğ‘¥ğ‘”ğ‘ž(𝑥)+(1−𝑥)ğ‘”ğ‘žî€¸(1−𝑥)1−2âˆ’ğ‘žğ‘ğ‘ž.(2.10) By differentiating the above equation with respect to 𝑥, we have ğ‘”ğ‘ž(𝑥)−2âˆ’ğ‘ğ‘”ğ‘žî€·2âˆ’ğ‘ğ‘¥î€¸âˆ’ğ‘”ğ‘ž(1−𝑥)+2âˆ’ğ‘ğ‘”ğ‘žî€·2−𝑁=(1−𝑥)1−2âˆ’ğ‘žğ‘ğ‘žî€·ğ‘”ğ‘ž(𝑥)+ğ‘¥ğ‘”î…žğ‘ž(𝑥)âˆ’ğ‘”ğ‘ž(1−𝑥)−(1−𝑥)ğ‘”î…žğ‘žî€¸.(1−𝑥)(2.11) Taking the limit ğ‘â†’âˆž in the above, we have (1−𝑥)ğ‘”î…žğ‘ž(𝑥)+(1âˆ’ğ‘ž)ğ‘”ğ‘ž(1−𝑥)=ğ‘¥ğ‘”î…žğ‘ž(𝑥)+(1âˆ’ğ‘ž)ğ‘”ğ‘ž(𝑥),(2.12) thanks to (2.7). From (2.9) and (2.12), we have the following differential equation: ğ‘¥ğ‘”î…žğ‘ž(𝑥)+(1âˆ’ğ‘ž)ğ‘”ğ‘ž(𝑥)=âˆ’ğ‘ğ‘ž.(2.13) This differential equation has the following general solution: ğ‘”ğ‘žğ‘(𝑥)=âˆ’ğ‘ž1âˆ’ğ‘ž+ğ‘‘ğ‘žğ‘¥ğ‘žâˆ’1,(2.14) where ğ‘‘ğ‘ž is an integral constant depending on ğ‘ž. From ğ‘”ğ‘ž(1)=0, we have ğ‘‘ğ‘ž=ğ‘ğ‘ž/(1âˆ’ğ‘ž). Thus, we have ğ‘”ğ‘ž(𝑥)=ğ‘ğ‘žğ‘¥ğ‘žâˆ’1−11âˆ’ğ‘ž.(2.15) Finally, we have ğ‘“ğ‘ž(𝑥)=ğ‘ğ‘žğ‘¥ğ‘žâˆ’ğ‘¥1âˆ’ğ‘ž=âˆ’ğ‘ğ‘žğ‘¥ğ‘žlnğ‘žğ‘¥.(2.16) From ğ‘“ğ‘ž(𝑥)≥0, we have ğ‘ğ‘žâ‰¥0.
If we take the limit as ğ‘žâ†’1 in Proposition 2.1, we have the following corollary.

Corollary 2.2 (see [17]). If the differentiable nonnegative function f satisfies the following functional equation: 𝑓(𝑥𝑦)+𝑓((1−𝑥)𝑦)−𝑓(𝑦)=(𝑓(𝑥)+𝑓(1−𝑥))𝑦,(0<𝑥<1,0<𝑦≤1),(2.17) then the function 𝑓 is uniquely given by 𝑓(𝑥)=−𝑐𝑥log𝑥,(2.18) where 𝑐 is a nonnegative constant.

3. Main Results

In this section, we give a characterization of a two-parameter extended entropy function by the functional equation. Before we give our main theorem, we review the following result given by Kannappan [18, 19].

Proposition 3.1 (see [18, 19]). Let two probability distributions (𝑝1,…,𝑝𝑛) and (ğ‘ž1,…,ğ‘žğ‘š). If the measureable function 𝑓∶(0,1)→ℝ satisfies 𝑛𝑚𝑖=1𝑗=1ğ‘“î€·ğ‘ğ‘–ğ‘žğ‘—î€¸=𝑛𝑖=1𝑝𝛼𝑖𝑚𝑗=1ğ‘“î€·ğ‘žğ‘—î€¸+𝑚𝑗=1ğ‘žğ›½ğ‘—ğ‘›î“ğ‘–=1𝑓𝑝𝑖,(3.1) for all (𝑝1,…,𝑝𝑛) and (ğ‘ž1,…,ğ‘žğ‘š) with fixed 𝑚,𝑛≥3, then the function 𝑓 is given by âŽ§âŽªâŽªâŽ¨âŽªâŽªâŽ©ğ‘î€·ğ‘ğ‘“(𝑝)=𝛼−𝑝𝛽,𝛼≠𝛽,𝑐𝑝𝛼log𝑝,𝛼=𝛽,𝑐𝑝log𝑝+𝑏(𝑚𝑛−𝑚−𝑛)𝑝+𝑏,𝛼=𝛽=1,(3.2) where 𝑐 and 𝑏 are arbitrary constants.

Here, we review a two-parameter generalized Shannon additivity, [14, equation (30)]𝑛𝑚𝑖=1𝑖𝑗=1𝑠𝛼,𝛽𝑝𝑖𝑗=𝑛𝑖=1𝑝𝛼𝑖𝑚𝑖𝑗=1𝑠𝛼,𝛽(𝑝(𝑗∣𝑖))+𝑛𝑖=1𝑠𝛼,𝛽𝑝𝑖𝑚𝑖𝑗=1𝑝(𝑗∣𝑖)𝛽,(3.3) where 𝑠𝛼,𝛽 is a component of the trace form of the two-parameter entropy [14, equation (26)] 𝑆𝛼,𝛽𝑝𝑖=𝑛𝑖=1𝑠𝛼,𝛽𝑝𝑖.(3.4)

Equation (3.3) was used to prove the uniqueness theorem for two-parameter extended entropy in [14]. As for (3.3), a tree-graphical interpretation was given in [14]. The condition (3.1) can be read as the independent case (𝑝(𝑗∣𝑖)=𝑝𝑗) in (3.3).

Here, we consider the nontrivial simplest case for (3.3). Take 𝑝𝑖𝑗={ğ‘ž1,ğ‘ž2,ğ‘ž3}, 𝑝1=ğ‘ž1+ğ‘ž2, and 𝑝2=ğ‘ž3. then we have 𝑝(1∣1)=ğ‘ž1/(ğ‘ž1+ğ‘ž2), 𝑝(2∣1)=ğ‘ž2/(ğ‘ž1+ğ‘ž2), 𝑝(1∣2)=1, and 𝑝(2∣2)=0, then (3.3) is written by𝑆𝛼,ğ›½î€·ğ‘ž1,ğ‘ž2,ğ‘ž3=î€·ğ‘ž1+ğ‘ž2𝛼𝑠𝛼,ğ›½î‚µğ‘ž1ğ‘ž1+ğ‘ž2+𝑠𝛼,ğ›½î‚µğ‘ž2ğ‘ž1+ğ‘ž2+ğ‘žğ›¼3𝑠𝛼,𝛽(1)+𝑠𝛼,𝛽(0)+𝑠𝛼,ğ›½î€·ğ‘ž1+ğ‘ž2î€¸îƒ¯î‚µğ‘ž1ğ‘ž1+ğ‘ž2𝛽+î‚µğ‘ž2ğ‘ž1+ğ‘ž2𝛽+𝑠𝛼,ğ›½î€·ğ‘ž3.(3.5)

If 𝑠𝛼,𝛽 is an entropic function, then it vanishes at 0 or 1, since the entropy has no informational quantity for the deterministic cases, then the above identity is reduced in the following: 𝑆𝛼,ğ›½î€·ğ‘ž1,ğ‘ž2,ğ‘ž3=î€·ğ‘ž1+ğ‘ž2𝛼𝑠𝛼,ğ›½î‚µğ‘ž1ğ‘ž1+ğ‘ž2+𝑠𝛼,ğ›½î‚µğ‘ž2ğ‘ž1+ğ‘ž2+𝑠𝛼,ğ›½î€·ğ‘ž1+ğ‘ž2î€¸îƒ¯î‚µğ‘ž1ğ‘ž1+ğ‘ž2𝛽+î‚µğ‘ž2ğ‘ž1+ğ‘ž2𝛽+𝑠𝛼,ğ›½î€·ğ‘ž3.(3.6)

In the following theorem, we adopt a simpler condition than (3.1).

Theorem 3.2. If the differentiable nonnegative function 𝑓𝛼,𝛽 with two positive parameters 𝛼,𝛽∈ℝ satisfies the following functional equation: 𝑓𝛼,𝛽(𝑥𝑦)=𝑥𝛼𝑓𝛼,𝛽(𝑦)+𝑦𝛽𝑓𝛼,𝛽(𝑥),(0<𝑥,𝑦≤1),(3.7) then the function 𝑓𝛼,𝛽 is uniquely given by 𝑓𝛼,𝛽(𝑥)=𝑐𝛼,𝛽𝑥𝛽−𝑥𝛼𝑓𝛼−𝛽,(𝛼≠𝛽),𝛼(𝑥)=−𝑐𝛼𝑥𝛼log𝑥,(𝛼=𝛽),(3.8) where 𝑐𝛼,𝛽 and 𝑐𝛼 are nonnegative constants depending only on the parameters 𝛼 (and 𝛽).

Proof. If we put 𝑦=1, then we have 𝑓𝛼,𝛽(1)=0 due to 𝑥>0. By differentiating (3.7) with respect to 𝑦, we have ğ‘¥ğ‘“î…žğ›¼,𝛽(𝑥𝑦)=ğ‘¥ğ›¼ğ‘“î…žğ›¼,𝛽(𝑦)+𝛽𝑦𝛽−1𝑓𝛼,𝛽(𝑥).(3.9) Putting 𝑦=1 in (3.9), we have the following differential equation: ğ‘¥ğ‘“î…žğ›¼,𝛽(𝑥)−𝛽𝑓𝛼,𝛽(𝑥)=−𝑐𝛼,𝛽𝑥𝛼,(3.10) where we put 𝑐𝛼,ğ›½â‰¡âˆ’ğ‘“î…žğ›¼,𝛽(1). Equation (3.10) can be deformed as follows: 𝑥𝛽+1𝑥−𝛽𝑓𝛼,𝛽(𝑥)=−𝑐𝛼,𝛽𝑥𝛼,(3.11) that is, we have 𝑥−𝛽𝑓𝛼,𝛽(𝑥)=−𝑐𝛼,𝛽𝑥𝛼−𝛽−1.(3.12) Integrating both sides on the above equation with respect to 𝑥, we have 𝑥−𝛽𝑓𝛼,𝛽𝑐(𝑥)=−𝛼,𝛽𝑥𝛼−𝛽𝛼−𝛽+𝑑𝛼,𝛽,(3.13) where 𝑑𝛼,𝛽 is a integral constant depending on 𝛼 and 𝛽. Therefore, we have 𝑓𝛼,𝛽𝑐(𝑥)=−𝛼,𝛽𝑥𝛼−𝛽𝛼+𝑑𝛼,𝛽𝑥𝛽.(3.14) By 𝑓𝛼,𝛽(1)=0, we have 𝑑𝛼,𝛽=𝑐𝛼,𝛽/(𝛼−𝛽). Thus, we have 𝑓𝛼,𝛽𝑐(𝑥)=𝛼,𝛽𝑥𝛼−𝛽𝛽−𝑥𝛼.(3.15) Also by 𝑓𝛼,𝛽(𝑥)≥0, we have 𝑐𝛼,𝛽≥0.
As for the case of 𝛼=𝛽, we can prove by the similar way.

Remark 3.3. We can derive (3.6) from our condition (3.7). Firstly, we easily have 𝑓𝛼,𝛽(0)=𝑓𝛼,𝛽(1)=0 from our condition equation (3.7). In addition, we have for ğ‘ž=ğ‘ž1+ğ‘ž2, 𝑆𝛼,ğ›½î‚µğ‘žğ‘ž1ğ‘žğ‘ž,ğ‘ž2ğ‘ž,ğ‘ž3=𝑓𝛼,ğ›½î‚µğ‘žğ‘ž1ğ‘žî‚¶+𝑓𝛼,ğ›½î‚µğ‘žğ‘ž2ğ‘žî‚¶+𝑓𝛼,ğ›½î€·ğ‘ž3=ğ‘žğ›¼ğ‘“ğ›¼,ğ›½î‚µğ‘ž1ğ‘žî‚¶+î‚µğ‘ž1ğ‘žî‚¶ğ›½ğ‘“ğ›¼,𝛽(ğ‘ž)+ğ‘žğ›¼ğ‘“ğ›¼,ğ›½î‚µğ‘ž2ğ‘žî‚¶+î‚µğ‘ž2ğ‘žî‚¶ğ›½ğ‘“ğ›¼,𝛽(ğ‘ž)+𝑓𝛼,ğ›½î€·ğ‘ž3=î€·ğ‘ž1+ğ‘ž2𝛼𝑓𝛼,ğ›½î‚µğ‘ž1ğ‘ž1+ğ‘ž2+𝑓𝛼,ğ›½î‚µğ‘ž2ğ‘ž1+ğ‘ž2+𝑓𝛼,ğ›½î€·ğ‘ž1+ğ‘ž2î€¸îƒ¯î‚µğ‘ž1ğ‘ž1+ğ‘ž2𝛽+î‚µğ‘ž2ğ‘ž1+ğ‘ž2𝛽+𝑓𝛼,ğ›½î€·ğ‘ž3.(3.16) Thus, we may interpret that our condition (3.7) contains an essential part of the two-parameter generalized Shannon additivity.
Note that we can reproduce the two-parameter entropic function by the use of 𝑓𝛼,𝛽 as −𝑦𝑓𝛼,𝛽𝑥𝑦=𝑥𝛼𝑦1−𝛽−𝑥𝛽𝑦1−𝛼𝛼−𝛽,(3.17) with 𝑐𝛼,𝛽=1 for simplicity. This leads to two-parameter extended relative entropy [15] 𝐷𝛼,𝛽𝑥1,…,𝑥𝑛𝑦||1,…,𝑦𝑛≡𝑛𝑗=1𝑥𝛼𝑗𝑦𝑗1−𝛽−𝑥𝛽𝑗𝑦𝑗1−𝛼𝛼−𝛽.(3.18) See also [20] on the first appearance of the Tsallis relative entopy (generalized Kullback-Leibler information).
If we take 𝛼=ğ‘ž,𝛽=1 or 𝛼=1,𝛽=ğ‘ž in Theorem 3.2, we have the following corollary.

Corollary 3.4. If the differentiable nonnegative function ğ‘“ğ‘ž with a positive parameter ğ‘žâˆˆâ„ satisfies the following functional equation: ğ‘“ğ‘ž(𝑥𝑦)=ğ‘¥ğ‘žğ‘“ğ‘ž(𝑦)+ğ‘¦ğ‘“ğ‘ž(𝑥),(0<𝑥,𝑦≤1,ğ‘žâ‰ 1),(3.19) then the function ğ‘“ğ‘ž is uniquely given by ğ‘“ğ‘ž(𝑥)=âˆ’ğ‘ğ‘žğ‘¥ğ‘žlnğ‘žğ‘¥,(3.20) where ğ‘ğ‘ž is a nonnegative constant depending only on the parameter ğ‘ž.

Here, we give an interpretation of the functional equation (3.19) from the view of Tsallis statistics.

Remark 3.5. We assume that we have the following two functional equations for 0<𝑥, 𝑦≤1: ğ‘“ğ‘ž(𝑥𝑦)=ğ‘¦ğ‘“ğ‘ž(𝑥)+ğ‘¥ğ‘“ğ‘ž(𝑦)+(1âˆ’ğ‘ž)ğ‘“ğ‘ž(𝑥)ğ‘“ğ‘žğ‘“(𝑦),ğ‘ž(𝑥𝑦)=ğ‘¦ğ‘žğ‘“ğ‘ž(𝑥)+ğ‘¥ğ‘žğ‘“ğ‘ž(𝑦)+(ğ‘žâˆ’1)ğ‘“ğ‘ž(𝑥)ğ‘“ğ‘ž(𝑦).(3.21) These equations lead to the following equations for 0<𝑥𝑖,𝑦𝑗≤1: ğ‘“ğ‘žî€·ğ‘¥ğ‘–ğ‘¦ğ‘—î€¸=ğ‘¦ğ‘—ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸+ğ‘¥ğ‘–ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸+(1âˆ’ğ‘ž)ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸,ğ‘“ğ‘žî€·ğ‘¥ğ‘–ğ‘¦ğ‘—î€¸=ğ‘¦ğ‘žğ‘—ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸+ğ‘¥ğ‘žğ‘–ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸+(ğ‘žâˆ’1)ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸,(3.22) where 𝑖=1,…,𝑛 and 𝑗=1,…,𝑚. Taking the summation on 𝑖 and 𝑗 in both sides, we have 𝑛𝑚𝑖=1𝑗=1ğ‘“ğ‘žî€·ğ‘¥ğ‘–ğ‘¦ğ‘—î€¸=𝑛𝑖=1ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸+𝑚𝑗=1ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸+(1âˆ’ğ‘ž)𝑛𝑖=1ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸ğ‘šî“ğ‘—=1ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸,(3.23)𝑛𝑚𝑖=1𝑗=1ğ‘“ğ‘žî€·ğ‘¥ğ‘–ğ‘¦ğ‘—î€¸=𝑚𝑗=1ğ‘¦ğ‘žğ‘—ğ‘›î“ğ‘–=1ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸+𝑛𝑖=1ğ‘¥ğ‘žğ‘–ğ‘šî“ğ‘—=1ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸+(ğ‘žâˆ’1)𝑛𝑖=1ğ‘“ğ‘žî€·ğ‘¥ğ‘–î€¸ğ‘šî“ğ‘—=1ğ‘“ğ‘žî€·ğ‘¦ğ‘—î€¸,(3.24) under the condition ∑𝑛𝑖=1𝑥𝑖=∑𝑚𝑗=1𝑦𝑗=1. If the function ğ‘“ğ‘ž(𝑥) is given by (3.20), then two above functional equations coincide with two nonadditivity relations given in (1.4) and (1.11).
On the other hand, we have the following equation from (23) and (3.21): ğ‘“ğ‘žî‚µğ‘¥(𝑥𝑦)=ğ‘ž+𝑥2î‚¶ğ‘“ğ‘žî‚µğ‘¦(𝑦)+ğ‘ž+𝑦2î‚¶ğ‘“ğ‘ž(𝑥),(0<𝑥,𝑦≤1,ğ‘žâ‰ 1).(3.25) By a similar way to the proof of Theorem 3.2, we can show that the functional equation (3.25) uniquely determines the function ğ‘“ğ‘ž by the form given in (3.20). Therefore, we can conclude that two functional equations (23) and (3.21), which correspond to the non-additivity relations (1.4) and (1.11), also characterize Tsallis entropy function.
If we again take the limit as ğ‘žâ†’1 in Corollary 3.4, we have the following corollary.

Corollary 3.6. If the differentiable nonnegative function 𝑓 satisfies the following functional equation: 𝑓(𝑥𝑦)=𝑦𝑓(𝑥)+𝑥𝑓(𝑦),(0<𝑥,𝑦≤1),(3.26) then the function 𝑓 is uniquely given by 𝑓(𝑥)=−𝑐𝑥log𝑥,(3.27) where 𝑐 is a nonnegative constant.

4. Conclusion

As we have seen, the two-parameter extended entropy function can be uniquely determined by a simple functional equation. Also an interpretation related to a tree-graphical structure was given as a remark.

Recently, the extensive behaviours of generalized entropies were studied in [21–23]. Our condition given in (3.7) may be seen as extensive form. However, I have not yet found any relation between our functional (3.7) and the extensive behaviours of the generalized entropies. This problem is not the purpose of the present paper, but it is quite interesting to study this problem as a future work.

Acknowledgments

This paper is dedicated to Professor Kenjiro Yanagi on his 60th birthday. The author would like to thank the anonymous reviewers for providing valuable comments to improve the paper. The author was partially supported by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B) 20740067.

References

  1. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, p. 379–423, 623–656, 1948. View at: Google Scholar | Zentralblatt MATH
  2. A. Rényi, “On measures of entropy and information,” in Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561, University California Press, Berkeley, Calif, USA, 1961. View at: Google Scholar | Zentralblatt MATH
  3. C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  4. C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, New York, NY, USA, 2009.
  5. C. Tsallis, D. Prato, and A. R. Plastino, “Nonextensive statistical mechanics: some links with astronomical phenomena,” Astrophysics and Space Science, vol. 290, pp. 259–274, 2004. View at: Google Scholar
  6. H. Suyari, “Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy,” IEEE Transactions on Information Theory, vol. 50, no. 8, pp. 1783–1787, 2004. View at: Publisher Site | Google Scholar | MathSciNet
  7. S. Furuichi, “On uniqueness theorems for Tsallis entropy and Tsallis relative entropy,” IEEE Transactions on Information Theory, vol. 51, no. 10, pp. 3638–3645, 2005. View at: Publisher Site | Google Scholar | MathSciNet
  8. H. Suyari, “Nonextensive entropies derived from form invariance of pseudoadditivity,” Physical Review E, vol. 65, no. 6, p. 066118, 2002. View at: Publisher Site | Google Scholar | MathSciNet
  9. C. Tsallis, R. S. Mendes, and A. R. Plastino, “The role of constraints within generalized nonextensive statistics,” Physica A, vol. 261, pp. 534–554, 1998. View at: Google Scholar
  10. S. Furuichi, “Information theoretical properties of Tsallis entropies,” Journal of Mathematical Physics, vol. 47, no. 2, p. 023302, 2006. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  11. G. Kaniadakis, M. Lissia, and A. M. Scarfone, “Deformed logarithms and entropies,” Physica A, vol. 340, no. 1–3, pp. 41–49, 2004. View at: Publisher Site | Google Scholar | MathSciNet
  12. G. Kaniadakis, M. Lissia, and A. M. Scarfone, “Two-parameter deformations of logarithm, exponential, and entropy: a consistent framework for generalized statistical mechanics,” Physical Review E, vol. 71, no. 4, p. 046128, 2005. View at: Publisher Site | Google Scholar | MathSciNet
  13. E. P. Borges and I. Roditi, “A family of nonextensive entropies,” Physics Letters A, vol. 246, no. 5, pp. 399–402, 1998. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  14. T. Wada and H. Suyari, “A two-parameter generalization of Shannon-Khinchin axioms and the uniqueness theorem,” Physics Letters A, vol. 368, no. 3-4, pp. 199–205, 2007. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  15. S. Furuichi, “An axiomatic characterization of a two-parameter extended relative entropy,” Journal of Mathematical Physics, vol. 51, no. 12, p. 123302, 2010. View at: Publisher Site | Google Scholar
  16. H. Suyari and M. Tsukada, “Tsallis differential entropy and divergences derived from the generalized Shannon-Khinchin axioms,” in Proceedings of the IEEE International Symposium on Information Theory (ISIT '09), pp. 149–153, Seoul, Korea, 2009. View at: Google Scholar
  17. Y. Horibe, “Entropy of terminal distributions and the Fibonacci trees,” The Fibonacci Quarterly, vol. 26, no. 2, pp. 135–140, 1988. View at: Google Scholar | Zentralblatt MATH
  18. P. l. Kannappan, “An application of a differential equation in information theory,” Glasnik Matematicki, vol. 14, no. 2, pp. 269–274, 1979. View at: Google Scholar | Zentralblatt MATH
  19. P. l. Kannappan, Functional Equations and Inequalities with Applications, Springer, New York, NY, USA, 2009. View at: Publisher Site
  20. L. Borland, A. R. Plastino, and C. Tsallis, “Information gain within nonextensive thermostatistics,” Journal of Mathematical Physics, vol. 39, no. 12, pp. 6490–6501, 1998. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  21. C. Tsallis, M. Gell-Mann, and Y. Sato, “Asymptotically scale-invariant occupancy of phase space makes the entropy Sq extensive,” Proceedings of the National Academy of Sciences of the United States of America, vol. 102, no. 43, pp. 15377–15382, 2005. View at: Publisher Site | Google Scholar | Zentralblatt MATH
  22. C. Tsallis, “On the extensivity of the entropy Sq, the q-generalized central limit theorem and the q-triplet,” Progress of Theoretical Physics, no. 162, pp. 1–9, 2006. View at: Publisher Site | Google Scholar | MathSciNet
  23. C. Zander and A. R. Plastino, “Composite systems with extensive Sq (power-law) entropies,” Physica A, vol. 364, pp. 145–156, 2006. View at: Google Scholar

Copyright © 2011 Shigeru Furuichi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views1658
Downloads861
Citations

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.