About this Journal Submit a Manuscript Table of Contents
Advances in Mathematical Physics
Volume 2011 (2011), Article ID 126108, 12 pages
http://dx.doi.org/10.1155/2011/126108
Research Article

Characterizations of Generalized Entropy Functions by Functional Equations

Department of Computer Science and System Analysis, College of Humanities and Sciences, Nihon University, 3-25-40, Sakurajyousui, Setagaya-ku, Tokyo 156-8550, Japan

Received 3 March 2011; Revised 22 May 2011; Accepted 23 May 2011

Academic Editor: Giorgio Kaniadakis

Copyright © 2011 Shigeru Furuichi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

We will show that a two-parameter extended entropy function is characterized by a functional equation. As a corollary of this result, we obtain that Tsallis entropy function is characterized by a functional equation, which is a different form that used by Suyari and Tsukada, 2009, that is, in a proposition 2.1 in the present paper. We give an interpretation of the functional equation in our main theorem.

1. Introduction

Recently, generalized entropies have been studied from the mathematical point of view. The typical generalizations of Shannon entropy [1] are Rényi entropy [2] and Tsallis entropy [3]. The recent comprehensive book [4] and the review [5] support to understand the Tsallis statistics for the readers. Rényi entropy and Tsallis entropy are defined by 𝑅𝑞1(𝑋)=1𝑞log𝑛𝑗=1𝑝𝑞𝑗𝑆,(𝑞1,𝑞>0),𝑞(𝑋)=𝑛𝑗=1𝑝𝑞𝑗𝑝𝑗1𝑞,(𝑞1,𝑞>0),(1.1)

for a given information source 𝑋={𝑥1,,𝑥𝑛} with the probability 𝑝𝑗Pr(𝑋=𝑥𝑗). Both entropies recover Shannon entropy 𝑆1(𝑋)𝑛𝑗=1𝑝𝑗log𝑝𝑗,(1.2) in the limit 𝑞1. The uniqueness theorem for Tsallis entropy was firstly given in [6] and improved in [7].

Throughout this paper, we call a parametric extended entropy, such as Rényi entropy and Tsallis entropy, a generalized entropy. If we take 𝑛=2 in (1.2), we have the so-called binary entropy 𝑠𝑏(𝑥)=𝑥log𝑥(1𝑥)log(1𝑥). Also we take 𝑛=1 in (1.2), and we have the Shannon's entropy function 𝑓(𝑥)=𝑥log𝑥. In this paper, we treat the entropy function with two parameters. We note that we can produce the relative entropic function 𝑦𝑓(𝑥/𝑦)=𝑥(log𝑥log𝑦)by the use of the Shannon's entropy function 𝑓(𝑥).

We note that Rényi entropy has the additivity 𝑅𝑞(𝑋×𝑌)=𝑅𝑞(𝑋)+𝑅𝑞(𝑌),(1.3) but Tsallis entropy has the nonadditivity 𝑆𝑞(𝑋×𝑌)=𝑆𝑞(𝑋)+𝑆𝑞(𝑌)+(1𝑞)𝑆𝑞(𝑋)𝑆𝑞(𝑌),(1.4) where 𝑋×𝑌 means that 𝑋 and 𝑌 are independent random variables. Therefore, we have a definitive difference for these entropies although we have the simple relation between them 𝑅exp𝑞(𝑋)=exp𝑞𝑆𝑞,(𝑋)(𝑞1),(1.5) where 𝑞-exponential function exp𝑞(𝑥){1+(1𝑞)𝑥}1/(1𝑞) is defined if 1+(1𝑞)𝑥0. Note that we have exp𝑞(𝑆𝑞(𝑋))=(𝑛𝑗=1𝑝𝑞𝑗)1/(1𝑞)>0.

Tsallis entropy is rewritten by 𝑆𝑞(𝑋)=𝑛𝑗=1𝑝𝑞𝑗ln𝑞𝑝𝑗,(1.6) where 𝑞-logarithmic function (which is an inverse function of exp𝑞()) is defined by ln𝑞𝑥𝑥1𝑞11𝑞,(𝑞1),(1.7) which converges to log𝑥 in the limit 𝑞1.

Since Shannon entropy can be regarded as the expectation value for each value log𝑝𝑗, we may consider that Tsallis entropy can be regarded as the 𝑞-expectation value for each value ln𝑞𝑝𝑗, as an analogy to the Shannon entropy, where 𝑞-expectation value 𝐸𝑞 is defined by 𝐸𝑞(𝑋)𝑛𝑗=1𝑝𝑞𝑗𝑥𝑗.(1.8)

However, the 𝑞-expectation value 𝐸𝑞 lacks the fundamental property such as 𝐸(1)=1, so that it was considered to be inadequate to adopt as a generalized definition of the usual expectation value. Then the normalized 𝑞-expectation value was introduced 𝐸(nor)𝑞(𝑋)𝑛𝑗=1𝑝𝑞𝑗𝑥𝑗𝑛𝑖=1𝑝𝑞𝑖,(1.9) and by using this, the normalized Tsallis entropy was defined by 𝑆(nor)𝑞𝑆(𝑋)𝑞(𝑋)𝑛𝑗=1𝑝𝑞𝑗=𝑛𝑗=1𝑝𝑞𝑗ln𝑞𝑝𝑗𝑛𝑖=1𝑝𝑞𝑖,(𝑞1).(1.10)

We easily find that we have the following nonadditivity relation for the normalized Tsallis entropy: 𝑆(nor)𝑞(𝑋×𝑌)=𝑆(nor)𝑞(𝑋)+𝑆(nor)𝑞(𝑌)+(𝑞1)𝑆(nor)𝑞(𝑋)𝑆(nor)𝑞(𝑌).(1.11)

As for the details on the mathematical properties of the normalized Tsallis entropy, see [8], for example. See also [9] for the role of Tsallis entropy and the normalized Tsallis entropy in statistical physics. The difference between two non-additivity relations (1.4) and (1.11) is the signature of the coefficient 1𝑞 in the third term of the right-hand sides.

We note that Tsallis entropy is also rewritten by 𝑆𝑞(𝑋)=𝑛𝑗=1𝑝𝑗ln𝑞1𝑝𝑗,(1.12) so that we may regard it as the expectation value such as 𝑆𝑞(𝑋)=𝐸1[ln𝑞1/𝑝𝑗],where 𝐸1 means the usual expectation value 𝐸1[𝑋]=𝑛𝑗=1𝑝𝑗𝑥𝑗. However, if we adopt this formulation in the definition of Tsallis conditional entropy, we do not have an important property such as a chain rule (see [10] for details). Therefore, we often adopt the formulation using the 𝑞-expectation value.

As a further generalization, a two-parameter extended entropy 𝑆𝜅,𝑟(𝑋)𝑛𝑗=1𝑝𝑗ln(𝜅,𝑟)𝑝𝑗(1.13) was recently introduced in [11, 12] and systematically studied with the generalized exponential function and the generalized logarithmic function ln𝜅,𝑟(𝑥)𝑥𝑟((𝑥𝜅𝑥𝜅)/2𝜅). In the present paper, we treat a two-parameter extended entropy defined in the following form: 𝑆𝛼,𝛽(𝑋)𝑛𝑗=1𝑝𝛼𝑗𝑝𝛽𝑗𝛽𝛼,(𝛼,𝛽,𝛼𝛽),(1.14)

for two positive numbers 𝛼 and 𝛽. This form can be obtained by putting 𝛼=1+𝑟𝜅 and 𝛽=1+𝑟+𝜅 in (1.13), and it coincides with the two-parameter extended entropy studied in [13]. In addition, the two-parameter extended entropy (1.14) was axiomatically characterized in [14]. Furthermore, a two-parameter extended relative entropy was also axiomatically characterized in [15].

In the paper [16], a characterization of Tsallis entropy function was proven by using the functional equation. In the present paper, we will show that the two-parameter extended entropy function 𝑓𝛼,𝛽𝑥(𝑥)=𝛼𝑥𝛽𝛽𝛼(𝛼,𝛽,𝛼𝛽)(1.15) can be characterized by the simple functional equation.

2. A Review of the Characterization of Tsallis Entropy Function by the Functional Equation

The following proposition was originally given in [16] by the simple and elegant proof. Here, we give the alternative proof along to the proof given in [17].

Proposition 2.1 (see [16]). If the differentiable nonnegative function 𝑓𝑞 with positive parameter q satisfies the following functional equation: 𝑓𝑞(𝑥𝑦)+𝑓𝑞((1𝑥)𝑦)𝑓𝑞𝑓(𝑦)=𝑞(𝑥)+𝑓𝑞𝑦(1𝑥)𝑞,(0<𝑥<1,0<𝑦1),(2.1) then the function 𝑓𝑞 is uniquely given by 𝑓𝑞(𝑥)=𝑐𝑞𝑥𝑞ln𝑞𝑥,(2.2) where 𝑐𝑞 is a nonnegative constant depending only on the parameter 𝑞.

Proof. If we put 𝑦=1 in (2.1), then we have 𝑓𝑞(1)=0. From here, we assume that 𝑦1. We also put 𝑔𝑞(𝑡)𝑓𝑞(𝑡)/𝑡 then we have 𝑥𝑔𝑞(𝑥𝑦)+(1𝑥)𝑔𝑞((1𝑥)𝑦)𝑔𝑞(𝑦)=𝑥𝑔𝑞(𝑥)+(1𝑥)𝑔𝑞𝑦(1𝑥)𝑞1.(2.3) Putting 𝑥=1/2 in (2.3), we have 𝑔𝑞𝑦2=𝑔𝑞12𝑦𝑞1+𝑔𝑞(𝑦).(2.4) Substituting 𝑦/2 into 𝑦, we have 𝑔𝑞𝑦22=𝑔𝑞12𝑦𝑞1+𝑦2𝑞1+𝑔𝑞(𝑦).(2.5) By repeating similar substitutions, we have 𝑔𝑞𝑦2𝑁=𝑔𝑞12𝑦𝑞111+2𝑞1+122(𝑞1)1++2(𝑁1)(𝑞1)+𝑔𝑞(𝑦)=𝑔𝑞12𝑦𝑞12𝑁(1𝑞)121𝑞1+𝑔𝑞(𝑦).(2.6) Then, we have lim𝑁𝑔𝑞𝑦/2𝑁2𝑁=0,(2.7) due to 𝑞>0. Differentiating (2.3) by 𝑦, we have 𝑥2𝑔𝑞(𝑥𝑦)+(1𝑥)2𝑔𝑞((1𝑥)𝑦)𝑔𝑞(𝑦)=(𝑞1)𝑥𝑔𝑞(𝑥)+(1𝑥)𝑔𝑞𝑦(1𝑥)𝑞2.(2.8) Putting 𝑦=1 in the above equation, we have 𝑥2𝑔𝑞(𝑥)+(1𝑥)2𝑔𝑞(1𝑥)+(1𝑞)𝑥𝑔𝑞(𝑥)+(1𝑥)𝑔𝑞(1𝑥)=𝑐𝑞,(2.9) where 𝑐𝑞=𝑔𝑞(1).
By integrating (2.3) from 2𝑁 to 1 with respect to 𝑦 and performing the conversion of the variables, we have 𝑥2𝑁𝑥𝑔𝑞(𝑡)𝑑𝑡+21𝑥𝑁(1𝑥)𝑔𝑞(𝑡)𝑑𝑡12𝑁𝑔𝑞(𝑡)𝑑𝑡=𝑥𝑔𝑞(𝑥)+(1𝑥)𝑔𝑞(1𝑥)12𝑞𝑁𝑞.(2.10) By differentiating the above equation with respect to 𝑥, we have 𝑔𝑞(𝑥)2𝑁𝑔𝑞2𝑁𝑥𝑔𝑞(1𝑥)+2𝑁𝑔𝑞2𝑁=(1𝑥)12𝑞𝑁𝑞𝑔𝑞(𝑥)+𝑥𝑔𝑞(𝑥)𝑔𝑞(1𝑥)(1𝑥)𝑔𝑞.(1𝑥)(2.11) Taking the limit 𝑁 in the above, we have (1𝑥)𝑔𝑞(𝑥)+(1𝑞)𝑔𝑞(1𝑥)=𝑥𝑔𝑞(𝑥)+(1𝑞)𝑔𝑞(𝑥),(2.12) thanks to (2.7). From (2.9) and (2.12), we have the following differential equation: 𝑥𝑔𝑞(𝑥)+(1𝑞)𝑔𝑞(𝑥)=𝑐𝑞.(2.13) This differential equation has the following general solution: 𝑔𝑞𝑐(𝑥)=𝑞1𝑞+𝑑𝑞𝑥𝑞1,(2.14) where 𝑑𝑞 is an integral constant depending on 𝑞. From 𝑔𝑞(1)=0, we have 𝑑𝑞=𝑐𝑞/(1𝑞). Thus, we have 𝑔𝑞(𝑥)=𝑐𝑞𝑥𝑞111𝑞.(2.15) Finally, we have 𝑓𝑞(𝑥)=𝑐𝑞𝑥𝑞𝑥1𝑞=𝑐𝑞𝑥𝑞ln𝑞𝑥.(2.16) From 𝑓𝑞(𝑥)0, we have 𝑐𝑞0.
If we take the limit as 𝑞1 in Proposition 2.1, we have the following corollary.

Corollary 2.2 (see [17]). If the differentiable nonnegative function f satisfies the following functional equation: 𝑓(𝑥𝑦)+𝑓((1𝑥)𝑦)𝑓(𝑦)=(𝑓(𝑥)+𝑓(1𝑥))𝑦,(0<𝑥<1,0<𝑦1),(2.17) then the function 𝑓 is uniquely given by 𝑓(𝑥)=𝑐𝑥log𝑥,(2.18) where 𝑐 is a nonnegative constant.

3. Main Results

In this section, we give a characterization of a two-parameter extended entropy function by the functional equation. Before we give our main theorem, we review the following result given by Kannappan [18, 19].

Proposition 3.1 (see [18, 19]). Let two probability distributions (𝑝1,,𝑝𝑛) and (𝑞1,,𝑞𝑚). If the measureable function 𝑓(0,1) satisfies 𝑛𝑚𝑖=1𝑗=1𝑓𝑝𝑖𝑞𝑗=𝑛𝑖=1𝑝𝛼𝑖𝑚𝑗=1𝑓𝑞𝑗+𝑚𝑗=1𝑞𝛽𝑗𝑛𝑖=1𝑓𝑝𝑖,(3.1) for all (𝑝1,,𝑝𝑛) and (𝑞1,,𝑞𝑚) with fixed 𝑚,𝑛3, then the function 𝑓 is given by 𝑐𝑝𝑓(𝑝)=𝛼𝑝𝛽,𝛼𝛽,𝑐𝑝𝛼log𝑝,𝛼=𝛽,𝑐𝑝log𝑝+𝑏(𝑚𝑛𝑚𝑛)𝑝+𝑏,𝛼=𝛽=1,(3.2) where 𝑐 and 𝑏 are arbitrary constants.

Here, we review a two-parameter generalized Shannon additivity, [14, equation (30)]𝑛𝑚𝑖=1𝑖𝑗=1𝑠𝛼,𝛽𝑝𝑖𝑗=𝑛𝑖=1𝑝𝛼𝑖𝑚𝑖𝑗=1𝑠𝛼,𝛽(𝑝(𝑗𝑖))+𝑛𝑖=1𝑠𝛼,𝛽𝑝𝑖𝑚𝑖𝑗=1𝑝(𝑗𝑖)𝛽,(3.3) where 𝑠𝛼,𝛽 is a component of the trace form of the two-parameter entropy [14, equation (26)] 𝑆𝛼,𝛽𝑝𝑖=𝑛𝑖=1𝑠𝛼,𝛽𝑝𝑖.(3.4)

Equation (3.3) was used to prove the uniqueness theorem for two-parameter extended entropy in [14]. As for (3.3), a tree-graphical interpretation was given in [14]. The condition (3.1) can be read as the independent case (𝑝(𝑗𝑖)=𝑝𝑗) in (3.3).

Here, we consider the nontrivial simplest case for (3.3). Take 𝑝𝑖𝑗={𝑞1,𝑞2,𝑞3}, 𝑝1=𝑞1+𝑞2, and 𝑝2=𝑞3. then we have 𝑝(11)=𝑞1/(𝑞1+𝑞2), 𝑝(21)=𝑞2/(𝑞1+𝑞2), 𝑝(12)=1, and 𝑝(22)=0, then (3.3) is written by𝑆𝛼,𝛽𝑞1,𝑞2,𝑞3=𝑞1+𝑞2𝛼𝑠𝛼,𝛽𝑞1𝑞1+𝑞2+𝑠𝛼,𝛽𝑞2𝑞1+𝑞2+𝑞𝛼3𝑠𝛼,𝛽(1)+𝑠𝛼,𝛽(0)+𝑠𝛼,𝛽𝑞1+𝑞2𝑞1𝑞1+𝑞2𝛽+𝑞2𝑞1+𝑞2𝛽+𝑠𝛼,𝛽𝑞3.(3.5)

If 𝑠𝛼,𝛽 is an entropic function, then it vanishes at 0 or 1, since the entropy has no informational quantity for the deterministic cases, then the above identity is reduced in the following: 𝑆𝛼,𝛽𝑞1,𝑞2,𝑞3=𝑞1+𝑞2𝛼𝑠𝛼,𝛽𝑞1𝑞1+𝑞2+𝑠𝛼,𝛽𝑞2𝑞1+𝑞2+𝑠𝛼,𝛽𝑞1+𝑞2𝑞1𝑞1+𝑞2𝛽+𝑞2𝑞1+𝑞2𝛽+𝑠𝛼,𝛽𝑞3.(3.6)

In the following theorem, we adopt a simpler condition than (3.1).

Theorem 3.2. If the differentiable nonnegative function 𝑓𝛼,𝛽 with two positive parameters 𝛼,𝛽 satisfies the following functional equation: 𝑓𝛼,𝛽(𝑥𝑦)=𝑥𝛼𝑓𝛼,𝛽(𝑦)+𝑦𝛽𝑓𝛼,𝛽(𝑥),(0<𝑥,𝑦1),(3.7) then the function 𝑓𝛼,𝛽 is uniquely given by 𝑓𝛼,𝛽(𝑥)=𝑐𝛼,𝛽𝑥𝛽𝑥𝛼𝑓𝛼𝛽,(𝛼𝛽),𝛼(𝑥)=𝑐𝛼𝑥𝛼log𝑥,(𝛼=𝛽),(3.8) where 𝑐𝛼,𝛽 and 𝑐𝛼 are nonnegative constants depending only on the parameters 𝛼 (and 𝛽).

Proof. If we put 𝑦=1, then we have 𝑓𝛼,𝛽(1)=0 due to 𝑥>0. By differentiating (3.7) with respect to 𝑦, we have 𝑥𝑓𝛼,𝛽(𝑥𝑦)=𝑥𝛼𝑓𝛼,𝛽(𝑦)+𝛽𝑦𝛽1𝑓𝛼,𝛽(𝑥).(3.9) Putting 𝑦=1 in (3.9), we have the following differential equation: 𝑥𝑓𝛼,𝛽(𝑥)𝛽𝑓𝛼,𝛽(𝑥)=𝑐𝛼,𝛽𝑥𝛼,(3.10) where we put 𝑐𝛼,𝛽𝑓𝛼,𝛽(1). Equation (3.10) can be deformed as follows: 𝑥𝛽+1𝑥𝛽𝑓𝛼,𝛽(𝑥)=𝑐𝛼,𝛽𝑥𝛼,(3.11) that is, we have 𝑥𝛽𝑓𝛼,𝛽(𝑥)=𝑐𝛼,𝛽𝑥𝛼𝛽1.(3.12) Integrating both sides on the above equation with respect to 𝑥, we have 𝑥𝛽𝑓𝛼,𝛽𝑐(𝑥)=𝛼,𝛽𝑥𝛼𝛽𝛼𝛽+𝑑𝛼,𝛽,(3.13) where 𝑑𝛼,𝛽 is a integral constant depending on 𝛼 and 𝛽. Therefore, we have 𝑓𝛼,𝛽𝑐(𝑥)=𝛼,𝛽𝑥𝛼𝛽𝛼+𝑑𝛼,𝛽𝑥𝛽.(3.14) By 𝑓𝛼,𝛽(1)=0, we have 𝑑𝛼,𝛽=𝑐𝛼,𝛽/(𝛼𝛽). Thus, we have 𝑓𝛼,𝛽𝑐(𝑥)=𝛼,𝛽𝑥𝛼𝛽𝛽𝑥𝛼.(3.15) Also by 𝑓𝛼,𝛽(𝑥)0, we have 𝑐𝛼,𝛽0.
As for the case of 𝛼=𝛽, we can prove by the similar way.

Remark 3.3. We can derive (3.6) from our condition (3.7). Firstly, we easily have 𝑓𝛼,𝛽(0)=𝑓𝛼,𝛽(1)=0 from our condition equation (3.7). In addition, we have for 𝑞=𝑞1+𝑞2, 𝑆𝛼,𝛽𝑞𝑞1𝑞𝑞,𝑞2𝑞,𝑞3=𝑓𝛼,𝛽𝑞𝑞1𝑞+𝑓𝛼,𝛽𝑞𝑞2𝑞+𝑓𝛼,𝛽𝑞3=𝑞𝛼𝑓𝛼,𝛽𝑞1𝑞+𝑞1𝑞𝛽𝑓𝛼,𝛽(𝑞)+𝑞𝛼𝑓𝛼,𝛽𝑞2𝑞+𝑞2𝑞𝛽𝑓𝛼,𝛽(𝑞)+𝑓𝛼,𝛽𝑞3=𝑞1+𝑞2𝛼𝑓𝛼,𝛽𝑞1𝑞1+𝑞2+𝑓𝛼,𝛽𝑞2𝑞1+𝑞2+𝑓𝛼,𝛽𝑞1+𝑞2𝑞1𝑞1+𝑞2𝛽+𝑞2𝑞1+𝑞2𝛽+𝑓𝛼,𝛽𝑞3.(3.16) Thus, we may interpret that our condition (3.7) contains an essential part of the two-parameter generalized Shannon additivity.
Note that we can reproduce the two-parameter entropic function by the use of 𝑓𝛼,𝛽 as 𝑦𝑓𝛼,𝛽𝑥𝑦=𝑥𝛼𝑦1𝛽𝑥𝛽𝑦1𝛼𝛼𝛽,(3.17) with 𝑐𝛼,𝛽=1 for simplicity. This leads to two-parameter extended relative entropy [15] 𝐷𝛼,𝛽𝑥1,,𝑥𝑛𝑦||1,,𝑦𝑛𝑛𝑗=1𝑥𝛼𝑗𝑦𝑗1𝛽𝑥𝛽𝑗𝑦𝑗1𝛼𝛼𝛽.(3.18) See also [20] on the first appearance of the Tsallis relative entopy (generalized Kullback-Leibler information).
If we take 𝛼=𝑞,𝛽=1 or 𝛼=1,𝛽=𝑞 in Theorem 3.2, we have the following corollary.

Corollary 3.4. If the differentiable nonnegative function 𝑓𝑞 with a positive parameter 𝑞 satisfies the following functional equation: 𝑓𝑞(𝑥𝑦)=𝑥𝑞𝑓𝑞(𝑦)+𝑦𝑓𝑞(𝑥),(0<𝑥,𝑦1,𝑞1),(3.19) then the function 𝑓𝑞 is uniquely given by 𝑓𝑞(𝑥)=𝑐𝑞𝑥𝑞ln𝑞𝑥,(3.20) where 𝑐𝑞 is a nonnegative constant depending only on the parameter 𝑞.

Here, we give an interpretation of the functional equation (3.19) from the view of Tsallis statistics.

Remark 3.5. We assume that we have the following two functional equations for 0<𝑥, 𝑦1: 𝑓𝑞(𝑥𝑦)=𝑦𝑓𝑞(𝑥)+𝑥𝑓𝑞(𝑦)+(1𝑞)𝑓𝑞(𝑥)𝑓𝑞𝑓(𝑦),𝑞(𝑥𝑦)=𝑦𝑞𝑓𝑞(𝑥)+𝑥𝑞𝑓𝑞(𝑦)+(𝑞1)𝑓𝑞(𝑥)𝑓𝑞(𝑦).(3.21) These equations lead to the following equations for 0<𝑥𝑖,𝑦𝑗1: 𝑓𝑞𝑥𝑖𝑦𝑗=𝑦𝑗𝑓𝑞𝑥𝑖+𝑥𝑖𝑓𝑞𝑦𝑗+(1𝑞)𝑓𝑞𝑥𝑖𝑓𝑞𝑦𝑗,𝑓𝑞𝑥𝑖𝑦𝑗=𝑦𝑞𝑗𝑓𝑞𝑥𝑖+𝑥𝑞𝑖𝑓𝑞𝑦𝑗+(𝑞1)𝑓𝑞𝑥𝑖𝑓𝑞𝑦𝑗,(3.22) where 𝑖=1,,𝑛 and 𝑗=1,,𝑚. Taking the summation on 𝑖 and 𝑗 in both sides, we have 𝑛𝑚𝑖=1𝑗=1𝑓𝑞𝑥𝑖𝑦𝑗=𝑛𝑖=1𝑓𝑞𝑥𝑖+𝑚𝑗=1𝑓𝑞𝑦𝑗+(1𝑞)𝑛𝑖=1𝑓𝑞𝑥𝑖𝑚𝑗=1𝑓𝑞𝑦𝑗,(3.23)𝑛𝑚𝑖=1𝑗=1𝑓𝑞𝑥𝑖𝑦𝑗=𝑚𝑗=1𝑦𝑞𝑗𝑛𝑖=1𝑓𝑞𝑥𝑖+𝑛𝑖=1𝑥𝑞𝑖𝑚𝑗=1𝑓𝑞𝑦𝑗+(𝑞1)𝑛𝑖=1𝑓𝑞𝑥𝑖𝑚𝑗=1𝑓𝑞𝑦𝑗,(3.24) under the condition 𝑛𝑖=1𝑥𝑖=𝑚𝑗=1𝑦𝑗=1. If the function 𝑓𝑞(𝑥) is given by (3.20), then two above functional equations coincide with two nonadditivity relations given in (1.4) and (1.11).
On the other hand, we have the following equation from (23) and (3.21): 𝑓𝑞𝑥(𝑥𝑦)=𝑞+𝑥2𝑓𝑞𝑦(𝑦)+𝑞+𝑦2𝑓𝑞(𝑥),(0<𝑥,𝑦1,𝑞1).(3.25) By a similar way to the proof of Theorem 3.2, we can show that the functional equation (3.25) uniquely determines the function 𝑓𝑞 by the form given in (3.20). Therefore, we can conclude that two functional equations (23) and (3.21), which correspond to the non-additivity relations (1.4) and (1.11), also characterize Tsallis entropy function.
If we again take the limit as 𝑞1 in Corollary 3.4, we have the following corollary.

Corollary 3.6. If the differentiable nonnegative function 𝑓 satisfies the following functional equation: 𝑓(𝑥𝑦)=𝑦𝑓(𝑥)+𝑥𝑓(𝑦),(0<𝑥,𝑦1),(3.26) then the function 𝑓 is uniquely given by 𝑓(𝑥)=𝑐𝑥log𝑥,(3.27) where 𝑐 is a nonnegative constant.

4. Conclusion

As we have seen, the two-parameter extended entropy function can be uniquely determined by a simple functional equation. Also an interpretation related to a tree-graphical structure was given as a remark.

Recently, the extensive behaviours of generalized entropies were studied in [2123]. Our condition given in (3.7) may be seen as extensive form. However, I have not yet found any relation between our functional (3.7) and the extensive behaviours of the generalized entropies. This problem is not the purpose of the present paper, but it is quite interesting to study this problem as a future work.

Acknowledgments

This paper is dedicated to Professor Kenjiro Yanagi on his 60th birthday. The author would like to thank the anonymous reviewers for providing valuable comments to improve the paper. The author was partially supported by the Japanese Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Encouragement of Young Scientists (B) 20740067.

References

  1. C. E. Shannon, “A mathematical theory of communication,” The Bell System Technical Journal, vol. 27, p. 379–423, 623–656, 1948. View at Zentralblatt MATH
  2. A. Rényi, “On measures of entropy and information,” in Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561, University California Press, Berkeley, Calif, USA, 1961. View at Zentralblatt MATH
  3. C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  4. C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, New York, NY, USA, 2009.
  5. C. Tsallis, D. Prato, and A. R. Plastino, “Nonextensive statistical mechanics: some links with astronomical phenomena,” Astrophysics and Space Science, vol. 290, pp. 259–274, 2004.
  6. H. Suyari, “Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy,” IEEE Transactions on Information Theory, vol. 50, no. 8, pp. 1783–1787, 2004. View at Publisher · View at Google Scholar · View at MathSciNet
  7. S. Furuichi, “On uniqueness theorems for Tsallis entropy and Tsallis relative entropy,” IEEE Transactions on Information Theory, vol. 51, no. 10, pp. 3638–3645, 2005. View at Publisher · View at Google Scholar · View at MathSciNet
  8. H. Suyari, “Nonextensive entropies derived from form invariance of pseudoadditivity,” Physical Review E, vol. 65, no. 6, p. 066118, 2002. View at Publisher · View at Google Scholar · View at MathSciNet
  9. C. Tsallis, R. S. Mendes, and A. R. Plastino, “The role of constraints within generalized nonextensive statistics,” Physica A, vol. 261, pp. 534–554, 1998.
  10. S. Furuichi, “Information theoretical properties of Tsallis entropies,” Journal of Mathematical Physics, vol. 47, no. 2, p. 023302, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  11. G. Kaniadakis, M. Lissia, and A. M. Scarfone, “Deformed logarithms and entropies,” Physica A, vol. 340, no. 1–3, pp. 41–49, 2004. View at Publisher · View at Google Scholar · View at MathSciNet
  12. G. Kaniadakis, M. Lissia, and A. M. Scarfone, “Two-parameter deformations of logarithm, exponential, and entropy: a consistent framework for generalized statistical mechanics,” Physical Review E, vol. 71, no. 4, p. 046128, 2005. View at Publisher · View at Google Scholar · View at MathSciNet
  13. E. P. Borges and I. Roditi, “A family of nonextensive entropies,” Physics Letters A, vol. 246, no. 5, pp. 399–402, 1998. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  14. T. Wada and H. Suyari, “A two-parameter generalization of Shannon-Khinchin axioms and the uniqueness theorem,” Physics Letters A, vol. 368, no. 3-4, pp. 199–205, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  15. S. Furuichi, “An axiomatic characterization of a two-parameter extended relative entropy,” Journal of Mathematical Physics, vol. 51, no. 12, p. 123302, 2010. View at Publisher · View at Google Scholar
  16. H. Suyari and M. Tsukada, “Tsallis differential entropy and divergences derived from the generalized Shannon-Khinchin axioms,” in Proceedings of the IEEE International Symposium on Information Theory (ISIT '09), pp. 149–153, Seoul, Korea, 2009.
  17. Y. Horibe, “Entropy of terminal distributions and the Fibonacci trees,” The Fibonacci Quarterly, vol. 26, no. 2, pp. 135–140, 1988. View at Zentralblatt MATH
  18. P. l. Kannappan, “An application of a differential equation in information theory,” Glasnik Matematicki, vol. 14, no. 2, pp. 269–274, 1979. View at Zentralblatt MATH
  19. P. l. Kannappan, Functional Equations and Inequalities with Applications, Springer, New York, NY, USA, 2009. View at Publisher · View at Google Scholar
  20. L. Borland, A. R. Plastino, and C. Tsallis, “Information gain within nonextensive thermostatistics,” Journal of Mathematical Physics, vol. 39, no. 12, pp. 6490–6501, 1998. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  21. C. Tsallis, M. Gell-Mann, and Y. Sato, “Asymptotically scale-invariant occupancy of phase space makes the entropy Sq extensive,” Proceedings of the National Academy of Sciences of the United States of America, vol. 102, no. 43, pp. 15377–15382, 2005. View at Publisher · View at Google Scholar · View at PubMed · View at Zentralblatt MATH
  22. C. Tsallis, “On the extensivity of the entropy Sq, the q-generalized central limit theorem and the q-triplet,” Progress of Theoretical Physics, no. 162, pp. 1–9, 2006. View at Publisher · View at Google Scholar · View at MathSciNet
  23. C. Zander and A. R. Plastino, “Composite systems with extensive Sq (power-law) entropies,” Physica A, vol. 364, pp. 145–156, 2006.